简体   繁体   中英

Post request on api endpoint does not return data

I'm currently trying to use the API and the headers of a website instead of creating a web scraper. I have been trying to inspect the website by using Chrome Dev Tools and capturing the information under the network tab when an POST request has been made.

So far I have no luck and have only been getting <Response [200]> from the endpoint but no data returned.

Please advise what I did wrong.

Below is my Python code:

    import requests

    def get_data_from_api():
    search_keyword = 'money'
    api_endpoint = 'https://45bwzj1sgc-dsn.algolia.net/1/indexes/*/queries?x-algolia-agent=Algolia%20for%20JavaScript%20(4.2.0)%3B%20Browser'
    payload = {
        'x-xxxx-application-id': '45BWZJ1SGC', 
        'x-xxx-api-key': 'MTRkMGNlOGFhMWQ4YThmZDA0YmViNGY2M2ViYzBlMGMwNTBmMjllNjJmMDA4YmY1YzY3YzI2NzRmMzlhYThkM3RhZ0ZpbHRlcnM9JTVCJTVCJTIyc3VzX3B1YmxpYyUyMiUyQyUyMnN1c19jaGFubmVsX2FsbCUyMiUyQyUyMnN1c19jaGFubmVsX2ZlYXR1cmVkJTIyJTJDJTIyc3VzX2NoYW5uZWxfd2Vla2x5LW5ld3NsZXR0ZXIlMjIlMkMlMjJzdXNfY2hhbm5lbF9hbWElMjIlMkMlMjJzdXNfY2hhbm5lbF9saXZlc3RyZWFtJTIyJTJDJTIyc3VzX2NoYW5uZWxfc2hvdy1zdXMlMjIlMkMlMjJzdXNfY2hhbm5lbF9lbmdpbmVlcmluZyUyMiU1RCU1RCZhbmFseXRpY3NUYWdzPSU1QiUyMnN1cyUyMiU1RA=='
        }

    r = requests.post(api_endpoint, headers= payload, data=str(''' {"requests":[{"indexName":"OmniSearch_sus_production","query":"'''+search_keyword+'''","params":"attributesToSnippet=%5B%22body%3A80%22%2C%22searchable_comments%3A30%22%5D&page=0&hitsPerPage=10&filters=&facetFilters=%5B%5B%5D%5D&facets=%5B%22channel%22%5D&sortFacetValuesBy=count&clickAnalytics=true"}]} ''')) 

    print(r)

if __name__ == "__main__":
    get_data_from_api()

You can have a try:

print(r.json())

the r is an response object. You can use

  • r.content get the binary body.
  • r.text get the string body
  • r.json() or orjson.loads(r.content) get the json body

Check out more details by here

To follow up on my comment, there are a few places where you can let Python do the hard work for you. Consider this alternative that uses urllib.parse.urlencode to create the params value, then lets Requests serialize your data structure:

from urllib.parse import urlencode

import requests


def get_data_from_api():
    search_keyword = "money"
    api_endpoint = "..."
    headers = {
        "x-xxxx-application-id": "...",
        "x-xxx-api-key": "...",
    }

    params = {
        "attributesToSnippet": '["body:80","searchable_comments:30"]',
        "page": "0",
        "hitsPerPage": "10",
        "facetFilters": "[[]]",
        "filters": "",
        "facets": '["channel"]',
        "sortFacetValuesBy": "count",
        "clickAnalytics": "true",
    }

    r = requests.post(
        api_endpoint,
        headers=headers,
        data={
            "requests": [
                {
                    "indexName": "OmniSearch_sus_production",
                    "query": search_keyword,
                    "params": urlencode(params),
                }
            ]
        },
    )

    return r.json()

Quick, ApplePie? It's 2AM and something broke and everyone's in a panic! Which version of the code (which is functionally identical) would you rather have to troubleshoot?

I know this doesn't directly answer your question, but take this in the spirit of "while we're on the subject...".

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM