简体   繁体   中英

how can i send payload with multiple requests in python using threadpoolexecutor and requests?

i have managed to send multiple requests to a web api at the same time through ThreadPoolExecutor and get the json responses but i cant send requests with payload would you be kind enough to see my code and suggest me an edit to send payload (data, header) i just dont know how to send payload.

from concurrent.futures import ThreadPoolExecutor 
import requests
from timer import timer
URL = 'whatever.com'
payload = {'aaaaa': '0xxxxxxx'}
headers = {
'abc': 'xyz',
'Content-Type': 'application/json',
}
def fetch(session, url):
 with session.post(url) as response:
    print(response.json())
@timer(1, 1)
def main():
 with ThreadPoolExecutor(max_workers=100) as executor:
    with requests.session() as session:
        executor.map(fetch, [session] * 100, [URL] * 100)
        executor.shutdown(wait=True)

Normally you specify a "payload" using the data keyword argument on the call to the post method. But if you want to send it in JSON format, then you should use the json keyword argument:

session.post(url, json=payload, headers=headers)

(If the header specified 'Content-Type': 'application/json' , as yours does, and if payload were already a JSON string, which yours is not , then you would be correct to use the data keyword argument for then you would not need any JSON conversion. But here you clearly need to first have requests convert a Python dictionary to a JSON string for transmission and that is why the json argument is being used. You do not really need to explicitly specify a header argument since requests will provide an appropriate one for you.)

Now I know this is a just a "dummy" program fetching the same URL 100 times. In a more realistic version you would be fetching 100 different URLs but you would, of course, be using the same Session instance for each call to fetch . You could therefore simplify the program in the following way:

from functools import partial

...

def main():
 with ThreadPoolExecutor(max_workers=100) as executor:
    with requests.Session() as session:
        worker = partial(fetch, session) # first argument will be session
        executor.map(worker, [URL] * 100)
        # remove following line
        #executor.shutdown(wait=True)

Note that I have also commented out your explicit call to method shutdown since shutdown will be automatically called following the termination of the with... as executor: block.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM