简体   繁体   中英

asynchronous HTTP POST requests in Python

I have a code which posts HTTP request sequentially for all the tuples in a list and append the response to another list.

import requests

url = 'https://www....'
input_list = [(1,2),(6,4),(7,8)]
output_list = []

for i in input_list:
   resp = (requests.post('url',data = i[1]).text,i[0])
   output_list.append(resp)

print(output_list)

Can someone please help me with the directions to make the HTTP requests in parallel?

Since requests library doesn't support asyncio natively, I'd use multiprocessing.pool.ThreadPool , assuming the most of the time is spent waiting for IO. Otherwise it might be beneficial to use multiprocessing.Pool

from multiprocessing.pool import ThreadPool
import requests

url = 'https://www....'
input_list = [(1,2),(6,4),(7,8)]

def get_url(i):
  return (requests.post('url',data = i[1]).text,i[0])

with ThreadPool(10) as pool: #ten requests to run in paralel
  output_list = list(pool.map(get_url, input_list))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM