[英]Asyncio not running Aiohttp requests in parallel
I want to run many HTTP requests in parallel using python.我想使用 python 并行运行许多 HTTP 请求。 I tried this module named aiohttp with asyncio.
我用 asyncio 尝试了这个名为 aiohttp 的模块。
import aiohttp
import asyncio
async def main():
async with aiohttp.ClientSession() as session:
for i in range(10):
async with session.get('https://httpbin.org/get') as response:
html = await response.text()
print('done' + str(i))
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I expect it to execute all the requests in parallel, but they are executed one by one.我希望它能够并行执行所有请求,但它们会被一一执行。 Although, I later solved this using threading, but I would like to know what's wrong with this?
虽然,我后来使用线程解决了这个问题,但我想知道这有什么问题?
You need to make the requests in a concurrent manner.您需要以并发方式发出请求。 Currently, you have a single task defined by
main()
and so the http
requests are run in a serial manner for that task.目前,您有一个由
main()
定义的任务,因此http
请求以串行方式运行该任务。
You could also consider usingasyncio.run()
if you are using Python version 3.7+
that abstracts out creation of event loop:如果您使用 Python 版本
3.7+
抽象出事件循环的创建,您也可以考虑使用asyncio.run()
:
import aiohttp
import asyncio
async def getResponse(session, i):
async with session.get('https://httpbin.org/get') as response:
html = await response.text()
print('done' + str(i))
async def main():
async with aiohttp.ClientSession() as session:
tasks = [getResponse(session, i) for i in range(10)] # create list of tasks
await asyncio.gather(*tasks) # execute them in concurrent manner
asyncio.run(main())
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.