[英]Asyncio.create_task() and asyncio.gather() running in sequence
I have an API, which takes a long time to do.我有一个 API,这需要很长时间才能完成。 So I want to split it into many smaller jobs, then run them in parallel and wait for the result before sending the response.所以我想将它拆分成许多较小的作业,然后并行运行它们并在发送响应之前等待结果。
My snip code:我的截图代码:
@app.post("/data")
async def get_tsp_events:
query = [foo, bar, foo, bar]
tasks = [asyncio.create_task(do_work(param1, param2)) for query in queries]
events = await asyncio.gather(*tasks)
return events
async def do_something(arg1, arg2):
log("Start time")
# This take a lot of times
events = [event for event in range(10000000000)]
log("End time")
return events
As I see, all task is run in sequence just like normal code (without using asyncio.create_task()
and asyncio.gather
) I'm new in Python and my question is:如我所见,所有任务都像普通代码一样按顺序运行(不使用asyncio.create_task()
和asyncio.gather
)我是 Python 新手,我的问题是:
Thank you all谢谢你们
In fact, async and await didn't introduce real parallel, they works when you do some underlying "asynchronous" operations like aiohttp , aiofile .实际上, async 和 await 并没有引入真正的并行,它们在您执行一些底层“异步”操作(如aiohttp 、 aiofile )时起作用。 Nevertheless, they only bring concurrency for IO-bound tasks.尽管如此,它们只会为 IO 绑定的任务带来并发性。
If you want real parallel for CPU-bound tasks, use multiprocessing .如果您希望 CPU 密集型任务真正并行,请使用multiprocessing 。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.