When using asyncio.gather
, all tasks are not executed when one of them raise exception:
# coding: utf-8
import asyncio
async def foo(i):
await asyncio.sleep(0.1)
if i == 2:
print("2: 1/0")
1/0
print(i)
async def main():
futures = []
for i in range(1000):
futures.append(foo(i))
await asyncio.gather(*futures)
asyncio.run(main())
0
1
2: 1/0
3
4
5
6
7
[...]
501
502
503
504
Traceback (most recent call last):
File "<input>", line 24, in <module>
File "/usr/lib/python3.8/asyncio/runners.py", line 43, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "<input>", line 21, in main
File "<input>", line 9, in foo
ZeroDivisionError: division by zero
How can I ensure all tasks are executed before get out of asyncio.gather
? I understand this cannot be the default behavior of asyncio.gather
because if one of my tasks never finished, exceptions will never be raised. My question is more: How can I execute a pool of tasks gather and wait all finished/raise before continue?
I think you want to pay close attention to the documentation for asyncio.gather :
If return_exceptions is False (default), the first raised exception is immediately propagated to the task that awaits on gather(). Other awaitables in the aws sequence won't be cancelled and will continue to run.
If return_exceptions is True, exceptions are treated the same as successful results, and aggregated in the result list.
From my reading, it looks like if you were to call asyncio.gather
like this...
await asyncio.gather(*futures, return_exceptions=True)
...you should get the behavior you want.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.