简体   繁体   中英

Should Python process wait for a job forked using `run_in_executor`, although I did not wait for?

Assume we have the following script:


def sync_sleep(x):
    print("sync_sleep going to sleep")
    time.sleep(x)
    print("sync_sleep awake")


async def main():
    loop = asyncio.get_running_loop()
    loop.run_in_executor(None, sync_sleep, 4)
    print("main going to sleep")
    await asyncio.sleep(1)
    print("main awake ")


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    print("main finished")

Note : I have omitted loop.close intentionally.

Now when I execute the above script:

python script.py

I have the following output:

sync_sleep going to sleep
main going to sleep
main awake 
main finished
sync_sleep awake

I expected after running the script, the process to exit after printing "main finished" , but it did not until the job sync_sleep has finished. I mean, if I wanted to wait for that job, I would have added the line:

loop.run_until_complete(loop.shutdown_default_executor())

Is my expectation wrong?

The default ThreadPoolExecutor waits at exit for daemon threads, meaning the process waits for the thread to finish, and only then stops.

Apart from implementing an Executor class yourself, there's no option other than using the threading module to create a new thread with daemon=True . If you wish to wait for that thread in asyncio , you can always run an executor to await the thread.join() .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM