简体   繁体   中英

How are threads (and asyncio tasks) scheduled in Python?

I'm trying to understand concurrency in Python and am confused about how threads are scheduled and how tasks (in asyncio library) are scheduled to run/wait.

Suppose a thread tries to acquire a Lock and is blocked. Does the Python interpreter immediately put that thread into the 'blocked' queue? How is this blocked thread put back into the running state? Is there busy waiting involved?

How is this different when a task (the equivalent of a thread) in the asyncio library is blocked on an async mutex?

What is the advantage of asyncio, if there is no busy waiting involved in either of the above two cases?

Suppose a thread tries to acquire a Lock and is blocked. Does the Python interpreter immediately put that thread into the 'blocked' queue?

Python creates real operating system threads, so no queuing or scheduling needs to be done by the interpreter.

The one possible exception is the global lock use by the interpreter to serialize execution of Python code and access to Python objects. This lock is released not only before acquiring a threading lock, but also before any (potentially) blocking operation, such as reading from an IO handle or sleeping.

What is the advantage of asyncio, if there is no busy waiting involved in either of the above two cases?

The advantage is that asyncio doesn't require a new OS thread for each coroutine it executes in parallel. OS threads are expensive, and asyncio tasks are quite lightweight. Also, asyncio makes the potential switch points visible (the await keyword), so there's less potential for race conditions.

You can think of asyncio as a successor to Twisted , but with a modern API and using suspendable coroutines instead of explicit callback chaining.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM