简体   繁体   中英

Do Multiple Python Programs Use Different Cpu Cores?

I have a question regarding Python and load balancing. Consider a virtual machine with 16 virtual cpu cores and single Python Rest-Service wrapped in Docker. I suppose it only is going to use one virtual core instead of the 16 when run.

Now consider 8 duplicates of the Python Rest-Service run in a docker compose parallel behind a load balancer (upstream). Do they use 8 of the 16 cores? And if so how do they know which one to use?

Also do the virtual cores make any difference, what about a real system with real 16 cores?

If anyone has any experience or knowledge I would be happy if you could share it.

Thank you!

Whether a REST service uses one or multiple cores depends on how you code it. You could, for instance, have a pool of python processes handling the requests.

Lets say that you have a python process. Since python grabs the Global Interpreter Lock (GIL) when running byte code, even if it has multiple threads, only one will run at a time. If you have multiple python processes, each one has its own unique GIL, so these processes will run in parallel using multiple cores.

How they run is up to the operating system, and it doesn't care if its running python or anything else. At any given time there are X threads available. The operating system scheduler keeps all cores running as much as possible. When the code on one core makes an operating system call, or when a certain time has elapsed, the OS may grab one of the pending threads to execute on that core. The old thread is now put back in the wait queue.

There are rules about which threads get the greatest precedence, etc... But this is all quite operating system dependent and so you have to be vague when talking the general sense.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM