简体   繁体   English

是否只有一个Python解释器执行多个并发脚本?

[英]Does only one Python interpreter execute multiple concurrent scripts?

I have a Python script that sends 4GB worth of data to a server in 10MB chunks using REST API. 我有一个Python脚本,可以使用REST API以10MB的块向服务器发送4GB的数据。 No matter how many of these scripts I invoke concurrently, I get exactly the same overall throughput client-side (10Gb network, server class system): 无论我同时调用这些脚本有多少,我都会在客户端(10Gb网络,服务器类系统)获得完全相同的总体吞吐量:

1 invocation = 300MB/s 1次调用= 300MB / s

2 invocations = 300MB/s 2次调用= 300MB / s

4 invocations = 300MB/s 4次调用= 300MB / s

8 invocations = 300MB/s 8次调用= 300MB / s

At first I though it was some kind of disk read limitation, but I modified the script so that it does not require hard drive access and uses minimal memory and I still get the exact same throughput. 起初,我虽然有某种磁盘读取限制,但是我修改了脚本,使其不需要硬盘驱动器并且使用了最少的内存,但我仍然获得完全相同的吞吐量。 CPU and memory usage during execution is minimal. 执行期间的CPU和内存使用量最少。

Researching further, I read that the Python interpreter is single threaded. 进一步研究,我了解到Python解释器是单线程的。 That is fine (and makes sense I guess), but is it possible that only one instance of the Python interpreter is invoked at a time, despite multiple Python scripts being invoked concurrently? 很好(我想这很有意义),但是尽管同时调用了多个Python脚本,但是否可能一次只调用一个Python解释器实例?

No, multiple python processes executed separately will not share threads or any other state. 不,单独执行的多个python进程将不会共享线程或任何其他状态。

The most likely case is that 300MB/s is either the fastest your client can support, or the fastest your server can support. 最可能的情况是300MB / s是客户端可以支持的最快速度,还是服务器可以支持的最快速度。

300MB/s is extremely fast, so much so that I wonder if you haven't confused megabytes with megabits. 300MB / s的速度非常快,以至于我想知道您是否没有将兆字节与兆位混淆。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM