[英]Python multiprocessing Queue memory management
Say I have the Main
processes and 2 additional processes A
and B
. 假设我有
Main
流程以及2个其他流程A
和B
In this program A is supposed to send data to B. If we have some code like this: 在此程序中,A应该发送数据到B。如果我们有这样的代码:
from multiprocessing import Process, Queue
def process_a(iterable, q):
for x in iterable:
q.put(x)
def process_b(q):
while some_condition():
x = q.get()
iterable = some_iterable()
q = Queue()
pa = Process(target=process_a, args=(iterable, q))
pb = Process(target=process_b, args=(q,))
pa.start()
pb.start()
pa.join()
pb.join()
given that the Queue q
was created in the Main process, does the data flow like this? 鉴于队列
q
是在Main流程中创建的,数据流是否像这样?
A => Main => B
If so, is there a way to have a Queue
initialized on B
and passed to A
such that data goes directly from A
to B
skipping Main
? 如果是这样,是有办法有一个
Queue
初始化上B
并传递到A
数据直接从进入这样A
到B
跳过Main
?
given that the Queue q was created in the Main process, does the data flow like this?
鉴于队列q是在Main流程中创建的,数据流是否像这样?
A => Main => B
No. As explained in the docs , a Queue
is just an auto-synchronizing wrapper around a Pipe
. 否。如文档中所述 ,
Queue
只是围绕Pipe
的自动同步包装器。 When you pass a Queue
to a child, you're just passing that Pipe
and some locks. 当您将
Queue
传递给孩子时,您只是在传递那个Pipe
和一些锁。
And the Pipe
is just a wrapper around an operating system pipe. 而且
Pipe
只是操作系统管道的包装。 When you pass a Pipe
to a child, you're just passing the pipe's file descriptor/handle. 当您将
Pipe
传递给孩子时,您只是在传递Pipe的文件描述符/句柄。
Ignoring the locks, process A is basically just writing to a pipe, and process B is just reading from it. 忽略锁,进程A基本上只是写入管道,而进程B只是从管道读取数据。
The locks do make things a bit more complicated (and may also mean that process A spins up a hidden background thread), but they still don't involve the main process at all. 锁确实使事情变得更加复杂(也可能意味着进程A旋转了一个隐藏的后台线程),但是它们仍然根本不涉及主进程。
Unless the main process calls a method on a queue, it has nothing to do with that queue at all. 除非主进程在队列上调用方法,否则它根本与该队列无关。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.