[英]Python multiprocessing share large data through Queue?
I would like to put output data into a queue in a multiprocessing computation.我想将 output 数据放入多处理计算中的队列中。 It seems that when the size of the return is too large, the program got stuck.
似乎是当return的大小太大时,程序就卡住了。 To illustrate the problem, here is a minimal codes.
为了说明这个问题,这里有一个最小的代码。 Anyone can help to make this work?
任何人都可以帮助完成这项工作吗?
from multiprocessing import Process, Queue
import numpy as np
def foo(q, qid):
x = np.random.randint(0,5,7)
y = np.random.random(100*10*10).reshape(100,10,10)
q.put([qid,x,y])
def main():
processes = []
q = Queue()
for qid in range(5):
p = Process(target=foo, args=(q, qid))
p.start()
processes.append(p)
for process in processes:
process.join()
for qid in range(5):
[_, x, y] = q.get()
print(x)
print(y)
if __name__ == '__main__':
main()
I figured out one solution is as below to switch the join
and get
.我想出一种解决方案如下所示来切换
join
和get
。 By default, the get
method blocks.默认情况下,
get
方法是阻塞的。
from multiprocessing import Process, Queue
import numpy as np
def foo(q, qid):
x = np.random.randint(0,5,7)
y = np.random.random(100*10*10).reshape(100,10,10)
q.put([qid,x,y])
def main():
processes = []
q = Queue()
for qid in range(5):
p = Process(target=foo, args=(q, qid))
p.start()
processes.append(p)
for qid in range(5):
[_, x, y] = q.get()
print(x)
print(y)
for process in processes:
process.join()
if __name__ == '__main__':
main()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.