[英]Sharing objects across pythons multiprocessing pool
我找不到对此问题的任何参考,而且看起来很琐碎。
甚至有可能从multiprocessing Pool
跨Python工作者共享对象吗?
这是一个小例子:
from multiprocessing import Pool
def work(a):
return do_work(obj_b)
def main(obj_a, obj_b):
my_iterable = get_iter(obj_a)
p = Pool(processes=6)
res = p.map(work, my_iterable)
假设get_iter(obj_a)返回一个可迭代的对象。 "work"
如何知道obj_b?
阅读了很多材料后,我开始意识到一些事情:
这是代码:
from multiprocessing import Pool, cpu_count
def work(a):
print("I'm aware of obj_b: {}".format(obj_b))
def initPoolResources(_obj_b):
# Define all your shared read obj here
global obj_b
# Initialize them
obj_b = _obj_b
def main(obj_a):
# Assume obj_a is an iterable object
# We want to create a "shared read only" object between the pool of processes.
p = Pool(processes=cpu_count()-1, initializer=initPoolResources, initargs(obj_b))
result = p.map(work, obj_a)
p.close()
p.join()
work(a)从未见过obj_b,但他完全意识到这一点。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.