[英]python: understanding multiprocessing for a simple for loop
I have never used multiprocessing module before. 我之前从未使用过多处理模块。
Is there a way a for loop could be made into concurrent subprocesses. 是否有一种方法可以将for循环转换为并发子进程。 like 喜欢
for i in xrange(10): list.append(i)
instead of sequential, make it parallel? 而不是顺序,使它平行?
I tried using Queue module 我尝试使用Queue模块
q = Queue.Queue()
for i in xrange(10):
q.put(i)
def addto(q):
new.append(q.get(block=False))
processes = [Process(target=addto, args=(q,))]
for p in processes:
p.start()
for p in processes:
p.join()
And it gave out a long error, im pasting the last of it: 并且它给出了一个很长的错误,我粘贴它的最后一个:
C:\WinPython-64bit-2.7.3.3\python-2.7.3.amd64\lib\pickle.pyc in save_global(self, obj, name, pack)
746 raise PicklingError(
747 "Can't pickle %r: it's not found as %s.%s" %
--> 748 (obj, module, name))
749 else:
750 if klass is not obj:
PicklingError: Can't pickle <type 'thread.lock'>: it's not found as thread.lock
I also see this alot: 我也看到了很多:
processes = [Process(target=func, args=(q,x)) for i in some iterable]
So okay there is a func(q,x) alright, and i have a map() or for loop/while going inside my function func() so why iteration in processes, again? 好吧有一个func(q,x)好吧,我有一个map()或for循环/同时进入我的函数func()所以为什么迭代进程再次? I wouldn't want to loop the whole function using process but just make those particular loops into parallel processes. 我不想使用进程循环整个函数,只是将这些特定的循环转换为并行进程。 Why iterate over the target function with args? 为什么用args迭代目标函数? I mean when i have already q.put it? 我的意思是,当我已经q.put它?
What if I do 如果我这样做会怎么样
processes = Process(target=addto, args=(q,)).start()
Queue.Queue
is for threadsafe queues, and thread primitives cannot be transferred to other processes. Queue.Queue
用于线程安全队列,并且线程基元不能传输到其他进程。 You want multiprocessing.Queue
instead; 你想要multiprocessing.Queue
.Queue; simply replace 简单地替换
import Queue
q = Queue.Queue()
with 同
import multiprocessing
q = multiprocessing.Queue()
Additionally, new
must be of type multiprocessing.managers.list
. 另外, new
必须是multiprocessing.managers.list
类型。
However, note that you're just replicating a multiprocessing.Pool
; 但请注意,您只是在复制multiprocessing.Pool
; you can just write 你可以写
import multiprocessing
new = multiprocessing.Manager().list()
def addto(val):
new.append(val)
pool = multiprocessing.Pool()
for i in xrange(10):
pool.apply_async(addto, (i,))
pool.close()
pool.join()
print(new)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.