[英]python multiprocessing script does not exit
我正在尝试使python2.7多处理模块更加舒适一些。 因此,我编写了一个小脚本,该脚本以文件名和所需的进程数作为输入,然后启动多个进程以将函数应用于队列中的每个文件名。 看起来像这样:
import multiprocessing, argparse, sys
from argparse import RawTextHelpFormatter
def parse_arguments():
descr='%r\n\nTest different functions of multiprocessing module\n%r' % ('_'*80, '_'*80)
parser=argparse.ArgumentParser(description=descr.replace("'", ""), formatter_class=RawTextHelpFormatter)
parser.add_argument('-f', '--files', help='list of filenames', required=True, nargs='+')
parser.add_argument('-p', '--processes', help='number of processes for script', default=1, type=int)
args=parser.parse_args()
return args
def print_names(name):
print name
###MAIN###
if __name__=='__main__':
args=parse_arguments()
q=multiprocessing.Queue()
procs=args.processes
proc_num=0
for name in args.files:
q.put(name)
while q.qsize()!=0:
for x in xrange(procs):
proc_num+=1
file_name=q.get()
print 'Starting process %d' % proc_num
p=multiprocessing.Process(target=print_names, args=(file_name,))
p.start()
p.join()
print 'Process %d finished' % proc_num
该脚本可以正常工作,并且每当一个旧进程完成时就启动一个新进程(我认为这是这样吗?),直到队列中的所有对象都用完为止。 但是,该脚本不会在完成队列后退出,而是处于空闲状态,我必须使用Ctrl+C
将其杀死。 这里有什么问题?
感谢您的回答!
似乎您在这里混了一些东西。 您生成一个流程,让它完成其工作,然后等待其退出,然后在下一次迭代中开始一个新流程。 使用这种方法,您会陷入顺序处理中,这里没有执行实际的多重处理。
也许您想以此为起点:
import sys
import os
import time
import multiprocessing as mp
def work_work(q):
# Draw work from the queue
item = q.get()
while item:
# Print own process id and the item drawn from the queue
print(os.getpid(), item)
# Sleep is only for demonstration here. Usually, you
# do not want to use this! In this case, it gives the processes
# the chance to "work" in parallel, otherwise one process
# would have finished the entire queue before a second one
# could be spawned, because this work is quickly done.
time.sleep(0.1)
# Draw new work
item = q.get()
if __name__=='__main__':
nproc = 2 # Number of processes to be used
procs = [] # List to keep track of all processes
work = [chr(i + 65) for i in range(5)]
q = mp.Queue() # Create a queue...
for w in work:
q.put(w) # ...and fill it with some work.
for _ in range(nproc):
# Spawn new processes and pass each of them a reference
# to the queue where they can pull their work from.
procs.append(mp.Process(target=work_work, args=(q,)))
# Start the process just created.
procs[-1].start()
for p in procs:
# Wait for all processes to finish their work. They only
# exit once the queue is empty.
p.join()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.