[英]Wait for all multiprocessing jobs to finish before continuing
I want to run a bunch of jobs in parallel and then continue once all the jobs are finished.我想并行运行一堆作业,然后在所有作业完成后继续。 I've got something like我有类似的东西
# based on example code from https://pymotw.com/2/multiprocessing/basics.html
import multiprocessing
import random
import time
def worker(num):
"""A job that runs for a random amount of time between 5 and 10 seconds."""
time.sleep(random.randrange(5,11))
print('Worker:' + str(num) + ' finished')
return
if __name__ == '__main__':
jobs = []
for i in range(5):
p = multiprocessing.Process(target=worker, args=(i,))
jobs.append(p)
p.start()
# Iterate through the list of jobs and remove one that are finished, checking every second.
while len(jobs) > 0:
jobs = [job for job in jobs if job.is_alive()]
time.sleep(1)
print('*** All jobs finished ***')
it works, but I'm sure there must be a better way to wait for all the jobs to finish than iterating over them again and again until they are done.它有效,但我确信必须有更好的方法来等待所有工作完成,而不是一次又一次地迭代它们直到它们完成。
You can make use of join .您可以使用join 。 It let you wait for another process to end.它让您等待另一个进程结束。
t1 = Process(target=f, args=(x,))
t2 = Process(target=f, args=('bob',))
t1.start()
t2.start()
t1.join()
t2.join()
You can also use barrier It works as for threads, letting you specify a number of process you want to wait on and once this number is reached the barrier free them.您还可以使用屏障它与线程一样工作,让您指定要等待的进程数量,一旦达到该数量,就可以释放它们。 Here client and server are asumed to be spawn as Process.这里假定客户端和服务器作为进程生成。
b = Barrier(2, timeout=5)
def server():
start_server()
b.wait()
while True:
connection = accept_connection()
process_server_connection(connection)
def client():
b.wait()
while True:
connection = make_connection()
process_client_connection(connection)
And if you want more functionalities like sharing data and more flow control you can use a manager .如果您想要更多功能,例如共享数据和更多流量控制,您可以使用管理器。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.