简体   繁体   English

等待所有多处理作业完成后再继续

[英]Wait for all multiprocessing jobs to finish before continuing

I want to run a bunch of jobs in parallel and then continue once all the jobs are finished.我想并行运行一堆作业,然后在所有作业完成后继续。 I've got something like我有类似的东西

# based on example code from https://pymotw.com/2/multiprocessing/basics.html
import multiprocessing
import random
import time

def worker(num):
    """A job that runs for a random amount of time between 5 and 10 seconds."""
    time.sleep(random.randrange(5,11))
    print('Worker:' + str(num) + ' finished')
    return

if __name__ == '__main__':
    jobs = []
    for i in range(5):
        p = multiprocessing.Process(target=worker, args=(i,))
        jobs.append(p)
        p.start()

    # Iterate through the list of jobs and remove one that are finished, checking every second.
    while len(jobs) > 0:
        jobs = [job for job in jobs if job.is_alive()]
        time.sleep(1)

    print('*** All jobs finished ***')

it works, but I'm sure there must be a better way to wait for all the jobs to finish than iterating over them again and again until they are done.它有效,但我确信必须有更好的方法来等待所有工作完成,而不是一次又一次地迭代它们直到它们完成。

What about?关于什么?

for job in jobs:
    job.join()

This blocks until the first process finishes, then the next one and so on.这会阻塞,直到第一个过程完成,然后是下一个,依此类推。 See more about join()查看更多关于join()

You can make use of join .您可以使用join It let you wait for another process to end.它让您等待另一个进程结束。

t1 = Process(target=f, args=(x,))
t2 = Process(target=f, args=('bob',))

t1.start()
t2.start()

t1.join()
t2.join()

You can also use barrier It works as for threads, letting you specify a number of process you want to wait on and once this number is reached the barrier free them.您还可以使用屏障它与线程一样工作,让您指定要等待的进程数量,一旦达到该数量,就可以释放它们。 Here client and server are asumed to be spawn as Process.这里假定客户端和服务器作为进程生成。

b = Barrier(2, timeout=5)

def server():
    start_server()
    b.wait()
    while True:
        connection = accept_connection()
        process_server_connection(connection)

def client():
    b.wait()
    while True:
        connection = make_connection()
        process_client_connection(connection)

And if you want more functionalities like sharing data and more flow control you can use a manager .如果您想要更多功能,例如共享数据和更多流量控制,您可以使用管理器

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在继续之前等待所有 multiprocessing.Processes 完成? - How to wait for all multiprocessing.Processes to complete before continuing? Kivy等待动画完成,然后继续 - Kivy wait for animation to finish before continuing 如何等待线程完成后再继续 - How to wait for a thread to finish before continuing 如何让一个函数在继续之前等待另一个函数完成 - How to let one function wait for another to finish before continuing 如何在多处理完成之前存储所有 output? - How to store all the output before multiprocessing finish? 了解python多重处理:代码是否等待所有过程完成 - Understanding python multiprocessing: does code wait for all process to finish Python MultiProcessing apply_async等待所有进程完成 - Python MultiProcessing apply_async wait for all processes to finish 使用Python子进程运行SLURM脚本以将多个长作业提交到队列中,并在继续python脚本之前等待作业完成 - Using Python subprocess to run SLURM script to submit multiple long jobs to queue and waiting for jobs to finish before continuing python script Python:在继续执行代码流之前,等待requests_futures.sessions完成 - Python: wait for requests_futures.sessions to finish before continuing with the code flow 在继续执行之前让函数等待 tkinter root.after() 循环完成 - Make a function wait for a tkinter root.after() loop to finish before continuing executing
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM