简体   繁体   English

如何使用多处理python按顺序运行多个函数?

[英]How to run multiple functions sequentialy with multiprocessing python?

I'm trying to run multiple functions with multiprocessing and running into a bit of wall.我正在尝试通过多处理运行多个功能,但遇到了一些障碍。 I want to run an initial function to completion on all processes/inputs and then run 2 or 3 other functions in parallel on the output of the first function.我想在所有进程/输入上运行一个初始函数以完成,然后在第一个函数的输出上并行运行 2 或 3 个其他函数。 I've already got my search function.我已经有了我的搜索功能。 the code is there for the sake of explanation.代码是为了解释而存在的。

I'm not sure how to continue the code from here.我不确定如何从这里继续代码。 I've put my initial attempt below.我把我的初步尝试放在下面。 I want all instances of process1 to finish and then process2 and process3 to start in parallel.我希望 process1 的所有实例都完成,然后 process2 和 process3 并行启动。

Code is something like:代码是这样的:

from multiprocessing import Pool


def init(*args):
    global working_dir
    [working_dir] = args

def process1(InFile):
    python.DoStuffWith.InFile
    Output.save.in(working_dir)

def process2(queue):
    inputfiles2 = []
    python.searchfunction.appendOutputof.process1.to.inputfiles2
    python.DoStuffWith.process1.Output
    python.Output

def process3(queue):
    inputfiles2 = []
    python.searchfunction.appendOutputof.process1.to.inputfiles2
    python.DoStuffWith.process1.Output
    python.Output

def MCprocess():
    working_dir = input("enter input: ")
    inputfiles1 = []
    python.searchfunction.appendfilesin.working_dir.to.inputfiles1
    with Pool(initializer=init, initargs=[working_dir], processes=16) as pool:
        pool.map(process1, inputfiles1)
        pool.close()
    
    #Editted Code
    queue = multiprocessing.Queue
    queue.put(working_dir)
    queue.put(working_dir)
    ProcessTwo = multiprocessing.Process(target=process2, args=(queue,))
    ProcessThree = multiprocessing.Process(target=process3, args=(queue,))
    ProcessTwo.start()
    ProcessThree.start()
     #OLD CODE
     #with Pool(initializer=init, initargs=[working_dir], processes=16) as pool:
        #pool.map_async(process2)
        #pool.map_async(process3)
    


if __name__ == '__main__':
    MCprocess()

Your best bet is to use an Event.最好的办法是使用事件。 The first process calls event.set() when it is done to indicate that the event has happened.第一个进程在完成后调用event.set()以指示事件已经发生。 The waiting processes use event.wait() or one of its variants to wait to be awoken that the event has been set.等待进程使用event.wait()或其变体之一等待事件已被唤醒。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 python 多处理不运行功能 - python multiprocessing does not run functions python对具有多个函数的函数进行多处理 - python multiprocessing to a function with multiple functions 如何使用 Python 3 的多处理池运行 2 个不同的函数? - How to run 2 different functions with Python 3's multiprocessing Pool? 如何使用具有多个参数的函数运行多处理 python 请求 - How to run multiprocessing python request with function with multiple agruments 在单个视频流多处理上运行多个函数 - Run multiple functions on single videostream multiprocessing 代码在 python 3.8 中不运行多处理功能 - Code does not run multiprocessing functions in python 3.8 如何使用 Python3 Multiprocessing 运行多个 Bash 命令 - How to use Python3 Multiprocessing to run Multiple Bash commands 如何在python中并行运行多个函数 - how to run multiple functions parallel in python 如何使用多处理并行运行函数,同时迭代 python 中的函数参数? - How to run functions in parallel using multiprocessing, while iterating over the functions arguments in python? 如何使用多线程在python中运行多个功能? - How to run multiple functions in python using multithreading?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM