简体   繁体   English

并行运行单独的进程 - Python

[英]Run separate processes in parallel - Python

I use the python 'multiprocessing' module to run single processes on multiple cores but I want to run a couple of independent processes in parallel.我使用 python 'multiprocessing' 模块在多个内核上运行单个进程,但我想并行运行几个独立进程。

For example, Process-one parses large files, Process-two find patterns in different files and process three does some calculation;例如,进程一解析大文件,进程二在不同文件中查找模式,进程三做一些计算; can all these three different processed that have different sets of arguments be run in parallel?可以并行运行所有这三个具有不同参数集的不同处理吗?

def Process1(largefile):
    Parse large file
    runtime 2hrs
    return parsed_file

def Process2(bigfile)
    Find pattern in big file
    runtime 2.5 hrs
    return pattern

def Process3(integer)
    Do astronomical calculation
    Run time 2.25 hrs
    return calculation_results

def FinalProcess(parsed,pattern,calc_results):
    Do analysis
    Runtime 10 min
    return final_results

def main():
parsed = Process1(largefile)
pattern = Process2(bigfile)
calc_res = Process3(integer)
Final = FinalProcess(parsed,pattern,calc_res)

if __name__ == __main__:
    main()
    sys.exit()

In the above pseudo-code Process1, Process2 and Process3 are single-core processes ie they can't be run on multiple processors.在上面的伪代码中,Process1、Process2 和 Process3 是单核进程,即它们不能在多个处理器上运行。 These processes are run sequentially and take 2+2.5+2.25hrs = 6.75 hrs.这些进程按顺序运行,需要 2+2.5+2.25hrs = 6.75 小时。 Is it possible to run these three processes in parallel?是否可以并行运行这三个进程? So that they run at the same time on different processors/cores and when most time taking (Process2) finishes than we move to Final Process.这样它们就可以在不同的处理器/内核上同时运行,并且当大部分时间(Process2)完成时,我们才进入最终进程。

From 16.6.1.5.16.6.1.5. Using a pool of workers : 使用工人池

from multiprocessing import Pool

def f(x):
    return x*x

if __name__ == '__main__':
    pool = Pool(processes=4)              # start 4 worker processes
    result = pool.apply_async(f, [10])    # evaluate "f(10)" asynchronously
    print result.get(timeout=1)           # prints "100" unless your computer is *very* slow
    print pool.map(f, range(10))          # prints "[0, 1, 4,..., 81]"

You can, therefore, apply_async against a pool and get your results after everything is ready.因此,您可以对池 apply_async 并在一切准备就绪后获得结果。

from multiprocessing import Pool

# all your methods declarations above go here
# (...)

def main():
    pool = Pool(processes=3)
    parsed = pool.apply_async(Process1, [largefile])
    pattern = pool.apply_async(Process2, [bigfile])
    calc_res = pool.apply_async(Process3, [integer])

    pool.close()
    pool.join()

    final = FinalProcess(parsed.get(), pattern.get(), calc_res.get())

# your __main__ handler goes here
# (...)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM