简体   繁体   中英

Running multiple python files in parallel

The following is my code. It's quite simple and straight forward, and the mission is to run three files at the same time.

When I ran it, it only runs the last processes (each file should generate a csv file).

import multiprocessing
import subprocess

def worker(file):
    subprocess.call(["python", file])


if __name__ == '__main__':
    files = ["launch_day_t1.py","launch_day_t2.py","launch_day_t3.py"]
    for i in files:
        p = multiprocessing.Process(target=worker(i))
        p.start()

As you can see I have three files ("launch_day_t1.py","launch_day_t2.py" and "launch_day_t3.py") that I wish to run.

My question is that whether is this the best way to run parallel scripts (if it is), or is there a better approach?

Thanks!

The best situation here would be to refactor the code in "launch_day_t1.py" so that the parts you need to call are wrapped into functions, and the cli code is if __name_=="__main__" 'd... something that looks like:

#launch_day_t1.py
def do_stuff():
    pass
def do_more_stuff():
    pass
if __name__ == "__main__":
    do_stuff()
    do_more_stuff()

Then instead of calling the file from the command line, you can import the parts that you need, and call it from your existing python interpreter, and that will make things like gathering up the output and such much easier.

import multiprocessing
import subprocess
import launch_day_t1
import launch_day_t2

def worker(namespace):
    namespace.do_stuff()
    return namespace.do_more_stuff()


if __name__ == '__main__':
    N_CORES = 8
    files = [launch_day_t1, launch_day_t2]
    p = multiprocessing.Pool(2*N_CORES)
    print(p.map(worker, files))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM