简体   繁体   中英

How to use python script to start other python scripts and check whether all “child scripts” are finished?

Basically, I want to use a script to start other scripts in different directories and do follow up works until all the "child scripts" are finished. I use multiprocessing.Pool to put all the child process together and use wait() to wait until all finished. Here is my script:

import os, shutil, subprocess,sys, multiprocessing

rootdir=os.getcwd()
argument=[]
def Openpy(inputinfo):
    dirpath, filename= inputinfo
    os.chdir(dirpath)
    return subprocess.Popen( [ 'python',  filename ] )

if __name__=="__main__":
    for dirpath, dirname , filenames in os.walk(rootdir):
        for filename in filenames: 
            if filename=='Test.py':
                argument.append( [ dirpath, filename ] )
    print argument
    po=multiprocessing.Pool()
    r=po.map_async(Openpy, argument)
    po.close()
    r.wait()

print 'Child scripts are finished'

The child processes start normally and finish in one minute. However, it seems that those child processes do not return its "finished message" to the parent process. It did not show that "Child scripts are finished". What can I do to change my code so that it can detect the end of the child processes properly?

All help is appreciated!

2013/9/1 9:16 UTC+8

JF Sebastian's answer is being accepted. Run successfully and elegant usage of list comprehension makes the whole stuff more readable.

Try swapping the location of r.wait() and po.close()

if __name__=="__main__":
    for dirpath, dirname , filenames in os.walk(rootdir):
        for filename in filenames: 
            if filename=='Test.py':
                argument.append( [ dirpath, filename ] )
    print argument
    po=multiprocessing.Pool()
    r=po.map_async(Openpy, argument)
    r.wait()
    po.close()

It might not be your problem, but it looks like maybe you are closing the pool before the result is evaluated.

Popen() returns immediately so your parent process finishes long before your child scripts ie, look for the "finished message" at the very beginning of the output.

You don't need multiprocessing to run subprocesses concurrently:

import os
import sys
from subprocess import Popen

# run all child scripts in parallel
processes = [Popen([sys.executable, filename], cwd=dirpath)
             for dirpath, dirname , filenames in os.walk('.')
             for filename in filenames
             if filename == 'Test.py']

# wait until they finish
for p in processes:
    p.wait()
print("all done")

See also Python threading multiple bash subprocesses?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM