简体   繁体   中英

Python - Capturing stdout of a multiple Popen subprocesses in real time to a file

I am able to capture the output of a processed called by ' Popen in real time using the following code:

p = subprocess.Popen(args,
                     stdout=subprocess.PIPE,
                     stderr=subprocess.PIPE)

for line in iter(p.stdout.readline,''):
    sys.stdout.write(line)

output, error = p.communicate()

This works great. However, I now have multiple processes running using the code below, so I need to capture the stdout to a file for each process:

for mapped_file in range(1,3):

    #Fire the command in parallel
    stdout = open(outfile + '.stdout', 'w')
    p = subprocess.Popen(args,
                         stdout=subprocess.PIPE,
                         stderr=subprocess.PIPE)

    stdout_list.append(stdout)
    process_list.append(p)

    #for line in iter(p.stdout.readline,''):
    #   stdout.write(line)

    #Wait for all processes to continue
    while len(process_list) > 0:
        for process in process_list:

        #Loop through the process to wait until all tasks finished
        if not process.poll() is None:
            output, error = process.communicate()

            #Remove the process from the list because it has finished
            process_list.remove(process)

        #Sleep for 1 second in between iterations
        time.sleep(1)

Including for line in iter(p.stdout.readline,''):... code keeps the code executing in the first loop only.

How can I capture the stdout (and presumably the stderr ) in real time of each process executed inside my loop to a file?

Pass a new file object to subprocess.Popen each time you call it. This allows you to divert stdout to a separate file for each process. Here is an example

import subprocess


procs = []

for p in range(3):
        args = ['echo',"A Message from process #%d" % p]
        #Funnel stdout to a file object, using buffering
        fout = open("stdout_%d.txt" % p,'w')
        p = subprocess.Popen(args,stdout=fout,bufsize=-1)
        procs.append(p)

#Wait for all to finish
for p in procs:
    p.communicate()

When I run that I get 3 separate files

ericu@eric-phenom-linux:~/Documents$ python write_multiple_proc_to_file.py 
ericu@eric-phenom-linux:~/Documents$ ls -l stdout_*
-rw-rw-r-- 1 ericu ericu 26 Feb 23 09:59 stdout_0.txt
-rw-rw-r-- 1 ericu ericu 26 Feb 23 09:59 stdout_1.txt
-rw-rw-r-- 1 ericu ericu 26 Feb 23 09:59 stdout_2.txt
ericu@eric-phenom-linux:~/Documents$ cat stdout_*.txt
A Message from process #0
A Message from process #1
A Message from process #2
ericu@eric-phenom-linux:~/Documents$ 

Popen's stdin and stdout arguments accept file objects so you could just pass the opened file in there.

From the docs :

stdin, stdout and stderr specify the executed program's standard input, standard output and standard error file handles, respectively. Valid values are PIPE, an existing file descriptor (a positive integer), an existing file object , and None. PIPE indicates that a new pipe to the child should be created. [...]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM