简体   繁体   中英

writing output to a file with subprocess in python

I have a code which spawns at the max 4 processes at a go. It looks for any news jobs submitted and if it exists it runs the python code

for index,row in enumerate(rows):
    if index < 4:
        dirs=row[0]
        dirName=os.path.join(homeFolder,dirs)
        logFile=os.path.join(dirName,(dirs+".log"))
        proc=subprocess.Popen(["python","test.py",dirs],stdout=open(logFile,'w'))

I have few questions:

  1. When I try to write the output or errors in log file it does not write into the file until the process finishes.Is it possible to write the output in the file as the process runs as this will help to know at what stage it is running.
  2. When one process finishes, I want the next job in the queue to be run rather than waiting for all child processes to finish and then the daemon starts any new.

Any help will be appreciated.Thanks!

Concerning point 1, try to adjust the buffering used for the log file:

open(logFile,'w', 1) # line-buffered (writes to the file after each logged line)
open(logFile,'w', 0) # unbuffered (should immediately write to the file)

If it suits your need, you should choose line-buffered instead of unbuffered.

Concerning your general problem, as @Tichodroma suggests, you should have a try with Python's multiprocessing module.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM