简体   繁体   中英

Save output of four parallely running python program to different log files

I have four different programs running which is started using a single python program.

import os                                                                       
from multiprocessing import Pool 
import sys   
processes = ('p1.py', 'p2.py', 'p3.py','p4.py')
def run_process(process):                                                             
    os.system('python3 {}'.format(process)) 
pool = Pool(processes=4)                                                        
pool.map(run_process, processes)

currently iam getting the log of all programs in a single file using nohup pmain.py>test.log But how can i get four different logs of p1,p2,p3 and p4 respectively in different log files.

A simple fix will be to do,

import os                                                                       
from multiprocessing import Pool 
import sys   
processes = ('p1.py > p1.log', 'p2.py > p2.log', 'p3.py > p3.log','p4.py > p4.log')
def run_process(process):                                                             
    os.system('python3 {}'.format(process)) 
pool = Pool(processes=4)                                                        
pool.map(run_process, processes)

But don't do the above.

The right way to do that would be using subprocess with something like,

import subprocess
processes = ('p1.py', 'p2.py', 'p3.py','p4.py')
procs = []
for pname in processes:
  logfile = os.path.splitext(pname)[0] + '.log'
  with open(logfile, 'w') as f:
    proc = subprocess.Popen(['python3', pname], stdout=f)
    procs.append(proc)

for proc in procs:
  proc.wait()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM