I have more than 20 independent scripts that I need to execute I don't know exactly how many processes I can execute in parallel as good practice:
import os
import glob
from multiprocessing import Pool
processos = (
'p1.py',
'p2.py',
'p3.py',
'p4.py',
'p5.py',
'p6.py',
'p7.py',
'p8.py',
'p9.py',
'p10.py',
'p11.py',
'p12.py',
'p13.py',
'p14.py',
'p15.py',
'p16.py',
'p17.py',
'p18.py',
'p19.py',
'p20.py',
'p21.py',
'p22.py',
'p23.py',
'p24.py'
)
def roda_processo(processo):
os.system('python {}'.format(processo))
pool = Pool(processes=24)
pool.map(roda_processo, processos)
Would you like to know the most effective way to run these scripts.
Detail: These processes will run with schedules.
Why not use shell?
seq 1 24 | xargs -L1 -I{} -P`nproc` python p{}.py
-P maxprocs
Parallel mode: run at most maxprocs invocations of utility at
once.
nproc
print the number of processing units available
Using GNU Parallel you can do:
parallel python ::: p*.py
It will spawn one process per CPU thread.
If your processes only use 30% CPU power (eg maybe it is waiting for the network the rest of the time) it makes sense to spawn 3 times as many:
parallel -j300% python ::: p*.py
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.