[英]Distribute many independent, expensive operations over multiple cores in python
[英]Distribute multiple processes over different cores in python
我試圖在python中運行帶有多個參數的單個外部程序。 為此,我使用類似
for i in range(10):
subprocess.Popen(["./foo", i])
這些過程完全獨立。 但是,它們僅在我的Mac OS X上的單個內核上運行。是否有一種簡單的方法可以在所有內核上分發進程(在我的示例中為4個)?
看一下multiprocessing
模塊。
以下是使用多處理並行轉換多張圖片的程序示例;
"""Convert DICOM files to PNG format, remove blank areas. The blank erea
removal is based on the image size of a Philips flat detector. The image
goes from 2048x2048 pixels to 1574x2048 pixels."""
import os
import sys
import subprocess
from multiprocessing import Pool, Lock
globallock = Lock()
def checkfor(args):
"""Make sure that a program necessary for using this script is
available."""
if isinstance(args, str):
args = args.split()
try:
f = open('/dev/null')
subprocess.call(args, stderr=subprocess.STDOUT, stdout=f)
f.close()
except:
print "Required program '{}' not found! exiting.".format(args[0])
sys.exit(1)
def processfile(fname):
"""Use the convert(1) program from the ImageMagick suite to convert the
image and crop it."""
size = '1574x2048'
args = ['convert', fname, '-units', 'PixelsPerInch', '-density', '300',
'-crop', size+'+232+0', '-page', size+'+0+0', fname+'.png']
rv = subprocess.call(args)
globallock.acquire()
if rv != 0:
print "Error '{}' when processing file '{}'.".format(rv, fname)
else:
print "File '{}' processed.".format(fname)
globallock.release()
def main(argv):
"""Main program.
Keyword arguments:
argv -- command line arguments
"""
if len(argv) == 1:
path, binary = os.path.split(argv[0])
print "Usage: {} [file ...]".format(binary)
sys.exit(0)
checkfor('convert')
p = Pool()
p.map(processfile, argv[1:])
p.close()
## This is the main program ##
if __name__ == '__main__':
main(sys.argv)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.