简体   繁体   中英

Python continuous parallel execution

Looking to build a python script that runs an infinite read loop from stdin like for line in sys.stdin: . For each iteration, I would like to get a worker from a pool that executes in the background using line as input. The process on finishing its execution or timing out prints to stdout.

I am having a difficult time finding a worker pool module that is able to work continuously. For example, the multiprocess pool module only supports functions like join that wait for all workers to finish all tasks. For the above specification, I cannot know all the tasks ahead of time and need to assign work as it comes to processes in the background.

This will run forever.

import sys
from multiprocessing import Pool

pool = Pool()

for line in sys.stdin.readline():
    pool.apply_async(function, args=[line])

def function(line):
    """Process the line in a separate process."""
    print(line)

Using Pool and imap might make it easier, but you have to assume a maximum capacity of workers ( processes=5 ):

import multiprocessing
import sys


def worker(line):
    return "Worker got %r" % (line)


pool = multiprocessing.Pool(processes=5)
for result in pool.imap(worker, sys.stdin):
    print "Result: %r" % (result)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM