简体   繁体   中英

Python Queue suddenly empty

I'm trying to run a couple of calculations concurrently based on this example . I extended it in the way that instead of just a simple function, I'm running some external software via subprocess. It's run with Python 2.7.6 on Ubuntu 14.04.2 LTS (GNU/Linux 3.16.0-30-generic x86_64).

There is a Queue involved to keep track of result outputs. This Queue is supposed to get filled, but appears to be empty after calculations are done.

The simplified code looks like this:

import subprocess, shlex, os, random, pickle
from Queue import Queue
from multiprocessing import Process
from time import sleep

def mp_solve(problems, nprocs):
    def worker(problem, out_q):
        outdict = []
        cmd = "..." + problem
        args = shlex.split(cmd)
        output,error = subprocess.Popen(args,stdout = subprocess.PIPE, stderr= subprocess.PIPE).communicate()
        outdict = [str(problem), str(output)]
        out_q.put(outdict)
        print out_q.empty() #prints False

    out_q = Queue() #create Queue
    procs = []

    for i in range(nprocs):
        p = Process(
                target=worker,
                args=(problems[i][1], out_q))
        procs.append(p)
        p.start()

    sleep(10) #calculations are limited to 3 seconds through a parameter passed to external program

    print out_q.empty() #prints True

    resultlist = []
    for i in range(nprocs):
        print "going to Q" + str(i)
        try:
            resultlist.append(out_q.get())
        except Queue.Empty:
            print "Queue empty"


mp_solve(list_of_problems, 10)

The output of this will be

False
False
False
False
False
False
False
False
False
False
True
going to Q0

After the last command, the session window is rendered useless. I can type in it, but nothing will happen and even Ctrl + C has no effect. I then just close the ssh session.

I'm fairly new to multiprocessing and I can't figure out why the Queue appears to get filled correctly (as seen from the False returned) but is then empty. Note that Queue.Empty never seems to catch. Any ideas how to get me back on the right track?

The Queue object from the Queue module isn't suitable for multiprocessing: it's intended only for communication between threads in the same process. Use the Queue object from the multiprocessing module instead.

from multiprocessing import Process, Queue

That should solve the immediate problem. Here are a couple of other notes:

  • out_q.get() is a blocking call: if there's nothing in the queue then it will wait until there is. So in your code above, the Empty exception will never be raised. For a non-blocking version, try out_q.get(block=False) . Alternatively, you could specify a timeout: out_q.get(timeout=10.0) .

  • It's good practice to join your child processes before your main process exits. Call proc.join() on each process. As dano points out in the comments, there's a possibility of deadlock if you try to join the processes before emptying the queue (see the warning here ), so you probably want to do this at the very end of your mp_solve function.

  • It's also good practice to guard your main code with an if __name__ == '__main__': block. It won't make a difference on Linux (under Python 2.7), but without it your code won't run correctly on Windows.

  • Again for cross-platform compatibility, you should move the nested worker function out to module level. For the code to run on Windows, the worker function needs to be pickleable, and nested functions aren't pickleable.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM