简体   繁体   中英

JoinableQueue join() method blocking main thread even after task_done()

In below code, if I put daemon = True , consumer will quit before reading all queue entries. If consumer is non-daemon, Main thread is always blocked even after the task_done() for all the entries.

from multiprocessing import Process, JoinableQueue

import time


def consumer(queue):
    while True:
        final = queue.get()
        print (final)
        queue.task_done()


def producer1(queue):
    for i in "QWERTYUIOPASDFGHJKLZXCVBNM":
        queue.put(i)

if __name__ == "__main__":

    queue = JoinableQueue(maxsize=100)
    p1 = Process(target=consumer, args=((queue),))
    p2 = Process(target=producer1, args=((queue),))
    #p1.daemon = True
    p1.start()
    p2.start()
    print(p1.is_alive())
    print (p2.is_alive())
    for i in range(1, 10):
        queue.put(i)
        time.sleep(0.01)
    queue.join()

Let's see what—I believe—is happening here:

  1. both processes are being started.
  2. the consumer process starts its loop and blocks until a value is received from the queue.
  3. the producer1 process feeds the queue 26 times with a letter while the main process feeds the queue 9 times with a number. The order in which letters or numbers are being fed is not guaranteed—a number could very well show up before a letter.
  4. when both the producer1 and the main processes are done with feeding their data, the queue is being joined. No problem here, the queue can be joined since all the buffered data has been consumed and task_done() has been called after each read.
  5. the consumer process is still running but is blocked until more data to consume show up.

Looking at your code, I believe that you are confusing the concept of joining processes with the one of joining queues. What you most likely want here is to join processes, you probably don't need a joinable queue at all.

#!/usr/bin/env python3

from multiprocessing import Process, Queue

import time

def consumer(queue):
    for final in iter(queue.get, 'STOP'):
        print(final)

def producer1(queue):
    for i in "QWERTYUIOPASDFGHJKLZXCVBNM":
        queue.put(i)

if __name__ == "__main__":
    queue = Queue(maxsize=100)
    p1 = Process(target=consumer, args=((queue),))
    p2 = Process(target=producer1, args=((queue),))
    p1.start()
    p2.start()
    print(p1.is_alive())
    print(p2.is_alive())
    for i in range(1, 10):
        queue.put(i)
        time.sleep(0.01)
    queue.put('STOP')
    p1.join()
    p2.join()

Also your producer1 exits on its own after feeding all the letters but you need a way to tell your consumer process to exit when there won't be any more data for it to process. You can do this by sending a sentinel, here I chose the string 'STOP' but it can be anything.

In fact, this code is not great since the 'STOP' sentinel could be received before some letters, thus both causing some letters to not be processed but also a deadlock because the processes are trying to join even though the queue still contains some data. But this is a different problem.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM