[英]Deadlock in python multiprocessing queue
I'm using queues from the multiprocessing library for sharing data between processes. 我正在使用多处理库中的队列在进程之间共享数据。
I have 2 queues, both are limited to 10 objects, the first queue has one process that "puts" objects into it and many processes "get" from it. 我有2个队列,两个队列都限制为10个对象,第一个队列有一个进程“将”对象“放入”它,许多进程从中“获取”。
The second queue has many processes that "put" objects into it, and only one process "gets" from it. 第二个队列有许多进程将对象“放入”它,并且只有一个进程“从中获取”。
The system works perfectly for a while and then starts behaving strangly: only the process that "puts" objects into the first queue continues to work while the processes that read from the first queue apparently are not behaving/working anymore (even though the processes are alive). 系统完美地工作了一段时间,然后开始表现得很严厉:只有将对象“放入”第一个队列的过程继续工作,而从第一个队列读取的进程显然不再表现/工作(即使进程是活)。 It seems that there's a deadlock here but I'm not sure, here is my code:
似乎这里有一个死锁,但我不确定,这是我的代码:
import multiprocessing
import logging
from multiprocessing import Process
logger = logging.get_logger(__name__)
# Processes 2, 3 ,4:
class Processes_234(Process):
def __init__(self, message_queue_1, message_queue_2):
Process.__init__(self)
self.message_queue_1 = message_queue_1
self.message_queue_2 = message_queue_2
def run(self):
while True:
try:
# get from queue
el1, el2, el3 = self.message_queue_1.get()
logger.debug('Processes234: get from queue')
except Exception as exp:
logger.debug("message_queue_1: queue empty, Exception message: " + str(exp))
# do some stuff with el1, el2, el3...
try:
# put into second queue
self.message_queue_2.put_nowait((el1, el2, el3))
logger.debug('Processes234: put into queue')
except Exception as excpt:
logger.debug(excpt)
logger.debug("message_queue_2: queue is full")
# the queue is full so replace the old element with the new one
try:
self.message_queue_2.get_nowait()
self.message_queue_2.put_nowait((el1, el2, el3))
# in case other process already fill the queue - ignore
except:
pass
# process 5:
class Process5(Process):
def __init__(self, message_queue_2):
Process.__init__(self)
self.message_queue_2 = message_queue_2
def run(self):
while True:
try:
# get from queue
el1, el2, el = self.message_queue_2.get()
print('Process5: get from queue')
except Exception as exp:
print("message_queue_2: queue empty, Exception message: " + str(exp))
def start_process_1():
# init queues
message_queue_1 = multiprocessing.Queue(maxsize=10)
message_queue_2 = multiprocessing.Queue(maxsize=10)
processes_234 = [Processes_234(message_queue_1, message_queue_2)
for _ in range(3)]
for proc in processes_234:
proc.start()
process5 = Process5(message_queue_2)
process5.start()
counter = 1
while True:
el1 = counter + 1
el2 = counter + counter
el3 = "some string " * ((counter ** 2) % 60000)
counter += 1
# start passing data
try:
# put into queue
message_queue_1.put_nowait((el1, el2, el3))
logger.debug('Process1: put into queue')
except Exception as excpt:
logger.debug(excpt)
logger.debug("message_queue_1: queue is full")
# the queue is full so replace the old element with the new one
try:
message_queue_1.get_nowait()
message_queue_1.put_nowait((el1, el2, el3))
# in case other process already fill the queue - ignore
except:
pass
if __name__ == '__main__':
start_process_1()
does anyone know what my problem is? 有谁知道我的问题是什么?
I'm using python 3.6.5 我正在使用python 3.6.5
Finally I was able to solve the problem, it was the logger! 最后我能够解决问题,这是记录器! According to logging library the logger is thread safe but not multi-process safe.
根据日志库,记录器是线程安全的,但不是多进程安全的。
I changed the code so that each process has its own logger and it solved the issue. 我更改了代码,以便每个进程都有自己的记录器,并解决了问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.