简体   繁体   English

python多处理队列不在共享内存中

[英]python multiprocessing queue is not in shared memory

I tried to run the following codes: 我尝试运行以下代码:

import multiprocessing
import time

def init_queue():
    print("init g_queue start")
    while not g_queue.empty():
        g_queue.get()
    for _index in range(10):
        g_queue.put(_index)
    print("init g_queue end")
    return

def task_io(task_id):
    print("IOTask[%s] start" % task_id)
    print("the size of queue is %s" % g_queue.qsize())
    while not g_queue.empty():
        time.sleep(1)
        try:
            data = g_queue.get(block=True, timeout=1)
            print("IOTask[%s] get data: %s" % (task_id, data))
        except Exception as excep:
            print("IOTask[%s] error: %s" % (task_id, str(excep)))
    print("IOTask[%s] end" % task_id)
    return

g_queue = multiprocessing.Queue()

if __name__ == '__main__':
    print("the size of queue is %s" % g_queue.qsize())
    init_queue()
    print("the size of queue is %s" % g_queue.qsize())
    time_0 = time.time()
    process_list = [multiprocessing.Process(target=task_io, args=(i,)) for i in range(multiprocessing.cpu_count())]
    for p in process_list:
        p.start()
    for p in process_list:
        if p.is_alive():
            p.join()
    print("End:", time.time() - time_0, "\n")

what I got was the following: 我得到的是以下内容:

the size of queue is 0
init g_queue start
init g_queue end
the size of queue is 10
IOTask[0] start
the size of queue is 0
IOTask[0] end
IOTask[1] start
the size of queue is 0
IOTask[1] end
('End:', 0.6480000019073486, '\n')

What I was expecting was 我期待的是

IOTask[0] start
the size of queue is 10

Because after initialization of g_queue, the size of queue was supposed to be 10, not 0. It seems like the queue is not in the shared memory. 因为在g_queue初始化之后,队列的大小应该为10,而不是0。似乎队列不在共享内存中。 When the sub process starts, a copy of g_queue is created and its size is 0. 子进程启动时,将创建g_queue的副本,其大小为0。

Why multiprocessing.queue is not in the shared memory? 为什么multiprocessing.queue不在共享内存中? Please advise. 请指教。 Many thanks! 非常感谢!

You should pass your g_queue as a parameter, then it will work. 您应该将g_queue作为参数传递,然后它将起作用。

demo for using multiprocessing with queue 演示如何将多处理与队列一起使用

import multiprocessing
import time


def long_time_calculate(n, result_queue):
 time.sleep(1)
 result_queue.put(n)


if __name__ == '__main__':
 result_queue = multiprocessing.Queue()
 pool_size = multiprocessing.cpu_count() * 2
 pool = multiprocessing.Pool(processes=pool_size, maxtasksperchild=4)

 manager = multiprocessing.Manager()
 result_queue = manager.Queue()

 inputs = [(1, result_queue), (2, result_queue), (3, result_queue), (4, result_queue)]

 for input in inputs:
     pool.apply_async(long_time_calculate, input)

 pool.close()
 pool.join()

 print(list(result_queue.get() for _ in inputs))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM