简体   繁体   English

共享队列内容在 Python 多处理中不可见

[英]Shared queue contents not visible in Python multiprocessing

I have a few coroutines running on one process (A) and one heavier unbounded job running on a separate process(B).我有几个协程在一个进程 (A) 上运行,一个更重的无界作业在一个单独的进程 (B) 上运行。 I would like that heavier job to dispatch its results into a queue which is consumed by the original process (A).我希望更重的工作将其结果分派到一个队列中,该队列由原始进程 (A) 使用。

Similar to this:与此类似:

import asyncio
import time
from concurrent.futures import ProcessPoolExecutor


def process__heavy(pipe):
    print("[B] starting...")
    while True:
        print(f"[B] Pipe queue: {pipe.qsize()}")
        pipe.put_nowait(str(time.time()))
        time.sleep(0.5)

async def coroutine__stats(pipe):
    print("[A] starting...")
    while True:
        print(f"[A] Pipe queue: {pipe.qsize()}")
        await asyncio.sleep(1)

  
async def main():
    pipe = asyncio.Queue()
    executor = ProcessPoolExecutor()

    jobs = await asyncio.gather(
        asyncio.get_running_loop().run_in_executor(executor, process__heavy, pipe),
        coroutine__stats(pipe)
    )

    print(f"Finished with result: {jobs.result()}")


if __name__ == '__main__':
    asyncio.run(main())
    print("Bye.")

Outut输出

[A] starting...
[A] Pipe queue: 0
[B] starting...
[B] Pipe queue: 0
[B] Pipe queue: 1
[A] Pipe queue: 0 <--- why zero?
[B] Pipe queue: 2
[B] Pipe queue: 3
[A] Pipe queue: 0 <---
[B] Pipe queue: 4
[B] Pipe queue: 5
[A] Pipe queue: 0 <---
[B] Pipe queue: 6
[B] Pipe queue: 7
[A] Pipe queue: 0
[B] Pipe queue: 8

The original process (A) does not see any data put into the shared queue.原始进程 (A) 看不到任何放入共享队列的数据。 I do not remember if in python you can do object sharing across processes or if is it all pickled and the only result you can get is when the process exits and returns?我不记得在 python 中你是否可以跨进程共享 object 或者是否全部腌制并且你能得到的唯一结果是进程退出和返回?

What am I doing wrong and what would be the best way to create a data pipe between those 2 processes?我做错了什么,在这两个进程之间创建数据 pipe 的最佳方法是什么?

Use a multiprocessing.Manager() to create the Queue instead of asyncio.Queue :使用multiprocessing.Manager()来创建Queue而不是asyncio.Queue

import multiprocessing as mp
# ...
pipe = mp.Manager().Queue()

With that change to the OP code:随着对 OP 代码的更改:

[A] starting...
[A] Pipe queue: 0
[B] starting...
[B] Pipe queue: 0
[B] Pipe queue: 1
[A] Pipe queue: 2
[B] Pipe queue: 2
[B] Pipe queue: 3
[A] Pipe queue: 4
[B] Pipe queue: 4
[B] Pipe queue: 5
[A] Pipe queue: 6
[B] Pipe queue: 6
[B] Pipe queue: 7
[A] Pipe queue: 8
[B] Pipe queue: 8

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM