簡體   English   中英

使 multiprocessing.Queue 可從 asyncio 訪問

[英]Make multiprocessing.Queue accessible from asyncio

給定一個multiprocessing.Queue ,它由不同的 Python 線程填充,通過ThreadPoolExecutor.submit(...)創建。

如何以安全的方式(上下文 FastAPI)和可靠的方式使用 asyncio / Trio / Anyio 訪問該隊列?

我知道 Janushttps://github.com/aio-libs/janus/blob/master/janus/__init__.py但更喜歡這里的自定義解決方案。

(希望)更簡潔地詢問:我如何做/實施

await <something_is_in_my_multiprocessing_queue>

你會建議什么同步機制?

(注意: multiprocessing.Queue不是asyncio.Queue

您可以使用loop.run_in_executor而不是mp.Queue ,它提供類似的行為並且設計用於與asyncio一起使用:

import asyncio
from concurrent.futures import ProcessPoolExecutor
from typing import Any
from functools import partial
import time

CONSUMERS_NUM = 100
PROCESS_POOL_SIZE = 4
TASKS_NUM = 20
TIME_TO_FINISH_HEAVY_TASK = 5


async def producer(q: asyncio.Queue) -> None:
    for i in range(TASKS_NUM):
        await q.put(i)

    for _ in range(CONSUMERS_NUM):
        await q.put(None)  # poison pill technique


def heavy_job(item: Any) -> Any:
    time.sleep(TIME_TO_FINISH_HEAVY_TASK)  # imagine it is heavy CPU-bound task
    return item ** 2


async def consumers(q: asyncio.Queue, pool: ProcessPoolExecutor, worker_id: int) -> None:
    loop = asyncio.get_event_loop()
    while True:
        item = await q.get()
        if item is None:  # poison pill technique
            break

        res = await loop.run_in_executor(pool, partial(heavy_job, item=item))
        print(f"{worker_id}: Result for {item} is {res}")


async def amain():
    """Wrapper around all async work."""
    with ProcessPoolExecutor(max_workers=PROCESS_POOL_SIZE) as pool:
        async_q = asyncio.Queue()
        await asyncio.gather(
            producer(async_q),
            *[consumers(async_q, pool, f"Worker-{i}") for i in range(CONSUMERS_NUM)],
        )

if __name__ == '__main__':
    asyncio.run(amain())

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM