简体   繁体   English

如何结合python asyncio和多处理?

[英]How to combine python asyncio and multiprocessing?

I have a device that needs multiprocessing to handle the CPU bound deserialization & decoding of the incoming data; 我有一个需要多处理才能处理CPU绑定的反序列化和传入数据解码的设备。 but the rest of the application is slower IO-limited code, which is excellent for asyncio. 但是应用程序的其余部分是速度较慢的IO限制代码,这对asyncio来说非常有用。 However, it seems like there is no good way to combine multiprocessing and asyncio together. 但是,似乎没有将多处理和异步结合在一起的好方法。

I have tried https://github.com/dano/aioprocessing , which uses threaded executors for multiprocessing operations. 我尝试过https://github.com/dano/aioprocessing ,它使用线程执行程序进行多处理操作。 However, this library does not natively support common asyncio operations; 但是,此库本身不支持常见的异步操作; for example, canceling a co-routine waiting on a queue.get with this library will lead to deadlock. 例如,取消在queue.get等待的协同例程。使用此库将导致死锁。

I have also tried to use a ProcessPoolExecutor , but passing multiprocessing objects to this executor does not work since the queue objects are not passed at the creation of the process. 我也尝试过使用ProcessPoolExecutor ,但是将多处理对象传递给该执行程序不起作用,因为在创建流程时未传递队列对象。

import multiprocessing
import asyncio
import atexit
from concurrent.futures import ProcessPoolExecutor


@atexit.register
def kill_children():
    [p.kill() for p in multiprocessing.active_children()]


async def queue_get(queue: multiprocessing.Queue):
    executor = ProcessPoolExecutor(max_workers=1)
    loop = asyncio.get_running_loop()
    return await loop.run_in_executor(executor, queue.get)


async def main():
    queue = multiprocessing.Queue()
    get_task = asyncio.create_task(queue_get(queue))

    queue.put(None)

    print(await get_task)


if __name__ == "__main__":
    asyncio.run(main())

Running this code leads to this exception: 运行此代码将导致以下异常:

RuntimeError: Queue objects should only be shared between processes through inheritance

Is there any way to cleanly bridge the gap between multiprocessing and asyncio? 有什么方法可以消除多处理与异步之间的鸿沟吗?

Per Can I somehow share an asynchronous queue with a subprocess? 我可以以某种方式与子进程共享异步队列吗?

The above code can be modified to work with a multiprocessing queue by creating the queue through a multiprocessing.Manager() 通过通过multiprocessing.Manager()创建队列,可以将以上代码修改为与多处理队列一起使用。

import multiprocessing
import asyncio
import atexit
from concurrent.futures import ProcessPoolExecutor


@atexit.register
def kill_children():
    [p.kill() for p in multiprocessing.active_children()]


async def queue_get(queue: multiprocessing.Queue):
    executor = ProcessPoolExecutor(max_workers=1)
    loop = asyncio.get_running_loop()
    return await loop.run_in_executor(executor, queue.get)


async def main():
    manager = multiprocessing.Manager()
    queue = manager.Queue()
    get_task = asyncio.create_task(queue_get(queue))

    queue.put(None)
    print(await get_task)


if __name__ == "__main__":
    asyncio.run(main())

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM