[英]Can I use asyncio to read from and write to a multiprocessing.Pipe?
I need to communicate between processes in Python and am using asyncio
in each of the processes for concurrent network IO.我需要在 Python 中的进程之间进行通信,并且在并发网络 IO 的每个进程中使用asyncio
。
Currently I'm using multiprocessing.Pipe
to send
and recv
significantly large amounts of data between the processes, however I do so outside of asyncio
and I believe I'm spending a lot of cpu time in IO_WAIT
because of it.目前我正在使用multiprocessing.Pipe
在进程之间send
和recv
大量数据,但是我在asyncio
之外这样做,我相信我因此在IO_WAIT
上花费了大量的 CPU 时间。
It seems like asyncio
can and should be used to handle the Pipe IO between processes, however I can't find an example for anything but piping STDIN/STDOUT.似乎asyncio
可以而且应该用于处理进程之间的 Pipe IO,但是除了管道 STDIN/STDOUT 之外,我找不到任何示例。
From what I read it seems like I should register the pipe with loop.connect_read_pipe(PROTOCOL_FACTORY, PIPE)
and likewise for write.从我读到的似乎我应该用loop.connect_read_pipe(PROTOCOL_FACTORY, PIPE)
注册 pipe 并同样用于写入。 However I don't understand the purpose of protocol_factory
as it would relate to a multiprocessing.Pipe
.但是我不明白protocol_factory
的目的,因为它与multiprocessing.Pipe
相关。 It's not even clear if I should be creating a multiprocessing.Pipe
or whether I can create a pipe within asyncio
.甚至不清楚我是否应该创建一个multiprocessing.Pipe
或者我是否可以在 asyncio 中创建一个asyncio
。
multiprocessing.Pipe
uses the high level multiprocessing.Connection
module that pickles and unpickles Python objects and transmits additional bytes under the hood. multiprocessing.Pipe
使用高级multiprocessing.Connection
模块来腌制和解封 Python 对象并在后台传输额外的字节。 If you wanted to read data from one of these pipes using loop.connect_read_pipe()
, you would have to re-implement all of this yourself.如果您想使用loop.connect_read_pipe()
从这些管道之一读取数据,则必须自己重新实现所有这些。
The easiest way to read from a multiprocessing.Pipe
without blocking the event loop would be to use loop.add_reader()
.从multiprocessing.Pipe
读取而不阻塞事件循环的最简单方法是使用loop.add_reader()
。 Consider the following example:考虑以下示例:
import asyncio
import multiprocessing
def main():
read, write = multiprocessing.Pipe(duplex=False)
writer_process = multiprocessing.Process(target=writer, args=(write,))
writer_process.start()
asyncio.get_event_loop().run_until_complete(reader(read))
async def reader(read):
frame_available = asyncio.Event()
asyncio.get_event_loop().add_reader(read.fileno(), frame_available.set)
await frame_available.wait()
frame_available.clear()
print(read.recv())
def writer(write):
write.send('Hello World')
if __name__ == '__main__':
main()
Pipes created using the lower-level os.pipe
don't add anything extra the way that pipes from multiprocessing.Pipe
do.使用较低级别的os.pipe
创建的管道不会像来自multiprocessing.Pipe
的管道那样添加任何额外内容。 As a result, we can use os.pipe
with loop.connect_read_pipe()
, without re-implementing any sort of inner-workings.因此,我们可以将os.pipe
与loop.connect_read_pipe()
一起使用,而无需重新实现任何内部工作。 Here is an example:这是一个例子:
import asyncio
import multiprocessing
import os
def main():
read, write = os.pipe()
writer_process = multiprocessing.Process(target=writer, args=(write,))
writer_process.start()
asyncio.get_event_loop().run_until_complete(reader(read))
async def reader(read):
pipe = os.fdopen(read, mode='r')
loop = asyncio.get_event_loop()
stream_reader = asyncio.StreamReader()
def protocol_factory():
return asyncio.StreamReaderProtocol(stream_reader)
transport, _ = await loop.connect_read_pipe(protocol_factory, pipe)
print(await stream_reader.readline())
transport.close()
def writer(write):
os.write(write, b'Hello World\n')
if __name__ == '__main__':
main()
This code helped me figure out to use loop.connect_read_pipe
. 这段代码帮助我弄清楚如何使用loop.connect_read_pipe
。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.