简体   繁体   中英

How to use a read/write stream between two Python asyncio coroutines?

How can I use asyncio to implement a pipe between two coroutines, one that reads from a stream and the other that writes into it?

Suppose we have this existing code, two simple scripts. One that produces to stdout:

# produce.py

import asyncio
import random
import sys

async def produce(stdout):
    for i in range(10000):
        await asyncio.sleep(random.randint(0, 3))
        print(i, file=stdout, flush=True)

if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(produce(sys.stdout))
    loop.close()

And the other that reads from stdin:

# consume.py

async def consume(loop, stdin):
    reader = asyncio.StreamReader(loop=loop)
    reader_protocol = asyncio.StreamReaderProtocol(reader)
    await loop.connect_read_pipe(lambda: reader_protocol, stdin)

    while True:
        line = await reader.readline()
        if not line:
            break
        print(int(line) ** 2)

if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(consume(loop, sys.stdin))
    loop.close()

Obviously, since our two pieces can run individually from the command-line, we could use the subprocess module with shell pipes ( produce | consume ).

But we would like to implement the equivalent of a Unix pipe in Python, ie connect the streams of those two existing functions.

Something like this won't work:

pipe = io.BytesIO()

await asyncio.gather(produce(pipe),
                     consume(loop, pipe))

If the two functions would manipulate generators, we could write something like this (python 3.6):

async def produce():
    for i in range(10000):
        await asyncio.sleep(random.randint(0, 3))
        yield str(i)


async def consume(generator):
    async for value in generator:
        print(int(value) ** 2)


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(consume(produce()))
    loop.close()

Is there some parts of the asyncio API that would allow that ?

Thanks!

A way of fixing this, is to turn your current functions into generator and to write some wrappers to expose them with Unix pipes:

# wrapper.py

import asyncio
import random
import sys


async def produce():
    for i in range(10000):
        await asyncio.sleep(random.randint(0, 3))
        yield str(i)


async def consume(generator):
    async for value in generator:
        print(int(value) ** 2)


async def system_out_generator(loop, stdout, generator):
    async for line in generator:
        print(line, file=stdout, flush=True)


async def system_in_generator(loop, stdin):
    reader = asyncio.StreamReader(loop=loop)
    reader_protocol = asyncio.StreamReaderProtocol(reader)
    await loop.connect_read_pipe(lambda: reader_protocol, stdin)
    while True:
        line = await reader.readline()
        if not line:
            break
        yield line


async def main(loop):
    try:
        if sys.argv[1] == "produce":
            await system_out_generator(loop, sys.stdout, produce())
        elif sys.argv[1] == "consume":
            await consume(system_in_generator(loop, sys.stdin))
    except IndexError:
        await consume(produce())


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main(loop))

You can either use:

python wrapper.py  # Python generators

or:

python wrapper.py produce | python wrapper.py consume  # System pipes

The original post says "Something like this won't work." I'm not sure if this statement is indented to mean "the following code did not work" or "I do not want a solution of this style."

I will note that the following code does work:

r, w = os.pipe()
read_pipe = os.fdopen(r, 'r')
write_pipe = os.fdopen(w, 'w')

await asyncio.gather(produce(write_pipe), consume(loop, read_pipe))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM