简体   繁体   English

使用asyncio创建两个并发异步任务

[英]Create two concurrently async task with asyncio

I need to create a software that receives concurrently from web socket and pipe and it sends the messages on the other channel (it receives from the socket, creates a new thread and sends to the pipe. In the same way it receives from the pipe, creates a new thread and sends to the socket). 我需要创建一个可以同时从Web套接字和管道接收的软件,并在另一个通道上发送消息(它从套接字接收消息,创建一个新线程并发送到管道。以与从管道接收消息相同的方式,创建一个新线程并发送到套接字。

I have a problem with multithreading, at the boot of the program I have to start the methods socket_receiver and pipe_receiver but I can only start the pipe_receiver . 我对多线程有问题,在程序启动时,我必须启动方法socket_receiverpipe_receiver但我只能启动pipe_receiver I tried removing all the code and keep only socket_receiver and pipe_receiver but it only enters in the while True of the pipe_receiver . 我试着删除所有代码,只保留socket_receiverpipe_receiver但它只在pipe_receiverwhile Truepipe_receiver

import asyncio
import sys
import json
from concurrent.futures.thread import ThreadPoolExecutor
import websockets

# make the Pool of workers
executor = ThreadPoolExecutor(max_workers=10)
# Make connection to socket and pipe
header = {"Authorization": r"Basic XXXX="}
connection = websockets.connect('wss://XXXXXXXX', extra_headers=header)


async def socket_receiver():
    """Listening from web socket"""
    async with connection as web_socket:
        while True:
            message = await web_socket.recv()
            # send the message to the pipe in a new thread
            executor.submit(send_to_pipe(message))


async def pipe_receiver():
    """Listening from pipe"""
    while True:
        message = sys.stdin.readline()
        if not message:
            break
        executor.submit(send_to_socket(message))
        # jsonValue = json.dump(str(line), file);
        sys.stdout.flush()


def send_to_pipe(message):
    # Check if message is CAM or DENM
    json_message = json.loads(message)
    type = int(json_message["header"]["messageID"])
    # 1 is DENM message, 2 is CAM message
    if type == 1  or type == 2:
        # send the message to the pipe
        sys.stdout.print(json_message);


async def send_to_socket(message):
     async with connection as web_socket:
        json_message = json.dumps(message)
        await web_socket.send(json_message)


asyncio.get_event_loop().run_until_complete(
    asyncio.gather(socket_receiver(),pipe_receiver()))

This program is called by a subprocess, the parent process communicates with it through pipes connected to stdout and stdin. 该程序由子进程调用,父进程通过连接到stdout和stdin的管道与之通信。

UPDATE: I receive this exception with @Martijn Pieters code 更新:我收到@Martijn Pieters代码的异常

Traceback (most recent call last):
  File "X", line 121, in <module>
    main()
  File "X", line 119, in main
    loop.run_until_complete(asyncio.gather(socket_coro, pipe_coro))
  File "X\AppData\Local\Programs\Python\Python37-32\lib\asyncio\base_events.py", line 568, in run_until_complete
    return future.result()
  File "X", line 92, in connect_pipe
    reader, writer = await stdio()
  File "X", line 53, in stdio
    lambda: asyncio.StreamReaderProtocol(reader), sys.stdin)
  File "X/AppData\Local\Programs\Python\Python37-32\lib\asyncio\base_events.py", line 1421, in connect_read_pipe
    transport = self._make_read_pipe_transport(pipe, protocol, waiter)
  File "X/AppData\Local\Programs\Python\Python37-32\lib\asyncio\base_events.py", line 433, in _make_read_pipe_transport
    raise NotImplementedError
NotImplementedError

You are not using the ThreadPoolExecutor correctly, and you really don't want to use that here. 您没有正确使用ThreadPoolExecutor ,并且您真的不想在这里使用它。 Instead, you need to set up consumers and producers to handle your socket and pipe with queues to send messages between them. 相反,您需要设置使用者和生产者以使用队列来处理套接字和管道以在它们之间发送消息。

  • for each connection type, create a coroutine that creates the connection, then passes that single connection to both a consumer and producer tasks (created with asyncio.create_task() ) for that connection. 对于每种连接类型,创建一个协程以创建连接,然后将该单个连接传递给该连接的使用者和生产者任务 (使用asyncio.create_task()创建)。 Use asyncio.wait() to run both tasks with return_when=asyncio.FIRST_COMPLETED , so you can cancel any that are still running when one of the two completes 'early' (eg has failed). 使用asyncio.wait()来运行带有return_when=asyncio.FIRST_COMPLETED两个任务,因此您可以取消在两个任务之一完成“提前”(例如失败)时仍在运行的任务。

  • Use a queue to pass messages from the consumer of one, to the producer of the other connection. 使用队列将消息从一个使用方传递到另一个连接的生成方。

  • sys.stdin and sys.stdout are blocking streams, don't just read and write to them! sys.stdinsys.stdout阻止流,而不仅仅是对其进行读写! See https://gist.github.com/nathan-hoad/8966377 for a gist attempting to set up non-blocking STDIO streams, and this asyncio issue that asks for a non-blocking streams feature. 请参阅https://gist.github.com/nathan-hoad/8966377 ,以了解尝试设置非阻塞STDIO流的要点,以及此异步问题 ,要求提供非阻塞流功能。

  • Don't use a global socket connection, certainly not with two separate async with statements. 不要使用全局套接字连接,当然不要使用两个独立的async with语句async with Your send_to_socket() method would actually close the socket because the async with connection as web_socket: context manager exits when the first message is sent, and this then causes issues for the socket_receiver code which assumes the socket remains open indefinitely. 您的send_to_socket()方法实际上将关闭套接字,因为当发送第一条消息时, async with connection as web_socket:async with connection as web_socket:管理器退出,这会导致socket_receiver代码出现问题,假定该套接字无限期保持打开状态。

  • Don't use threading here! 不要在这里使用线程! Your connections are entirely managed by asyncio, threading would stomp majorly on this. 您的连接完全由asyncio进行管理,因此,线程连接会因此而脚。

  • asyncio.Executor() instances should only be used with regular callables, not with coroutines. asyncio.Executor()实例只能与常规可调用对象一起使用, 而不能与协程一起使用。 Executor.submit() states it takes a callable, passing in a coroutine with executor.submit(send_to_pipe(message)) or executor.submit(send_to_socket(message)) will cause an exception to be raised as coroutines are not callables. Executor.submit()声明它需要一个可调用对象,并通过executor.submit(send_to_pipe(message))executor.submit(send_to_socket(message))来传递协程,因为协程不是可调用的,因此将引发异常。 You are probably not seeing an exception message as that exception is raised in the other thread. 您可能没有看到异常消息,因为该异常在另一个线程中引发。

    This is the reason your socket_receiver() coroutine fails; 这就是您的socket_receiver()协程失败的原因; it certainly starts but attempts to send messages fail. 它肯定会启动,但是尝试发送消息失败。 When I run your code against a local mocked-up websocket server a warning is printed: 当我在本地模拟的websocket服务器上运行您的代码时,将显示警告:

     RuntimeWarning: coroutine 'send_to_socket' was never awaited executor.submit(send_to_socket(message)) 

    When a coroutine is not awaited, the code in that coroutine is never executed. 当不等待协程时,该协程中的代码将永远不会执行。 Wrapping the coroutine in one that prints out the exception to stderr ( try: callable(), except Exception: traceback.print_exc(file=sys.stderr)) ) you get: 将协程包装为一个将异常输出到stderr的程序( try: callable(), except Exception: traceback.print_exc(file=sys.stderr)) ),您会得到:

     Traceback (most recent call last): File "soq52219672.py", line 15, in log_exception callable() TypeError: 'coroutine' object is not callable 

Executors should only be used to integrate code that can't be converted to using coroutines; 执行程序应仅用于集成无法转换为使用协程的代码; the executor manages that code to run parallel to the asyncio tasks without interference. 执行者管理该代码使其与asyncio任务并行运行而不会受到干扰。 Care should be taken if that code wanted to interact with asyncio tasks, always use asyncio.run_coroutine_threadsafe() or asyncio.call_soon_threadsafe() to call across the boundary. 如果该代码想与asyncio任务进行交互, asyncio ,请始终使用asyncio.run_coroutine_threadsafe()asyncio.call_soon_threadsafe()跨边界调用。 See the Concurrency and multithreading section . 请参见并发和多线程部分

Here is an example of how I'd rewrite your code to use the consumer/producer pattern, with stdio() based on the Nathan Hoad gist on the subject , plus a fallback for Windows where support for treating stdio as pipes is limited : 这是一个示例,该示例演示了如何重写代码以使用使用者/生产者模式,以及基于主题Nathan Hoad要旨的 stdio()以及适用于将stdio视为管道的支持受到限制的 Windows后备:

import asyncio
import json
import os
import sys

import websockets

async def socket_consumer(socket, outgoing):
    # take messages from the web socket and push them into the queue
    async for message in socket:
        await outgoing.put(message)

async def socket_producer(socket, incoming):
    # take messages from the queue and send them to the socket
    while True:
        message = await incoming.get()
        jsonmessage = json.dumps(message)
        await socket.send(jsonmessage)

async def connect_socket(incoming, outgoing):
    header = {"Authorization": r"Basic XXXX="}
    uri = 'wss://XXXXXXXX'
    async with websockets.connect(uri, extra_headers=header) as websocket:
        # create tasks for the consumer and producer. The asyncio loop will
        # manage these independently
        consumer_task = asyncio.create_task(socket_consumer(websocket, outgoing))
        producer_task = asyncio.create_task(socket_producer(websocket, incoming))

        # start both tasks, but have the loop return to us when one of them
        # has ended. We can then cancel the remainder
        done, pending = await asyncio.wait(
            [consumer_task, producer_task],
            return_when=asyncio.FIRST_COMPLETED
        )
        for task in pending:
            task.cancel()
        # force a result check; if there was an exception it'll be re-raised
        for task in done:
            task.result()


# pipe support
async def stdio(loop=None):
    if loop is None:
        loop = asyncio.get_running_loop()

    if sys.platform == 'win32':
        # no support for asyncio stdio yet on Windows, see https://bugs.python.org/issue26832
        # use an executor to read from stdio and write to stdout
        class Win32StdinReader:
            def __init__(self):
                self.stdin = sys.stdin.buffer 
            async def readline():
                # a single call to sys.stdin.readline() is thread-safe
                return await loop.run_in_executor(None, self.stdin.readline)

        class Win32StdoutWriter:
            def __init__(self):
                self.buffer = []
                self.stdout = sys.stdout.buffer
            def write(self, data):
                self.buffer.append(data)
            async def drain(self):
                data, self.buffer = self.buffer, []
                # a single call to sys.stdout.writelines() is thread-safe
                return await loop.run_in_executor(None, sys.stdout.writelines, data)

        return Win32StdinReader(), Win32StdoutWriter()

    reader = asyncio.StreamReader()
    await loop.connect_read_pipe(
        lambda: asyncio.StreamReaderProtocol(reader),
        sys.stdin
    )

    writer_transport, writer_protocol = await loop.connect_write_pipe(
        asyncio.streams.FlowControlMixin,
        os.fdopen(sys.stdout.fileno(), 'wb')
    )
    writer = asyncio.streams.StreamWriter(writer_transport, writer_protocol, None, loop)

    return reader, writer

async def pipe_consumer(pipereader, outgoing):
    # take messages from the pipe and push them into the queue
    while True:
        message = await pipereader.readline()
        if not message:
            break
        await outgoing.put(message.decode('utf8'))

async def pipe_producer(pipewriter, incoming):
    # take messages from the queue and send them to the pipe
    while True:
        jsonmessage = await incoming.get()
        message = json.loads(jsonmessage)
        type = int(message.get('header', {}).get('messageID', -1))
        # 1 is DENM message, 2 is CAM message
        if type in {1, 2}:
            pipewriter.write(jsonmessage.encode('utf8') + b'\n')
            await pipewriter.drain()

async def connect_pipe(incoming, outgoing):
    reader, writer = await stdio()
    # create tasks for the consumer and producer. The asyncio loop will
    # manage these independently
    consumer_task = asyncio.create_task(pipe_consumer(reader, outgoing))
    producer_task = asyncio.create_task(pipe_producer(writer, incoming))

    # start both tasks, but have the loop return to us when one of them
    # has ended. We can then cancel the remainder
    done, pending = await asyncio.wait(
        [consumer_task, producer_task],
        return_when=asyncio.FIRST_COMPLETED
    )
    for task in pending:
        task.cancel()
    # force a result check; if there was an exception it'll be re-raised
    for task in done:
        task.result()

async def main():
    pipe_to_socket = asyncio.Queue()
    socket_to_pipe = asyncio.Queue()

    socket_coro = connect_socket(pipe_to_socket, socket_to_pipe)
    pipe_coro = connect_pipe(socket_to_pipe, pipe_to_socket)

    await asyncio.gather(socket_coro, pipe_coro)

if __name__ == '__main__':
    asyncio.run(main())

This then starts with two tasks, one to manage the socket, the other to manage the STDIO pipe. 然后从两个任务开始,一个任务管理套接字,另一个任务管理STDIO管道。 Both each start 2 more tasks, for their consumer and producer. 两家公司都分别为其消费者和生产者启动了2个任务。 There are two queues to send the messages from the consumer of one and to the producer of the other. 有两个队列将消息从一个的消费者发送到另一个的生产者。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM