简体   繁体   English

使用异步的 python 子进程的“关闭”事件侦听器

[英]'close' event listener for python subprocess using asyncio

I'm trying to attach an event listener to a python subprocess, that is called whenever the process closes/exits.我正在尝试将事件侦听器附加到 python 子进程,每当进程关闭/退出时都会调用该子进程。 The callback will respawn the subprocess and attach the same event listener to the new one.回调将重新生成子进程并将相同的事件侦听器附加到新的事件侦听器。

Something akin to calling .on from child_process in node.js类似于从.on中的child_process调用 .on

I'm aware of this thread already.我已经知道这个线程了。 But I'd ideally like to do this usingasyncio.subprocess instead of running each event listener in a separate thread.但理想情况下,我希望使用asyncio.subprocess来执行此操作,而不是在单独的线程中运行每个事件侦听器。

I was thinking of something like this-我在想这样的事情-

from asyncio import subprocess as asubprocess
from typing import Callable, Awaitable

async def handle_close(proc: asubprocess.Process, spawner: Callable[[], Awaitable[asubprocess.Process]]):
    while True:
        await proc.wait()
        proc = await spawner()

(This function can also be extended to take a list of processes and their respective spawners and asyncio.wait on all of them, once any of them stops - they are respawned and the process is repeated) (这个 function 也可以扩展以获取进程列表及其各自的生成器和asyncio.wait所有它们,一旦它们中的任何一个停止 - 它们将重新生成并重复该过程)

The proc argument would be the process returned by asubprocess.create_subprocess_exec and spawner would be an async function that respawns that subprocess. proc参数将是asubprocess.create_subprocess_exec spawner是重新生成该子进程的异步 function。 (or any other callback) (或任何其他回调)

The only issue is, I don't know how to run this in the background without polluting my entire codebase with async .唯一的问题是,我不知道如何在后台运行它而不用async污染我的整个代码库。 Ideally, there will be multiple subprocesses that will require their own handle_close in the background.理想情况下,会有多个子进程在后台需要它们自己的handle_close While all of these handlers are running, the caller code should not be blocked.当所有这些处理程序都在运行时,不应阻塞调用者代码。

In my own usecase, I don't need the process handles so those can be discarded, as long as the processes keep running and respawning in case they stop and the caller code that spawns all the processes has control to do other things, it's fine.在我自己的用例中,我不需要进程句柄,因此可以丢弃这些句柄,只要进程继续运行并在它们停止的情况下重新生成并且生成所有进程的调用者代码可以控制执行其他操作,就可以了.

The only issue is, I don't know how to run this in the background without polluting my entire codebase with async .唯一的问题是,我不知道如何在后台运行它而不用async污染我的整个代码库。

It's not entirely clear from this sentence what the actual constraint is, ie how far you are willing to go with async code.这句话并不完全清楚实际的约束是什么,即你愿意用异步代码 go 多远。 Normally a program that uses asyncio is assumed to run inside an asyncio event loop, and then the whole program is async, ie uses callbacks and/or coroutines.通常假设使用 asyncio 的程序在 asyncio 事件循环中运行,然后整个程序是异步的,即使用回调和/或协程。 However, if you have a large code base using blocking code and threads, you can also introduce asyncio in a separate thread.但是,如果您有一个使用阻塞代码和线程的大型代码库,您也可以在单独的线程中引入 asyncio。 For example:例如:

_loop = asyncio.new_event_loop()
def _run():
    asyncio.set_event_loop(_loop)
    _loop.run_forever()
threading.Thread(target=_run, daemon=True).start()

def async_submit(coro):
    return asyncio.run_coroutine_threadsafe(coro, _loop)

With the event loop running in the background, you can submit tasks to it from your blocking code.随着事件循环在后台运行,您可以从阻塞代码向它提交任务。 For example:例如:

from typing import List

async def handle_process(cmd: List[str]):
    while True:
        p = await asubprocess.create_subprocess_exec(*cmd)
        await p.wait()
        print('restarting', cmd)

# tell the event loop running in the background to respawn "sleep 1":
async_submit(handle_process(["sleep", "1"]))

Note that all interaction with the event loop must be executed through run_coroutine_threadsafe or its cousin call_soon_threadsafe .请注意,与事件循环的所有交互都必须通过run_coroutine_threadsafe或其表亲call_soon_threadsafe执行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM