简体   繁体   中英

How to run external application in separate thread with kill on exit

I'm stuck for some now, and wanted to ask maybe someone had similar issue.

I need to start external executable when starting my application, it has to be alive for the whole time, and die when application exits. But because it takes ages to load i wanted to capture output real time and await stat message.

Now interesting part:

async def start_app(event):
    command_raw = 'ping 127.0.0.1'
    command = shlex.split(command_raw, posix="win" not in sys.platform)
    create = asyncio.create_subprocess_exec(*command, stdout=asyncio.subprocess.PIPE)
    proc = await create

    while True:
        data = await proc.stdout.readline()
        line = data.decode('ascii').rstrip()
        if line == 'Ping statistics for 127.0.0.1:':
            break
    event.set()


def runner_worker(runner_loop):
    asyncio.set_event_loop(runner_loop)
    runner_loop.run_forever()

if sys.platform == "win32":
    runner_loop = asyncio.ProactorEventLoop()

event = Event()
t = Thread(target=runner_worker, args=(runner_loop,))
t.start()

f = functools.partial(start_app, event)
runner_loop.call_soon_threadsafe(f)

print('waiting for externall app to start')
event.wait(timeout=60)
print('reasuming application')
time.sleep(2)

print('ending...')

As you can see i wanted to have blocked loop running in different thread (which will always be blocked) that sets flag that application started. Here i'm not even sure if this right approach.

Even omitting error that i got:

RuntimeWarning: coroutine 'start_app' was never awaited
  self._callback(*self._args)

Today i tried different approach:

import asyncio
import concurrent.futures
import logging
import sys
import time
import shlex
import subprocess


def start_app_syn():
    log = logging.getLogger('ext application')
    log.info('starting application')
    command_raw = 'ping -t 127.0.0.1'
    command = shlex.split(command_raw, posix="win" not in sys.platform)
    proc = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
    while True:
        line = proc.stdout.readline().rstrip()
        if line == b'Reply from 127.0.0.1: bytes=32 time<1ms TTL=128':
            break
    return proc


async def run_blocking_tasks(executor):
    log = logging.getLogger('run_blocking_tasks')
    log.info('starting')

    log.info('creating executor ')
    loop = asyncio.get_event_loop()

    log.info('waiting for application start')
    runner = loop.run_in_executor(executor, start_app_syn, )
    try:
        proc = await asyncio.wait_for(runner, timeout=60)
        log.info('application has started')
    except TimeoutError as e:
        log.error('Process didnot started in acceptable time')
        raise e
    return proc


if __name__ == '__main__':
    logging.basicConfig(
        level=logging.INFO,
        format='%(threadName)10s %(name)18s: %(message)s',
        stream=sys.stderr,
    )

    executor = concurrent.futures.ThreadPoolExecutor()
    event_loop = asyncio.get_event_loop()
    task = asyncio.ensure_future(run_blocking_tasks(executor), loop=event_loop)
    proc = event_loop.run_until_complete(task)

    print('App started')
    print('doing a lot of stuff')
    for i in range(5):
        print('... ')
        time.sleep(1)

    print('Exiting...')
    proc.kill()
    event_loop.close()

It works. But i'm not able to judge it this good approach.

Why not to use daemon? It will cause the thread to be killed if main application ends

...
t.daemon = True
t.start()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM