简体   繁体   中英

Collecting results from celery worker with asyncio

I am having a Python application which offloads a number of processing work to a set of celery workers. The main application has to then wait for results from these workers. As and when result is available from a worker, the main application will process the results and will schedule more workers to be executed.

I would like the main application to run in a non-blocking fashion. As of now, I am having a polling function to see whether results are available from any of the workers.

I am looking at the possibility of using asyncio get notification about result availability so that I can avoid the polling. But, I could not find any information on how to do this.

Any pointers on this will be highly appreciated.

PS: I know with gevent, I can avoid the polling. However, I am on python3.4 and hence would prefer to avoid gevent and use asyncio.

You must be looking for asyncio.as_completed(coros) . It yields as and when the results are ready from different coroutines. It returns an iterator which yields - in the order in which they are completed. You might also want to see how it differs from asyncio.gather(*coros) which returns once everything submitted to it has fininshed

import asyncio
from asyncio.coroutines import coroutine


@coroutine
def some_work(x, y):
    print("doing some background work")
    yield from asyncio.sleep(1.0)
    return x * y


@coroutine
def some_other_work(x, y):
    print("doing some background other work")
    yield from asyncio.sleep(3.0)
    return x + y


@coroutine
def as_when_completed():
    # give me results as and when they are ready
    coros = [some_work(2, 3), some_other_work(2, 3)]
    for futures in asyncio.as_completed(coros):
        res = yield from futures
        print(res)


@coroutine
def when_all_completed():
    # when everything is complete
    coros = [some_work(2, 3), some_other_work(2, 3)]
    results = yield from asyncio.gather(*coros)
    print(results)


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    # loop.run_until_complete(when_all_completed())
    loop.run_until_complete(as_when_completed())

I implement on_finish function of celery worker to publish a message to redis

then in the main app uses aioredis to subscribe the channel, once got notified, the result is ready

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM