简体   繁体   中英

How to combine Celery with asyncio?

How can I create a wrapper that makes celery tasks look like asyncio.Task ? Or is there a better way to integrate Celery with asyncio ?

@asksol, the creator of Celery, said this: :

It's quite common to use Celery as a distributed layer on top of async I/O frameworks (top tip: routing CPU-bound tasks to a prefork worker means they will not block your event loop).

But I could not find any code examples specifically for asyncio framework.

EDIT: 01/12/2021 previous answer (find it at the bottom) didn't age well therefore I added a combination of possible solutions that may satisfy those who still look on how to co-use asyncio and Celery

Lets quickly break up the use cases first (more in-depth analysis here: asyncio and coroutines vs task queues ):

  • If the task is I/O bound then it tends to be better to use coroutines and asyncio.
  • If the task is CPU bound then it tends to be better to use Celery or other similar task management systems.

So it makes sense in the context of Python's "Do one thing and do it well" to not try and mix asyncio and celery together.

BUT what happens in cases where we want to be able to run a method both asynchronously and as an async task? then we have some options to consider:

  • The best example that I was able to find is the following: https://johnfraney.ca/posts/2018/12/20/writing-unit-tests-celery-tasks-async-functions/ (and I just found out that it is @Franey's response ):

    1. Define your async method.

    2. Use asgiref 's sync.async_to_sync module to wrap the async method and run it synchronously inside a celery task:

       # tasks.py import asyncio from asgiref.sync import async_to_sync from celery import Celery app = Celery('async_test', broker='a_broker_url_goes_here') async def return_hello(): await asyncio.sleep(1) return 'hello' @app.task(name="sync_task") def sync_task(): async_to_sync(return_hello)()
  • A use case that I came upon in a FastAPI application was the reverse of the previous example:

    1. An intense CPU bound process is hogging up the async endpoints.

    2. The solution is to refactor the async CPU bound process into a celery task and pass a task instance for execution from the Celery queue.

    3. A minimal example for visualization of that case:

       import asyncio import uvicorn from celery import Celery from fastapi import FastAPI app = FastAPI(title='Example') worker = Celery('worker', broker='a_broker_url_goes_here') @worker.task(name='cpu_boun') def cpu_bound_task(): # Does stuff but let's simplify it print([n for n in range(1000)]) @app.get('/calculate') async def calculate(): cpu_bound_task.delay() if __name__ == "__main__": uvicorn.run('main:app', host='0.0.0.0', port=8000)
  • Another solution seems to be what @juanra and @danius are proposing in their answers, but we have to keep in mind that performance tends to take a hit when we intermix sync and async executions, thus those answers need monitoring before we can decide to use them in a prod environment.

Finally, there are some ready-made solutions, that I cannot recommend (because I have not used them myself) but I will list them here:

  • Celery Pool AsyncIO which seems to solve exactly what Celery 5.0 didn't, but keep in mind that it seems a bit experimental (version 0.2.0 today 01/12/2021)
  • aiotasks claims to be "a Celery like task manager that distributes Asyncio coroutines" but seems a bit stale (latest commit around 2 years ago)

Well that didn't age so well did it? Version 5.0 of Celery didn't implement asyncio compatibility thus we cannot know when and if this will ever be implemented... Leaving this here for response legacy reasons (as it was the answer at the time) and for comment continuation.

That will be possible from Celery version 5.0 as stated on the official site:

http://docs.celeryproject.org/en/4.0/whatsnew-4.0.html#preface

  1. The next major version of Celery will support Python 3.5 only, where we are planning to take advantage of the new asyncio library.
  2. Dropping support for Python 2 will enable us to remove massive amounts of compatibility code, and going with Python 3.5 allows us to take advantage of typing, async/await, asyncio, and similar concepts there's no alternative for in older versions.

The above was quoted from the previous link.

So the best thing to do is wait for version 5.0 to be distributed!

In the meantime, happy coding :)

This simple way worked fine for me:

import asyncio
from celery import Celery

app = Celery('tasks')

async def async_function(param1, param2):
    # more async stuff...
    pass

@app.task(name='tasks.task_name', queue='queue_name')
def task_name(param1, param2):
    asyncio.run(async_function(param1, param2))

You can wrap any blocking call into a Task using run_in_executor as described in documentation , I also added in the example a custom timeout :

def run_async_task(
    target,
    *args,
    timeout = 60,
    **keywords
) -> Future:
    loop = asyncio.get_event_loop()
    return asyncio.wait_for(
        loop.run_in_executor(
            executor,
            functools.partial(target, *args, **keywords)
        ),
        timeout=timeout,
        loop=loop
    )
loop = asyncio.get_event_loop()
async_result = loop.run_until_complete(
    run_async_task, your_task.delay, some_arg, some_karg="" 
)
result = loop.run_until_complete(
    run_async_task, async_result.result 
)

Here is a simple helper that you can use to make a Celery task awaitable:

import asyncio
from asgiref.sync import sync_to_async

# Converts a Celery tasks to an async function
def task_to_async(task):
    async def wrapper(*args, **kwargs):
        delay = 0.1
        async_result = await sync_to_async(task.delay)(*args, **kwargs)
        while not async_result.ready():
            await asyncio.sleep(delay)
            delay = min(delay * 1.5, 2)  # exponential backoff, max 2 seconds
        return async_result.get()
    return wrapper

Like sync_to_async , it can be used as a direct wrapper:

@shared_task
def get_answer():
    sleep(10) # simulate long computation
    return 42    

result = await task_to_async(get_answer)()

...and as a decorator:

@task_to_async
@shared_task
def get_answer():
    sleep(10) # simulate long computation
    return 42    

result = await get_answer()

Of course, this is not a perfect solution since it relies on polling . However, it should be a good workaround to call Celery tasks from Django async views until Celery officially provides a better solution .

EDIT 2021/03/02: added the call to sync_to_async to support eager mode .

The cleanest way I've found to do this is to wrap the async function in asgiref.sync.async_to_sync (from asgiref ):

from asgiref.sync import async_to_sync
from celery.task import periodic_task


async def return_hello():
    await sleep(1)
    return 'hello'


@periodic_task(
    run_every=2,
    name='return_hello',
)
def task_return_hello():
    async_to_sync(return_hello)()

I pulled this example from a blog post I wrote.

我通过在celery-pool-asyncio库中结合 Celery 和 asyncio 解决了问题。

A nice way to implement Celery with asyncio:

import asyncio
from celery import Celery

app = Celery()

async def async_function(param):
    print('do something')

@app.task()
def celery_task(param):
    loop = asyncio.get_event_loop()
    return loop.run_until_complete(async_function(param))

Here's my implementation of Celery handling async coroutines when necessary:

Wrap the Celery class to extend its functionnality:

from celery import Celery
from inspect import isawaitable
import asyncio


class AsyncCelery(Celery):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.patch_task()

        if 'app' in kwargs:
            self.init_app(kwargs['app'])

    def patch_task(self):
        TaskBase = self.Task

        class ContextTask(TaskBase):
            abstract = True

            async def _run(self, *args, **kwargs):
                result = TaskBase.__call__(self, *args, **kwargs)
                if isawaitable(result):
                    await result

            def __call__(self, *args, **kwargs):
                asyncio.run(self._run(*args, **kwargs))

        self.Task = ContextTask

    def init_app(self, app):
        self.app = app

        conf = {}
        for key in app.config.keys():
            if key[0:7] == 'CELERY_':
                conf[key[7:].lower()] = app.config[key]

        if 'broker_transport_options' not in conf and conf.get('broker_url', '')[0:4] == 'sqs:':
            conf['broker_transport_options'] = {'region': 'eu-west-1'}

        self.config_from_object(conf)


celery = AsyncCelery()

Simplest approach (IMHO) which worked out for me, used python version is 3.9.6

TL, DR: Run asyncio eventloop forever in different thread, start id and stop for every celery worker thread. Assign awaitable things via asyncio tasks. Blockingly wait for async task execution by Future API

import asyncio
from threading import Thread
from celery import Celery

app = Celery("workers", backend="rpc://", broker="pyamqp://guest:guest@localhost/")


# 1. define new thread,  loop which would be launched in new thread


def thread_function(loop: asyncio.AbstractEventLoop):
    asyncio.set_event_loop(loop)
    loop.run_forever()


loop = asyncio.get_event_loop()
thread = Thread(target=thread_function, args=(loop,))
# NOTE: it's not started yet

# 2. define your async infinite tasks (optional)

# this is async iteration example
async def my_iterate_async_infinite():
    count = 0
    while True:
        await asyncio.sleep(1)
        print(f"async iteration {count=}")
        count += 1


# 3. subscribe for celery worker signals to start and stop thread

from celery.signals import worker_process_init, worker_process_shutdown


@worker_process_init.connect
def handle_worker_process_init(sender, **_):
    thread.start()
    asyncio.run_coroutine_threadsafe(my_iterate_async_infinite(), loop)


@worker_process_shutdown.connect
def handle(sender, **_):
    # uncomment and use if you need async teardown
    # asyncio.run_coroutine_threadsafe(example_call_async_teardown_function(), loop).result()

    loop.stop()
    thread.join()


# 4. call async function and wait for it's result, within task


@app.task()
def my_task():
    print("Call async function within task")

    async def do_async_func():
        await asyncio.sleep(1)
        return "my_async_result"

    res = asyncio.run_coroutine_threadsafe(do_async_func(), loop=loop).result()
    print(f"{res=}")  # would be "me_async_result"

For anyone who stumbles on this looking for help specifically with async sqlalchemy (ie, using the asyncio extension) and Celery tasks, explicitly disposing of the engine will fix the issue. This particular example worked with asyncpg .

Example:

from sqlalchemy.ext.asyncio import (
    AsyncSession,
    create_async_engine,
)
from sqlalchemy.orm import sessionmaker
from asgiref.sync import async_to_sync


engine = create_async_engine("some_uri", future=True)
async_session_factory = sessionmaker(engine, expire_on_commit=False, class_=AsyncSession)


@celery_app.task(name="task-name")
def sync_func() -> None:
    async_to_sync(some_func)()


async def some_func() -> None:
    async with get_db_session() as session:
        result = await some_db_query(session)
    # engine.dispose will be called on exit


@contextlib.asynccontextmanager
async def get_db_session() -> AsyncGenerator:
    try:
        db = async_session_factory()
        yield db
    finally:
        await db.close()
        await engine.dispose()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM