简体   繁体   中英

How can I ensure that sub-tasks of a Celery task go into the same queue as the parent task?

I have a celery task that can potentially queue up other sub-tasks. If a worker pulls that task from a high-priority queue, and then that task queues up other tasks, I want the new tasks to go back into the high-priority queue. But how can I programmatically get the queue that a currently-executing task came from?

I know I could do something like pass an additional parameter, to the original my_task.apply_async() call, that specifies a queue to use for sub-tasks, and then I could pass that through the chain of methods/classes that crunch through the task, but that seems messy and hard to maintain. It seems like the queue information would be available somewhere simply by asking Celery.

I found that the queue information is available via current_task.request.delivery_info['exchange'] .
So, the solution I ended up using is as follows:

def get_source_queue(default=None):
    """
    Finds and returns the queue that the currently-executing task (if any) came from.
    """
    from celery import current_task
    if current_task is not None and 'exchange' in current_task.request.delivery_info:
        source_queue = current_task.request.delivery_info['exchange']
        if source_queue is not None:
            return source_queue
    return default

and then I use that with sub-tasks like this:

my_task.apply_async(args=('my', 'args'), queue=get_source_queue(default='foo_queue'))


I don't know if that's the best way to do it... maybe there's something built right into celery that says "use the same queue as the source queue"(?) But, the above works.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM