简体   繁体   中英

Celery task does not appear in queue after task.apply_async using countdown

I Hope this question qualifies for stack overflow.

Working with celery 4.2.0 and Redis as broker and backend.

Having a task

@shared_task()
def add(a, b):
    return a+b

And while a worker is active , running the fallowing command:

add.apply_async(countdown=60)

Results in the task not being registered to the default celery queue, but still being executed after the period of time stated in countdown

Why is that, and how can I look for all pending tasks? Doing this would have worked if the task would be registered to the queue:

    with celery_app.pool.acquire(block=True) as conn:
        tasks = conn.default_channel.client.lrange('celery', 0, -1)

If I terminate the worker while task havent been started I get the fallowing:

[WARNING/MainProcess] Restoring 1 unacknowledged message(s)

This tells me the task is kept somewhere else other then the queue, but I cannot figure out where

Turns out this is actually the expected behavior as workers grab pending tasks even before execution to speed things up and thus remove them from queue.

To achieve what I wanted, which is disable workers from taking tasks from queues before actually starting to work on them, and losing track of that task, I had to use 2 celery settings together

task_acks_late = True
worker_prefetch_multiplier = 1

This way a task is removed from its original queue, but still exists in a queue called 'unacked', which allows me to monitor it. Optimizing celery

Be advised that using the 'acks_late' acks late setting has some side affects you need to be aware of, such that if a worker unexpectedly terminated, the task will be restored and re run next time the worker is revived.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM