简体   繁体   中英

Celery rate_limit affecting multiple tasks

I have a setup with rabbitmq and celery, with workers running on 4 machines with 4 instances each. I have two task functions defined, which basically call the same backend function, but one of them named process_transaction with no rate_limit defined, and another called slow_process_transaction , with rate_limit="6/m" . The tasks go to different queues on rabbitmq, slow and normal .

The strange thing is the rate_limit being enforced for both tasks. If I try to change the rate_limit using celery.control.rate_limit , doing it with the process_transaction doesn't change the effective rate, and using the slow_process_transaction name changes the effective rate for both.

Any ideas on what is wrong?

By reading the bucket source code I figured out celery implements rate limiting by sleeping the time delta after finishing a task, so if you mix tasks with different rate limits in the same workers, they affect each other.

Separating the workers solved my problem, but it's not the optimal solution.

You can separate the workers by using node names and named parameters on your celeryd call. For instance, you have nodes 'fast' and 'slow', and you want them to consume separate queues with concurrency 5 and 1 respectively:

celeryd <other opts> -Q:fast fast_queue -c:fast 5 -Q:slow slow_queue -c:slow 1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM