简体   繁体   中英

Accessing Celery Group Task Results inside a Celery Worker

I have to spawn certain tasks and have them execute in parallel. However I also need to have all their results of all these updated centrally.

Is it possible to access the results of all these tasks within a parent task somehow? I know I cant call a task_result.get() from a tasks since Celery doesnt allow it, is there any other way to achieve this?

You can make Celery wait for the result of a subtask (see disable_sync_subtasks parameter to get() ), it's just not recommended because you could deadlock the worker (see here for more details). So if you use it, you should know what you are doing.

The recommended way for your use case is to use a chord :

A chord is just like a group but with a callback. A chord consists of a header group and a body, where the body is a task that should execute after all of the tasks in the header are complete.

This would indeed require you to refactor your logic a bit so you don't need the subtasks' results inside the parent task but to process it in the chord's body.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM