There is a tracker class, that just counts success
, failed
, pending
, and started
tasks via redis.
The goal is to extend Celery, so its workers can access the group_id
and keep statistics for the group. I expect an interface similar to:
def on_important_event(...):
group_id=uuid4()
for _ in range(count_of_jobs):
my_task.apply_async(..., group_id=group_id)
custom Task class would look like:
class MyTask(Task):
# declaring group_id somehow
def apply_async(...):
get_tracker(self.request.group_id).task_pending()
...
def before_start(...):
get_tracker(self.request.group_id).task_started()
...
def on_success(...):
get_tracker(self.request.group_id).task_success()
...
def on_failure(...):
get_tracker(self.request.group_id).task_failed()
...
I could not find a way to implement the class so it will properly save and receive custom attribute through AMQP
.
I would recommend a different approach - write a custom monitor (check the Monitoring API document in the official Celery docs). A good starting point: Real-time processing .
This is basically how Flower and Leek work.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.