简体   繁体   中英

how to profile memory usage of a celery task?

I have a django application that runs background tasks using the celery lib and I need to obtain and store the max memory usage of a task.

I've tried memory_usage from memory_profiler library, but I can not use this function inside a task because I get the error: "daemonic processes not allowed have children". I've also tried the memory_usage function outside the task, to monitor the task.async call, but for some reason the task is triggered twice.

All the other ways I found out there consist of checking the memory usage in different places of the code and then getting the maximum, but I have the feeling that it is very inaccurate and there are probably some calls that have a high memory usage that is left out because of garbage collection before I manage to check the current memory usage.

the official documentation has some useful functions but it would have to rely on the method above. https://docs.celeryproject.org/en/latest/reference/celery.utils.debug.html

Thanks in advance!

Why not a controller tasks?

Celery infrastructure let to query the current status of all workers:

from celery import Celery
app = Celery(...)
app.control.inspect().active()

This can be used inside a task to poll every # sec the cluster and understand what's happening. I've used a similar approach to identify and send the kill() command between tasks. My tasks are killable so each of them know how to handle the soft kill.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM