简体   繁体   中英

Access partial results of a Celery task

I'm not a Python expert, however I'm trying to develop some long-running Celery-based tasks which I'm able to access their partial results instead of waiting for the tasks to finish.

As you can see in the code below, given a multiplier, an initial and final range, the worker creates a list of size final_range - initial_range + 1 .

from celery import Celery
app = Celery('trackers', backend='amqp', broker='amqp://')

@app.task
def worker(value, initial_range, final_range):
    if initial_range < final_range
        list_values = []
        for index in range(initial_frame, final_frame + 1):
            list_values.append(value * index)
        return list_values
    else
        return None

So, instead of waiting for all four workers to finish, I would like to access the to-be-returned values ( list_values ) before they are actually returned.

from trackers import worker

res_1 = worker.delay(3, 10, 10000000)
res_2 = worker.delay(5, 01, 20000000)
res_3 = worker.delay(7, 20, 50000000)
res_4 = worker.delay(9, 55, 99999999)

First of all, is it possible? If so, what sort of changes do I have to perform to make it work?

You absolutely need to use an external storage such as SQL or Redis/Memcached because different tasks can be executed on different servers in common case.

So in your example you should store list_values in some DB and update it during the loop.

I suggest you split the work performed in each task.

If you need the first values for computing the values later in the list, then you can chain the tasks via link parameter. See http://docs.celeryproject.org/en/latest/userguide/canvas.html#callbacks

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM