I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online.
Code example
def do_tasks(b):
for a in b:
c.delay(a)
return c.all_results_some_how()
For Celery >= 3.0 , TaskSet is deprecated in favour of group .
from celery import group
from tasks import add
job = group([
add.s(2, 2),
add.s(4, 4),
add.s(8, 8),
add.s(16, 16),
add.s(32, 32),
])
Start the group in the background:
result = job.apply_async()
Wait:
result.join()
Task.delay
returns AsyncResult
. Use AsyncResult.get
to get result of each task.
To do that you need to keep references to the tasks.
def do_tasks(b):
tasks = []
for a in b:
tasks.append(c.delay(a))
return [t.get() for t in tasks]
Or you can use ResultSet
:
UPDATE : ResultSet
is deprecated, please see @laffuste 's answer .
def do_tasks(b):
rs = ResultSet([])
for a in b:
rs.add(c.delay(a))
return rs.get()
I have a hunch you are not really wanting the delay but the async feature of Celery.
I think you really want a TaskSet :
from celery.task.sets import TaskSet
from someapp.tasks import sometask
def do_tasks(b):
job = TaskSet([sometask.subtask((a,)) for a in b])
result = job.apply_async()
# might want to handle result.successful() == False
return result.join()
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.