简体   繁体   English

Celery通过链式任务生成小组任务

[英]Celery Generating group tasks from chain task

I am trying to chain following tasks with celery(v4.0), 我正在尝试用celery(v4.0)链接以下任务,

task = group([tasks1.s(), task2.s()) | generate_job_requests.s() | execute_job.map() | aggregate_result.s()
result = task.get()

Above part is working fine upto generate_job_requests as chord. 上面的部分可以正常工作,直到generate_job_requests和弦为止。 But problem starts with execute_job where it gets list of jobs from generate_job_requests , for which I need to create parallel tasks and later on aggregate result of all jobs. 但是问题开始于execute_job ,它从generate_job_requests获取作业列表,为此我需要创建并行任务,然后再创建所有作业的合计结果。

I am trying to validate whether such kind of taskgraph is possible with celery ? 我正在尝试验证芹菜是否可以使用这种任务表? Is there any possible alternate workflow to solve problem with such dependency ? 是否有任何可能的替代工作流程来解决此类依赖性问题? Anything I am missing in documentation. 我在文档中缺少的任何内容。

I used map like functionality with intermediate task creator which acts like chord, 我在中间任务创建者中使用了类似地图的功能,其行为类似于和弦,

@shared_task(ignore_result=False)
def dmap(it, callback, end_task):
    callback = subtask(callback)
    grp = group(callback.clone([arg, ]) for arg in it)
    c = (grp | end_task)
    return c()

So task flow was reduced as this, 因此,任务流因此减少了,

task = (group([tasks1.s(), task2.s()) | generate_job_requests.s() | dmap.s(
        execute_job.s(), aggregate_result.s())).apply_async()

For getting ultimate output of task, I did few tweaks, 为了获得最终的任务输出,我做了一些调整,

# As we get dmap task id here
dmap_task = celery_app.AsyncResult(task.id)
dmap_result = dmap_task.get()
# Get actual aggregate_result task id
aggr_res_task_id = dmap_result[0][0]
result = celery_app.AsyncResult(aggr_res_task_id)
# Here we receive actual output of overall task
result.get()

I referred answer 我提到了答案

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM