简体   繁体   English

celery如何将任务结果存储在redis中

[英]how does celery stores task results in redis

I am new to Python and Celery-Redis, so Please correct me if my understanding is incorrect.我是Python和Celery-Redis的新手,所以如果我的理解不正确请指正。

I have been debugging a code base Which has structure like -我一直在调试一个代码库,它的结构如下 -

TaskClass -> Celery Task任务类 -> TaskClass任务

HandlerClass1, HandlerClass2 -> These are python classes extending Object class HandlerClass1, HandlerClass2 -> 这些是 python 类扩展 Object class

The application creates TaskClass say dumyTask instance and dumyTask creates celery subtasks(I believe these subtasks are unique) say dumySubTask1, dumySubTask2 by taking signatures of handler.该应用程序创建TaskClassdumyTask实例和dumyTask创建 celery 子任务(我相信这些子任务是唯一的)说dumySubTask1, dumySubTask2通过获取处理程序的签名。

What I am not able to understand?我无法理解的是什么?

1) How does celery manages the results of dumySubTask1, dumySubTask2 and dumyTask ? 1) celery 如何管理dumySubTask1, dumySubTask2dumyTask的结果? I mean the results of dumySubTask1 and dumySubTask2 should be aggregated and given as result of dumyTask .我的意思是dumySubTask1dumySubTask2的结果应该被聚合并作为dumyTask的结果给出。 How does Celery-Redis manage this? Celery-Redis 如何管理这个?

2) once the task is executed how does celery stores tasks results in backend? 2)任务执行后,celery如何将任务结果存储在后端? I mean will the result of dumySubTask1 and dumySubTask2 be stored in backend and then results returned to dumyTask and then dumyTask return results to QUEUE(Please correct if I am wrong) ?我的意思是将dumySubTask1dumySubTask2的结果存储在后端,然后将结果返回给dumyTask ,然后dumyTask将结果返回给QUEUE(如果我错了请更正)

3) Does Celery maintains Tasks and subtasks as STACK ? 3) Celery 是否将任务和子任务维护为STACK Please see snapshot.请看快照。 Task-SubTask Tree任务子任务树

Any guidance is highly appreciated.非常感谢任何指导。 Thanks.谢谢。

celery worker can invoke 'tasks' . 芹菜工作者可以调用“任务”。 This 'task' can have 'subtasks' which can be 'chained' together ie invokes sequentially. 该“任务”可以具有“子任务”,这些子任务可以“链接”在一起,即顺序调用。 'chain' is the term specifically used in celery canvas guides. “链条”是芹菜帆布指南中专门使用的术语。 The result is then returned to the queue in redis. 然后将结果以Redis的形式返回到队列。

celery worker are use to invoke 'independent tasks' mostly used for 'network use cases' ie 'sending email','hitting url' 芹菜工作者用于调用主要用于“网络用例”的“独立任务”,即“发送电子邮件”,“点击网址”

You need to get it from the celery instance with您需要从 celery 实例中获取它

task = app_celery.AsyncResult(task_id)

Full example - below完整示例 - 下面

My celery_worker.py file is:我的celery_worker.py文件是:

import os
import time

from celery import Celery
from dotenv import load_dotenv

load_dotenv(".env")

celery = Celery(__name__)
celery.conf.broker_url = os.environ.get("CELERY_BROKER_URL")
celery.conf.result_backend = os.environ.get("CELERY_RESULT_BACKEND")


@celery.task(name="create_task")
def create_task(a, b, c):
    print(f"Executing create_task it will take {a}")
    [print(i) for i in range(100)]
    time.sleep(a)
    return b + c

I'm using FastAPI my endpoints are:我正在使用 FastAPI 我的端点是:

# To execute the task
@app.get("/sum")
async def root(sleep_time: int, first_number: int, second_number: int):
    process = create_task.delay(sleep_time, first_number, second_number)
    return {"process_id": process.task_id, "result": process.result}


# To get the task status and result
from celery_worker import create_task, celery
@app.get("/task/{task_id}")
async def check_task_status(task_id: str):
    task = celery.AsyncResult(task_id)
    return {"status": task.status, "result": task.result}

My .env file has:我的.env文件有:

CELERY_BROKER_URL=redis://redis:6379/0
CELERY_RESULT_BACKEND=redis://redis:6379/0

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM