简体   繁体   中英

how does celery stores task results in redis

I am new to Python and Celery-Redis, so Please correct me if my understanding is incorrect.

I have been debugging a code base Which has structure like -

TaskClass -> Celery Task

HandlerClass1, HandlerClass2 -> These are python classes extending Object class

The application creates TaskClass say dumyTask instance and dumyTask creates celery subtasks(I believe these subtasks are unique) say dumySubTask1, dumySubTask2 by taking signatures of handler.

What I am not able to understand?

1) How does celery manages the results of dumySubTask1, dumySubTask2 and dumyTask ? I mean the results of dumySubTask1 and dumySubTask2 should be aggregated and given as result of dumyTask . How does Celery-Redis manage this?

2) once the task is executed how does celery stores tasks results in backend? I mean will the result of dumySubTask1 and dumySubTask2 be stored in backend and then results returned to dumyTask and then dumyTask return results to QUEUE(Please correct if I am wrong) ?

3) Does Celery maintains Tasks and subtasks as STACK ? Please see snapshot. Task-SubTask Tree

Any guidance is highly appreciated. Thanks.

celery worker can invoke 'tasks' . This 'task' can have 'subtasks' which can be 'chained' together ie invokes sequentially. 'chain' is the term specifically used in celery canvas guides. The result is then returned to the queue in redis.

celery worker are use to invoke 'independent tasks' mostly used for 'network use cases' ie 'sending email','hitting url'

You need to get it from the celery instance with

task = app_celery.AsyncResult(task_id)

Full example - below

My celery_worker.py file is:

import os
import time

from celery import Celery
from dotenv import load_dotenv

load_dotenv(".env")

celery = Celery(__name__)
celery.conf.broker_url = os.environ.get("CELERY_BROKER_URL")
celery.conf.result_backend = os.environ.get("CELERY_RESULT_BACKEND")


@celery.task(name="create_task")
def create_task(a, b, c):
    print(f"Executing create_task it will take {a}")
    [print(i) for i in range(100)]
    time.sleep(a)
    return b + c

I'm using FastAPI my endpoints are:

# To execute the task
@app.get("/sum")
async def root(sleep_time: int, first_number: int, second_number: int):
    process = create_task.delay(sleep_time, first_number, second_number)
    return {"process_id": process.task_id, "result": process.result}


# To get the task status and result
from celery_worker import create_task, celery
@app.get("/task/{task_id}")
async def check_task_status(task_id: str):
    task = celery.AsyncResult(task_id)
    return {"status": task.status, "result": task.result}

My .env file has:

CELERY_BROKER_URL=redis://redis:6379/0
CELERY_RESULT_BACKEND=redis://redis:6379/0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM