简体   繁体   中英

Celery task results not persisted with rpc

I have been trying to get Celery task results to be routed to another process by making results persisted to a queue and another process can pick results from queue. So, have configured Celery as CELERY_RESULT_BACKEND = 'rpc', but still Python function returned value is not persisted to queue.

Not sure if any other configuration or code change required. Please help.

Here is the code example:

celery.py

from __future__ import absolute_import

from celery import Celery

app = Celery('proj',
         broker='amqp://',
         backend='rpc://',
         include=['proj.tasks'])

# Optional configuration, see the application user guide.
app.conf.update(
    CELERY_RESULT_BACKEND = 'rpc',
    CELERY_RESULT_PERSISTENT = True,
    CELERY_TASK_SERIALIZER = 'json',
    CELERY_RESULT_SERIALIZER = 'json'
)

if __name__ == '__main__':
    app.start()

tasks.py

from proj.celery import app

@app.task
def add(x, y):
    return x + y

Running Celery as

celery worker --app=proj -l info --pool=eventlet -c 4

通过使用Pika(AMQP 0-9-1协议的Python实现 - https://pika.readthedocs.org )将结果发布回celeryresults渠道解决

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM