简体   繁体   中英

Celery + Redis losing connection

I have a very simple Celery task that runs a (long running) shell script:

import os
from celery import Celery

os.environ['CELERY_TIMEZONE'] = 'Europe/Rome'
os.environ['TIMEZONE'] = 'Europe/Rome'

app = Celery('tasks', backend='redis', broker='redis://OTHER_SERVER:6379/0')

@app.task(name='ct.execute_script')
def execute_script(command):
    return os.system(command)

I have this task running on server MY_SERVER and I launch it from OTHER_SERVER where is also running the Redis database. The task seems to run successfully (I see the result of executing the script on the filesystem) but the I always start getting the following error:

INTERNAL ERROR: ConnectionError('Error 111 connecting to localhost:6379. Connection refused.',)

What could it be? Why is it trying to contact localhost while I've set the Redis server to be redis://OTHER_SERVER:6379/0 and it works (since the task is launched)? Thanks

When you set the backend argument, Celery will use it as the result backend.
On your code, you tell Celery to use local redis server as the result backend.

You seen ConnectionError , because celery can't save the reult to local redis server.
You can disable result backend or start an local redis server or set it to OTHER_SERVER.

ref:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM