I'm close to getting Celery to work with my Django+Docker-Compose project, but I have a problem where the worker never recognizes the task given to it. The basic idea is that I have a function insertIntoDatabase
that is called from a task:
myapp/tasks.py:
@task(name='tasks.db_ins')
def db_ins_task(datapoints, user, description):
from utils.db.databaseinserter import insertIntoDatabase
insertIntoDatabase(datapoints, user, description)
And in views.py
, I do:
from .tasks import db_ins_task
...
db_ins_task.delay(datapoints, user, description)
datapoints
is basically a list of dictionaries and user
and description
are just strings. The problem is, when the Celery worker container starts, this db_ins_task
is never found as one of the listed tasks, so when I try to upload anything to my website, I get the following sort of error:
worker_1 | [2015-09-25 19:38:00,205: ERROR/MainProcess] Received unregistered task of type u'tasks.db_ins'.
worker_1 | The message has been ignored and discarded.
worker_1 |
worker_1 | Did you remember to import the module containing this task?
worker_1 | Or maybe you are using relative imports?
worker_1 | Please see http://bit.ly/gLye1c for more information.
...
worker_1 | Traceback (most recent call last):
worker_1 | File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 455, in on_task_received
worker_1 | strategies[name](message, body,
worker_1 | KeyError: u'tasks.db_ins'
I've been trying to get the worker to recognize the task, including adding this setting to settings.py
:
CELERY_IMPORTS = ('myapp.tasks',)
I added some debug logging to tasks.py
to make sure that it wasn't being completely missed, and I can confirm that every time I try to run the task, the logger reports that tasks.py
is being run. For reference, here's the worker
container in docker-compose.yml
:
worker:
build: .
links:
- redis
command: bash -c "celery -A myproj worker --app=taskman.celery --loglevel=DEBUG"
celery.py
is in a separate app named taskman
. What exactly am I not doing right that would be causing this error with the tasks?
In you question you show starting your worker with:
celery -A myproj worker --app=taskman.celery --loglevel=DEBUG
Now, the problem is that -A
and --app
mean the same thing. So this suggests to me that you've been waffling between using myproj
or taskman.celery
as the holder of your Celery app. Your worker uses taskman.celery
, because from testing I've found that if any combination of -A
or --app
are given to a single worker invocation only the last one is used.
This being said, there is one way I can imagine your problem happening. If your myapp/tasks.py
file gets the task
decorator from myproj.celery.app
rather than taskman.celery.app
, you'd be registering your tasks with the wrong app.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.