简体   繁体   English

Django Celery定期任务运行但RabbitMQ队列不被消耗

[英]Django Celery Periodic Tasks Run But RabbitMQ Queues Aren't Consumed

Question

After running tasks via celery's periodic task scheduler, beat, why do I have so many unconsumed queues remaining in RabbitMQ? 在通过celery的周期性任务调度程序运行任务后,为什么我在RabbitMQ中有这么多未使用的队列?

Setup 建立

  • Django web app running on Heroku 在Heroku上运行的Django Web应用程序
  • Tasks scheduled via celery beat 通过芹菜打败的任务
  • Tasks run via celery worker 任务通过芹菜工人运行
  • Message broker is RabbitMQ from ClouldAMQP 消息代理是来自ClouldAMQP的RabbitMQ

Procfile Procfile

web: gunicorn --workers=2 --worker-class=gevent --bind=0.0.0.0:$PORT project_name.wsgi:application
scheduler: python manage.py celery worker --loglevel=ERROR -B -E --maxtasksperchild=1000
worker: python manage.py celery worker -E --maxtasksperchild=1000 --loglevel=ERROR

settings.py settings.py

CELERYBEAT_SCHEDULE = {
    'do_some_task': {
        'task': 'project_name.apps.appname.tasks.some_task',
        'schedule': datetime.timedelta(seconds=60 * 15),
        'args': ''
    },
}

tasks.py tasks.py

@celery.task
def some_task()
    # Get some data from external resources
    # Save that data to the database
    # No return value specified

Result 结果

Every time the task runs, I get (via the RabbitMQ web interface): 每次任务运行时,我都会(通过RabbitMQ Web界面):

  • An additional message in the "Ready" state under my "Queued Messages" 我的“排队消息”下的“就绪”状态中的另一条消息
  • An additional queue with a single message in the "ready" state 一个附加队列,其中一条消息处于“就绪”状态
    • This queue has no listed consumers 此队列没有列出的使用者

It ended up being my setting for CELERY_RESULT_BACKEND . 它最终成为我对CELERY_RESULT_BACKEND设置。

Previously, it was: 以前,它是:

CELERY_RESULT_BACKEND = 'amqp'

I no longer had unconsumed messages / queues in RabbitMQ after I changed it to: 在我将其更改为:RabbitMQ后,我不再拥有未使用的消息/队列:

CELERY_RESULT_BACKEND = 'database'

What was happening, it would appear, is that after a task was executed, celery was sending info about that task back via rabbitmq, but, there was nothing setup to consume these responses messages, hence a bunch of unread ones ending up in the queue. 发生的事情似乎是,在执行任务之后,芹菜通过rabbitmq发回有关该任务的信息,但是,没有任何设置来消耗这些响应消息,因此一堆未读的消息最终在队列中。

NOTE: This means that celery would be adding database entries recording the outcomes of tasks. 注意:这意味着芹菜将添加记录任务结果的数据库条目。 To keep my database from getting loaded up with useless messages, I added: 为了防止我的数据库被无用的消息加载,我补充说:

# Delete result records ("tombstones") from database after 4 hours
# http://docs.celeryproject.org/en/latest/configuration.html#celery-task-result-expires
CELERY_TASK_RESULT_EXPIRES = 14400

Relevant parts from Settings.py 来自Settings.py的相关部分

########## CELERY CONFIGURATION
import djcelery
# https://github.com/celery/django-celery/
djcelery.setup_loader()

INSTALLED_APPS = INSTALLED_APPS + (
    'djcelery',
)

# Compress all the messages using gzip
# http://celery.readthedocs.org/en/latest/userguide/calling.html#compression
CELERY_MESSAGE_COMPRESSION = 'gzip'

# See: http://docs.celeryproject.org/en/latest/configuration.html#broker-transport
BROKER_TRANSPORT = 'amqplib'

# Set this number to the amount of allowed concurrent connections on your AMQP
# provider, divided by the amount of active workers you have.
#
# For example, if you have the 'Little Lemur' CloudAMQP plan (their free tier),
# they allow 3 concurrent connections. So if you run a single worker, you'd
# want this number to be 3. If you had 3 workers running, you'd lower this
# number to 1, since 3 workers each maintaining one open connection = 3
# connections total.
#
# See: http://docs.celeryproject.org/en/latest/configuration.html#broker-pool-limit
BROKER_POOL_LIMIT = 3

# See: http://docs.celeryproject.org/en/latest/configuration.html#broker-connection-max-retries
BROKER_CONNECTION_MAX_RETRIES = 0

# See: http://docs.celeryproject.org/en/latest/configuration.html#broker-url
BROKER_URL = os.environ.get('CLOUDAMQP_URL')

# Previously, had this set to 'amqp', this resulted in many read / unconsumed
# queues and messages in RabbitMQ
# See: http://docs.celeryproject.org/en/latest/configuration.html#celery-result-backend
CELERY_RESULT_BACKEND = 'database'

# Delete result records ("tombstones") from database after 4 hours
# http://docs.celeryproject.org/en/latest/configuration.html#celery-task-result-expires
CELERY_TASK_RESULT_EXPIRES = 14400
########## END CELERY CONFIGURATION

Looks like you are getting back responses from your consumed tasks. 看起来你正在从你消耗的任务中得到回复。

You can avoid that by doing: 你可以这样做:

@celery.task(ignore_result=True)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM