简体   繁体   English

Django中带有Redis代理的Celery:任务成功执行,但是仍然存在太多持久性Redis密钥和连接

[英]Celery with Redis broker in Django: tasks successfully execute, but too many persistent Redis keys and connections remain

Our Python server (Django 1.11.17) uses Celery 4.2.1 with Redis as the broker (the pip redis package we're using is 3.0.1). 我们的Python服务器(Django 1.11.17)使用带有Redis的Celery 4.2.1作为代理(我们使用的pip redis软件包是3.0.1)。 The Django app is deployed to Heroku, and the Celery broker was set up using Heroku's Redis Cloud add-on. Django应用程序已部署到Heroku,而Celery经纪人是使用Heroku的Redis Cloud插件设置的。

The Celery tasks we have should definitely have completed within a minute (median completion time is ~100 ms), but we're seeing that Redis keys and connections are persisting for much, much longer than that (up to 24 hours). 我们已经完成的Celery任务肯定应该在一分钟内完成(中值完成时间约为100毫秒),但是我们发现Redis密钥和连接的持久性要长得多(长达24小时)。 Otherwise, tasks are being executed correctly. 否则,任务将正确执行。

What can be happening that's causing these persisting keys and connections in our Redis broker? 会发生什么,导致这些Redis代理中的持久键和连接持久化? How can we clear them when Celery tasks conclude? 当芹菜任务完成时,我们如何清除它们?

Here's a Redis Labs screenshot of this happening (all tasks should have completed, so we'd expect zero keys and zero connections): 这是Redis Labs发生的这种情况的屏幕截图(所有任务都应该已经完成​​,因此我们期望零键和零连接):

Redis Labs屏幕截图

Resolved my own question: if the CELERY_IGNORE_RESULT config variable is set to True (which I'm able to do because I don't use any return values from my tasks), then the keys and connections are back under control. 解决了我自己的问题:如果将CELERY_IGNORE_RESULT配置变量设置为True (我能够执行此操作,因为我不使用任务中的任何返回值),那么键和连接将重新受到控制。

Source: Celery project documentation 资料来源: 芹菜项目文档

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Celery:Redis 作为经纪人离开任务元键 - Celery: Redis as broker leaving task meta keys 用芹菜(Broker redis)在django中实现zeromq发布者 - Implement zeromq publisher in django with celery (Broker redis) 如何使用 Redis 作为代理清除 celery 队列中的任务 - How to purge tasks in celery queues using Redis as the broker Python Redis和celery客户端过多,每次执行时都会出现不同的错误| 任务使用pymsql连接到MySQL - Python redis and celery too many clients, different errors on each execution | Tasks connect to MySQL using pymsql 芹菜与redis代理可见性超时 - celery with redis broker visibility timeout 多个服务器上的django-celery基础架构,代理为redis - django-celery infrastructure over multiple servers, broker is redis Celery + Redis在不同文件中执行任务 - Celery + Redis tasks in different files Django,Celery,Redis,RabbitMQ:Fanout-On-Writes的链式任务 - Django, Celery, Redis, RabbitMQ: Chained Tasks for Fanout-On-Writes 设置 docker-compose.yml 以运行 celery worker 和 celery beat 以 redis 作为代理的 django 项目 - Setting up docker-compose.yml to run celery worker and celery beat for a django project with redis as broker RedisClusterException: 使用连接池时 redis 中的连接过多 - RedisClusterException: Too many connections in redis while using connection pool
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM