简体   繁体   English

达到 Django Celery 最大数据库连接数

[英]Django Celery Max DB connections reached

I am running tasks on my celery worker in a django application where each task takes about 1-2 seconds to execute.我在 django 应用程序中对 celery worker 运行任务,其中每个任务执行大约需要 1-2 秒。 Usually these executions are fine but from time to time, especially if the Django application has been deployed for a while, I start seeing errors like this:通常这些执行都很好,但有时,特别是如果 Django 应用程序已经部署了一段时间,我开始看到这样的错误:

File "/usr/lib64/python3.6/site-packages/sqlalchemy/pool/base.py", line 428, in __init__
    self.__connect(first_connect_check=True)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/pool/base.py", line 630, in __connect
    connection = pool._invoke_creator(self)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 114, in connect
    return dialect.connect(*cargs, **cparams)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 453, in connect
    return self.dbapi.connect(*cargs, **cparams)
  File "/usr/lib64/python3.6/site-packages/psycopg2/__init__.py", line 130, in connect
    conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL:  remaining connection slots are reserved for non-replication superuser connections

Which indicates to me that the Celery worker is not closing connections properly.这向我表明 Celery worker 没有正确关闭连接。

I checked the idle connection count on the DB when this error occurred -- there were definitely some connections left so the DB's max connection limit was not reached.发生此错误时,我检查了数据库上的空闲连接计数——肯定还有一些连接剩余,因此未达到数据库的最大连接限制。

My question : How can I ensure that the celery worker is closing DB connections?我的问题:如何确保 celery worker 正在关闭数据库连接?

Celery settings:芹菜设置:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_proj.settings')

_celery_broker = settings.CELERY_BROKER
_result_backend = settings.RESULT_BACKEND

app = Celery('my_proj', broker=_celery_broker, backend=_result_backend)

app.autodiscover_tasks(['common'])

app.conf.update(
    worker_prefetch_multiplier=0,
    event_queue_ttl=0,
    task_acks_late=True,
)

My Django DB settings:我的 Django 数据库设置:

'DATABASES': {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': <...>,
        'USER': <...>,
        'PASSWORD': <...>,
        'HOST': <...>,
        'PORT': 5480,
    }
}

How I start my deployed Django server我如何启动已部署的 Django 服务器

gunicorn --config gunicorn.config my_proj.wsgi:application

gunicorn config gunicorn 配置

bind = '0.0.0.0:8201'
workers = 3
worker_class = 'gthread'
threads = 3
limit_request_line = 0
timeout = 1800

How I start my celery worker:我如何开始我的芹菜工人:

celery -A my_proj worker -l info

I read in the Django docs that if unspecified, the MAX_CONN_AGE setting is by default 0 and from my understanding the celery worker should pick this up as well.我在 Django 文档中读到,如果未指定,则 MAX_CONN_AGE 设置默认为 0 ,根据我的理解,celery worker 也应该选择它。

Probably if you can start sort of pooling or deliberately start and close connection that may help.可能如果您可以开始某种池化或故意启动和关闭连接可能会有所帮助。 Go through this https://code.i-harness.com/en/q/2263d77 the discussion is about dB connection pooling and creating/closing connections for celery tasks.通过这个https://code.i-harness.com/en/q/2263d77讨论是关于 dB 连接池和为 celery 任务创建/关闭连接。 I haven't tried it myself yet.我自己还没有尝试过。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM