簡體   English   中英

Docker 容器中的 Celery:錯誤/MainProcess 消費者:無法連接到 redis

[英]Celery in Docker container: ERROR/MainProcess consumer: Cannot connect to redis

對此感到非常沮喪,幾天來一直試圖讓它發揮作用。 我求救。

這是一個帶有 Postgres、Celery 和 Docker 的 Django 項目。 一開始我用RabbitMQ試了一下,和現在用Redis一樣的錯誤,后來多次嘗試改用redis還是一樣,所以我認為問題出在Celery上,而不是RabbitMQ/Redis。

Dockerfile:

FROM python:3.8.5-alpine

ENV PYTHONUNBUFFERED 1

RUN apk update \
    # psycopg2 dependencies
    && apk add --virtual build-deps gcc python3-dev musl-dev \
    && apk add postgresql-dev \
    # Pillow dependencies
    && apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev \
    # Translation dependencies
    && apk add gettext \
    # CFFI dependencies
    && apk add libffi-dev py-cffi \
    && apk add --no-cache openssl-dev libffi-dev \
    && apk add --no-cache --virtual .pynacl_deps build-base python3-dev libffi-dev

RUN mkdir /app
WORKDIR /app
COPY requirements.txt /app/
RUN pip install -r requirements.txt
COPY . /app/

docker-compose.yml:

version: '3'

volumes:
  local_postgres_data: {}

services:
  postgres:
    image: postgres
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
    volumes:
      - local_postgres_data:/var/lib/postgresql/data
    env_file:
      - ./.envs/.postgres

  django: &django
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/app/
    ports:
      - "8000:8000"
    depends_on:
      - postgres

  redis:
    image: redis:6.0.8

  celeryworker:
    <<: *django
    image: pyrty_celeryworker
    depends_on:
      - redis
      - postgres
    ports: []
    command: celery -A pyrty worker -l INFO

  celerybeat:
    <<: *django
    image: pyrty_celerybeat
    depends_on:
      - redis
      - postgres
    ports: []
    command: celery -A pyrty beat -l INFO

pyrty/pyrty/celery.py:

from __future__ import absolute_import, unicode_literals

import os

from celery import Celery


os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'pyrty.settings')

app = Celery('pyrty')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

pyrty/pyrty/settings.py:

# Celery conf
CELERY_BROKER_URL = 'redis://127.0.0.1:6379/0' #also tried localhost and
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0' #also tried without the '/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'America/Argentina/Buenos_Aires'

pyty/pyrty/ init .py:

from __future__ import absolute_import, unicode_literals

from .celery import app as celery_app


__all__ = ('celery_app',)

要求.txt:

Django==3.1
psycopg2==2.8.3
djangorestframework==3.11.0
celery==4.4.7
redis==3.5.3
Pillow==7.1.2
django-extensions==2.2.9
amqp==2.6.1
billiard==3.6.3
kombu==4.6.11
vine==1.3.0
pytz==2020.1

這就是所有配置,然后當我執行docker-compose up我會在終端中得到以下信息(關於 Celery 和 Redis):

redis_1         | 1:M 19 Sep 2020 18:09:08.117 # Server initialized
redis_1         | 1:M 19 Sep 2020 18:09:08.117 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1         | 1:M 19 Sep 2020 18:09:08.118 * Loading RDB produced by version 6.0.8
redis_1         | 1:M 19 Sep 2020 18:09:08.118 * RDB age 16 seconds
redis_1         | 1:M 19 Sep 2020 18:09:08.118 * RDB memory usage when created 0.77 Mb
redis_1         | 1:M 19 Sep 2020 18:09:08.118 * DB loaded from disk: 0.000 seconds
redis_1         | 1:M 19 Sep 2020 18:09:08.118 * Ready to accept connections

celeryworker_1  |  
celeryworker_1  |  -------------- celery@f334b468b079 v4.4.7 (cliffs)
celeryworker_1  | --- ***** ----- 
celeryworker_1  | -- ******* ---- Linux-5.4.0-47-generic-x86_64-with 2020-09-19 18:09:16
celeryworker_1  | - *** --- * --- 
celeryworker_1  | - ** ---------- [config]
celeryworker_1  | - ** ---------- .> app:         pyrty:0x7fd280ac7640
celeryworker_1  | - ** ---------- .> transport:   redis://127.0.0.1:6379/0
celeryworker_1  | - ** ---------- .> results:     redis://127.0.0.1:6379/0
celeryworker_1  | - *** --- * --- .> concurrency: 6 (prefork)
celeryworker_1  | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celeryworker_1  | --- ***** ----- 
celeryworker_1  |  -------------- [queues]
celeryworker_1  |                 .> celery           exchange=celery(direct) key=celery
celeryworker_1  |                 
celeryworker_1  | 
celeryworker_1  | [tasks]
celeryworker_1  |   . pyrty.celery.debug_task
celeryworker_1  | 
celeryworker_1  | [2020-09-19 18:09:16,865: ERROR/MainProcess] consumer: Cannot connect to redis://127.0.0.1:6379/0: Error 111 connecting to 127.0.0.1:6379. Connection refused..
celeryworker_1  | Trying again in 2.00 seconds... (1/100)
celeryworker_1  | 
celeryworker_1  | [2020-09-19 18:09:18,871: ERROR/MainProcess] consumer: Cannot connect to redis://127.0.0.1:6379/0: Error 111 connecting to 127.0.0.1:6379. Connection refused..
celeryworker_1  | Trying again in 4.00 seconds... (2/100)

我真的不明白我錯過了什么,我一直在閱讀文檔,但我無法解決這個問題。 請幫忙!

嘗試更新您的應用設置以使用 redis 主機名作為redis而不是127.0.0.1

# Celery conf
CELERY_BROKER_URL = 'redis://redis:6379/0'
CELERY_RESULT_BACKEND = 'redis://redis:6379/0'

參考:

每個容器現在都可以查找主機名 web 或 db 並取回相應容器的 IP 地址。 例如,web 的應用程序代碼可以連接到 URL postgres://db:5432 並開始使用 Postgres 數據庫。

https://docs.docker.com/compose/networking/

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM