簡體   English   中英

泊塢窗中的芹菜工人將無法獲得正確的消息代理

[英]Celery worker in docker won't get correct message broker

我正在使用應用程序工廠模式創建Flask服務,並且需要將celery用於異步任務。 我還使用docker和docker-compose包含並運行所有內容。 我的結構如下所示:

server
 |
 +-- manage.py
 +-- docker-compose.yml
 +-- requirements.txt
 +-- Dockerfile
 |    
 +-- project
 |  |  
 |  +-- api
 |      |
 |      +--tasks.py
 |
 |  +-- __init__.py

我的tasks.py文件如下所示:

from project import celery_app

@celery_app.task
def celery_check(test):
    print(test)

我調用manage.py運行,如下所示:

# manage.py

from flask_script import Manager
from project import create_app

app = create_app()
manager = Manager(app)

if __name__ == '__main__':
    manager.run()

我的__init__.py看起來像這樣:

# project/__init__.py

import os
import json
from flask_mongoalchemy import MongoAlchemy
from flask_cas import CAS
from flask import Flask
from itsdangerous import JSONWebSignatureSerializer as JWT
from flask_httpauth import HTTPTokenAuth
from celery import Celery

# instantiate the database and CAS
db = MongoAlchemy()
cas = CAS()

# Auth stuff (ReplaceMe is replaced below in create_app())
jwt = JWT("ReplaceMe")
auth = HTTPTokenAuth('Bearer')
celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL"))


def create_app():
    # instantiate the app
    app = Flask(__name__, template_folder='client/templates', static_folder='client/static')

    # set config
    app_settings = os.getenv('APP_SETTINGS')
    app.config.from_object(app_settings)

    # Send new static files every time if debug is enabled
    if app.debug:
        app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0

    # Get the secret keys
    parse_secret(app.config['CONFIG_FILE'], app)

    celery_app.conf.update(app.config)
    print(celery_app.conf)

    # set up extensions
    db.init_app(app)
    cas.init_app(app)
    # Replace the secret key with the app's
    jwt.secret_key = app.config["SECRET_KEY"]

    parse_config(app.config['CONFIG_FILE'])

    # register blueprints
    from project.api.views import twist_blueprint
    app.register_blueprint(twist_blueprint)

    return app

在我的docker-compose中,我啟動一個worker並定義一些環境變量,如下所示:

version: '2.1'

services:
  twist-service:
    container_name: twist-service
    build: .
    volumes:
      - '.:/usr/src/app'
    ports:
      - 5001:5000 # expose ports - HOST:CONTAINER
    environment:
      - APP_SETTINGS=project.config.DevelopmentConfig
      - DATABASE_NAME_TESTING=testing
      - DATABASE_NAME_DEV=dev
      - DATABASE_URL=twist-database
      - CONFIG_FILE=./project/default_config.json
      - MONGO_PASSWORD=user
      - CELERY_RESULT_BACKEND=redis://redis:6379
      - CELERY_BROKER_URL=redis://redis:6379/0
      - MONGO_PORT=27017
    depends_on:
      - celery
      - twist-database
  celery:
    container_name: celery
    build: .
    command: celery -A project.api.tasks --loglevel=debug worker
    volumes:
      - '.:/usr/src/app'
  twist-database:
    image: mongo:latest
    container_name: "twist-database"
    environment:
      - MONGO_DATA_DIR=/data/db
      - MONGO_USER=mongo
    volumes:
      - /data/db
    ports:
      - 27017:27017  # expose ports - HOST:CONTAINER
    command: mongod
  redis:
    image: "redis:alpine"
    command: redis-server
    volumes:
      - '/redis'
    ports:
      - '6379:6379'

但是,當我運行docker-compose文件並生成容器時,最終在celery worker日志中得到了這個:

[2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

這意味着工作人員在創建芹菜時會忽略redis的配置集,而是嘗試使用Rabbitmq。 我嘗試將project.api.tasks更改為project和project.celery_app,但無濟於事。

在我看來, celery服務也應該具有環境變量CELERY_RESULT_BACKENDCELERY_BROKER_URL

您需要將Docker服務鏈接在一起。 最簡單的方法是在dockerfile中添加networks部分

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM