简体   繁体   English

特定于队列的Celery事件

[英]Celery events specific to a queue

I have two Django projects, each with a Celery app: 我有两个Django项目,每个项目都有一个Celery应用程序:

- fooproj.celery_app
- barproj.celery_app

Each app is running its own Celery worker: 每个应用程序都运行自己的Celery工作者:

celery worker -A fooproj.celery_app -l info -E -Q foo_queue
celery worker -A barproj.celery_app -l info -E -Q bar_queue

Here's how I am configuring my Celery apps: 以下是我配置Celery应用程序的方法:

import os
from celery import Celery
from django.conf import settings


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings.local')


app = Celery('celery_app', broker=settings.BROKER_URL)
app.conf.update(
    CELERY_ACCEPT_CONTENT=['json'],
    CELERY_TASK_SERIALIZER='json',
    CELERY_RESULT_SERIALIZER='json',
    CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
    CELERY_SEND_EVENTS=True,
    CELERY_DEFAULT_QUEUE=settings.CELERY_DEFAULT_QUEUE,
    CELERY_DEFAULT_EXCHANGE=settings.CELERY_DEFAULT_EXCHANGE,
    CELERY_DEFAULT_ROUTING_KEY=settings.CELERY_DEFAULT_ROUTING_KEY,
    CELERY_DEFAULT_EXCHANGE_TYPE='direct',
    CELERY_ROUTES = ('proj.celeryrouters.MainRouter', ),
    CELERY_IMPORTS=(
        'apps.qux.tasks',
        'apps.lorem.tasks',
        'apps.ipsum.tasks',
        'apps.sit.tasks'
    ),
)

My router class: 我的路由器类:

from django.conf import settings


class MainRouter(object):
    """
    Routes Celery tasks to a proper exchange and queue
    """
    def route_for_task(self, task, args=None, kwargs=None):
        return {
            'exchange': settings.CELERY_DEFAULT_EXCHANGE,
            'exchange_type': 'direct',
            'queue': settings.CELERY_DEFAULT_QUEUE,
            'routing_key': settings.CELERY_DEFAULT_ROUTING_KEY,
        }

fooproj has settings: fooproj有设置:

BROKER_URL = redis://localhost:6379/0
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'

barproj has settings: barproj有设置:

BROKER_URL = redis://localhost:6379/1
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'

As you can see, both projects use their own Redis database as a broker, their own MySQL database as a result backend, their own exchange, queue and routing key. 如您所见,两个项目都使用自己的Redis数据库作为代理,他们自己的MySQL数据库作为结果后端,他们自己的交换,队列和路由密钥。

I am trying to have two Celery events processes running, one for each app: 我正在尝试运行两个Celery事件进程,每个应用程序一个:

celery events -A fooproj.celery_app -l info -c djcelery.snapshot.Camera
celery events -A barproj.celery_app -l info -c djcelery.snapshot.Camera

The problem is, both celery events processes are picking up tasks from all of my Celery workers! 问题是,芹菜事件流程都在从我所有的Celery工作人员处获取任务! So in the fooproj database, I can see task results from barproj database. 所以在fooproj数据库中,我可以看到barproj数据库的任务结果。

Any idea how to solve this problem? 知道如何解决这个问题吗?

From http://celery.readthedocs.org/en/latest/getting-started/brokers/redis.html : 来自http://celery.readthedocs.org/en/latest/getting-started/brokers/redis.html

Monitoring events (as used by flower and other tools) are global and is not affected by the virtual host setting. 监视事件(由花和其他工具使用)是全局的,不受虚拟主机设置的影响。

This is caused by a limitation in Redis. 这是由Redis的限制引起的。 The Redis PUB/SUB channels are global and not affected by the database number. Redis PUB / SUB通道是全局的,不受数据库编号的影响。

This seems to be one of Redis' caveats :( 这似乎是Redis的一个警告:(

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM