简体   繁体   中英

Getting reusable tasks to work in a setup with one celery server and 3k+ django sites, each with its own database

Here's the problem: I have one celery server and 3k+ django sites, each with its own database. New sites (and databases) can be added dynamically.

I'm writing celery tasks which need to be run for each site, through the common celery server. The code is in an app which is meant to be reusable, so it shouldn't be written in a way that ties it to this particular setup.

So. Without mangling the task code to fit my exact setup, how can I make sure that the tasks connect to the correct database when they run?

This is hard to accomplish because of an inherent limitation in Django: The settings are global. So unless all the apps shared the same settings, this is going to be a problem.

You could try spawning new worker processes for every task and create the django environment each time. Don't use django-celery, but use celery directly with something like this in celeryconfig.py :

from celery import signals
from importlib import import_module

def before_task(task, **kwargs):
    settings_module = task.request.kwargs.pop("settings_module", None)
    if settings_module:
        settings = import_module(settings_module)
        from django.conf import setup_environ
        setup_environ(settings)
signals.task_prerun.connect(before_task)

CELERYD_MAX_TASKS_PER_CHILD = 1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM