简体   繁体   中英

Start celery workers with multiple brokers

Currently, i have a celery.py with a single redis broker

proj/celery.py

from __future__ import absolute_import

from kombu import Exchange, Queue
from celery import Celery

app = Celery('proj',
             broker='redis://myredis.com'
             backend='redis://myredis.com'
             include=['proj.tasks])

if __name__ == '__main__':
    app.start()

i would start a worker with:

celery multi start somename -A proj -Q work -c20 --pidfile='somepidfile' --logfile='somelogfile'

over multiple machines (let's say 20..)

so these workers across 20 machines use a single broker,

'redis://myredis.com'

I would like to split that so that 10 machines use 'redis://myredis.com' and other 10 machines use 'redis://myredis2.com'

What changes do I need to make that happen?

Thank you

A simple way to do that will be to have a DNS that resolves to both redis://myredis.com and redis://myredis2.com in a round-robin fashion. Provided that you have enough workers, this should roughly split the 2 brokers evenly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM