简体   繁体   中英

Running celery worker from script and concurrency

I have a wrapper script that starts a worker, but I don't understand how to limit the amount of concurrency. I tried to read the celery code, but for me it's very hard to see how they do it.

My code:

from celery.bin import worker as w

my_worker = w.worker(app=app )
options = {
  'loglevel': loglevel,
  'queues': [service,],
  'hostname': hostname,
  }
my_worker.run(**options)

I don't really know how to add the concurrency to this.

Here is my code just use cmd celery :

def start(self,datas):
    queue = datas['queue']
    # check if exists with app.control.inspect().active()
    all_workers = self.active(datas).keys()
    if all_workers:
        return None
    #return os.getcwd()
    a="celery multi start %s_worker -A celeryserver  -Q '%s' --concurrency=1 -l  DEBUG" % (queue, queue)
    sys.argv = a.split()
    from celery.bin.celery import main
    try:
        main()
    except SystemExit as exit:
        return exit.code 

for your code, add 'concurrency': 1 to your options ( It should work, not tested ), but I suggests you to use main with sys.argv just like I do.
Because it can auto parse sys.argv into *args, **options with Command.handle_argv and call __call__ which call self.run() (just like you do,but more compatibility and easy).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM