简体   繁体   中英

celery group runs tasks sequentially instead of in parallel

I am learning celery group function

@celery_app.task
def celery_task():
    import time
    time.sleep(30)
    print('task 1')

@celery_app.task
def celery_task2():
    import time
    time.sleep(10)
    print('task 2')

@celery_app.task
def test():
    from datetime import datetime
    print(datetime.now())
    job = group(
        celery_task.s(),
        celery_task2.s()
    )
    result = job()
    result.get()
    print(datetime.now())

However, as i was running test() from the python console and viewing them in the celery logs, it seems that task1 was run then task2 was run.

Shouldnt it be run in parallel? The whole test() function took 30s to complete

To start up my celery workers i use the command celery -A tasks worker -l=INFO

Are you sure that the whole test() took 30s? If so, I don't understand what's the problem? If it wasn't parallel - it had to take 30s+10s=40s.

Two things here:

  1. Use --concurrency flag when you run your worker so it can handle more than one task. Alternatively, use more than one worker (run two processes): celery -A tasks worker -l=INFO --concurrency=4 (I'm not sure what is the default - I guess that's one).
  2. Run your canvas asynchronously with job.delay() or job. apply_async() job. apply_async() to run async.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM