简体   繁体   English

如何与 Celery 和 Django 同时运行定期任务

[英]How to run periodic tasks concurrently with Celery and Django

I have some tasks being running by celery in my Django project.在我的 Django 项目中,celery 正在运行一些任务。

I use contrab to specify the time the task should be run, like this:我使用 conrab 来指定任务应该运行的时间,如下所示:

from celery.schedules import crontab

CELERY_BEAT_SCHEDULE = {
    'task_a': {
        'task': 'tasks.task_a',
        'schedule': crontab(minute=0, hour='5,18'),
    },
    'task_b': {
        'task': 'tasks.task_b',
        'schedule': crontab(minute=4, hour='5,18'),
    },
}

What has been happening is that one task is executed, and only about 5 minutes later the other starts.发生的事情是一个任务被执行,大约 5 分钟后另一个任务开始了。 When they should be executed at the same time.什么时候应该同时执行。

I would like all of them to be started at the same, but this it's not what is happening我希望所有这些都从同一开始,但这不是正在发生的事情

there are about eight tasks in total, some of which take a long time to complete总共大约有八个任务,其中一些需要很长时间才能完成

I am using the following command at the moment我目前正在使用以下命令

initially, it was like this最初是这样的

celery -A api worker  --concurrency=4 -n <name>

then I tried然后我尝试了

celery -A api multi  --concurrency=4 -n <name>

and finally最后

celery -A api multi -P gevent --concurrency=4 -n <name>

They are all shared_tasks他们都是shared_tasks

@shared_task(bind=True, name="tasks.task_a")
def task_a(self):
     pass

and I'm using autodiscover_tasks我正在使用autodiscover_tasks

app = Celery('<app-name>')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

Sounds like it might be a hardware issue.听起来可能是硬件问题。 Make sure you have enough CPU resource available to the celery worker.确保您有足够的 CPU 资源可供 celery 工作器使用。

Concurrency should be set to the number of CPUs available on your machine (which is default).并发应该设置为您机器上可用的 CPU 数量(这是默认设置)。 If your tasks are CPU bound, the pool should be set to prefork (which is default) and not gevent (which is for I/O bound like HTTP requests)如果您的任务受 CPU 限制,则池应设置为prefork (这是默认设置)而不是gevent (用于 I/O 限制,如 HTTP 请求)

Celery multi is to start multiple works. Celery multi是启动多个工作。 Generally Celery's default advise is 1 worker per machine.一般来说,Celery 的默认建议是每台机器 1 个工人。 But they do tell you to do your own testing.但他们确实告诉你做你自己的测试。

Edit: This link does a great job at explaining: https://www.distributedpython.com/2018/10/26/celery-execution-pool/编辑:这个链接在解释方面做得很好: https://www.distributedpython.com/2018/10/26/celery-execution-pool/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM