简体   繁体   English

Django + Celery在多个工作节点上执行任务

[英]Django + Celery tasks on multiple worker nodes

I've deployed a django(1.10) + celery(4.x) on the same VM, with rabbitmq being the broker(on the same machine). 我在同一个VM上部署了一个django(1.10) + celery(4.x) ,其中rabbitmq是代理(在同一台机器上)。 I want to develop the same application on a multi-node architecture like I can just replicate a number of worker nodes, and scale the tasks to run quickly. 我想在多节点架构上开发相同的应用程序,就像我可以复制许多工作节点,并将任务扩展为快速运行。 Here, 这里,

  1. How to configure celery with rabbitmq for this architecture? 如何使用rabbitmq为这种架构配置芹菜?
  2. On the other worker nodes, what should be the setup? 在其他工作节点上,应该设置什么?

You should have borker in one node and configure it so that, workers from other nodes can access it. 您应该在一个节点中安装borker并对其进行配置,以便来自其他节点的工作人员可以访问它。

For that, you can create a new user/vhost on rabbitmq. 为此,您可以在rabbitmq上创建新的用户/虚拟主机。

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart

From other nodes, you can queue up tasks or you can just run workers to consume tasks. 从其他节点,您可以排队任务,也可以只运行工作人员来执行任务。

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
    return x + y

If you have a file(say task.py ) like this, you can queue up tasks using add.delay(). 如果您有这样的文件(比如task.py ),则可以使用add.delay()排队任务。

You can also start worker with 你也可以开始工作

celery worker -A task -l info

You can see my answer here to get a brief idea about how to run tasks on remote machines . 您可以在此处查看我的答案,以便了解如何在远程计算机上运行任务 For a step by step process, you can checkout a post i have written on scaling celery . 对于一步一步的过程,您可以查看我在缩放芹菜上写的帖子。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM