简体   繁体   English

动态添加周期性任务 celery

[英]dynamically add periodic tasks celery

Is it possible to dynamically add periodic tasks to celery?是否可以向 celery动态添加周期性任务?

I'm using Flask, not django, and I am building an app that is supposed to allow users to define recurrent tasks through a web interface.我正在使用 Flask,而不是 django,并且我正在构建一个应用程序,该应用程序应该允许用户通过 Web 界面定义循环任务。

I've tried using Periodic Tasks from Celery 4.1, but to add new tasks I have to stop celery server, change the config (even if done through python), and start it again.我已经尝试使用 Celery 4.1 中的 Periodic Tasks,但是要添加新任务,我必须停止 celery 服务器,更改配置(即使通过 python 完成),然后重新启动它。 Maybe there's a way to dynamically load the config (without having to re-start it)?也许有一种方法可以动态加载配置(无需重新启动)?

I've considered to have a crontab that re-starts celery service every 5mins.我考虑过有一个 crontab,每 5 分钟重新启动一次芹菜服务。 but it seems highly contra-nature.但这似乎非常违反自然。 Among other reasons because the reason I wanted to use celery was not to use crontab.在其他原因中,我想使用 celery 的原因是不使用 crontab。

Does anyone has some lights on this?有没有人对此有所了解?

ps.: I'm aware of another similar question , but it's from 2012. I was hoping things had changed since then, namely with the introduction of beat in v4.1 ps。:我知道另一个类似的问题,但它是从 2012 年开始的。我希望从那时起事情就发生了变化,即在 v4.1 中引入了beat

This works for Celery 4.0.1+ and Python 2.7, and Redis这适用于 Celery 4.0.1+ 和 Python 2.7 以及 Redis

from celery import Celery
import os, logging
logger = logging.getLogger(__name__)
current_module = __import__(__name__)

CELERY_CONFIG = {
    'CELERY_BROKER_URL': 
     'redis://{}/0'.format(os.environ.get('REDIS_URL', 'localhost:6379')),
  'CELERY_TASK_SERIALIZER': 'json',
}


celery = Celery(__name__, broker=CELERY_CONFIG['CELERY_BROKER_URL'])
celery.conf.update(CELERY_CONFIG)

I define a job in the following way:我通过以下方式定义工作:

job = {
    'task': 'my_function',               # Name of a predefined function
    'schedule': {'minute': 0, 'hour': 0} # crontab schedule
    'args': [2, 3],
    'kwargs': {}
}

I then define a decorator like this:然后我定义一个这样的装饰器:

def add_to_module(f):
    setattr(current_module, 'tasks_{}__'.format(f.name), f)
    return f

My task is我的任务是

@add_to_module
def my_function(x, y, **kwargs):
    return x + y

Then add a function which adds the task on the fly然后添加一个动态添加任务的函数

def add_task(job):
    logger.info("Adding periodic job: %s", job)
    if not isinstance(job, dict) and 'task' in jobs:
        logger.error("Job {} is ill-formed".format(job))
        return False
    celery.add_periodic_task(
        crontab(**job.get('schedule', {'minute': 0, 'hour': 0})),
        get_from_module(job['task']).s(
            enterprise_id,
            *job.get('args', []),
            **job.get('kwargs', {})
        ),
        name = job.get('name'),
        expires = job.get('expires')
    )
    return True


def get_from_module(f):
    return getattr(current_module, 'tasks_{}__'.format(f))

After this, you can link the add_task function to a URL, and get them to create tasks out of functions in your current module在此之后,您可以将 add_task 函数链接到一个 URL,并让它们从当前模块中的函数中创建任务

for this purpose you can user redBeat.为此,您可以使用 redBeat。 based on redbeat GitHub:基于 redbeat GitHub:

RedBeat is a Celery Beat Scheduler that stores the scheduled tasks and runtime metadata in Redis. RedBeat 是一个 Celery Beat Scheduler,用于在 Redis 中存储计划任务和运行时元数据。

for task creation:用于任务创建:

import tasks #celery defined task class 
from redbeat import RedBeatSchedulerEntry as Entry
entry = Entry(f'urlCheck_{key}', 'tasks.urlSpeed', repeat, args=['GET', url, timeout, key], app=tasks.app)
entry.save()
entry.key

delete task:删除任务:

import tasks #celery defined task class 
from redbeat import RedBeatSchedulerEntry as Entry
entry = Entry.from_key(key, app=tasks.app) #key from previous step
entry.delete()

and there is an example that you can use: https://github.com/hamedsh/redBeat_example并且有一个您可以使用的示例: https : //github.com/hamedsh/redBeat_example

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM