簡體   English   中英

在django中連接新的celery周期任務

[英]Connect new celery periodic task in django

這不是一個問題,而是幫助那些發現 celery 4.0.1 文檔中描述的周期性任務聲明很難集成到 Django 中的人: http : //docs.celeryproject.org/en/latest/userguide/periodic-tasks .html#entries

復制粘貼 celery 配置文件main_app/celery.py

from celery import Celery
from celery.schedules import crontab

app = Celery()

@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 10 seconds.
    sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')

    # Calls test('world') every 30 seconds
    sender.add_periodic_task(30.0, test.s('world'), expires=10)

    # Executes every Monday morning at 7:30 a.m.
    sender.add_periodic_task(
        crontab(hour=7, minute=30, day_of_week=1),
        test.s('Happy Mondays!'),
    )

@app.task
def test(arg):
    print(arg)

但是如果我們使用 django 並且我們的任務被放置在另一個應用程序中呢? 在 celery 4.0.1我們不再有@periodic_task裝飾器。 所以讓我們看看我們能做些什么。

第一種情況

如果您更喜歡將任務及其日程安排彼此靠近:

main_app/some_app/tasks.py

from main_app.celery import app as celery_app

@celery_app.on_after_configure.connect
    def setup_periodic_tasks(sender, **kwargs):
        # Calls test('hello') every 10 seconds.
        sender.add_periodic_task(10.0, test.s('hello'))

@celery_app.task
def test(arg):
    print(arg)

我們可以在調試模式下運行beat

celery -A main_app beat -l debug

我們將看到沒有這樣的周期性任務。

第二種情況

我們可以嘗試在配置文件中描述所有周期性任務,如下所示:

main_app/celery.py

...
app = Celery()

@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 10 seconds.
    from main_app.some_app.tasks import test
    sender.add_periodic_task(10.0, test.s('hello'))
...

結果是一樣的。 但是它的行為會與通過 pdb 進行手動調試時看到的不同。 在第一個示例中, setup_periodic_tasks回調根本不會被觸發。 但在第二個示例中,我們將得到django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. (這個異常不會被打印出來)

對於Django,我們需要使用另一個信號: @celery_app.on_after_finalize.connect 它可以同時用於:

  • app/tasks.py聲明接近任務的任務計划,因為在導入所有tasks.py和所有可能的接收者之后,該信號將被觸發(第一種情況)。
  • 集中的時間表聲明,因為django應用程序將已經初始化並准備導入(第二種情況)

我想我應該寫下最終聲明:

第一種情況

宣告接近任務的任務時間表:

main_app/some_app/tasks.py

from main_app.celery import app as celery_app

@celery_app.on_after_finalize.connect
    def setup_periodic_tasks(sender, **kwargs):
        # Calls test('hello') every 10 seconds.
        sender.add_periodic_task(10.0, test.s('hello'))

@celery_app.task
def test(arg):
    print(arg)

第二種情況

配置文件main_app/celery.py集中計划聲明:

...

app = Celery()

@app.on_after_finalize.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 10 seconds.
    from main_app.some_app.tasks import test
    sender.add_periodic_task(10.0, test.s('hello'))
...

如果意圖是分別維護任務邏輯tasks.py,然后調用from main_app.some_app.tasks import test里面setup_periodic_tasks沒有為我工作。 有效的方法如下:

芹菜

@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 10 seconds.
    sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')

@app.task
def test(arg):
    print(arg)
    from some_app.tasks import test
    test(arg)

task.py

@shared_task
def test(arg):
    print('world')

結果為以下輸出:

[2017-10-26 22:52:42,262: INFO/MainProcess] celery@ubuntu-xenial ready.
[2017-10-26 22:52:42,263: INFO/MainProcess] Received task: main_app.celery.test[3cbdf4fa-ff63-401a-a9e4-cfd1b6bb4ad4]  
[2017-10-26 22:52:42,367: WARNING/ForkPoolWorker-2] hello
[2017-10-26 22:52:42,368: WARNING/ForkPoolWorker-2] world
[2017-10-26 22:52:42,369: INFO/ForkPoolWorker-2] Task main_app.celery.test[3cbdf4fa-ff63-401a-a9e4-cfd1b6bb4ad4] succeeded in 0.002823335991706699s: None
[2017-10-26 22:52:51,205: INFO/Beat] Scheduler: Sending due task add every 10 (main_app.celery.test)
[2017-10-26 22:52:51,207: INFO/MainProcess] Received task: main_app.celery.test[ce0f3cfc-54d5-4d74-94eb-7ced2e5a6c4b]  
[2017-10-26 22:52:51,209: WARNING/ForkPoolWorker-2] hello
[2017-10-26 22:52:51,209: WARNING/ForkPoolWorker-2] world

我使用它工作

芹菜

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')

app = Celery('mysite')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

task.py

from celery import current_app
app = current_app._get_current_object()

@app.task
def test(arg):
    print(arg)

@app.on_after_finalize.connect
def app_ready(**kwargs):
    """
    Called once after app has been finalized.
    """
    sender = kwargs.get('sender')

    # periodic tasks
    speed = 5
    sender.add_periodic_task(speed, test.s('foo'),name='update leases every {} seconds'.format(speed))

正在運行的工人

celery -A mysite worker --beat --scheduler django --loglevel=info

如果要單獨使用任務邏輯,請使用以下設置:

celery.py

import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings') # your settings.py path

app = Celery()

@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    sender.add_periodic_task(5, periodic_task.s('sms'), name='SMS Process')
    sender.add_periodic_task(60, periodic_task.s('email'), name='Email Process')


@app.task
def periodic_task(taskname):
    from myapp.tasks import sms_process, email_process

    if taskname == 'sms':
        sms_process()

    elif taskname == 'email':
        email_process()

django應用程序myapp中的一個示例任務:

myapp / tasks.py

def sms_process():
    print('send sms task')

def email_process():
    print('send email task')

也在掙扎,終端沒有活動,讓它在下面工作:

Django 3.2.8 版,Celery 5.2.0 版

在 Django 項目中,稱為Proj

Proj/Proj celery.pysettings.py旁邊的文件)

芹菜.py

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'Proj.settings')

app = Celery('Proj')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django apps.
app.autodiscover_tasks()

__init__.py (與settings.py相同的文件夾中)

__init__.py

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

在任何子 django app 文件夾中,有一個名為tasks.py的文件(在 models.py 旁邊)

任務.py

from Proj.celery import app

# Schedule
@app.on_after_finalize.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 1 seconds.
    sender.add_periodic_task(1.0, test.s('hello'), name='add every 1')

    # Calls test('world') every 3 seconds
    sender.add_periodic_task(3.0, test.s('world'), expires=10)

# Tasks
@app.task
def test(arg):
    print(arg)

然后,在終端中運行以下,使用虛擬環境(如果適用):

>>> celery -A Proj worker -B

結果(確認它正在工作):

[2021-11-10 11:22:22,070: WARNING/MainProcess] /.venv/lib/python3.9/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory

[2021-11-10 11:22:22,173: WARNING/ForkPoolWorker-9] hello
[2021-11-10 11:22:22,173: WARNING/ForkPoolWorker-3] hello
[2021-11-10 11:22:22,173: WARNING/ForkPoolWorker-2] world

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM