简体   繁体   中英

Django Celery - Passing an object to the views and between tasks using RabbitMQ

This is the first time I'm using Celery, and honestly, I'm not sure I'm doing it right. My system has to run on Windows, so I'm using RabbitMQ as the broker.

As a proof of concept, I'm trying to create a single object where one task sets the value, another task reads the value, and I also want to show the current value of the object when I go to a certain url. However I'm having problems sharing the object between everything.

This is my celery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE','cesGroundStation.settings')

app = Celery('cesGroundStation')

app.config_from_object('django.conf:settings')

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind = True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

The object I'm trying to share is:

class SchedulerQ():

    item = 0

    def setItem(self, item):
        self.item = item

    def getItem(self):
        return self.item

This is my tasks.py

from celery import shared_task
from time import sleep
from scheduler.schedulerQueue import SchedulerQ

schedulerQ = SchedulerQ()

@shared_task()
def SchedulerThread():
    print ("Starting Scheduler") 
    counter = 0
    while(1):
        counter += 1
        if(counter > 100):
            counter = 0
        schedulerQ.setItem(counter)
        print("In Scheduler thread - " + str(counter))
        sleep(2)
    print("Exiting Scheduler")

@shared_task()
def RotatorsThread():
    print ("Starting Rotators") 
    while(1):
        item = schedulerQ.getItem()
        print("In Rotators thread - " + str(item))
        sleep(2)
    print("Exiting Rotators")

@shared_task()
def setSchedulerQ(schedulerQueue):
    schedulerQ = schedulerQueue

@shared_task()
def getSchedulerQ():
    return schedulerQ

I'm starting my tasks in my apps.py...I'm not sure if this is the right place as the tasks/workers don't seem to work until I start the workers in a separate console where I run the celery -A cesGroundStation -l info .

from django.apps import AppConfig
from scheduler.schedulerQueue import SchedulerQ
from scheduler.tasks import SchedulerThread, RotatorsThread, setSchedulerQ, getSchedulerQ

class SchedulerConfig(AppConfig):
    name = 'scheduler'

    def ready(self):
        schedulerQ = SchedulerQ()
        setSchedulerQ.delay(schedulerQ)
        SchedulerThread.delay()
        RotatorsThread.delay()

In my views.py I have this:

def schedulerQ():
    queue = getSchedulerQ.delay()
    return HttpResponse("Your list: " + queue)

The django app runs without errors, however my output from "celery -A cesGroundStation -l info" is this: Celery command output

First it seems to start multiple "SchedulerThread" tasks, secondly the "SchedulerQ" object isn't being passed to the Rotators, as it's not reading the updated value.

And if I go to the url for which shows the views.schedulerQ view I get this error: Django views error

I have very, very little experience with Python, Django and Web Development in general, so I have no idea where to start with that last error. Solutions suggest using Redis to pass the object to the views, but I don't know how I'd do that using RabbitMQ. Later on the schedulerQ object will implement a queue and the scheduler and rotators will act as more of a producer/consumer dynamic with the view showing the contents of the queue, so I believe using the database might be too resource intensive. How can I share this object across all tasks, and is this even the right approach?

The right approach would be to use some persistence layer, such as a database or results back end to store the information you want to share between tasks if you need to share information between tasks (in this example, what you are currently putting in your class).

Celery operates on a distributed message passing paradigm - a good way to distill that idea for this example, is that your module will be executed independently every time a task is dispatched. Whenever a task is dispatched to Celery, you must assume it is running in a seperate interpreter and loaded independently of other tasks. That SchedulerQ class is instantiated anew each time.

You can share information between tasks in ways described in the docs linked previously and some best practice tips discuss data persistence concerns.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM