简体   繁体   中英

Sending messages to django-channels via celery

So I have a scheduled celery beat task (celery.py):

@app.on_after_configure.connect 
def setup_periodic_tasks(sender,
**kwargs):
    sender.add_periodic_task(10.0, test_event, name='test')

And the task (events/tasks.py):

@shared_task
def test_event():
    from .models import Event
    Event.objects.create()

When the event is created, a receiver is fired, that should send a message to a channels group (events/receivers.py):

@receiver(post_save, sender=Event)
def event_post_add(sender, instance, created, *args, **kwargs):
    if created:
        print("receiver fired")
        Group("test").send({
            "text": json.dumps({
                'type': 'test',
            })
        })

The main problem is that the receiver is being fired in the celery beat process, and nothing is getting sent via django channels. No error messages, nothing, it's simply not being sent.

How can I integrate these two so that I will be able to send messages to channels from celery background processes?

hi i dont know if you found a solution or not. but as i was stuck on this problem myself so i tried a work around. i created a view for the message that needs to be send by websocket and make a request to it from celery beat the view:

def send_message(request,uuid,name):
print('lamo')
ty = f"uuid_{uuid}"
data={
    'message':f'{name} Driver is now Avaliable',
    'sender':'HQ',
    'id':str(uuid),
    'to':str(uuid),
    'type':'DriverAvailable',
}
channel_layer = get_channel_layer()
async_to_sync(channel_layer.group_send)(
    ty,
    {'type':'chat_message',
    'message':data,
    }
)

and the task:

def my_task():
list=[]
for d in Driver_api.objects.all():
    if d.available_on !=None:
        if d.available_on <= timezone.now():
            d.available_on = None
            d.save()
        uuid = str(d.user.uuid)
        requests.get(f'{DOMAIN}message/sendMessage/{uuid}/{d.name}')
    logger.info('Adding {0}'.format(d.user.uuid))

return list

sorry for any neglects or overlooks in my approach to the problem.

Signals are actually not being asynchronous in Django. So in:

@shared_task
def test_event():
    from .models import Event
    Event.objects.create() # This will fire a signal and the signal will 
                           # still be interpreted by celery

The only nice approach I know would be if your backend would listen on rabbitmq messages. A simple and nasty approach would be to fire a request to a view in django.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM