简体   繁体   中英

Celery SQS Broker not receiving event on task

We want to use celery for listening sqs queue and process event into task

This is the celeryconfig.py file

from kombu import (
    Exchange,
    Queue
)


broker_transport = 'sqs'
broker_transport_options = {'region': 'us-east-1'}
worker_concurrency = 10

accept_content = ['application/json']
result_serializer = 'json'
content_encoding = 'utf-8'
task_serializer = 'json'

worker_enable_remote_control = False
worker_send_task_events = True
result_backend = None

task_queues = (
    Queue('re.fifo', exchange=Exchange('consume', type='direct'), routing_key='consume'),
)

task_routes = {'consume': {'queue': 're.fifo'}}

And this is celery.py file


from celery.utils.log import get_task_logger
from celery import Celery

app = Celery(__name__)

logger = get_task_logger(__name__)

@app.task(routing_key='consume', name="consume", bind=True, acks_late=True, ignore_result=True)
def consume(self, msg):
    print('Message received')
    logger.info('Message received')
    # DO SOMETHING WITH THE RECEIVED MESSAGE
    # print('this is the new message', msg)
    return True

We are pushing event on sqs using aws cli

aws --endpoint-url http://localhost:9324 sqs send-message --queue-url http://localhost:9324/queue/re.fifo --message-group-id owais --message-deduplication-id test18 --message-body {\"test\":\"test\"}

we are receiving event on celery worker but our consume task is not calling we want to call it

在此处输入图像描述

How can we call consume task on event coming from SQS, any help would be appreciated

The message you pushed to SQS using aws won't be recognized by the celery worker. You need to call consume.delay(msg) to push messages SQS, then your worker will be able to recognize it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM