简体   繁体   中英

Consume SQS messages using AWS lambda function

I have 2 FIFO SQS queues which receives JSON messages that are to be indexed to elasticsearch. One queue is constantly adding delta changes to the database and adding them to the queue. The second queue is used for database re-indexing ie the entire 50Tb if data is to be indexing every couple of months (where everything is added to the queue). I have a lambda function that consumes the messages from the queues and places them into the appropriate queue (either the active index or the indexing being rebuilt).

How should I trigger the lambda function to best process the backlog of messages in SQS so it process both queues as quickly as possible?

A constraint I have is that the queue items need to be processed in order. If the lambda function could be run indefinitely without the 5 minute limit I could keep running one function that constantly processes messages.

The standard way to do this is to use Cloudwatch Events that run periodically . This lets you pull data from the queue on a regular schedule.

Because you have to poll SQS this may not lead to the fastest processing of messages. Also, be careful if you constantly have messages to process - Lambda will end up being far more expensive than a small EC2 instance to handle the messages.

Instead of pushing your messages directly into SQS you could publish the messages to a SNS Topic with 2 Subscriber registered.

  1. Subscriber: SQS
  2. Subscriber: Lambda Function

Has the benefit that your Lambda is invoked at the same time as the message is stored in SQS .

Not sure I fully understand your problem, but here are my 2 cents:

  1. If you have a constant and real-time stream of data, consider using Kinesis Streams with 1 shard in order to preserve the FIFO. You may consume the data in batch of n items using lambda . Up to you to decide the batch size n and the memory size of lambda .

    • with this solution you pay a low constant price for Kinesis Streams and a variable price for Lambdas .
  2. Should you really are in love with SQS and the real-time does not metter , you may consume items with Lambdas or EC2 or Batch . Either you trigger many lambdas with CloudWatch Events , either you keep alive an EC2 , either you trigger on a regular basis an AWS Batch job.

    • there is an economic equation to explore, each solution is the best for one use case and the worst for another, make your choice ;)
    • I prefer SQS + Lambdas when there are few items to consume and SQS + Batch when there are a lot of items to consume.
  3. You may probably also consider using SNS + SQS + Lambdas like @maikay says in his answer, but I wouldn't choose that solution.

Hope it helps. Feel free to ask for clarifications. Good luck!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM