简体   繁体   中英

How to place dynamo db records in a SQS queue to trigger lambda

I have a lambda that scans through items present in a dynamo table and does some post processing with that. While this works fine due to smaller number of entries in the table, it will soon grow and the 15 minute timeout will be reached.

I am considering utilizing a SQS but not sure how i can place records from the table to SQS which will then trigger the lambda concurrently.

Is this a feasible solution? Or should i just create threads with the lambda and process it, again unsure if this will count towards the 15 minute limit

Any suggestions will be appreciated, thanks

DynamoDB streams is a perfect use-case for this, every item added or modified will enter the stream and in turn will trigger your Lambda function that does the pre-processing, but of course it strongly relies on your particular use-case.

If for example you require all the data from the table, you can make useful aggregations and contain those aggregates in a single item. Then instead of having to Scan the table to get all the items, you just do a single GetItem request which already holds your aggregate data.

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html

As @LeeHannigan says, use DynamoDB Streams to capture your table's CRUD events. Streams has traditionally had 2 targets to consume these change events: Lambda and Kinesis.

But what about a SQS destination? EventBridge Pipes adds EventBridge as another way to consume DynamoDB streams. EB Pipes, a new integration service, would have the DynamoDB Stream as its source and SQS as its target .

The flow would be DynamoDB Streams -> EB Pipes -> SQS -> Lambda .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM