简体   繁体   中英

How to have more than two DynamoDB Streams that trigger lambdas

We have a dynamoDB table that has two processes reading from it (two lambdas, as Lambda triggers). We now need to add a third but are aware that AWS strongly recommends having no more than two simultaneous readers (I assume adding a third lambda trigger ups our count of readers to 3?). How can we add a third service that is able to make use of the DynamoDB stream without impacting performance?

My very early thought is to replace those two lambdas with one that places the stream record on to SQS triggers SNS that alerts any lambdas that are subscribed to the topic, they can then do there thing with the stream record.

您的观察是正确的,因为您的体系结构朝着Pub-Sub模型发展,所以在订阅者收听主题的同时,使用SNS主题发布流消息通知是有意义的。

I would have only one thin lambda that triggers on dynamoDB stream, and have that lambda just invoke your other 3 "actual" lambdas. You can configure deadletter SQS queues, but other than that I would skip using SQS or SNS for anything.

In python lambdas, the trigger function would be something like this:

import boto3

lambda_client = boto3.client('lambda', region_name='your-region')

def lambda_handler(event, context):
    lambda_client.invoke(FunctionName='function-name-1', InvocationType='Event', Payload=event)
    lambda_client.invoke(FunctionName='function-name-2', InvocationType='Event', Payload=event)
    lambda_client.invoke(FunctionName='function-name-3', InvocationType='Event', Payload=event)
    return 'Replication completed'

If you start having too many functions to invoke, consider migrating this to something like Kinesis.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM