简体   繁体   English

将 arguments 从 S3 触发器传递到 Lambda function

[英]Pass arguments from S3 trigger to Lambda function

Using AWS Cloud Services, I am using an S3 trigger to monitor a bucket and invoke a Lambda function.使用 AWS 云服务,我使用 S3 触发器来监控存储桶并调用 Lambda function。 This function then picks up the S3 object to populate a DynamoDB table.然后,此 function 获取 S3 object 以填充 DynamoDB 表。

The problem is that I now need to monitor multiple directories for changes and each directory has meta data (not available in the object) that needs to be passed to the DynamoDB.问题是我现在需要监视多个目录的更改,并且每个目录都有需要传递给 DynamoDB 的元数据(对象中不可用)。 I do not know of a way to pass this meta information from the trigger to the lambda.我不知道如何将此元信息从触发器传递到 lambda。 I currently have the Lambda duplicated for each directory with the meta information saved as environment variables for each Lambda.我目前为每个目录复制了 Lambda,并将元信息保存为每个 Lambda 的环境变量。 This works, but feels like a terrible hack.这行得通,但感觉就像一个可怕的黑客。

How can I go about using a single Lambda to monitor multiple directories passing the meta arguments from the trigger to the Lambda?我如何 go 关于使用单个 Lambda 来监视多个目录,将元 arguments 从触发器传递到 Z04A7DA3C3154CAD85DA1EEBBB9?

Sadly, you can't add extra information to the S3 notification records.遗憾的是,您无法向 S3 通知记录添加额外信息 But, if the folders are part of the same bucket, then having one lambda could be enough in my opinion.但是,如果这些文件夹是同一个存储桶的一部分,那么在我看来,拥有一个 lambda 就足够了。

This is based on the fact that you could differentiate between different directories based on the prefixes of the S3 objects.这是基于您可以根据 S3 对象的前缀区分不同目录的事实。

For example if you upload the following to files to the bucket:例如,如果您将以下内容上传到存储桶中的文件:

dir1/file1.csv
dir2/file3.txt

your lambda would be triggered for each of them.您的 lambda 将为每个人触发。 In the lambda you could use basic if-else matching to check if your objects prefixes are dir1 or dir2 .在 lambda 中,您可以使用基本if-else匹配来检查您的对象前缀dir1还是dir2 Based on this you could choose different metadata to be written to the dynamodb.基于此,您可以选择不同的元数据写入 dynamodb。

Very roughly, based on two folders, in your lambda function you could have (pseudo-code):非常粗略地,基于两个文件夹,在您的 lambda function 中您可以拥有(伪代码):

if object_key.beinswith('dir1'):
    metadata = {some_metatadata_for_dir1}
elif object_key.beinswith('dir2'):
    metadata = {some_metatadata_for_dir2}

dynamodb.put_item({object_key + metadata})

You are basically doing this anyway, but passing the metadata though env variables to different lambda functions.无论如何,您基本上都是这样做的,但是通过 env 变量将元数据传递给不同的 lambda 函数。 Obviously, if you have many folders to monitor you can store the metadata outside of the lambda , eg in other dynamodb table, or parameter store if the metadata would change often.显然,如果您有许多文件夹要监视,您可以将元数据存储在 lambda 之外,例如在其他 dynamodb 表中,或者如果元数据经常更改,则存储在参数存储中。

Hope this helps.希望这可以帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从s3上传到一定时间范围内触发lambda函数 - Trigger lambda function in a certain time range from s3 upload 从 S3 中删除 object 时触发 Lambda function - Trigger Lambda function when object is deleted from S3 如何在lambda函数中添加s3触发器? - How to add s3 trigger to lambda function? 是否可以使用带有 S3 触发器的 lambda 函数将数据从 S3 发送到 Raspberry pi - Is it possible to send data from S3 to Raspberry pi using lambda function with S3 trigger 如何在lambda function中添加s3触发器function读取文件 - How to add s3 trigger function in the lambda function to read the file 使用AWS Lambda函数从SNS主题触发器读取和复制S3库存数据 - Read and Copy S3 inventory data from SNS topic trigger with AWS lambda function AWS Lambda function 未从 S3 存储桶事件触发器接收记录 object - AWS Lambda function not receiving Records object from S3 bucket event trigger 根据将文件从一个 S3 存储桶复制到另一个存储桶的清单文件触发 AWS Lambda function - Trigger AWS Lambda function based on a manifest file which copies files from one S3 bucket to another 从 s3 存储桶下载文件时如何触发 lambda 函数? - How to trigger a lambda function when a file is downloaded from a s3 bucket? 使用 Java ZF20E3C5E54C0AB3D375D660B3F896F 将触发器添加到 AWS Lambda Function - Add trigger to AWS Lambda Function using Java SDK for s3
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM