简体   繁体   English

Lambda 触发动态特定路径s3上传

[英]Lambda trigger dynamic specific path s3 upload

I am trying to create a lambda function that will get triggered once a folder is uploaded to a S3 Bucket.我正在尝试创建一个 lambda function 一旦文件夹上传到 S3 存储桶就会被触发。 But the lambda will perform an operation that will save files back on the same folder, how can I do so without having a self calling function?但是 lambda 将执行将文件保存回同一文件夹的操作,我如何才能在不调用 function 的情况下执行此操作?

I want to upload the following folder structure to the bucket:我想将以下文件夹结构上传到存储桶:

Project_0001/input/inputs.csv项目_0001/输入/输入.csv

The outputs will create and be saved on:输出将创建并保存在:

Project_0001/output/outputs.csv项目_0001/输出/输出.csv

But, my project number will change, so I can't simply assign a static prefix.但是,我的项目编号会改变,所以我不能简单地分配一个 static 前缀。 Is there a way of dynamically change the prefix, something like:有没有办法动态更改前缀,例如:

Project_*/input/项目_*/输入/

From Shubham's comment I drafted my solution using the prefix and sufix.根据 Shubham 的评论,我使用前缀和后缀起草了我的解决方案。

For my case, I stated the prefix being 'Project_' and for the suffix I choose one specific file for the trigger, so my suffix is '/input/myFile.csv'.对于我的情况,我将前缀指定为“Project_”,对于后缀,我为触发器选择了一个特定文件,因此我的后缀是“/input/myFile.csv”。

So every time I upload the structure Project_/input/allmyfiles_with_myFile.csv it triggers the function and then I save my output in the same project folder, under the output folder, thus not triggering the function again.因此,每次我上传结构 Project_/input/allmyfiles_with_myFile.csv 时,它都会触发 function,然后我将 output 保存在同一项目文件夹中的 output 文件夹下,因此不会再次触发 function。

I get project name with the following code我使用以下代码获取项目名称

key = event['Records'][0]['s3']['object']['key']
project_id = key.split("/")[0]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 s3 触发器为 s3 存储桶中的特定文件夹调用 lambda function - How to use a s3 trigger to invoke a lambda function for a specific folder in s3 bucket 在 lambda 触发器中读取私有 S3 文件 - Read private S3 file in lambda trigger AWS - 想要将多个文件上传到 S3 并且仅当所有文件都上传时触发 lambda function - AWS - want to upload multiple files to S3 and only when all are uploaded trigger a lambda function 不上传触发S3 Object - Trigger S3 Object without upload S3 lambda 上传问题:损坏 pipe - S3 lambda upload issue: Broken pipe 如何使用带有动态 s3 object 密钥的 s3 触发代码管道? - How to Trigger codepipeline with s3 with dynamic s3 object key? 使用 Cloudformation 创建触发器以启动 lambda function 并在我将文件上传到 S3 存储桶时验证哪个区域可用 - Create trigger using Cloudformation to launch a lambda function and verify in what region is available when I upload a file in S3 bucket 我想要一个 s3 触发器的列表和详细信息映射到 lambda function - I want the list and details of a s3 trigger mapped with a lambda function Lambda 上的 S3 触发器在今天早上之前工作了 - S3 trigger on Lambda worked before this morning not anymore AWS Lambda function 在 S3 中创建 object 触发器不起作用 - AWS Lambda function trigger on object creation in S3 does not work
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM