简体   繁体   English

使用一个 cloudFormation 模板从 zip 文件和 S3 存储桶创建一个 lambda

[英]Create a lambda from zip file and and an S3 bucket using one cloudFormation Template

How do I create an S3 bucket and a lambda in the same cloudormation Template?如何在同一个 cloudormation 模板中创建 S3 存储桶和 lambda?

The lambda has lot of lines of code so it can't be coded inline. lambda 有很多代码行,所以不能内联编码。 Usually i upload the lambda zip to an S3 bucket and then specify the S3 key for the zip to create the lambda in my cloudFormation template.通常我将 lambda zip 上传到 S3 存储桶,然后为 zip 指定 S3 密钥以在我的 cloudFormation 模板中创建 lambda。 How can I do this without having to manually create an S3 bucket beforehand?如何在不必事先手动创建 S3 存储桶的情况下执行此操作? Basically what I'm asking is, if there is a temporary storage option in AWS that can be used to upload files to without needing to create an S3 bucket manually.基本上我要问的是,AWS 中是否有一个临时存储选项可用于将文件上传到而无需手动创建 S3 存储桶。

I tried searching online but all the results point to uploading the zip file to an S3 bucket and using that in the cloudFormation template to create the lambda. That doesn't work here because the S3 bucket is also gets created in the same cloudFormation Template.我尝试在线搜索,但所有结果都指向将 zip 文件上传到 S3 存储桶,并在 cloudFormation 模板中使用该文件创建 lambda。这在这里不起作用,因为 S3 存储桶也是在同一个 cloudFormation 模板中创建的。

You could do something like below, which creates an S3 bucket, a lambda function, zips the inline code and creating an event notification which will trigger the lambda function if an object is uploaded into the specified bucket.您可以执行类似下面的操作,创建一个 S3 存储桶,一个 lambda function,压缩内联代码并创建一个事件通知,如果 object 上传到指定的存储桶中,该事件通知将触发 lambda function。 I've also included a event notification, which you can ignore or remove it accordingly.我还包含了一个事件通知,您可以相应地忽略或删除它。

Make sure to replace your code snippet with mine within the lambda function.确保在 lambda function 中将您的代码片段替换为我的代码片段。

As far as I know, either you have to create the S3 bucket, upload the file into it beforehand and use those details to point your zip file in the lambda function. Or else create the S3 bucket through the lambda first and then upload the file into it manually once the resources are provisioned.据我所知,要么您必须创建 S3 存储桶,事先将文件上传到其中,然后使用这些详细信息将您的 zip 文件指向 lambda function。或者先通过 lambda 创建 S3 存储桶,然后再上传文件一旦配置了资源,手动进入它。

In my lambda function, you can notice I have provided an incline code to zip, but you can still give the S3 bucket and key if you have the bucket already.在我的 lambda function 中,您可以注意到我已经向 zip 提供了倾斜代码,但是如果您已经有存储桶,您仍然可以提供 S3 存储桶和密钥。

https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-lambda-function-code.html https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-lambda-function-code.html

You can also check this where they have created an S3 object on the fly and have pointed to the bucket that was created.您还可以在他们动态创建 S3 object 并指向已创建的存储桶的位置进行检查。 But I haven't personally tested this, so you may have test and see whether you can upload a zip file too.但是我没有亲自测试过这个,所以你可以测试一下,看看你是否也可以上传一个 zip 文件。

AWSTemplateFormatVersion: 2010-09-09

Parameters:
  LambdaFunctionName:
    Type: String
    MinLength: '1'
    MaxLength: '64'
    AllowedPattern: '[a-zA-Z][a-zA-Z0-9_-]*'
    Description: The name of the Lambda function to be deployed
    Default: convert_csv_to_parquet_v2
  LambdaRoleName:
    Type: String
    MinLength: '1'
    MaxLength: '64'
    AllowedPattern: '[\w+=,.@-]+'
    Description: The name of the IAM role used as the Lambda execution role
    Default: Lambda-Role-CFNExample
  LambdaPolicyName:
    Type: String
    MinLength: '1'
    MaxLength: '128'
    AllowedPattern: '[\w+=,.@-]+'
    Default: Lambda-Policy-CFNExample
  NotificationBucket:
    Type: String
    Description: S3 bucket that's used for the Lambda event notification

Resources:
  ExampleS3:
    Type: AWS::S3::Bucket
    DependsOn: LambdaInvokePermission
    Properties:
      BucketName: !Ref NotificationBucket
      NotificationConfiguration:
        LambdaConfigurations:
          - Event: s3:ObjectCreated:Put
            Filter:
              S3Key:
                Rules:
                  - Name: suffix
                    Value: txt
            Function: !GetAtt LambdaFunction.Arn
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Ref LambdaRoleName
      Description: An execution role for a Lambda function launched by CloudFormation
      ManagedPolicyArns:
        - !Ref LambdaPolicy
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
        - Effect: Allow
          Principal:
            Service: lambda.amazonaws.com
          Action:
          - 'sts:AssumeRole'
      
  LambdaPolicy:
    Type: AWS::IAM::ManagedPolicy
    Properties:
      ManagedPolicyName: !Ref LambdaPolicyName
      Description: Managed policy for a Lambda function launched by CloudFormation
      PolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action:
              - 'logs:CreateLogStream'
              - 'logs:PutLogEvents'
            Resource: !Join ['',['arn:', !Ref AWS::Partition, ':logs:', !Ref AWS::Region, ':', !Ref AWS::AccountId, ':log-group:/aws/lambda/', !Ref LambdaFunctionName, ':*']]
          - Effect: Allow
            Action:
              - 'logs:CreateLogGroup'
            Resource: !Sub 'arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:*'
          - Effect: Allow
            Action:
              - 's3:*'
            Resource: '*'
        
  LogGroup:
    Type: AWS::Logs::LogGroup
    Properties:
      LogGroupName: !Join ['',['/aws/lambda/', !Ref LambdaFunctionName]]
      RetentionInDays: 30
            
  LambdaFunction:
    Type: AWS::Lambda::Function
    Properties:
      Description: Read CSV files from a S3 location and converting them into Parquet
      FunctionName: !Ref LambdaFunctionName
      Handler: lambda_function.lambda_handler
      MemorySize: 128
      Runtime: python3.9
      Role: !GetAtt 'LambdaRole.Arn'
      Timeout: 60
      Code:
        ZipFile: |
            # Imports
            import pandas
            from urllib.parse import unquote_plus
            import boto3
            import os            
            
            def lambda_handler(event, context):
                print(f'event >> {event}')
                s3 = boto3.client('s3', region_name='us-east-1')
            
                for record in event['Records']:
                    key = unquote_plus(record['s3']['object']['key'])
                    print(f'key >> {key}')
            
                    bucket = unquote_plus(record['s3']['bucket']['name'])
                    print(f'bucket >> {bucket}')
            
                    get_file = s3.get_object(Bucket=bucket, Key=key)
                    get = get_file['Body']
                    print(f'get >> {get}')
            
                    df = pandas.DataFrame(get)
                    print('updating columns..')
                    df.columns = df.columns.astype(str)
            
                    print('saving file to s3 location...')
                    df.to_parquet(f's3://csvtoparquetconverted/{key}.parquet')
                    print('file converted to parquet')
  LambdaInvokePermission:
    Type: 'AWS::Lambda::Permission'
    Properties:
      FunctionName: !GetAtt LambdaFunction.Arn
      Action: 'lambda:InvokeFunction'
      Principal: s3.amazonaws.com
      SourceAccount: !Ref 'AWS::AccountId'
      SourceArn: !Sub 'arn:aws:s3:::${NotificationBucket}'

Outputs:
  CLI:
    Description: Use this command to invoke the Lambda function
    Value: !Sub |
        aws lambda invoke --function-name ${LambdaFunction} --payload '{"null": "null"}' lambda-output.txt --cli-binary-format raw-in-base64-out

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从一个 s3 存储桶中获取 zip 个文件,将它们解压缩到另一个 s3 存储桶中 - get zip files from one s3 bucket unzip them to another s3 bucket 最简单的 lambda function 将文件从一个 s3 存储桶复制到另一个存储桶 - Simplest lambda function to copy a file from one s3 bucket to another AWS Lambda 使用 python 读取 S3 存储桶中的文件 - AWS Lambda read a file in the S3 bucket using python 从 AWS lambda function 中的 s3 存储桶中读取 .mdb 或 .accdb 文件并使用 python 将其转换为 excel 或 csv - Reading .mdb or .accdb file from s3 bucket in AWS lambda function and converting it into excel or csv using python 将 json 文件从一个 s3 存储桶复制到另一个 s3 存储桶时,无法识别 Json 文件? - Json file is not recognising when copy json files from one s3 bucket to another s3 bucket? 是否可以在不使用存储桶策略的情况下将 s3 存储桶内容从一个存储桶复制到另一个帐户 s3 存储桶? - is it possible to copy s3 bucket content from one bucket to another account s3 bucket without using bucket policy? 在 Cloudformation 中使用时无法从 S3 下载引导文件 - not able to download bootstrap file from S3 when using in Cloudformation 如何创建预签名 URL 以使用 Boto3 从 S3 存储桶下载文件? - How do I create a Presigned URL to download a file from an S3 Bucket using Boto3? Cloudfront 的 Cloudformation S3 存储桶主体 - Cloudformation S3 bucket principal for Cloudfront 如何将 zip 文件(包含 shp)从 s3 存储桶加载到 Geopandas? - How to load a zip file (containing shp) from s3 bucket to Geopandas?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM