简体   繁体   中英

Create a lambda from zip file and and an S3 bucket using one cloudFormation Template

How do I create an S3 bucket and a lambda in the same cloudormation Template?

The lambda has lot of lines of code so it can't be coded inline. Usually i upload the lambda zip to an S3 bucket and then specify the S3 key for the zip to create the lambda in my cloudFormation template. How can I do this without having to manually create an S3 bucket beforehand? Basically what I'm asking is, if there is a temporary storage option in AWS that can be used to upload files to without needing to create an S3 bucket manually.

I tried searching online but all the results point to uploading the zip file to an S3 bucket and using that in the cloudFormation template to create the lambda. That doesn't work here because the S3 bucket is also gets created in the same cloudFormation Template.

You could do something like below, which creates an S3 bucket, a lambda function, zips the inline code and creating an event notification which will trigger the lambda function if an object is uploaded into the specified bucket. I've also included a event notification, which you can ignore or remove it accordingly.

Make sure to replace your code snippet with mine within the lambda function.

As far as I know, either you have to create the S3 bucket, upload the file into it beforehand and use those details to point your zip file in the lambda function. Or else create the S3 bucket through the lambda first and then upload the file into it manually once the resources are provisioned.

In my lambda function, you can notice I have provided an incline code to zip, but you can still give the S3 bucket and key if you have the bucket already.

https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-lambda-function-code.html

You can also check this where they have created an S3 object on the fly and have pointed to the bucket that was created. But I haven't personally tested this, so you may have test and see whether you can upload a zip file too.

AWSTemplateFormatVersion: 2010-09-09

Parameters:
  LambdaFunctionName:
    Type: String
    MinLength: '1'
    MaxLength: '64'
    AllowedPattern: '[a-zA-Z][a-zA-Z0-9_-]*'
    Description: The name of the Lambda function to be deployed
    Default: convert_csv_to_parquet_v2
  LambdaRoleName:
    Type: String
    MinLength: '1'
    MaxLength: '64'
    AllowedPattern: '[\w+=,.@-]+'
    Description: The name of the IAM role used as the Lambda execution role
    Default: Lambda-Role-CFNExample
  LambdaPolicyName:
    Type: String
    MinLength: '1'
    MaxLength: '128'
    AllowedPattern: '[\w+=,.@-]+'
    Default: Lambda-Policy-CFNExample
  NotificationBucket:
    Type: String
    Description: S3 bucket that's used for the Lambda event notification

Resources:
  ExampleS3:
    Type: AWS::S3::Bucket
    DependsOn: LambdaInvokePermission
    Properties:
      BucketName: !Ref NotificationBucket
      NotificationConfiguration:
        LambdaConfigurations:
          - Event: s3:ObjectCreated:Put
            Filter:
              S3Key:
                Rules:
                  - Name: suffix
                    Value: txt
            Function: !GetAtt LambdaFunction.Arn
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Ref LambdaRoleName
      Description: An execution role for a Lambda function launched by CloudFormation
      ManagedPolicyArns:
        - !Ref LambdaPolicy
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
        - Effect: Allow
          Principal:
            Service: lambda.amazonaws.com
          Action:
          - 'sts:AssumeRole'
      
  LambdaPolicy:
    Type: AWS::IAM::ManagedPolicy
    Properties:
      ManagedPolicyName: !Ref LambdaPolicyName
      Description: Managed policy for a Lambda function launched by CloudFormation
      PolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action:
              - 'logs:CreateLogStream'
              - 'logs:PutLogEvents'
            Resource: !Join ['',['arn:', !Ref AWS::Partition, ':logs:', !Ref AWS::Region, ':', !Ref AWS::AccountId, ':log-group:/aws/lambda/', !Ref LambdaFunctionName, ':*']]
          - Effect: Allow
            Action:
              - 'logs:CreateLogGroup'
            Resource: !Sub 'arn:${AWS::Partition}:logs:${AWS::Region}:${AWS::AccountId}:*'
          - Effect: Allow
            Action:
              - 's3:*'
            Resource: '*'
        
  LogGroup:
    Type: AWS::Logs::LogGroup
    Properties:
      LogGroupName: !Join ['',['/aws/lambda/', !Ref LambdaFunctionName]]
      RetentionInDays: 30
            
  LambdaFunction:
    Type: AWS::Lambda::Function
    Properties:
      Description: Read CSV files from a S3 location and converting them into Parquet
      FunctionName: !Ref LambdaFunctionName
      Handler: lambda_function.lambda_handler
      MemorySize: 128
      Runtime: python3.9
      Role: !GetAtt 'LambdaRole.Arn'
      Timeout: 60
      Code:
        ZipFile: |
            # Imports
            import pandas
            from urllib.parse import unquote_plus
            import boto3
            import os            
            
            def lambda_handler(event, context):
                print(f'event >> {event}')
                s3 = boto3.client('s3', region_name='us-east-1')
            
                for record in event['Records']:
                    key = unquote_plus(record['s3']['object']['key'])
                    print(f'key >> {key}')
            
                    bucket = unquote_plus(record['s3']['bucket']['name'])
                    print(f'bucket >> {bucket}')
            
                    get_file = s3.get_object(Bucket=bucket, Key=key)
                    get = get_file['Body']
                    print(f'get >> {get}')
            
                    df = pandas.DataFrame(get)
                    print('updating columns..')
                    df.columns = df.columns.astype(str)
            
                    print('saving file to s3 location...')
                    df.to_parquet(f's3://csvtoparquetconverted/{key}.parquet')
                    print('file converted to parquet')
  LambdaInvokePermission:
    Type: 'AWS::Lambda::Permission'
    Properties:
      FunctionName: !GetAtt LambdaFunction.Arn
      Action: 'lambda:InvokeFunction'
      Principal: s3.amazonaws.com
      SourceAccount: !Ref 'AWS::AccountId'
      SourceArn: !Sub 'arn:aws:s3:::${NotificationBucket}'

Outputs:
  CLI:
    Description: Use this command to invoke the Lambda function
    Value: !Sub |
        aws lambda invoke --function-name ${LambdaFunction} --payload '{"null": "null"}' lambda-output.txt --cli-binary-format raw-in-base64-out

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM