简体   繁体   中英

“KeyError: 'Records'” in AWS S3 - Lambda trigger

I have the following lambda function code for simply printing out the Author and metadata of an uploaded event of an S3 bucket:

from __future__ import print_function
import json
import urllib
import boto3

print('Loading function')

s3 = boto3.client('s3')


def lambda_handler(event, context):

    #print("Received event: " + json.dumps(event, indent=2))
    # bucket = event['Records'][0]['s3']['bucket']['name']

    for record in event['Records']:
        bucket = record[0]['s3']['bucket']['name']
        key = record[0]['s3']['object']['key']
        response = s3.head_object(Bucket=bucket, Key=key)

        logger.info('Response: {}'.format(response))

        print("Author : " + response['Metadata']['author'])
        print("Description : " + response['Metadata']['description'])

However, I am getting the following error while testing:

{
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      17,
      "lambda_handler",
      "for record in event['Records']:"
    ]
  ],
  "errorType": "KeyError",
  "errorMessage": "'Records'"
}

Am I doing anything wrong while accessing the bucket name and key name of the S3 object? If not, then what am I doing wrong here?

bit late to the party. But here is my first post!

EXPLANATION:

When you test in lambda panel -> def lambda_handler(event, context) <- event is injected directly.

However in AWS API its neccessary to add Mapping Template or otherwise -> event <- is empty, thus resulting in quizzing:

"errorType": "KeyError", "errorMessage": "'Records'"

this is null pointer. Records doesnt exist, since -> event <- doesnt exist.

SOLUTION:

You need to configure Integration Request inside AWS API. Click on Body Mapping Templates . Then add Mapping Template Set content type to application/json Then edit generated mapping template:

 { "body" : $input.json('$'), "headers": { #foreach($header in $input.params().header.keySet()) "$header": "$util.escapeJavaScript($input.params().header.get($header))" #if($foreach.hasNext),#end #end }, "method": "$context.httpMethod", "params": { #foreach($param in $input.params().path.keySet()) "$param": "$util.escapeJavaScript($input.params().path.get($param))" #if($foreach.hasNext),#end #end }, "query": { #foreach($queryParam in $input.params().querystring.keySet()) "$queryParam": "$util.escapeJavaScript($input.params().querystring.get($queryParam))" #if($foreach.hasNext),#end #end } }

And edit Lambda function :

replace:

for record in event['Records']:

with:

for record in event['query']['Records']

don't know whether stack will ping you with this answer - so i call you @Dawny33 @KevinOelen @franklinsijo

As for explanation i figured it on my own. However "mapping template" comes from https://medium.com/simple-thoughts-amplified/passing-variables-from-aws-api-gateway-to-lambda-3c5d8602081b

I had this issue yesterday and the problem is when you run a test there isn't actually any event (def lambda_handler(event, context)) for lambda to process. The event document is normally generated when an actual event occurs which is a json doc, your python code parses this document and then retrieves the keys. So the solution is for you to create an actual event by dropping a file in s3 so long as you have created event notification then this would get picked up

I also faced same issue , what we need to do here is very simple. Region of the lambda and region while creating the bucket/ trigger should be same. If it is same simply upload files to S3 bucket will be reflected to DynamoDB Directly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM