[英]AWS Lambda Python boto3 - reading the content of a file on S3
I am writing a lambda function that reads the content of a json file which is on S3 bucket to write into a kinesis stream. I am writing a lambda function that reads the content of a json file which is on S3 bucket to write into a kinesis stream. Simple requirement.简单的要求。 In the lambda I put the trigger as S3 bucket (with name of the bucket).在 lambda 中,我将触发器作为 S3 存储桶(带有存储桶的名称)。 When I drop a file into the bucket the Lambda gets triggered but says in the cloudwatch logs can't file key name.当我将文件放入存储桶时,Lambda 被触发,但在 cloudwatch 日志中说不能文件键名。 I am able to print the file name.我可以打印文件名。 I don't understand我不明白
[ERROR] NoSuchKey: An error occurred (NoSuchKey) when calling the GetObject
operation: The
specified key does not exist.
This is the code to read the file on S3这是在 S3 上读取文件的代码
import boto3
import json
#from urllib.parse import unquote_plus
import time
import csv
from pprint import pprint
s3 = boto3.client('s3')
kinesis = boto3.client('kinesis')
def lambda_handler(event, context):
if event:
# Read bucketname, filename from event record
file_obj = event["Records"][0]
bucketname = file_obj['s3']['bucket']['name']
filename = file_obj['s3']['object']['key']
print('bucket name is ' + str(bucketname) + ' file name is ' + str(filename))
# Get referenc to file on s3 bucket
fileObj = s3.get_object(Bucket=bucketname, Key=filename)
#pprint(fileObj)
# Convert the data in file
file_content = fileObj["Body"].read().decode('utf-8')
# put the record to kinesis
kinesis.put_record(Data=bytes(file_content, 'utf-8'),
StreamName='LambdaSourceKinesisStream',PartitionKey='basam')
return "thanks"
The file name is agent.json文件名为agent.json
It is working all of a sudden.它突然开始工作了。 no changes were made to the code没有对代码进行任何更改
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.