Background: using python inside of an AWS Lambda to send a csv file to an s3.
Issue: Cannot get Boto3 to accept my csv file or a csv.reader object.
Example:
# writing to csv file
with open('/tmp/' + output_file_name, 'a+') as csvfile:
for row in csv_reader:
# ... do data manipulation
csv.DictWriter(csvfile, fieldnames=fields)
# read and send to s3
with open('/tmp/' + output_file_name, 'r') as file:
s3_client = boto3.client('s3')
s3_client.put_object(Body=file, Bucket='bucket-output', Key=output_file_name)
I receive the error TypeError: Unicode-objects must be encoded before hashing
. So I tried to open the file to read with param encoding='utf-8'
but no luck there..
What needs to be done for Boto3 to 'accept' a csv file?
This works for me to read a csv from a local drive and upload to s3
with open('test.csv', 'rb') as f:
data = f.read().decode('utf-8')
boto3.client('s3').put_object(Body=data, Bucket=bucket, Key=key)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.