简体   繁体   English

Python CSV 文件到字节或可查找的类文件对象

[英]Python CSV file to bytes or seekable file-like object

Background: using python inside of an AWS Lambda to send a csv file to an s3.背景:在 AWS Lambda 中使用 python 将 csv 文件发送到 s3。

Issue: Cannot get Boto3 to accept my csv file or a csv.reader object.问题:无法让 Boto3 接受我的 csv 文件或 csv.reader 对象。

Example:例子:

# writing to csv file
with open('/tmp/' + output_file_name, 'a+') as csvfile:
    for row in csv_reader:
        # ... do data manipulation
        csv.DictWriter(csvfile, fieldnames=fields)

# read and send to s3
with open('/tmp/' + output_file_name, 'r') as file:
    s3_client = boto3.client('s3')
    s3_client.put_object(Body=file, Bucket='bucket-output', Key=output_file_name)

I receive the error TypeError: Unicode-objects must be encoded before hashing .我收到错误TypeError: Unicode-objects must be encoded before hashing So I tried to open the file to read with param encoding='utf-8' but no luck there..所以我试图打开文件以使用 param encoding='utf-8'读取但没有运气..

What needs to be done for Boto3 to 'accept' a csv file? Boto3 需要做什么才能“接受”一个 csv 文件?

This works for me to read a csv from a local drive and upload to s3这适用于我从本地驱动器读取 csv 并上传到 s3

with open('test.csv', 'rb') as f:
    data = f.read().decode('utf-8')

boto3.client('s3').put_object(Body=data, Bucket=bucket, Key=key)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM