简体   繁体   English

使用 boto 3 写入 S3 存储桶中的 CSV 文件

[英]Writing to a CSV file in an S3 bucket using boto 3

I'm working on a project that needs to update a CSV file with user info periodically.我正在开发一个需要定期使用用户信息更新 CSV 文件的项目。 The CSV is stored in an S3 bucket so I'm assuming I would use boto3 to do this. CSV 存储在 S3 存储桶中,所以我假设我会使用 boto3 来执行此操作。 However, I'm not exactly sure how to go about this- would I need to download the CSV from S3 and then append to it, or is there a way to do it directly?但是,我不确定如何 go 关于这个 - 我需要从 S3 下载 CSV 然后 append 到它,还是有办法直接? Any code samples would be appreciated.任何代码示例将不胜感激。

Ideally this would be something where DynamoDB would work pretty well (as long as you can create a hash key).理想情况下,这将是 DynamoDB 可以很好地工作的地方(只要您可以创建 hash 密钥)。 Your solution would require the following.您的解决方案将需要以下内容。

  1. Download the CSV下载 CSV
  2. Append new values to the CSV Files Append CSV 文件的新值
  3. Upload the CSV.上传 CSV。

A big issue here is the possibility (not sure how this is planned) that the CSV file is updated multiple times before being uploaded, which would lead to data loss.这里的一个大问题是 CSV 文件在上传之前被多次更新的可能性(不确定这是如何计划的),这会导致数据丢失。

Using something like DynamoDB, you could have a table, and just use the put_item api call to add new values as you see fit.使用 DynamoDB 之类的东西,您可以有一个表,只需使用put_item api 调用来添加您认为合适的新值。 Then, whenever you wish, you could write a python script to scan for all the values and then write a CSV file however you wish然后,无论何时,您都可以编写一个 python 脚本来扫描所有值,然后根据需要编写一个 CSV 文件

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM