简体   繁体   English

实时将Json文件存储到Amazon S3的Python脚本

[英]Python script to store Json files in real time to amazon S3

I have a python code that gives me tweets in real time using Twitter Streaming API. 我有一个Python代码,可以使用Twitter Streaming API实时给我发推文。 I have stored the output to a json file which keeps on updating dynamically as new tweets arrive.However, I would like to save this json to amazon s3 which I could use to trigger events using amazon lambda service.Can somebody suggest me a way to solve this problem? 我将输出存储到一个json文件中,该文件会随着新推文的到来而不断更新,但是我想将此json保存到amazon s3中 ,我可以使用它使用amazon lambda服务来触发事件。解决这个问题?

Amazon has a python library to interact with it's web services. 亚马逊有一个python库与其Web服务进行交互。 It's called Boto and supports both S3 and lambda services. 它称为Boto同时支持S3和lambda服务。

To save the json to S3: 要将json保存到S3:

import boto3

s3 = boto3.resource('s3')
s3.create_bucket(Bucket='jsonbucket')
s3.Object('jsonbucket', 'yourstoredjson.json').put(Body=open('yourlocaljson.json', 'rb'))

Note: before saving files to S3 with boto, you have to set up authentication credentials for AWS. 注意:使用boto将文件保存到S3之前,您必须为AWS 设置身份验证凭证

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM