[英]batch upload json to dynamodb using boto3
I would like to batch upload a json file to dynamodb. 我想将json文件批量上传到dynamodb。 At the moment I can successfully manually put items in a python file (as below) and upload to a table, however how can I amend the script to read an external json file (containing 200 items) and batch upload all 200 items to the table.
目前,我可以成功地手动将项目放入python文件中(如下所示)并上传到表格中,但是如何修改脚本以读取外部json文件(包含200个项目)并将全部200个项目批量上传到表格中。
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('exampletable')
with table.batch_writer() as batch:
batch.put_item(
Item={
'ID': '2',
'DateTime': '21/12/2017 13:16',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
batch.put_item(
Item={
'ID': '3',
'DateTime': '21/12/2017 13:40',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
json file contents as below json文件内容如下
[{
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "10",
"DestinationDevice": "20",
"DataType": "part",
"Activity": "normal"
}, {
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "40",
"DestinationDevice": "25",
"DataType": "full",
"Activity": "unusual"
}]
You would simply break that down into two tasks: 您只需将其分解为两个任务:
batch.put_item
each time. batch.put_item
。 There are lots of results when you do a search for the first task. 搜索第一个任务时有很多结果 。 And the second task is literally just writing a loop .
第二项任务实际上只是编写一个循环 。
A full solution would look something like this: 完整的解决方案如下所示:
import json
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('exampletable')
# Read the JSON file
with open('items.json') as json_data:
items = json.load(json_data)
with table.batch_writer() as batch:
# Loop through the JSON objects
for item in items:
batch.put_item(Item=item)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.