简体   繁体   中英

Exceeding 20 attribute limit on DynamoDB batch write operation

I am working on learning AWS DynamoDB. I have been attempting to bulk upload a csv to the table using Python, and the batch write operation. I am able to successfully enter the data, however only 20 of the attributes (including primary key). However, my data has 157 attributes. I am using Dynamo on-demand tier, not free tier.

My code is basically the same as the aws samples:

def dynamodb_batch_write(table_name, file):
    dynamodb_obj = boto3.resource('dynamodb', region_name='us-east-1')
    table = dynamodb_obj.Table(table_name)

    with table.batch_writer() as batch:
        for row in file:
            batch.put_item(Item=row)
    return True

sample_import = read_csv('sample-files/SampleFile2.csv')
import_iter = iter(sample_import)
for startrow in range(0, len(sample_import), 25):
    row_slice = None
    row_slice = islice(import_iter, 0, 25)
    upload = dynamodb_batch_write('custom-sample2', row_slice)

Does anyone know a way to bypass this limit? According to the documentation I do not see any limits on the attribute count except for secondary keys.

Thanks, Seth

Come to find out, I did not realize that by default the web viewer was hiding columns. In the dynamodb table console, you can click on the gear icon in the top right to hide and show columns.

The data was always there, I just didn't have it showing in the view.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM