简体   繁体   中英

how to update metadata on an S3 object larger than 5GB?

I am using the boto3 API to update the S3 metadata on an object.

I am making use of How to update metadata of an existing object in AWS S3 using python boto3?

My code looks like this:

    s3_object = s3.Object(bucket,key)
    new_metadata = {'foo':'bar'}
    s3_object.metadata.update(new_metadata)
    s3_object.copy_from(CopySource={'Bucket':bucket,'Key':key}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')

This code fails when the object is larger than 5GB. I get this error:

botocore.exceptions.ClientError: An error occurred (InvalidRequest) when calling the CopyObject operation: The specified copy source is larger than the maximum allowable size for a copy source: 5368709120

How does one update the metadata on an object larger than 5GB?

Due to the size of your object, try invoking a multipart upload and use the copy_from argument. See the boto3 docs here for more information:

https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.MultipartUploadPart.copy_from

Apparently, you can't just update the metadata - you need to re-copy the object to S3. You can copy it from s3 back to s3, but you can't just update, which is annoying for objects in the 100-500GB range.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM