[英]how to update metadata on an S3 object larger than 5GB?
I am using the boto3 API to update the S3 metadata on an object.我正在使用 boto3 API 更新对象上的 S3 元数据。
I am making use of How to update metadata of an existing object in AWS S3 using python boto3?我正在使用如何使用 python boto3 更新 AWS S3 中现有对象的元数据?
My code looks like this:我的代码如下所示:
s3_object = s3.Object(bucket,key)
new_metadata = {'foo':'bar'}
s3_object.metadata.update(new_metadata)
s3_object.copy_from(CopySource={'Bucket':bucket,'Key':key}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')
This code fails when the object is larger than 5GB.当对象大于 5GB 时,此代码将失败。 I get this error:
我收到此错误:
botocore.exceptions.ClientError: An error occurred (InvalidRequest) when calling the CopyObject operation: The specified copy source is larger than the maximum allowable size for a copy source: 5368709120
How does one update the metadata on an object larger than 5GB?如何更新大于 5GB 的对象的元数据?
Due to the size of your object, try invoking a multipart upload and use the copy_from argument.由于对象的大小,请尝试调用分段上传并使用 copy_from 参数。 See the boto3 docs here for more information:
有关更多信息,请参阅此处的 boto3 文档:
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.MultipartUploadPart.copy_from https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.MultipartUploadPart.copy_from
Apparently, you can't just update the metadata - you need to re-copy the object to S3.显然,您不能只更新元数据 - 您需要将对象重新复制到 S3。 You can copy it from s3 back to s3, but you can't just update, which is annoying for objects in the 100-500GB range.
您可以将其从 s3 复制回 s3,但不能只是更新,这对于 100-500GB 范围内的对象来说很烦人。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.