[英]Multipart upload using boto3
I am following below doc for multipart upload using boto3, but not able to perform the same. 我正在使用boto3跟随下面的doc进行分段上传,但是无法执行相同的操作。 can you walk me through concept and syntax for same?
你可以告诉我相同的概念和语法吗?
http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.create_multipart_upload http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.create_multipart_upload
http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.complete_multipart_upload http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.complete_multipart_upload
Using the Transfer Manager
使用传输管理器
boto3 provides interfaces for managing various types of transfers with S3.
boto3提供了使用S3管理各种类型传输的接口。 Functionality includes:
功能包括:
Automatically managing multipart and non-multipart uploads
自动管理多部分和非部分上传
To ensure that multipart uploads only happen when absolutely necessary, you can use the multipart_threshold configuration parameter:
要确保仅在绝对必要时才发生分段上传,可以使用multipart_threshold配置参数:
Use the following python code that uploads file to s3 and manages automatic multipart uploads. 使用以下python代码将文件上载到s3并管理自动分段上传。
import argparse
import boto3
import botocore
import os
import pandas as pd
from boto3.s3.transfer import TransferConfig
def environment_set(access_key,secret_access_key):
os.environ["AWS_ACCESS_KEY_ID"] = access_key
os.environ["AWS_SECRET_ACCESS_KEY"] = secret_access_key
def s3_upload_file(args):
while True:
try:
s3 = boto3.resource('s3')
GB = 1024 ** 3
# Ensure that multipart uploads only happen if the size of a transfer
# is larger than S3's size limit for nonmultipart uploads, which is 5 GB.
config = TransferConfig(multipart_threshold=5 * GB)
s3.meta.client.upload_file(args.path, args.bucket, os.path.basename(args.path),Config=config)
print "S3 Uploading successful"
break
except botocore.exceptions.EndpointConnectionError:
print "Network Error: Please Check your Internet Connection"
except Exception, e:
print e
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='UPLOAD A FILE TO PRE-EXISTING S3 BUCKET')
parser.add_argument('path', metavar='PATH', type=str,
help='Enter the Path to file to be uploaded to s3')
parser.add_argument('bucket', metavar='BUCKET_NAME', type=str,
help='Enter the name of the bucket to which file has to be uploaded')
parser.add_argument('cred', metavar='CREDENTIALS', type=str,
help='Enter the Path to credentials.csv, having AWS access key and secret access key')
args = parser.parse_args()
df = pd.read_csv(args.cred, header=None)
access_key = df.iloc[1,1]
secret_access_key = df.iloc[1,2]
environment_set(access_key,secret_access_key)
s3_upload_file(args)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.