[英]How to upload nested directories and files into s3 bucket using python boto3
I have a folder called to be uploaded inside which I have files, folders, and nested folders/nested directories.我有一个名为要上传的文件夹,其中有文件、文件夹和嵌套文件夹/嵌套目录。
Whatever ever I want to upload into the s3 bucket, I just put it into the to be uploaded folder and run a python3 script, after which I have to get all the things inside to be uploaded folder as it is inside the s3 bucket.无论我想上传到 s3 存储桶中的什么,我只需将其放入待上传文件夹并运行 python3 脚本,之后我必须将所有内容放入待上传文件夹中,因为它位于 s3 存储桶中。
Below pics are for reference as to just get an idea of how my to be uploaded folder looks like下面的图片仅供参考,以了解我要上传的文件夹的外观
I have set up all the aws configuration stuff using boto3 but got stuck here So, I'd like to see a python user-defined/pre-defined function after running which I can get all the contents in the to be uploaded folder as shown in the first picture into the s3 bucket as it is.我已经使用 boto3 设置了所有 aws 配置内容,但卡在这里所以,我想看到 python 用户定义/预定义 function 运行后我可以得到要上传的文件夹中的所有内容,如图所示在第一张图片中按原样放入 s3 存储桶中。
I am using windows 10 os and python 3.9.2 interpreter.我正在使用 windows 10 操作系统和 python 3.9.2 解释器。
Please help !!!请帮忙 !!!
Any help is highly appreciated !!!!非常感谢任何帮助!!!!
Thanks in advance..提前致谢..
Try this:尝试这个:
params is a file in which you have the access keys and other confidential stuff in a,b,c,d variables params 是一个文件,其中您在 a,b,c,d 变量中拥有访问密钥和其他机密内容
import os, boto3, params, subprocess
os.environ['aws_access_key_id'] = params.a
os.environ['aws_secret_access_key'] = params.b
os.environ['default_region_name'] = params.c
os.environ['default_output_format'] = params.d
def upload_to_s3():
subprocess.run(
['aws', 's3', 'sync', './folder-name', 's3://bucket-name'])
upload_to_s3()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.