简体   繁体   中英

How to download all files from s3 bucket to local linux server while passing bucket and local folder value at runtime using python

I am making script to download files form s3 bucket to local linux folder. To achieve that i have to use dynamic values for buckets and folders where we want to download stuff.

I know how to do with

aws s3 cp s3://bucket /linux/local/folder --recursive --p alusta

But how to accept bucket value at runtime

dwn_cmd = "aws s3 cp s3://bucket/name/" + str(year_name) + '/' + str(month_name)

folder_path = "/local/linux/folder/" + folder_name

#subprocess.call(['aws','s3','cp',dwn_cmd,folder_path,'--recursive','--p', 'alusta'])

This is showing error that subprocess needs s3 bucket path and local folder path. I think it is not picking up the path. If i hard code the path it is working but not with this. How could I achieve my result

With

dwn_cmd = "aws s3 cp s3://bucket/name/" + "2019" + '/' + "June"
folder_path = "/local/linux/folder/" + "test"

You will be calling

subprocess.call(['aws','s3','cp',
 "aws s3 cp s3://bucket/name/2019/June",
 "/local/linux/folder/test",
 '--recursive', '--p', 'alusta']);

Delete the aws s3 cp parameters from dwn_command :

dwn_cmd = "s3://bucket/name/" + "2019" + '/' + "June"

Note: Do not use subprocess.call([dwn_cmd, folder_path,'--recursive','--p', 'alusta']) # wrong The space between aws and s3 will be considered as part of the command name, so it would look for the command in a subdirectory of the directory with 3 spaces aws s3 cp s3: .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM