简体   繁体   中英

How to move files from amazon ec2 to s3 bucket using command line

In my amazon EC2 instance, I have a folder named uploads . In this folder I have 1000 images. Now I want to copy all images to my new S3 bucket. How can I do this?

First Option sm3cmd

Use s3cmd

s3cmd get s3://AWS_S3_Bucket/dir/file

Take a look at this s3cmd documentation

if you are on linux, run this on the command line:

sudo apt-get install s3cmd

or Centos, Fedore.

yum install s3cmd

Example of usage:

s3cmd put my.file s3://pactsRamun/folderExample/fileExample

Second Option

Using Cli from amazon

Update

Like @tedder42 said in the comments, instead of using cp , use sync .

Take a look at the following syntax:

aws s3 sync <source> <target> [--options]

Example:

aws s3 sync . s3://my-bucket/MyFolder

More information and examples available at Managing Objects Using High-Level s3 Commands with the AWS Command Line Interface

aws s3 sync your-dir-name s3://your-s3-bucket-name/folder-name
  • Important: This will copy each item in your named directory into the s3 bucket folder you selected. This will not copy your directory as a whole.

Or, you can use the following command for one selected file.

aws s3 sync your-dir-name/file-name s3://your-s3-bucket-name/folder-name/file-name

Or you can use a wild character to select all. Note that this will copy your directory as a whole and also generate metadata and save them to your s3 bucket folder.

aws s3 sync . s3://your-s3-bucket-name/folder-name

To copy from EC2 to S3 use the below code in the Command line of EC2.

First, you have to give "IAM Role with full s3 Access" to your EC2 instance.

aws s3 cp Your_Ec2_Folder s3://Your_S3_bucket/Your_folder --recursive

Also note on aws cli syncing with s3 it is multithreaded and uploads multiple parts of a file at one time. The number of threads however, is not configurable at this time.

aws s3 mv /home/inbound/ s3://test/ --recursive --region us-west-2

This can be done very simply. Follow the following steps:

  • Open the AWS EC2 on console.
  • Select the instance and navigate to actions.
  • Select instances settings and select Attach/Replace IAM Role
  • When this is done, connect to the AWS instance and the rest will be done via the following CLI commands:

aws s3 cp filelocation/filename s3://bucketname

Hence you don't need to install or do any extra efforts.

Please note... the file location refers to the local address. And the bucketname is the name of your bucket. Also note: This is possible if your instance and S3 bucket are in the same account. Cheers.

We do have a dryrun feature available for testing.

  • To begin with I would assign ec2-instance a role to be able read write to S3
  • SSH into the instance and perform the following
  • vi tmp1.txt
  • aws s3 mv ./ s3://bucketname-bucketurl.com/ --dryrun
  • If this works then all you have to do is either create a script to upload all files with specific from this folder to s3 bucket
  • I have done the wrritten the following command in my script to move files older than 2 minutes from current directory to bucket/folder
  • cd dir; ls . -rt | xargs -I FILES find FILES -maxdepth 1 -name '*.txt' -mmin +2 -exec aws s3 mv '{}' s3://bucketurl.com

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM