简体   繁体   中英

files copying from remote server to s3 does not show status on s3 bucket

I am uploading almost 7 TB of files and folders from my remote server to the s3 bucket but I can not see any files on the s3 bucket. only a few files I can see on s3 that was copied successfully.

I have one ec2 server on which I have mounted an s3 bucket using this link

on the remote server, I am using the following script. I have also tested this script and it was working fine for the small size of files

rsync -uvPz --recursive -e "ssh -i /tmp/key.pem" /eb_bkup/OMCS_USB/* appadmin@10.118.33.124:/tmp/tmp/s3fs-demo/source/backups/eb/ >> /tmp/log.txt &

The log file I am generating is showing me files are being copied and all the relevant information like transfer speed, filename, etc. But on the s3 bucket, I can not see any file after the 1st one is copied.

Each file size is from 500MB to 25GB.

Why I cannot see these files on S3?

Amazon S3 is an object storage service, not a filesystem. I recommend you use the AWS Command-Line Interface (CLI) to copy files rather than mounting S3 as a disk.

The AWS CLI includes a aws s3 sync command that is ideal for your purpose -- it will synchronize files between two locations. So, if something fails, you can re-run it and it will not copy files that have already been copied.

So the issue I was facing that rsync was copying files on target ec2 and first creating a temporary file and then writing it over S3 bucket. So multiple rsync job was running and the local ebs volume storage on EC2 server was full. That is why rsync wasn't able to create temp files and was kept copying/writing on the socket.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM