简体   繁体   中英

Unable to download aws s3 glacier objects

I was trying to copy all the files from my S3 bucket to a local folder in VM and I am getting the following error:

warning: Skipping file s3://bucket/object. Object is of storage class GLACIER.
Unable to perform download operations on GLACIER objects. You must restore the
object to be able to perform the operation. See aws s3 download help for
additional parameter options to ignore or force these transfers.

冰川错误

To copy files from my S3 bucket to local folder I used the following command:

aws s3 cp s3://${s3Location} ${localDumpPath}

Where:

  • ${s3Location} = my s3 location and
  • ${localDumpPath} = my localfolder path

What do I need to change to be able to copy successfully?

I fixed the issue by using the following command:

aws s3 cp s3://${s3Location} ${localDumpPath} --storage-class STANDARD --recursive --force-glacier-transfer

You can also refer the below link to get details of how to restore an S3 object from the Amazon S3 Glacier storage class using the AWS CLI: Restore S3 object from the Amazon Glacier storage class

The Problem : you are trying to copy an aws s3 object but the storage type is glacier and you got the following error:

warning: Skipping file s3://<SomePathToS3Object> Object is of storage class GLACIER.
Unable to perform download operations on GLACIER objects.
You must restore the object to be able to perform the operation.
See aws s3 download help for additional parameter options to ignore or force these transfers.

Explanation : Amazon S3 Glacier is a secure, durable, and extremely low-cost cloud storage service for data archiving and long-term backup. When you need to use the file you perform a restore request,you pay a retrieval pricing and after couple of hours the object is enabled and ready. This feature usually use companies to archive files/logs/databases/backups when this data is rarely consumed.

Solution : In order to get glacier files you need to initiate a restore request, monitor the status of the restore request, as soon as it finishes change the object of the storage class (standard) and copy it. You can use aws reference

//Initate restore request:
$ aws s3api restore-object --bucket examplebucket --key dir1/example.obj \
--restore-request '{"Days":7,"GlacierJobParameters":{"Tier":"Standard"}}'

//monitor status:
$ aws s3api head-object --bucket examplebucket --key dir1/example.obj

// output example - restore in progress
{
    "Restore": "ongoing-request=\"true\"",
    ...
    "StorageClass": "GLACIER",
    "Metadata": {}
}

// output example - restore completed
 {
    "Restore": "ongoing-request=\"false\", expiry-date=\"Sun, 1 January 2000 00:00:00 GMT\"",
    ...
    "StorageClass": "GLACIER",
    "Metadata": {}
}

$ aws s3 cp s3://examplebucket/dir1/ ~/Downloads \
--storage-class STANDARD --recursive --force-glacier-transfer

You wrote that you needed "to copy all the files" to a local folder, assuming you wanted to copy the files recursively.

Because the files are kept in the Glacier storage class, you need to restore them from the Glacier archive first before you could copy them to your local folder, ie make the files available for retrieval for a specified number of days. After the restore completes, you can copy the files specifying the --force-glacier-transfer parameter until the term that you have specified in days expire.

Unless you store the files in the "S3 Glacier Instant Retrieval" storage class, you should first restore the files (make them available for retrieval) so that --force-glacier-transfer option would not fail. Therefore, the solution proposed at https://stackoverflow.com/a/62651252/6910868 does not apply to "S3 Glacier Deep Archive" storage class, for which you have to explicitly issue the restore-object command and wait for its completion before you can copy files to your local folder.

However, the aws s3api restore-object restores just one file and does not support recursive restore. The solution specified at https://stackoverflow.com/a/65925266/6910868 does not work for a recursive directory or when you have multiple files so that you wish to specify just the folder without listing all the files one by one.

As an alternative, instead of restoring the files by making them available for retrieval, you can change the object's storage class to Amazon S3 Standard. To do that, you can copy the files within S3, either by overwriting the existing files or by copying the files from one S3 location into another S3 location. In each case, you should specify the correct destination storage class.

If you just need to retrieve recursive files from the Glacier storage class without changing the storage class or making additional copies within S3, you can use the Perl script that lists the files recursively and then restores them from Glacier individually. This script may be used not only to initiate the restore using the specified restore tier, but to monitor the process as well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM