简体   繁体   中英

Unload from Redshift to an S3 bucket of a box

I'm trying to unload data from a Redshift cluster in one box to an S3 bucket in another box. I have managed to send the file. However, as the bucket owner is not the owner of the file I'm sending - he cannot access it. As it's a straight unload from Redshift, I don't think I can't specify condition to allow the bucket owner the right permissions.

Is it even possible to achieve (without having to unload from Redshift using the same account) and if so - how?

Thanks!

(Attempt #2...)

Okay, it seems your situation is:

  • You are doing an UNLOAD from Amazon Redshift into an Amazon S3 bucket that belongs to a different AWS Account
  • A user within that different AWS Account wishes to access the files but says that they are unable to do so

There is no concept of a "file owner" in Amazon S3. Instead, there are:

  • Permissions associated with each object in Amazon S3
  • A bucket policy that applies to a specific bucket
  • IAM policies that can be applied to Users, Groups and Roles

As long as at least one of these permissions grants access and none of them specifically deny access, then users will be able to access the files.

If the user reports not being able to view the files, then ensure that ListBucket and GetObject permissions have been granted via one of the above methods.

It sounds like your situation is:

  • You have used the UNLOAD command to export data from Amazon Redshift to an Amazon S3 bucket that you own
  • You wish to grant access to the files to an AWS user that belongs to a different AWS account

Permission to access an object in Amazon S3 can be granted in several ways:

  • On the object itself , by manually setting permissions on the file(s)
  • In IAM (Identity and Access Management) , by attaching a Policy to a specific user that grants permission to access a bucket - but this would only work for users in the same AWS Account
  • By defining a Bucket Policy that grants access to bucket content, either to everyone (public) or to particular AWS users (including a user in a different account) - the user would need to access the content by supplying credentials, such as using the AWS Command-Line Interface (CLI) aws s3 cp command.

The Bucket Policy option seems the best for your situation. To enable it, create a policy on the bucket that grants access to a particular directory where you will put the files, for example:

{
    "Id": "Policy",
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Sid1",
            "Action": [
                "s3:ListBucket"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::MY-BUCKET",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::123456789012:user/username"
                ]
            }
        },
        {
            "Sid": "Sid2",
            "Action": [
                "s3:GetObject"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::MY-BUCKET/PATH/*",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::123456789012:user/username"
                ]
            }
        }
    ]
}

The ARN in the policy refers to the Account ID and username of the person to whom you are granting access. They can then use the AWS CLI to list the contents of the bucket and download content.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM