简体   繁体   English

尝试下载文件时从s3获取403禁止

[英]Getting 403 forbidden from s3 when attempting to download a file

I have a bucket on s3, and a user given full access to that bucket. 我在s3上有一个存储桶,并且用户可以完全访问该存储桶。

I can perform an ls command and see the files in the bucket, but downloading them fails with: 我可以执行ls命令并查看存储桶中的文件,但是下载它们会失败:

A client error (403) occurred when calling the HeadObject operation: Forbidden

I also attempted this with a user granted full S3 permissions through the IAM console. 我还尝试通过IAM控制台授予用户完全S3权限的用户。 Same problem. 同样的问题。

For reference, here is the IAM policy I have: 作为参考,这是我的IAM政策:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::mybucket",
                "arn:aws:s3:::mybucket/*"
            ]
        }
    ]
}

I also tried adding a bucket policy, even making the bucket public, and still no go...also, from the console, I tried to set individual permissions on the files in the bucket, and got an error saying I cannot view the bucket, which is strange, since I was viewing it from the console when the message appeared, and can ls anything in the bucket. 我也尝试添加一个存储桶策略,甚至将存储桶公开,但仍然没有...也是,从控制台,我试图对存储桶中的文件设置单独的权限,并得到一个错误,说我无法查看存储桶,这是奇怪的,因为我是从控制台查看它当消息出现,并能ls在斗什么。

EDIT the files in my bucket were copied there from another bucket belonging to a different account, using credentials from my account. 编辑我的存储桶中的文件是使用我帐户中的凭据从另一个属于不同帐户的存储桶中复制的。 May or may not be relevant... 可能相关也可能不相关......

2nd EDIT just tried to upload, download and copy my own files to and from this bucket from other buckets, and it works fine. 第二次编辑只是尝试从其他存储桶上传,下载和复制我自己的文件到这个桶,它工作正常。 The issue is specifically with the files placed there from another account's bucket. 问题特别在于从另一个帐户的存储区放置的文件。

Thanks! 谢谢!

I think you need to make sure that the permissions are applied to objects when moving/copying them between buckets with the "bucket-owner-full-control" acl. 我认为您需要确保在使用“bucket-owner-full-control”acl在存储桶之间移动/复制权限时将权限应用于对象。

Here are the details about how to do this when moving or copying files as well as retroactively: https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-access/ 以下是有关在移动或复制文件时如何执行此操作以及追溯的详细信息: https//aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-access/

Also, you can read about the various predefined grants here: http://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl 此外,您可以在此处阅读各种预定义授权: http//docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl

The problem here stems from how you get the files into the bucket. 这里的问题源于你如何将文件放入存储桶中。 Specifically the credentials you have and/or privileges you grant at the time of upload. 具体而言,您拥有的凭据和/或您在上载时授予的权限。 I ran into a similar permissions issue issue when I had multiple AWS accounts, even though my bucket policy was quite open (as yours is here). 当我有多个AWS账户时,我遇到了类似的权限问题,即使我的存储桶策略是非常开放的(因为你的在这里)。 I had accidentally used credentials from one account (call it A1) when uploading to a bucket owned by a different account (A2). 在上传到其他帐户(A2)拥有的存储桶时,我不小心使用了来自一个帐户的凭据(称为A1)。 Because of this A1 kept the permissions on the object and the bucket owner did not get them. 因为这个A1保留了对象和存储桶拥有者的权限没有得到它们。 There are at least 3 possible ways to fix this in this scenario at time of upload: 在上传时,至少有3种方法可以解决此问题:

  • Switch accounts. 切换帐户。 Run $export AWS_DEFAULT_PROFILE=A2 or, for a more permanent change, go modify ~/.aws/credentials and ~/.aws/config to move the correct credentials and configuration under [default] . 运行$export AWS_DEFAULT_PROFILE=A2或者,为了更持久的更改,请修改~/.aws/credentials~/.aws/config以在[default]下移动正确的凭据和配置。 Then re-upload. 然后重新上传。
  • Specify the other profile at time of upload: aws s3 cp foo s3://mybucket --profile A2 在上传时指定其他个人资料: aws s3 cp foo s3://mybucket --profile A2
  • Open up the permissions to bucket owner (doesn't require changing profiles): aws s3 cp foo s3://mybucket --acl bucket-owner-full-control 打开存储桶拥有者的权限(不需要更改配置文件): aws s3 cp foo s3://mybucket --acl bucket-owner-full-control

Note that the first two ways involve having a separate AWS profile. 请注意,前两种方法涉及具有单独的AWS配置文件。 If you want to keep two sets of account credentials available to you, this is the way to go. 如果您希望保留两组帐户凭据,则可以使用此方法。 You can set up a profile with your keys, region etc by doing aws configure --profile Foo . 您可以通过执行aws configure --profile Foo来设置包含密钥,区域等的aws configure --profile Foo See here for more info on Named Profiles. 有关命名配置文件的更多信息,请参见此处

There are also slightly more involved ways to do this retroactively (post upload) which you can read about here . 还有一些比较复杂的方法可以追溯(上传后),你可以在这里阅读。

To correctly set the appropriate permissions for newly added files, add this bucket policy: 要为新添加的文件正确设置适当的权限,请添加此存储桶策略:

[...]
{
    "Effect": "Allow",
    "Principal": {
        "AWS": "arn:aws:iam::123456789012::user/their-user"
    },
    "Action": [
        "s3:PutObject",
        "s3:PutObjectAcl"
    ],
    "Resource": "arn:aws:s3:::my-bucket/*"
}

Your bucket policy is even more open, so that's not what's blocking you. 您的存储桶策略更加开放,因此不会阻止您。

However, the uploader needs to set the ACL for newly created files. 但是,上传者需要为新创建的文件设置ACL。 Python example: Python示例:

import boto3

client = boto3.client('s3')
local_file_path = '/home/me/data.csv'
bucket_name = 'my-bucket'
bucket_file_path = 'exports/data.csv'
client.upload_file(
    local_file_path,
    bucket_name, 
    bucket_file_path, 
    ExtraArgs={'ACL':'bucket-owner-full-control'}
)

source: https://medium.com/artificial-industry/how-to-download-files-that-others-put-in-your-aws-s3-bucket-2269e20ed041 (disclaimer: written by me) 来源: https//medium.com/artificial-industry/how-to-download-files-that-others-put-in-your-aws-s3-bucket-2269e20ed041 (免责声明:由我撰写)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用签名 URL 上传到 S3 时获取 403(禁止访问) - Getting 403 (Forbidden) when uploading to S3 with a signed URL 来自 S3 的图像未加载,403 Forbidden - Images from S3 not loading, 403 Forbidden 尝试使用 Boto3(Django + Javascript)通过预先签名的帖子将文件上传到 AWS S3 时被禁止获取 403 - Getting 403 Forbidden when trying to upload file to AWS S3 with presigned post using Boto3 (Django + Javascript) 在 AWS 上从 s3 读取文件到 sagemaker 会出现 403 禁止错误,但其他操作会对该文件起作用 - Reading a file from s3 to sagemaker on AWS gives 403 forbidden error, but other operations work on the file 尝试将文件发送到AWS S3时获取403 - Getting 403 when trying to send a file to AWS S3 (403) 调用 HeadObject 操作时:在 python 中从 AWS Batch 访问 S3 时禁止 - (403) when calling the HeadObject operation: Forbidden when accessing S3 from AWS Batch in python 无法访问Amazon s3文件:“ 403禁止访问” - Cannot access Amazon s3 file: '403 Forbidden' 403 禁止。 从 s3 通过云端服务多页应用程序时 - 403 Forbidden. when serving multipage app via cloudfront from s3 将文件上传到S3时行为不一致-禁止403 - Inconsistent behavior when uploading files to S3 - 403 forbidden AWS S3-403禁止读取文件-Boto Python - AWS S3 - 403 forbidden reading file - Boto Python
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM