简体   繁体   English

使用来自另一个帐户 s3 存储桶的代码源部署 Lambda

[英]Deploy Lambda with code source from another accounts s3 bucket

I store my Lambda zip files in an S3 bucket in Account A. In Account BI have my Lambda. I am trying to have my Lambda use the zip file in Account A's bucket but I keep getting:我将我的 Lambda zip 文件存储在帐户 A 的 S3 存储桶中。在帐户 BI 中有我的 Lambda。我试图让我的 Lambda 使用帐户 A 存储桶中的 zip 文件,但我不断收到:

Your access has been denied by S3, please make sure your request credentials have permission to GetObject for bucket/code.zip. S3 Error Code: AccessDenied. S3 Error Message: Access Denied

I have followed guides I have found online but I am still facing issues.我已经按照我在网上找到的指南进行操作,但我仍然面临问题。 Here is my current config:这是我当前的配置:

Account A's S3 Bucket Policy:账户 A 的 S3 存储桶策略:

{
    "Version": "2012-10-17",
    "Id": "ExamplePolicy",
    "Statement": [
        {
            "Sid": "ExampleStmt",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::AccountBID:role/MyLambdaRole"
            },
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::bucket",
                "arn:aws:s3:::bucket/*"
            ]
        }
    ]
}

Account B's Lambda Execution Role Policy:账户 B 的 Lambda 执行角色策略:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::bucket/*",
                "arn:aws:s3:::bucket"
            ]
        }
    ]
}

The principal in your bucket policy is the role that AWS Lambda uses during execution, which is not used when deploying your function. You could easily just allow the entire B account principal in the bucket policy and then use IAM policies in account B to allow access to the bucket that way.存储桶策略中的主体是 AWS Lambda 在执行期间使用的角色,在部署 function 时不使用该角色。您可以轻松地只允许存储桶策略中的整个 B 帐户主体,然后使用帐户 B 中的 IAM 策略以允许访问以这种方式到水桶。

A bucket policy allowing an entire account looks like this:允许整个帐户的存储桶策略如下所示:

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "ProductAccountAccess",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::XXXX-account-number:root"
                ]
            },
            "Action": [
                "s3:Get*",
                "s3:List*"
            ],
            "Resource": [
                "arn:aws:s3:::bucket",
                "arn:aws:s3:::bucket/*"
            ]
        }
    ]
}

This means that the IAM policies in account B depend on how you do your deployment.这意味着账户 B 中的 IAM 策略取决于您的部署方式。 Meaning that whatever credentials are used for the deployment need to have S3 permissions for that bucket.这意味着无论用于部署的凭据都需要具有该存储桶的 S3 权限。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用带有源代码的现有 S3 存储桶通过 AWS CDK 进行部署 - Using existing S3 bucket with source code to deploy with AWS CDK AWS Lambda 尝试将文件从 S3 存储桶复制到另一个 S3 存储桶时出现无效存储桶名称错误 - Invalid bucket name error when AWS Lambda tries to copy files from an S3 bucket to another S3 bucket AWS Lambda S3 存储桶中的代码未更新 - AWS Lambda Code in S3 Bucket not updating 从 S3 存储桶下载 Lambda 依赖项 - downloading Lambda dependencies from S3 bucket 最简单的 lambda function 将文件从一个 s3 存储桶复制到另一个存储桶 - Simplest lambda function to copy a file from one s3 bucket to another 根据将文件从一个 S3 存储桶复制到另一个存储桶的清单文件触发 AWS Lambda function - Trigger AWS Lambda function based on a manifest file which copies files from one S3 bucket to another 从 GitHub 部署到 S3 存储桶的 CDK 管道 - CDK pipeline to deploy to S3 bucket from GitHub 如何从 S3 存储桶计算 Lambda 中图像的像素面积? - How to calculate pixel area of an image in Lambda from an S3 bucket? 使用 Lambda 将文件从一个 S3 存储桶复制到另一个 S3 存储桶 - 时间限制? - File Copy from one S3 bucket to other S3 bucket using Lambda - timing constraint? AWS Lambda 从 S3 存储桶中的特定文件夹中删除所有内容 - AWS Lambda to delete everything from a specific folder in an S3 bucket
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM