简体   繁体   English

AWS CodePiplene 未将文件上传到 S3 存储桶

[英]AWS CodePiplene not upload file to the S3 bucket

I have AWS CodePipline with these steps:我有具有以下步骤的 AWS CodePipline:

在此处输入图片说明

When I run CodeBuild phase manually 2 artifacts successfully uploaded to the s3.当我手动运行 CodeBuild 阶段时,2 个工件成功上传到 s3。

在此处输入图片说明

When I change code and push it to CodeCommit pipeline executes successfully, but artifacts don't upload to the S3 bucket.当我更改代码并将其推送到 CodeCommit 管道时,会成功执行,但工件不会上传到 S3 存储桶。 When I read the pipeline logs it asserts that 2 files uploaded successfully to the S3 bucket.当我阅读管道日志时,它断言 2 个文件已成功上传到 S3 存储桶。 but files are not in the S3 bucket.但文件不在 S3 存储桶中。

Both Pipeline and CodeBuild use the same S3 bucket. Pipeline 和 CodeBuild 都使用相同的 S3 存储桶。 these are Pipeline logs.这些是管道日志。 The last 3 lines of log assert that 2 artifacts successfully uploaded to the S# bucket.日志的最后 3 行断言 2 个工件已成功上传到 S# 存储桶。

Container] 2020/08/24 08:25:55 Phase complete: POST_BUILD State: SUCCEEDED
[Container] 2020/08/24 08:25:55 Phase context status code:  Message: 
[Container] 2020/08/24 08:25:55 Expanding base directory path: ./build-artifacts
[Container] 2020/08/24 08:25:55 Assembling file list
[Container] 2020/08/24 08:25:55 Expanding ./build-artifacts
[Container] 2020/08/24 08:25:55 Expanding file paths for base directory ./build-artifacts
[Container] 2020/08/24 08:25:55 Assembling file list
[Container] 2020/08/24 08:25:55 Expanding **/*
[Container] 2020/08/24 08:25:55 Found 2 file(s)
[Container] 2020/08/24 08:25:56 Phase complete: UPLOAD_ARTIFACTS State: SUCCEEDED
[Container] 2020/08/24 08:25:56 Phase context status code:  Message: 

As explained in the comments, the reason why you can't see the artifacts in your bucket is that CodePipeline (CP) will store it in CP's bucket .正如评论中所解释的,您看不到存储桶中工件的原因是 CodePipeline (CP) 将其存储在CP 的存储桶中 You can inspect the CP's bucket.您可以检查 CP 的存储桶。

The artifacts will be zipped , although the file in the bucket will have no extension and will have a random name.工件将被压缩,尽管存储桶中的文件没有扩展名并且将具有随机名称。 You can download the file, add zip extension and unpack to verify.您可以下载该文件,添加zip扩展名并解压缩以进行验证。

One way to deploy to S3 is through a Deploy action and Amazaon S3 action provider.部署到 S3 的一种方法是通过Deploy操作和Amazaon S3操作提供程序。 But this will just deploy the zip, it will not unpack it.但这只会部署 zip,不会解压它。 I don't think this is what you are aiming for.我不认为这就是你的目标。

The alternative is to use AWS CLI in your CodeBuild project to simply copy "manually" the artifacts to the desired bucket.另一种方法是在您的 CodeBuild 项目中使用AWS CLI来简单地“手动”将工件复制到所需的存储桶。 This probably would be the easiest way .这可能是最简单的方法

CodePipeline will override the CodeBuild's artifact bucket, I guess it's moving the artifacts to its own Bucket. CodePipeline 将覆盖 CodeBuild 的工件存储桶,我猜它正在将工件移动到它自己的存储桶中。 you can see the CodePipeline's bucket by running the below command.您可以通过运行以下命令查看 CodePipeline 的存储桶。

codepipeline get-pipeline --name PipelineName--query pipeline.[artifactStore]

[
    {
        "type": "S3",
        "location": "codepipeline-us-east-1-xxxxxxxx"
    }
]

If you are using CloudFormation to create the pipeline you configure the bucket using ArtifactStore .如果您使用 CloudFormation 创建管道,请使用ArtifactStore配置存储桶。

Update ArtifactStore in CodePipeline: Currently, I don't see a way to update the ArtifactStore through Console but it can be done with update-pipeline command.在 CodePipeline 中更新 ArtifactStore:目前,我没有看到通过控制台更新 ArtifactStore 的方法,但可以使用update-pipeline命令来完成。

Get the pipleline details:获取流水线详细信息:

aws codepipeline get-pipeline --name MyFirstPipeline >pipeline.json

Edit the pipeline to update the S3 Bucket in artifactStore and then run the command.编辑管道以更新artifactStore的 S3 Bucket,然后运行命令。

aws codepipeline update-pipeline --cli-input-json file://pipeline.json

Answer Reference: https://stackoverflow.com/a/49292819/11758843答案参考: https : //stackoverflow.com/a/49292819/11758843

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM