简体   繁体   English

如何获取 AWS CodeDeploy 日志并在 BitBucket 管道中显示它们

[英]How to fetch AWS CodeDeploy logs and show them in a BitBucket pipeline

I want to fetch CodeDeploy logs from my Amazon EC2 instance when a script fails during deployment and then show the logs in BitBucket pipelines.当脚本在部署期间失败时,我想从我的 Amazon EC2 实例中获取 CodeDeploy 日志,然后在 BitBucket 管道中显示日志。

How can I do that?我怎样才能做到这一点?
Is there any API available for fetching the logs from CodeDeploy?是否有任何 API 可用于从 CodeDeploy 获取日志?

I am not sure about BitBucket, but natively on AWS, you can push the logs from CodeDeploy agent to CloudWatch Logs using the CloudWatch Logs agent [1].我不确定 BitBucket,但在 AWS 上,您可以使用 CloudWatch Logs 代理 [1] 将日志从 CodeDeploy 代理推送到 CloudWatch Logs。 Once in CloudWatch Logs, you will create a Metric Filter to Alarm when some specific text appears in the log entries [2].进入 CloudWatch Logs 后,您将创建一个指标过滤器,以便在日志条目 [2] 中出现某些特定文本时发出警报。

CodeDeploy agent log file locations are: CodeDeploy 代理日志文件位置为:

LINUX 
*** /opt/codedeploy-agent/deployment-root/deployment-group-ID/deployment-ID/logs/scripts.log
*** /var/log/aws/codedeploy-agent/codedeploy-agent.log
*** /tmp/codedeploy-agent.update.log
WINDOWS 
*** C:\ProgramData\Amazon\CodeDeploy\log\codedeploy-agent-log.txt
*** C:\ProgramData\Amazon\CodeDeploy\deployment-group-ID\deployment-ID\logs\scripts.log
*** C:\ProgramData\Amazon\CodeDeployUpdater\log\codedeploy-agent.updater.log

References:参考:

[1] Quick Start: Enable Your Amazon EC2 Instances Running Windows Server 2016 to Send Logs to CloudWatch Logs Using the CloudWatch Logs Agent - https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartWindows2016.html [1] 快速入门:使运行 Windows Server 2016 的 Amazon EC2 实例能够使用 CloudWatch Logs 代理将日志发送到 CloudWatch Logs - https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartWindows2016.html

[2] https://theithollow.com/2017/12/11/use-amazon-cloudwatch-logs-metric-filters-send-alerts/ [2] https://theithollow.com/2017/12/11/use-amazon-cloudwatch-logs-metric-filters-send-alerts/

Although I'm not aware of any API to get this done, I have a simple workaround which does the job - this involves uploading the default code deployment log file to S3 and reading from there.虽然我不知道有任何 API 可以完成这项工作,但我有一个简单的解决方法来完成这项工作——这涉及将默认代码部署日志文件上传到 S3 并从那里读取。

  1. On the EC2 instance the default log file for AWS Code Deploy is stored here: /var/log/aws/codedeploy-agent/codedeploy-agent.log在 EC2 实例上,AWS Code Deploy 的默认日志文件存储在此处:/var/log/aws/codedeploy-agent/codedeploy-agent.log

  2. Install aws-cli on the EC2 instance.[1]在 EC2 实例上安装 aws-cli。[1]

  3. Write a simple shell script which uploads the log file to S3 bucket.编写一个简单的 shell 脚本,将日志文件上传到 S3 存储桶。 For convenience sake, you may use the same S3 bucket which is being used to upload code from bitbucket to EC2 instance.为了方便起见,您可以使用用于将代码从 bitbucket 上传到 EC2 实例的同一个 S3 存储桶。 However make sure that the EC2 instance has enough privileges to upload a file to the S3 bucket.但是请确保 EC2 实例有足够的权限将文件上传到 S3 存储桶。 [2] [2]

    #!/bin/bash

    aws s3 cp /var/log/aws/codedeploy-agent/codedeploy-agent.log s3://S3_BUCKET_NAME/codedeploy-agent.log

  4. Let's say that the script created in the above step is saved here: /opt/upload-log-file.sh.假设在上述步骤中创建的脚本保存在这里:/opt/upload-log-file.sh。 Make sure that the script has read privileges for code deploy to be able to upload it to S3.确保脚本具有代码部署的读取权限,以便能够将其上传到 S3。 This can be ensured by running the command:这可以通过运行以下命令来确保:

    chmod +r /opt/upload-log-file.sh

  5. The appspec.yml defines all the steps for the deployment. appspec.yml 定义了部署的所有步骤。 In the script of the last defined step (usually the 'ValidateService' step) add the following line to execute the upload-log-file.sh script:在最后定义的步骤(通常是“ValidateService”步骤)的脚本中添加以下行以执行 upload-log-file.sh 脚本:

    /opt/upload-log-file.sh

  6. In the bitbucket-pipelines.yml file, add the following steps to read the contents of the uploaded log file.在 bitbucket-pipelines.yml 文件中,添加以下步骤读取上传的日志文件的内容。 Note that the variables used here should be declared in the bitbucket user-defined variables.请注意,此处使用的变量应在 bitbucket 用户定义变量中声明。 [3] [3]

    apt-get update

    apt-get install -y awscli

    export aws_access_key_id=$AWS_ACCESS_KEY_ID; export aws_secret_access_key=$AWS_SECRET_ACCESS_KEY; aws s3 cp s3://$S3_BUCKET_NAME/codedeploy-agent.log.

    tail -100 codedeploy-agent.log

The next time bitbucket pipelines are run, you will be able to see the last 100 lines of the log file in the pipeline steps.下次运行 bitbucket 管道时,您将能够在管道步骤中看到日志文件的最后 100 行。


References:参考:

[1] Installing AWS CLI - https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html [1] 安装 AWS CLI - https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html

[2] Accessing AWS S3 from EC2 - https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/ [2] 从 EC2 访问 AWS S3 - https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/

[3] Bitbucket Variables and Secrets - https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/ [3] Bitbucket 变量和秘密 - https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM