简体   繁体   English

AWS CloudWatch | 将日志导出到EC2服务器

[英]AWS CloudWatch | Export logs to EC2 server

I have "cloudwatch" service to monitor logs for my EC2 running instances. 我有“ cloudwatch”服务来监视EC2运行实例的日志。 But the ColudWatch web console does not seem to have a button to allow you to download/exporting the log data from it. 但是ColudWatch Web控制台似乎没有按钮可让您从中下载/导出日志数据。

Any ideas how I can achieve this goal through CLI or GUI? 有什么想法可以通过CLI或GUI实现此目标吗?

Programmatically, using boto3 (Python), 使用boto3(Python)以编程方式,

log_client=boto3.client('logs')
result_1=log_client.describe_log_streams(logGroupName='<NAME>')

(I don't know what log group names for EC2 instances look like; for Lambda they are of the form '/aws/lambda/FuncName' . Try grabbing the names you see in the console). (我不知道EC2实例的日志组名称是什么样的;对于Lambda,它们的格式为'/aws/lambda/FuncName' 。尝试获取在控制台中看到的名称)。

result_1 contains two useful keys: logStreams (the result you want) and nextToken (for pagination, I'll let you look up the usage). result_1包含两个有用的键: logStreams (所需的结果)和nextToken (用于分页,我将让您查找用法)。

Now result_1['logStreams'] is a list of objects containing a logStreamName . 现在result_1['logStreams']是包含logStreamName的对象列表。 Also useful are firstEventTimestamp and lastEventTimestamp . firstEventTimestamplastEventTimestamp也很有用。

Now that you have log stream names, you can use 现在您有了日志流名称,您可以使用

log_client.get_log_events(logGroupName='<name>',logStreamName='<name>'

The response contains nextForwardToken and nextBackwardToken for pagination, and events for the log events you want. 响应包含nextForwardTokennextBackwardToken为分页和events你想要的日志事件。 Each event contains a timestamp and a message . 每个事件都包含一个timestamp和一条message

I'll leave it to you to look up the API to see what other parameters might be useful to you. 我将留给您查找API,以了解其他哪些参数可能对您有用。 By the way, the console will let you stream your logs to an S3 bucket or to AWS's ElasticSearch service. 顺便说一句,该控制台将让您将日志流式传输到S3存储桶或AWS的ElasticSearch服务。 ElasticSearch is a joy to use, and Kibana's UI is intuitive enough that you can get results even without learning their query language. ElasticSearch使用起来很有趣,而且Kibana的UI非常直观,即使您不学习其查询语言也可以得到结果。

You can use the console or the AWS CLI to download CloudWatch logs to Amazon S3. 您可以使用控制台或AWS CLI将CloudWatch日志下载到Amazon S3。 You do need to know the log group name, from & to timestamps in the log, destination bucket and prefix. 您确实需要知道日志组的名称,从&到日志中的时间戳,目标存储区和前缀。 Amazon recommends a separate S3 bucket for your logs. Amazon建议您为日志使用一个单独的S3存储桶。 Once you have a bucket you create an export task, under (in the console) Navigation - Logs - select your log group - Actions - Export data to S3 - fill in the details for your export - select Export data. 一旦有了存储桶,就可以创建导出任务,在(控制台中)导航-日志-选择日志组-操作-将数据导出到S3-填写导出详细信息-选择导出数据。 Amazon's documentation explains it pretty well at: http://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/S3Export.html . 亚马逊的文档对此进行了很好的解释: http : //docs.aws.amazon.com/AmazonCloudWatch/latest/logs/S3Export.html And CLI instructions are there also if you wanted to use that. 如果您想使用它,还可以使用CLI指示。 I imagine with the CLI you could also script your export, but you would have to define the variables somehow so you don't overwrite an existing export. 我想通过CLI也可以编写导出脚本,但是您必须以某种方式定义变量,以免覆盖现有的导出。

If this is part of your overall AWS disaster recovery planning, you might want to check out some tips & best practices, such as Amazon's white paper on AWS disaster recovery, and NetApp's discussion of using the cloud for disaster recovery . 如果这是您整个AWS灾难恢复计划的一部分,则您可能需要查看一些技巧和最佳实践,例如Amazon的AWS灾难恢复白皮书以及NetApp关于使用云进行灾难恢复的讨论。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM