简体   繁体   English

AWS ECS Fargate,将日志发送到多个目的地(CloudWatch Logs 和 Elasticsearch)

[英]AWS ECS Fargate, send logs to multiple destinations (CloudWatch Logs and Elasticsearch)

I have some containers deployed in ECS Fargate, that send the logs to Cloudwatch logs.我在 ECS Fargate 中部署了一些容器,它们将日志发送到 Cloudwatch 日志。 Now, I want to send also the logs to a custom Elasticsearch instance (not Amazon Elasticsearch Service).现在,我还想将日志发送到自定义 Elasticsearch 实例(不是 Amazon Elasticsearch 服务)。 I have read some info about firelens , but it is not clear for me if the logs will be sent also to Cloudwatch logs.我已经阅读了一些关于firelens的信息,但我不清楚日志是否也会发送到 Cloudwatch 日志。 Also, all the documentation seems to refer to Amazon Elasticsearch Service, not an own Elasticsearch instance.此外,所有文档似乎都指的是 Amazon Elasticsearch 服务,而不是自己的 Elasticsearch 实例。 Do you have any recommendation/snippet of how to do that?您对如何做到这一点有任何建议/片段吗? Additionally, If I wanted to send the logs to a third, or fourth destination, what would be the approach to do this?此外,如果我想将日志发送到第三个或第四个目的地,执行此操作的方法是什么? Of course, in the cheapest possible way, avoiding Firehose or Lambda tricks.当然,以最便宜的方式,避免 Firehose 或 Lambda 技巧。

Adding an answear because I don't have the reputation points to comment.添加一个 answear 因为我没有评论的声誉点。 Anyway, I suggest you review this AWS Blog post on custom firelens configuration to send ecs fargate logs to multiple destinations.无论如何,我建议您查看这篇关于自定义 firelens 配置的AWS 博客文章,以将 ecs fargate 日志发送到多个目的地。

First you'll need to build a custom firelens image and make it available on ECR:首先,您需要构建自定义 firelens 图像并使其在 ECR 上可用:

FROM amazon/aws-for-fluent-bit:stable
ADD logDestinations.conf /logDestinations.conf

The custom image should have a.conf file with your log destinations... Here's an example:自定义图像应该有一个包含您的日志目的地的 .conf 文件...这是一个示例:

[OUTPUT]
    Name                cloudwatch
    Match               YourContainerName*
    region              us-east-1
    log_group_name      app-container
    log_stream_prefix   fluent-bit
    auto_create_group   true
[OUTPUT]
    Name              datadog
    Match             *
    Host              http-intake.logs.datadoghq.com
    TLS               on
    compress          gzip
    apikey            <DATADOG_API_KEY>
    dd_service        <APPLICATION_SERVICE>
    dd_source         <SOURCE>
    dd_message_key    log
    dd_tags           env:dev,<TAG_KEY>:<TAG_VALUE>

You will add Firelens as a new container in your TaskDefinition:您将在 TaskDefinition 中添加 Firelens 作为新容器:

ContainerDefinitions:
... [Other Containers in your task. E.g.: Your application, datadog agent, etc]
- Name: log-router
  Essential: true
  Image: amazon/customImage:latest
  Cpu: 100
  Memory: 256
  FirelensConfiguration:
    Type: fluentbit
    Options:
      enable-ecs-log-metadata: true
      config-file-type: file
      config-file-value: "/logDestinations.conf"

I tested this configuration, it works well.我测试了这个配置,它运行良好。 The only thing I had to worry was setting up my image in a separate pipeline, since it's not possible to get the logDestinations.conf file from S3 when using ECS on Fargate.我唯一需要担心的是在单独的管道中设置我的图像,因为在 Fargate 上使用 ECS 时无法从 S3 获取 logDestinations.conf 文件。

Reference: https://aws.amazon.com/pt/premiumsupport/knowledge-center/ecs-container-log-destinations-fargate/参考: https://aws.amazon.com/pt/premiumsupport/knowledge-center/ecs-container-log-destinations-fargate/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM