简体   繁体   English

Fluentd是否支持文件输出的日志轮换?

[英]Does Fluentd support log rotation for file output?

The current setup I am working with is a Docker compose stack with multiple containers. 我正在使用的当前设置是带有多个容器的Docker compose堆栈。 These containers send their logging information to a logging container (inside the compose stack) running the Fluentd daemon. 这些容器将其日志记录信息发送到运行Fluentd守护程序的日志记录容器(在Compose堆栈内部)。 The configuration for Fluentd consists of one in_forward source that collects the logs and writes them to separate files, depending on the container. Fluentd的配置由一个in_forward源组成,该源收集日志并将日志写到单独的文件中,具体取决于容器。 My Fluentd configuration file looks similar to this: 我的Fluentd配置文件看起来与此类似:

<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>

<match container1>
   @type copy
   <store>
     @type file
     path /fluentd/log/container1.*.log
     format single_value
     message_key "log"
   </store>
</match>

...

My docker-compose.yml file looks something like this: 我的docker-compose.yml文件看起来像这样:

version: '3'

services:

  container1:
    build: ./container1
    container_name: "container1" 
    depends_on:
     - "logger" 
    logging:
      driver: "fluentd"
      options:
        tag: container1  
    networks:
      static-net:
        ipv4_address: 172.28.0.4  


  ...


  logger:
    build: ./logger
    container_name: "logger"
    ports:
     - "24224:24224"
     - "24224:24224/udp"
    volumes:
     - ./logger/logs:/fluentd/log
    networks:
      static-net:
        ipv4_address: 172.28.0.5          

networks:
  static-net:
    ipam:
      driver: default
      config:
       - subnet: 172.28.0.0/16

Everything works as expected, but I would ideally like to set Fluentd to keep a certain number of log files. 一切都按预期工作,但理想情况下,我希望将Fluentd设置为保留一定数量的日志文件。 I can change the size of the log files by configuring the chunk_limit_size parameter in a buffer section. 我可以通过在缓冲区部分中配置chunk_limit_size参数来更改日志文件的大小。 However, even though I want this option, I still do not want Fluentd writing an endless amount of files. 但是,即使我需要此选项,我仍然不希望Fluentd编写无数文件。 The buffer_queue_limit and overflow_action in the buffer configuration do not seem to affect anything. 缓冲区配置中的buffer_queue_limitoverflow_action似乎没有任何影响。 This application will be running continuously once deployed, so log rotation is a necessity. 部署后,该应用程序将连续运行,因此有必要进行日志轮换。 Several questions I have: 我有几个问题:

  1. Does Fluentd support log rotation for writing logs to files? Fluentd是否支持将日志写入文件的日志轮换? If so, what parameters do I set in the Fluentd configuration file? 如果是这样,我应该在Fluentd配置文件中设置哪些参数?
  2. If not, can I configure Docker in such a way that I can utilize its log rotation of the json logging driver for Fluentd? 如果没有,我是否可以通过配置Docker的方式来利用Fluentd的json日志驱动程序的日志轮换方式?
  3. And if that is not possible, is there a way to add log rotation into Fluentd via a plugin or perhaps in the Fluentd docker container itself (or a sidecar container)? 并且如果不可能,是否有办法通过插件或在Fluentd docker容器本身(或sidecar容器)中将日志循环添加到Fluentd中?

When you are using fluentd logging driver for docker then there is no container log files, there are only fluentd logs, and to rotate them you can use this link. 当您为docker使用fluentd日志驱动程序时,则没有容器日志文件,只有fluentd日志,要旋转它们,您可以使用此链接。 If you want docker to keep logs and to rotate them, then you have to change your stackfile from: 如果您想让docker保留日志并对其进行轮换,则必须从以下位置更改堆栈文件:

    logging:
      driver: "fluentd"
      options:
        tag: container1  

to

  logging:
   driver: "json-file"
   options:
      max-size: "5m" #max size of log file
      max-file: "2" #how many files should docker keep

in fluentd you have to use the in_tail plugin instead of forward (fluentd should have access to log files in /var/lib/docker/containers/*/*-json.log ) 在fluentd中,您必须使用in_tail插件而不是forward(fluentd应该可以访问/var/lib/docker/containers/*/*-json.log日志文件)

<source>
  type tail
  read_from_head true
  pos_file fluentd-docker.pos
  path /var/lib/docker/containers/*/*-json.log
  tag docker.*
  format json
</source>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM