简体   繁体   中英

Docker Fluentd Logging Driver For multiline

I am trying to create a centralized logging system using fluentd for a docker environment. Currently, i able to send the docker log to fluentd using fluentd docker logging driver which is a much cleaner solution compare to reading the docker log file using in_tail method. However, i am currently facing the issue on multi lines log issue.

在此输入图像描述

As you can see from the picture above, the multi lines log are out of order which is very confusing for user. Is there any way this can be solved?

Thanks.

Cw

Using fluent-plugin-concat pluging helped me in fixing above problem.

Adding these lines in fluent-conf

 <filter **>
  @type concat
  key log
  stream_identity_key container_id
  multiline_start_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{3}
  multiline_end_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{3}
</filter>

Where my regular expression is checking for DateTimeStamp in Logs where each line starts with and Date and Timestamp (pay attention to "log":"2017-09-21 15:03:27.289 ) below

2017-09-21T15:03:27Z    tag     {"container_id":"11b0d89723b9c812be65233adbc51a71507bee04e494134258b7af13f089087f","container_name":"/bel_osc.1.bc1k2z6lke1d7djeq5s28xjyl","source":"stdout","log":"2017-09-21 15:03:27.289  INFO 1 --- [           main] org.apache.catalina.core.StandardEngine  : Starting Servlet Engine: Apache Tomcat/8.5.6"}
2017-09-21T15:03:28Z    tag     {"container_id":"11b0d89723b9c812be65233adbc51a71507bee04e494134258b7af13f089087f","container_name":"/bel_osc.1.bc1k2z6lke1d7djeq5s28xjyl","source":"stdout","log":"2017-09-21 15:03:28.191  INFO 1 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/]       : Initializing Spring embedded WebApplicationContext"}

Also, I had to add below lines in Dockerfile to install the plugin

RUN ["gem", "install", "fluent-plugin-concat", "--version", "2.1.0"] 
#Works with Fluentd v0.14-debian

Though this regular expression doesn't work well when an exception occurs, but still much better than before. Fluentd Link, for reference .

Take a look at multiline parsing in their documentation: http://docs.fluentd.org/articles/parser-plugin-overview#

You basically have to specify a regex that would match the beginning of a new log message and that will enable fluentd to aggregate multiline log events into a single message.

Example for a usual java stacktrace from their docs:

format multiline format_firstline /\\d{4}-\\d{1,2}-\\d{1,2}/ format1 /^(?<time>\\d{4}-\\d{1,2}-\\d{1,2} \\d{1,2}:\\d{1,2}:\\d{1,2}) \\[(?<thread>.*)\\] (?<level>[^\\s]+)(?<message>.*)/

I know this is not and "answer" to the fluentd question. But this guide solves the problem with logstash: http://www.labouisse.com/how-to/2015/09/14/elk-and-docker-1-8

JSON support by adding

    json {
        source => "log_message"
        target => "json"
    }

to his filter after parsing a log line

I never found a solution for fluentd, so went with this solution instead

Updated link

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM