简体   繁体   English

配置Logstash解码其自己的事件格式JSON

[英]Configuring Logstash to Decode Its Own Event Format JSON

I have a java log file for a webbapp that was created using SLF4J, Logback and the logstash-logback-encoder for use in logstash 1.4.2. 我有一个Webbapp的Java日志文件,该文件是使用SLF4J,Logback和logstash-logback-encoder创建的 ,用于logstash 1.4.2。 While various configurations have succeeded from retrieving data from the logs, none has actually resulted in proper json being returned. 尽管从日志中检索数据成功完成了各种配置,但实际上没有任何配置导致返回正确的json。 Based on every guide I have read, the following configuration should work, but does not. 根据我阅读的每本指南,以下配置应该有效,但无效。

Sample of Log 日志样本

{"@timestamp":"2015-02-04T00:03:43.178+00:00","@version":1,"message":"No token was found, creating new token.","logger_name":"com.company.ws.service.AuthService","thread_name":"ajp-nio-8009-exec-10","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.199+00:00","@version":1,"message":"5f8aaebd-4274-4f00-a2eb-7b2350231ef2","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-1","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.199+00:00","@version":1,"message":"36","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-1","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.218+00:00","@version":1,"message":"5f8aaebd-4274-4f00-a2eb-7b2350231ef2","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-3","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.218+00:00","@version":1,"message":"36","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-3","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.218+00:00","@version":1,"message":"135a2411-ac96-492b-94e9-df6b65974f9f","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-3","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.218+00:00","@version":1,"message":"36","logger_name":"com.company.jaxrs.provider.ParamTest","thread_name":"ajp-nio-8009-exec-3","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}
{"@timestamp":"2015-02-04T00:03:43.219+00:00","@version":1,"message":"is string","logger_name":"com.company.jaxrs.parameter.RestParameterFactory","thread_name":"ajp-nio-8009-exec-3","level":"INFO","level_value":20000,"HOSTNAME":"development.company.com"}

/etc/logstash/conf.d/01-lumberjack-input.conf /etc/logstash/conf.d/01-lumberjack-input.conf

input {

 lumberjack {
    port => 5000
    type => "logs"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }

}

/etc/logstash/conf.d/10-syslog.conf /etc/logstash/conf.d/10-syslog.conf

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST$
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }

  else if [type] == "json" {

        source => "message"

  }

/etc/logstash/conf.d/30-lumberjack-output.conf /etc/logstash/conf.d/30-lumberjack-output.conf

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

/etc/logstash-forwarder (other machine) / etc / logstash-forwarder(其他机器)

{
  "network": {
    "servers": [ "utility.company.com:5000" ],
    "timeout": 15,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
  },
  "files": [
    {
      "paths": ["/company/apache-tomcat-8.0.9/logs/vhost1.log"],
      "fields": { "type": "json"  }

    }
   ]
}

The best returns I have been able to get back (if anything returns) in Kibana look something like this: 我能够在Kibana中获得的最佳回报(如果有回报)如下所示:

{
  "_index": "logstash-2015.02.04",
  "_type": "json",
  "_id": "8l1rDYTZSceBCklFxAuvAg",
  "_score": null,
  "_source": {
    "message": "{\"@timestamp\":\"2015-02-04T06:03:18.794+00:00\",\"@version\":1,\"message\":\"Attribute Count 1\",\"logger_name\":\"com.company.ws.service.ReportSearchService\",\"thread_name\":\"ajp-nio-8009-exec-1\",\"level\":\"INFO\",\"level_value\":20000,\"HOSTNAME\":\"development.company.com\"}",
    "@version": "1",
    "@timestamp": "2015-02-04T06:13:10.685Z",
    "type": "json",
    "file": "/company/apache-tomcat-8.0.9/logs/vhost1.log",
    "host": "development.company.com",
    "offset": "4907321"
  },
  "sort": [
    1423030390685,
    1423030390685
  ]
}

Obviously, the json conversion logic is not functioning properly, so what am I missing? 显然,json转换逻辑无法正常运行,那么我缺少了什么?

ELK stack was configured using this guide . ELK堆栈是使用本指南进行配置的。

This looks very suspicious: 这看起来非常可疑:

else if [type] == "json" {

      source => "message"

}

If this really is what's in your config file I don't understand why Logstash doesn't complain about it. 如果这确实是您的配置文件中的内容,我不明白为什么Logstash不会抱怨它。 This is what it should look like: 它应该是这样的:

else if [type] == "json" {
  json {
    source => "message"
  }
}

Alternatively, if all messages received via the lumberjack protocol are JSON messages you can use the json codec for your lumberjack input. 另外,如果通过伐木工人协议接收的所有消息都是JSON消息,则可以将json编解码器用于伐木工人输入。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM