简体   繁体   English

使用filebeat,logstash和elasticsearch将json格式日志发送到kibana?

[英]Sending json format log to kibana using filebeat, logstash and elasticsearch?

I have logs like this: 我有这样的日志:

{"logId":"57aaf6c8d32fb","clientIp":"127.0.0.1","time":"03:11:29 pm","uniqueSubId":"57aaf6c98963b","channelName":"JSPC","apiVersion":"v1","modulName":null,"actionName":"apiRequest","typeOfError":"","statusCode":"","message":"In Auth","exception":"In Auth","logType":"Info"}

{"logId":"57aaf6c8d32fb","clientIp":"127.0.0.1","time":"03:11:29 pm","uniqueSubId":"57aaf6c987206","channelName":"JSPC","apiVersion":"v2","modulName":null,"actionName":"performV2","typeOfError":"","statusCode":"","message":"in inbox api v2 5","exception":"in inbox api v2 5","logType":"Info"}

I want to push them to kibana . 我想把他们推到kibana I am using filebeat to send data to logstash, using following configuration: 我使用filebeat将数据发送到logstash,使用以下配置:

filebeat.yml

 ### Logstash as output
logstash:
# The Logstash hosts
hosts: ["localhost:5044"]

# Number of workers per Logstash host.
#worker: 1

Now using following configuration, I want to change codec type: 现在使用以下配置,我想更改编解码器类型:

input {

     beats {
     port => 5000
     tags => "beats"
     codec => "json_lines"
     #ssl  => true
     #ssl_certificate => "/opt/filebeats/logs.example.com.crt"
     #ssl_key => "/opt/filebeats/logs.example.com.key"
     }


     syslog {
        type => "syslog"
        port => "5514"

    }

}

But, still I get the logs in string format: 但是,我仍然以字符串格式获取日志:

"message": "{\\"logId\\":\\"57aaf6c96224b\\",\\"clientIp\\":\\"127.0.0.1\\",\\"time\\":\\"03:11:29 pm\\",\\"channelName\\":\\"JSPC\\",\\"apiVersion\\":null,\\"modulName\\":null,\\"actionName\\":\\"404\\",\\"typeOfError\\":\\"EXCEPTION\\",\\"statusCode\\":0,\\"message\\":\\"404 page encountered http:\\/\\/localjs.com\\/uploads\\/NonScreenedImages\\/profilePic120\\/16\\/29\\/15997002iicee52ad041fed55e952d4e4e163d5972ii4c41f8845105429abbd11cc184d0e330.jpeg\\",\\"logType\\":\\"Error\\"}", “message”:“{\\”logId \\“:\\”57aaf6c96224b \\“,\\”clientIp \\“:\\”127.0.0.1 \\“,\\”time \\“:\\”03:11:29 pm \\“,\\ “CHANNELNAME \\”:\\ “JSPC \\”,\\ “apiVersion \\”:空,\\ “modulName \\”:空,\\ “actionName \\”:\\ “404 \\” \\ “typeOfError \\”:\\ “例外\\” ,“statusCode \\”:0,\\“message \\”:\\“404页面遇到http:\\ / \\ / localjs.com \\ / uploads \\ / NonScreenedImages \\ / profilePic120 \\ / 16 \\ / 29 \\ /15997002iicee52ad041fed55e952d4e4e163d5972ii4c41f8845105429abbd11cc184d0e330.jpeg \\ “\\ ”日志类型\\“:\\ ”错误\\“}”,

Please help me solve this. 请帮我解决这个问题。

To parse JSON log lines in Logstash that were sent from Filebeat you need to use a json filter instead of a codec. 要解析从Filebeat发送的Logstash中的JSON日志行,您需要使用json过滤器而不是编解码器。 This is because Filebeat sends its data as JSON and the contents of your log line are contained in the message field. 这是因为Filebeat将其数据作为JSON发送,并且日志行的内容包含在message字段中。

Logstash config: Logstash配置:

input {
  beats {
    port => 5044
  }   
}   

filter {
  if [tags][json] {
    json {
      source => "message"
    }   
  }   
}   

output {
  stdout { codec => rubydebug { metadata => true } } 
}

Filebeat config: Filebeat配置:

filebeat:
  prospectors:
    - paths:
        - my_json.log
      fields_under_root: true
      fields:
        tags: ['json']
output:
  logstash:
    hosts: ['localhost:5044']

In the Filebeat config, I added a "json" tag to the event so that the json filter can be conditionally applied to the data. 在Filebeat配置中,我为事件添加了一个“json”标记,以便可以有条件地将json过滤器应用于数据。

Filebeat 5.0 is able to parse the JSON without the use of Logstash, but it is still an alpha release at the moment. Filebeat 5.0能够在不使用Logstash的情况下解析JSON,但它目前仍然是alpha版本。 This blog post titled Structured logging with Filebeat demonstrates how to parse JSON with Filebeat 5.0. 这篇名为“ 使用Filebeat进行结构化日志记录”的博客文章演示了如何使用Filebeat 5.0解析JSON。

From FileBeat 5.x You can do it without using Logstash. 从FileBeat 5.x您可以在不使用Logstash的情况下执行此操作。

Filebeat config: Filebeat配置:

filebeat.prospectors:
- input_type: log
  paths: ["YOUR_LOG_FILE_DIR/*"]
  json.message_key: logId
  json.keys_under_root: true

output.elasticsearch:
  hosts: ["<HOSTNAME:PORT>"]
  template.name: filebeat
  template.path: filebeat.template.json

Filebeat is more lightweight then Logstash. Filebeat比Logstash更轻量级。 Also, even if you need to insert to elasticsearch version 2.x you can use this feature of FileBeat 5.x Real example can be found here 此外,即使你需要插入elasticsearch版本2.x你可以使用FileBeat 5.x的这个功能真实的例子可以在这里找到

I've scoured internet for the exact same problem you are having and tried various suggestions, including those above. 我已经在互联网上搜索了你遇到的完全相同的问题,并尝试了各种建议,包括上述建议。 However, none helped so I did it the old fashioned way. 然而,没有人帮助我,所以我采用老式的方式。 I went on elasticsearch documentation on filebeat configuration 我继续关于filebeat配置的elasticsearch文档

and all that was required (no need for filters config in logstash) 以及所需的一切(在logstash中不需要过滤器配置)

Filebeat config: Filebeat配置:

filebeat.prospectors:
- input_type: log
  document_type: #whatever your type is, this is optional
  json.keys_under_root: true
  paths:
    - #your path goes here

keys_under_root keys_under_root

copies nested json keys to top level in the output document. 将嵌套的json键复制到输出文档的顶层。

My filebeat version is 5.2.2. 我的filebeat版本是5.2.2。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM