简体   繁体   English

使用Logstash处理多个JSON事件

[英]Processing multiple json events using logstash

I have the following configuration: logs from the application instances are forwarded to elastic search using filebeat and logstash. 我有以下配置:应用程序实例中的日志使用filebeat和logstash转发到弹性搜索。

   Apps
+--------+
| +--------+
| | +--------+     +----------+    +----------+
| | |   +--- |     |          |    |          |
+ | |   |file| --> | logstash | -->| elastic  |
  + |   |beat|     |   (1)    |    | search   |
    +--------+     +----------+    +----------+
                         |               |
             (not avail) X               | (query)
                         V               |
                   +----------+          V
                   |          |      +------+
                   | logstash |<-----| Json |
                   |   (2)    |      | file |
                   +----------+      +------+

I want to test the log processing in logstash-2, but I cannot presently implement the forwarding from logstash-1. 我想测试logstash-2中的日志处理,但是目前无法实现从logstash-1进行的转发。 So I tried the following: query elasticsearch and retrieve the documents's _source fields and I got some json documents like this: 因此,我尝试了以下操作:查询elasticsearch并检索文档的_source字段,然后得到了一些json文档,如下所示:

{
 "@timestamp": <timestamp>,
 "@version": "1",
 "requestMethod": "PUT",
 "requestUri": "/api/endopoint",
 "servername": "myserver" 
 ....  many other fields
}
{
 "@timestamp": <timestamp>,
 "@version": "1",

}
... many other json objects

My question is, how can I process these json documents from the elasticsearch query using logstash? 我的问题是,如何使用logstash处理elasticsearch查询中的这些json文档?

I have tried to process them using the multiline codec and then the json filter, but cannot manage to make it work: Here is an attempt: 我尝试使用多行编解码器然后使用json过滤器来处理它们,但无法使其正常工作:这是一种尝试:

input {
  file {
    path => "events.json"
    sincedb_path => "/dev/null"
    start_position => beginning
    codec => multiline {
       pattern => "^\}"    #end of each json object
       what => "previous"
    }
  }
}

filter {
  json {
    source => "event"
  }
 }

 output {
  stdout{}
}

After some additional research, I have realized the multiline codec configuration was wrong. 经过一些额外的研究,我意识到多行编解码器配置是错误的。 I have fixed and now I have the whole event on the message field. 我已经解决了,现在我在消息字段中有了整个事件。

input {
  file {
    path => "events.json"
    sincedb_path => "/dev/null"
    start_position => beginning
    codec => multiline {
        pattern => "^\}"    #end of each json object
        negate => true
        what => "next"
    }
  }
}

filter {
  json {
    source => "message"
  }

  mutate {
    remove_field => ["message"]
  } 
} 

output {
 stdout{}
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM