简体   繁体   English

ELK-使用Logstash过滤数据

[英]ELK - Filtering data with Logstash

I am experimenting with ELK stack, and so far so good. 我正在尝试ELK堆栈,到目前为止效果很好。 I have small issue that I am trying to resolve. 我有一个小问题要解决。 I have a field named 'message' coming from filebeat. 我有一个来自filebeat的名为“ message”的字段。 Inside that field is a string with data for logging. 在该字段内是一个字符串,其中包含用于记录的数据。 Sometimes that message field might contain this line: 有时,该消息字段可能包含以下行:

successfully saved with IP address: [142.93.111.8] user: [testuser@some.com]

I would like to apply a filter, so the logstash send this to the Elastic Search: 我想应用一个过滤器,因此logstash将其发送到Elastic Search:

successfully saved with IP address: [] user: [testuser@some.com]

This is what I currently have in Logstash configuration: 这是我目前在Logstash配置中拥有的:

input {
beats {
    port => "5043"
    codec => json
    }
  }
  filter {

  if [message] =~ /IP address:/{
    mutate { add_tag => "whats happening" }
    }

}

output {
elasticsearch {
   hosts => [ "localhost:9200" ]
   }
 }

Something else cought my attention. 其他事情引起了我的注意。 ELK is able to do text filtering on Filebeat level and also on Logstash level. ELK能够在Filebeat级别和Logstash级别上执行文本过滤。 Which one is the most usual scenario? 哪种情况最常见? Is Filebeat filtering more suitable? Filebeat筛选是否更合适?

I have found the correct solution in my case: 我在我的情况下找到了正确的解决方案:

mutate {
  gsub => ["message", "address: \[(.*?)]", "address:[not indexable]"]
}

Hopefully someone will find it usefull. 希望有人会发现它有用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM