简体   繁体   English

ELK LDAP日志过滤

[英]ELK LDAP log filtering

Two things: Our logs look like this - 两件事:我们的日志如下所示-

May 11 06:51:31 ldap slapd[6694]: conn=1574001 op=1 SRCH base="cn=s_02,ou=users,o=meta" scope=0 deref=0 filter="(...)"

I need to 1) take the time stamp and set it to the left column "time" in Kibana's discover panel and 2) take the number after connection and make it a field so as to be able to order them by number. 我需要1)获取时间戳并将其设置在Kibana的发现面板中的左列“时间”中,以及2)连接后获取数字并将其设置为一个字段,以便能够按数字对其进行排序。 I've spent all day researching and date and mutate seem promising, but I haven't been able to get them correctly implemented. 我花了一整天的时间进行研究,并且日期和变异似乎很有前途,但我无法正确实现它们。

The config file looks like this: 配置文件如下所示:

 input { file { path => "/Desktop/logs/*.log" type => "log" sincedb_path => "/dev/null" } } output { elasticsearch { hosts => "127.0.0.1" index => "logstash-%{type}-%{+YYYY.MM.dd}" } file { path => "/home/logsOut/%{type}.%{+yyyy.MM.dd.HH.mm}" } } 

If you only need these two as seperate fields: 如果只需要将这两个字段作为单独的字段:

filter {
    grok {
        match => { 
            "message" => [ "%{SYSLOGBASE} conn=%{INT:conn}" ]
        }
    }

    date {
        match => [ "timestamp", "MMM dd HH:mm:ss" ]
        target => "time"
    }

    mutate {
        convert => { "conn" => "integer" }
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM