简体   繁体   中英

logstash - filter logs and send to different elasticsearch cluster

let's say I've got a stack like this: logstash-forwarder -> logstash -> elasticsearch -> kibana

I wonder if it's possible to monitor a whole directory with logstash-forwarder and send the logs to different elasticsearch cluster, based on filters. Use Case:

I've got some programs that print out logs to the same directory. These logs may contain two types of messages - either "private" or debug. Again, these message can appair in the same logfiles. I know that it is possible to give certain files a different type and filter them with an if to different outputs. What I don't know is what you can do when a certain log can contain more than one type of logmessage.

Is there a way to split them? I want to restrict access to the logmessages with private information to certain users and I thought of two different elasticsearch cluster, each with its own Kibana and LDAP.

BR

Have your filter add a new field based on the message content and use that field to decide which output this message should go to.

Event flow:

logstash-forwarder --> broker ---> logstash-indexer | --> elasticsearch public
                                                    | --> elasticsearch private

Pseudo config:

input { 
    # broker input
}

filter {

    # structure message
    grok {}

    filter {
        if [action] == "login" {
            add_field => { "privacy" => 'private' }
        } else {
            add_field => { "privacy" => 'public' }
        }
    }
}

output {
    if [privacy] == "private" {
        elasticsearch { 
            # private elasticsearch instance
        }
    }

    if [privacy] == "public" {
        elasticsearch { 
            # public elasticsearch instance
        }
    }

}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM