简体   繁体   中英

Logstash and filebeat in the ELK stack

We are setting up elasticsearch, kibana, logstash and filebeat on a server to analyse log files from many applications. Due to reasons* each application log file ends up in a separate directory on the ELK server. We have about 20 log files.

  1. As I understand we can run a logstash pipeline config file for each application log file. That will be one logstash instance running with 20 pipelines in parallel and each pipeline will need its own beat port. Please confirm that this is correct?
  2. Can we have one filebeat instance running or do we need one for each pipeline/logfile?
  3. Is this architecture ok or do you see any major down sides?

Thank you!

*There are different vendors responsible for different applications and they run a cross many different OS and many of them will not or can't install anything like filebeats.

We do not recommend reading log files from network volumes. Whenever possible, install Filebeat on the host machine and send the log files directly from there. Reading files from network volumes (especially on Windows) can have unexpected side effects. For example, changed file identifiers may result in Filebeat reading a log file from scratch again.

Reference

We always recommend installing Filebeat on the remote servers. Using shared folders is not supported. The typical setup is that you have a Logstash + Elasticsearch + Kibana in a central place (one or multiple servers) and Filebeat installed on the remote machines from where you are collecting data.

Reference

For one filebeat instance running you can apply different configuration settings to different files by defining multiple input sections as below example, check here for more

filebeat.inputs:

- type: log

  enabled: true
  paths:
    - 'C:\App01_Logs\log.txt'
  tags: ["App01"]
  fields: 
    app_name: App01

- type: log
  enabled: true
  paths:
    - 'C:\App02_Logs\log.txt'
  tags: ["App02"]
  fields: 
    app_name: App02

- type: log
  enabled: true
  paths:
    - 'C:\App03_Logs\log.txt'
  tags: ["App03"]
  fields: 
    app_name: App03

And you can have one logstash pipeline with if statement in filter

filter {

    if [fields][app_name] == "App01" {

      grok { }

    } else if [fields][app_name] == "App02" {

      grok { }

    } else {

      grok { }

    }
}

Condtion can be also if "App02" in [tags] or if [source]=="C:\\App01_Logs\\log.txt" as we send from filebeat

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM