简体   繁体   中英

Elastic ELK stack 8.5 integration with Spring Boots Application using Filebeat

Setting up a pipeline of elastic search, kibana, and logstash in locally and using filebeat to push logs from a spring boot application to the pipeline. U will find the official documentation well-defined, But I created this questions to answer a few points that were not clear. I answered for a single spring boot app scenario, thanks to people who are adding their scenarios as well.

I spend a few days configuring the ELK stack with my spring boot application. Here I won't specify the step-by-step integration, for that, you can refer to the official documentation. This is more focused on what I didn't find in the documentation steps.

Env: This will be focused on setting up the 8.5.3 version in a mac os.

For Elasticsearch and Kibana I didn't have any trouble following the official document word by word.

Elasticsearch: https://www.elastic.co/downloads/elasticsearch

Kibana: https://www.elastic.co/downloads/kibana

In my project, I needed to extract only a specific log line and process it. U can use the below official document link to download and extract the logstash and filebeat. Then you can use the mentioned configs before you run it.

Logstash: https://www.elastic.co/downloads/logstash

Filebeat: https://www.elastic.co/downloads/beats/filebeat

Filebeat:

First, you need to make permission changes to your filebeat.yml file. Navigate to your filebeat extracted folder and you can use the following config if needed.

filebeat.inputs:

- type: filestream
  id: filebeat-id-name
  enabled: true
  paths:
    - /Users/leons/IdeaProjects/SpringELKDemo/myapplogs.log  #Path to you log file
#I wanted to only read the log line with MainController string in it
  include_lines: ['MainController'] 

output.logstash:
  hosts: ["localhost:5044"]

Then you need to alter the write permission for this file using the below command(mac). Later you can edit the file using sudo nano.

sudo chown root filebeat.yml

Logstash:

Initial a sample file for logstash.conf is available in the config folder inside logstash. you can refer to that, also take a look at mine.

input {
  beats {
    port => 5044
  }
}
filter {
    dissect {
        mapping => {
            "message" => "%{}: %{data_message}"
         }
    }
    json {
    source => "data_message"
    }
}
output {
  elasticsearch {
    hosts => ["https://localhost:9200"]
    index => "index_name"
    user => "elastic"
    password => "XXXXXXXXXXXXX-XXX"
    ssl_certificate_verification => false
  }
  stdout{
    codec => rubydebug
  }
}

I used the dissect filter to do string manipulation in my logline, that filebeat transferred. Below was my log, and I needed only the exact message which is JSON string

2022-12-15 21:14:56.152  INFO 9278 --- [http-nio-8080-exec-10] c.p.t.springdemo.controller.MainController    : {"name":"leons","id":"123123","msg":"hello world"}

For more on dissect refer official docs

The json filter is used to convert the JSON key: values into fields and values in your elastic document.

Now you should be ready to run logstash and filebeat using official document command. Just for reference use below

Logstash:

bin/logstash -f logstash.conf

Filebeat:

sudo ./filebeat -e -c filebeat.yml

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM