简体   繁体   中英

Logstash setup on Spring Boot application

I am building microservices and want to push all the logs to ELK stack managed on cloud. What is the standard practice to push the logs to Elasticsearch. Should I configure the logstash on each microservice instance or should the logstash be configured remotely and all the microservices should push the log to the logstash service?

You should run Filebeat (or Fluent Bit) instead on each server where your code will run, and ideally have slf4j from Spring write under some /var/log/java/<your app name> using a rolling file appender; java subdirectory because you're going to glob all directories under it, and you don't need to capture all other /var/log/* files and directories...

Then, configure Filebeat to scan for files under /var/log/java/**

From there, you can either directly ship to Elastic, or push TCP, Kafka, etc output and consume that from Logstash (or Fluentd), to then transform, filter, etc before writing to Elastic

Loghstash is a heavyweight application, which is usually single instance.

In my experience, there is often Filebeat instances for every node, which are fetching logs from apps and pushing them to LogStash instance through network.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM