简体   繁体   English

如何将Logstash日志缓冲到弹性搜索

[英]How to buffer logstash logs to elastic search

I'm using ELK stack and wondered how to handle crises in my elastic search, what is the best practice to buffer logs coming from logstash to elastic search fails in case elastic search fails and logs keep coming. 我正在使用ELK堆栈,想知道如何在弹性搜索中处理危机,如果弹性搜索失败并且日志不断出现,最好的方法是缓冲从logstash到弹性搜索的日志失败。

Or in case you have a better solution to provide, in order to solve the problems with failing elastic search when we should keeping the logstash "live and on air" 或者,如果您有更好的解决方案,以解决弹性搜索失败的问题,那么我们应该保持“实时”的日志记录

Place a Buffer (Redis, RabbitMQ ...) in front of your Logstash machine that will be the entry point for all log events that are shipped to your system. 在Logstash计算机前面放置一个缓冲区(Redis,RabbitMQ ...),该缓冲区将作为系统附带的所有日志事件的入口。 It will then buffer the data until the downstream components have enough resources to index. 然后它将缓冲数据,直到下游组件具有足够的资源来建立索引。

You don't mention what your inputs actually are. 您没有提及您的输入实际上是什么。 filebeat will stop sending data to logstash/elasticsearch if it is unable to connect. 如果无法连接,filebeat将停止向logstash / elasticsearch发送数据。 Since it keeps track of where in the file it is, you get a distributed cache for free. 由于它跟踪文件的位置,因此可以免费获得分布式缓存。 Note that you may have problems if the log file is rotated when the server is still down. 请注意,如果在服务器仍然关闭时旋转日志文件,则可能会遇到问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM