简体   繁体   中英

What happens if logstash sends data to elasticsearch at a rate faster than it can index?

So I have multiple hosts with logstash installed on each host. Logstash on all these hosts reads from the log files generated by the host and sends data to my single aws elasticsearch cluster.

Now considering a scenario where large quantities of logs are being generated by each host at the same time. Since logstash is installed on each host and it just forwards the data to the es cluster I assume that even if my elasticsearch cluster is not able to index it, my hosts won't be affected. Are the logs just loss in such a scenario?

Can my host machines get affected in any way?

简而言之,您可能会丢失主机上的某些日志,这就是为什么使用类似kafka的消息传递解决方案的原因https://www.elastic.co/guide/zh-CN/logstash/current/deploying-and-scaling.html#deploying-消息排队

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM