I am using a minimal Logstash setup and the Syslog input to collect execution data from three remote systems.
This works fine but some times there are data gaps. Some log entries available in the original log files do not make it to ElasticSearch.
My question is whether Logstash drops data when the load increases.
If yes, I would like to know:
Thanks, Michail
The current version of logstash (1.x) has a very small pipeline queue, and will not accept more messages than that if it becomes congested. For a file{} input, this isn't a problem, because the file will continue to sit on disk waiting for logstash to resume. For syslog, which has no buffer, messages would be lost.
The current recommendation is to put a broker in between (redis, rabbitmq), which can grow when logstash is congested.
logstash 2.0 is said to have a real pipeline cache, so an extra broker won't be required.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.