简体   繁体   中英

logstash->kafka->logstash->elasticsearch dropping some logs

i have a logstash->kafka->logstash->elasticsearch setup. logstash is tailing on a log file. this log file is being appended to with bunyan in nodejs and everything is set to json format/codec. it seems to me any log entry that contains an empty array somewhere doesn't get delivered to the destination. does any one know what the problem is? is kafka not designed to take in json objects with empty lists? logstash 's kafka output plugin is pretty new, are there known issues related to that? I couldn't find out anything from my google search...

I would do 2 things to check the root source of the problem:

  • Add a file appender to the first logstash, and then check if the empty array is being written to the file or not. Architecture would look like this: logstash -> file, kafka -> logstash -> elasticsearch
  • Read from the kafka topic yourself and check if the empty array is being written or not. You can do this using Apache Kafka Python Plugin

That way, you could check if the problem is in the logstash plugin side (more likely) or the kafka side (less likely because kafka basically just read/write bytes, and does not familiar with the content itself).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM