简体   繁体   English

logstash-> kafka-> logstash-> elasticsearch删除一些日志

[英]logstash->kafka->logstash->elasticsearch dropping some logs

i have a logstash->kafka->logstash->elasticsearch setup. 我有一个logstash->kafka->logstash->elasticsearch设置。 logstash is tailing on a log file. logstash尾随日志文件。 this log file is being appended to with bunyan in nodejs and everything is set to json format/codec. 该日志文件被追加到bunyannodejs ,一切都被设置成json格式/编解码器。 it seems to me any log entry that contains an empty array somewhere doesn't get delivered to the destination. 在我看来,任何包含空数组的日志条目都不会传递到目的地。 does any one know what the problem is? 有谁知道是什么问题吗? is kafka not designed to take in json objects with empty lists? kafka不是设计成采用空列表的json对象吗? logstash 's kafka output plugin is pretty new, are there known issues related to that? logstashkafka输出插件非常新,是否存在与此相关的已知问题? I couldn't find out anything from my google search... 我从谷歌搜索中找不到任何东西...

I would do 2 things to check the root source of the problem: 我将做两件事来检查问题的根源:

  • Add a file appender to the first logstash, and then check if the empty array is being written to the file or not. 将文件追加程序添加到第一个logstash,然后检查是否将空数组写入文件。 Architecture would look like this: logstash -> file, kafka -> logstash -> elasticsearch 架构看起来像这样: logstash->文件,kafka-> logstash-> elasticsearch
  • Read from the kafka topic yourself and check if the empty array is being written or not. 自己从kafka主题中读取并检查是否正在写入空数组。 You can do this using Apache Kafka Python Plugin 您可以使用Apache Kafka Python插件执行此操作

That way, you could check if the problem is in the logstash plugin side (more likely) or the kafka side (less likely because kafka basically just read/write bytes, and does not familiar with the content itself). 这样,您可以检查问题是在logstash插件方面(更有可能)还是在kafka方面(不太可能,因为kafka基本上只是读/写字节,并且不熟悉内容本身)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM