[英]logstash->kafka->logstash->elasticsearch dropping some logs
i have a logstash->kafka->logstash->elasticsearch
setup. 我有一个
logstash->kafka->logstash->elasticsearch
设置。 logstash
is tailing on a log file. logstash
尾随日志文件。 this log file is being appended to with bunyan
in nodejs
and everything is set to json
format/codec. 该日志文件被追加到
bunyan
中nodejs
,一切都被设置成json
格式/编解码器。 it seems to me any log entry that contains an empty array somewhere doesn't get delivered to the destination. 在我看来,任何包含空数组的日志条目都不会传递到目的地。 does any one know what the problem is?
有谁知道是什么问题吗? is
kafka
not designed to take in json objects with empty lists? kafka
不是设计成采用空列表的json对象吗? logstash
's kafka
output plugin is pretty new, are there known issues related to that? logstash
的kafka
输出插件非常新,是否存在与此相关的已知问题? I couldn't find out anything from my google search... 我从谷歌搜索中找不到任何东西...
I would do 2 things to check the root source of the problem: 我将做两件事来检查问题的根源:
That way, you could check if the problem is in the logstash plugin side (more likely) or the kafka side (less likely because kafka basically just read/write bytes, and does not familiar with the content itself). 这样,您可以检查问题是在logstash插件方面(更有可能)还是在kafka方面(不太可能,因为kafka基本上只是读/写字节,并且不熟悉内容本身)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.