简体   繁体   English

将下载的日志文件重播到ELK服务器中

[英]Replay downloaded log files into an ELK server

Our deployed systems collect "regular" plain old log4j log files. 我们部署的系统收集“常规的”普通旧log4j日志文件。 Since it is a distributed system, there are 5-10 of these. 由于它是分布式系统,因此有5-10个。 When there is a problem it is possible to obtain the log files as a zip file. 出现问题时,可以将日志文件作为zip文件获得。

I found this docker ELK image which works great. 我发现此docker ELK图像效果很好。 However, this is new to me, and all the examples talk about using something such as filebeat to play the log information to ElasticSearch. 但是,这对我来说是新手,所有示例都讨论了使用诸如filebeat东西将日志信息播放到ElasticSearch。 I am wondering if there exists a way to essentially "replay" an existing set of log files into such an ELK container instance? 我想知道是否有一种方法可以将现有的日志文件集“重播”到这样的ELK容器实例中? Or is this something I would need to build? 还是我需要构建的东西?

I am not sure why you don't want to use filebeat as it handles network outage gracefully and guarantees log delivery. 我不确定您为什么不想使用filebeat,因为它可以优雅地处理网络中断并保证日志传递。 I would use filebeat even if it is just a local installation (same domain where ELK is) and i would just feed the collected logs to it from your system (with rsync or manually) if you don't want to install filebeat on every log source server to send to your main logstash instance. 即使您只是在本地安装(ELK所在的域相同),我也将使用filebeat,并且如果您不想在每个日志上都安装filebeat,我只会从您的系统(使用rsync或手动)将收集到的日志提供给它。源服务器发送到您的主要logstash实例。

On the other hand there is a very basic logstash file input plugin which you can use for direct reading: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html 另一方面,有一个非常基本的logstash文件输入插件,可用于直接阅读: https ://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM