简体   繁体   English

如何使用 Kafka 在 logstash 中保留日志更长时间?

[英]How can I use Kafka to retain logs in logstash for longer period?

Currently I use redis -> s3 -> elastic search -> kibana stack to pipe and visualise my logs.目前我使用 redis -> s3 -> elastic search -> kibana stack 来管道和可视化我的日志。 But due to large volume of data in elastic search I can retain logs upto 7 days.但是由于弹性搜索中的大量数据,我可以将日志保留长达 7 天。

I want to bring kafka cluster in this stack and retain logs for more number of days.我想在这个堆栈中引入 kafka 集群并将日志保留更多天数。 I am thinking of following stack.我正在考虑以下堆栈。

app nodes piping logs to kafka -> kafka cluster -> elastics search cluster -> kibana应用程序节点将日志传送到 kafka -> kafka 集群 -> 弹性搜索集群 -> kibana

How can I use kafka to retain logs for more number of days?如何使用 kafka 将日志保留更多天数?

Looking through the Apache Kafka broker configs , there are two properties that determine when a log will get deleted.查看 Apache Kafka代理配置,有两个属性可以确定何时删除日志。 One by time and the other by space.一个是时间,一个是空间。

log.retention.{ms,minutes,hours}
log.retention.bytes

Also note that if both log.retention.hours and log.retention.bytes are both set we delete a segment when either limit is exceeded.还要注意,如果 log.retention.hours 和 log.retention.bytes 都设置了,我们会在超过任一限制时删除一个段。

Those two dictate when logs are deleted in Kafka.这两个决定了何时在 Kafka 中删除日志。 The log.retention.bytes defaults to -1, and I'm pretty sure leaving it to -1 allows only the time config to solely determine when a log gets deleted. log.retention.bytes 默认为 -1,我很确定将它保留为 -1 只允许时间配置来单独确定日志何时被删除。

So to directly answer your question, set log.retention.hours to however many hours you wish to retain your data and don't change the log.retention.bytes configuration.因此,要直接回答您的问题,请将 log.retention.hours 设置为您希望保留数据的小时数,并且不要更改 log.retention.bytes 配置。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 我可以在Logstash和Elasticsearch之间使用Kafka吗? (使用两个卡夫卡) - Can I use Kafka between Logstash and Elasticsearch? ( Using two Kafka) Kafka vs filebeat 将 Shippong 日志发送到 Logstash - Kafka vs filebeat for shippong logs to logstash 如何在两台分离的服务器之间使用Logstash将日志从文件发送到Elasticsearch? - How can I send logs from file to Elasticsearch using Logstash between two separated servers? logstash-> kafka-> logstash-> elasticsearch删除一些日志 - logstash->kafka->logstash->elasticsearch dropping some logs 使用Logstash解析百万个日志 - Use Logstash to parse million logs 我无法在kibana中查看日志,也无法使用Logstash将日志添加到elasticseach - I can not view logs in kibana or I can not add logs to elasticseach with logstash 如何停止logstash将logstash日志写入syslog? - How to stop logstash to write logstash logs to syslog? 如何打印logstash执行的日志 - How to print the logs of logstash execution 如何使用logstash解析日志 - How to parse logs using logstash 如何运行 mongo-kafka 连接器作为 kafka 的源并将其与 logstash 输入集成以使用 elasticsearch 作为接收器? - How to run the mongo-kafka connector as a source for kafka and integrate that with logstash input to use elasticsearch as a sink?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM