简体   繁体   English

Logstash正在将旧数据放入Elasticsearch。 无法清除Elasticsearch数据

[英]Logstash is putting old data in Elasticsearch. Can't clean Elasticsearch data

I have Logstash reading from a log file status.log and sending the output to an Elasticsearch instance. 我有Logstash从日志文件中读取status.log及发送输出到Elasticsearch实例。

I want to clean the data in Elasticsearch, for that I'm executing curl -XDELETE 'http://localhost:9200/index_name/_all' . 我想清理Elasticsearch中的数据,因为我正在执行curl -XDELETE 'http://localhost:9200/index_name/_all' If I check through the head plugin, the data is gone. 如果我通过头插件检查,数据就消失了。

If that was not enough I'm also cleaning the log file with echo "" > status.log . 如果那还不够,我还要用echo "" > status.log清理日志文件。

When I execute the application again, the old data reappears in Elasticsearch but with an updated @timestamp . 当我再次执行该应用程序时,旧数据会重新出现在Elasticsearch中,但带有更新的@timestamp The data is not present again in the status.log . 该数据不再显示在status.log The new data is inserted correctly in Elasticsearch. 新数据已正确插入Elasticsearch中。

How can I get rid of the old data? 如何清除旧数据? Is it still stored in Elasticsearch or Logstash has some kind of cache? 它仍然存储在Elasticsearch或Logstash中是否具有某种缓存?

assuming you are working with the file plugin. 假设您正在使用文件插件。 If you add the stdout plugin to your output section like so: 如果您将stdout插件添加到输出部分,如下所示:

stdout { codec => rubydebug }

logstash will display each processed log element on the console. logstash将在控制台上显示每个已处理的日志元素。 When using the file plugin every processed logmessage gets a path field telling you were logstash read the message from. 使用文件插件时,每个处理的logmessage都会得到一个路径字段,告诉您是logstash读取消息的源。 maybe that helps you to find out where the messages are coming from... 也许这可以帮助您找出消息的来源...

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM