简体   繁体   English

使用ElasticSearch输出时,Logstash错误消息=>“无法刷新外发项目”

[英]Logstash error message when using ElasticSearch output=>“Failed to flush outgoing items”

Im using ES 1.4.4 and LS 1.5 and Kibana 4 on Debian. 我在Debian上使用ES 1.4.4,LS 1.5和Kibana 4。 I start logstash, it works fine for a couple of minutes then i have a fatal error. 我启动了logstash,它在几分钟内正常工作,然后出现致命错误。 In order to shutdown logstash i have to delete the recent datas stored in ES, that's the only way i found. 为了关闭logstash,我必须删除存储在ES中的最新数据,这是我发现的唯一方法。 One more relevant fact is that Elastic Search looks OK, i can see old datas in kibana and plugin head works fine. 另一个相关的事实是,Elastic Search看起来还可以,我可以在kibana中看到旧数据,并且插件头工作正常。 My output config : output { elasticsearch {port => 9200 protocol => http host => "127.0.0.1"}} 我的输出配置:输出{elasticsearch {端口=> 9200协议=> http主机=>“ 127.0.0.1”}}

Any help will be appreciated :) 任何帮助将不胜感激 :)

Here is the full error message : Got error to send bulk of actions to elasticsearch server at 127.0.0.1 : Read timed out {:level=>:error} 这是完整的错误消息:在127.0.0.1处发送大量操作到elasticsearch服务器时出错:读取超时{:level =>:error}

Failed to flush outgoing items {:outgoing_count=>1362, :exception=>#, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:35:in initialize'", "org/jruby/RubyProc.java:271:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:61:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:224:incall_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:127:in code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/transport/http/manticore.rb:50:inperform_request'", "org/jruby/RubyProc.java:271:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/transport/base.rb:187:inperform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/trans 无法刷新传出项目{:outgoing_count => 1362,:exception =>#,:backtrace => [“ / opt / logstash / vendor / bundle / jruby / 1.9 / gems / manticore-0.3.5-java / lib / manticore /response.rb:35:in initialize'“,” org / jruby / RubyProc.java:271:incall'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java /lib/manticore/response.rb:61:in call'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:224: incall_once'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.3.5-java/lib/manticore/response.rb:127:in code',“ // opt / logstash / vendor /bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/transport/http/manticore.rb:50:inperform_request'“,” org / jruby / RubyProc.java:271:in call '“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/transport/base.rb:187:inperform_request'“,” // opt / logstash /vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/trans port/http/manticore.rb:33:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/client.rb:115:inperform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.7/lib/elasticsearch/api/actions/bulk.rb:80:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch/protocol.rb:82:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:413:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:412:insubmit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:438:in flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:436: port / http / manticore.rb:33:在perform_request中“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.7/lib/elasticsearch/transport/client.rb:115 :inperform_request'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.7/lib/elasticsearch/api/actions/bulk.rb:80:in bulk',“ / opt / logstash / vendor / bundle / jruby / 1.9 / gems / logstash-output-elasticsearch-0.1.18-java / lib / logstash / outputs / elasticsearch / protocol.rb:82:inbulk'“,” // opt / logstash / vendor / bundle / jruby / 1.9 / gems / logstash-output-elasticsearch-0.1.18-java / lib / logstash / outputs / elasticsearch.rb:413:in Submit'“,” / opt / logstash / vendor / bundle / jruby /1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:412:insubmit'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash -output-elasticsearch-0.1.18-java / lib / logstash / outputs / elasticsearch.rb:438:in flush'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch- 0.1.18-java / lib / logstash / outputs / elasticsearch.rb:436: inflush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219:in buffer_flush'", "org/jruby/RubyHash.java:1341:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216:in buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193:inbuffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:159:in buffer_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/elasticsearch.rb:402:inreceive'", "/opt/logstash/lib/logstash/outputs/base.rb:88:in handle'", "(eval):1070:ininitialize'", "org/jruby/RubyArray.java:1613:in each'", "org/jruby/RubyEnumerable.java:805:inflat_map'", "(eval):1067:in initialize'", "org/jruby/RubyProc.java:271:incall'", "/opt/logstash/lib/logstash/pipeline.rb:279:in output'", "/opt/logstash/lib/logstash/pipeline.rb:235:inoutputworker'", "/opt/logstash/lib/logstash/pipeline.rb:163:in inflush'“,” / opt / logstash / vendor / bundle / jruby / 1.9 / gems / stud-0.0.19 / lib / stud / buffer.rb:219:in buffer_flush'“,” org / jruby / RubyHash.java: 1341:ineach'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216:in buffer_flush'“,” // opt / logstash / vendor /bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193:inbuffer_flush'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19 /lib/stud/buffer.rb:159:in buffer_receive'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.1.18-java/lib/logstash/outputs/ elasticsearch.rb:402:inreceive'“,” /opt/logstash/lib/logstash/outputs/base.rb:88:in handle'“,”(eval):1070:ininitialize'“,” org / jruby / RubyArray .java:1613:in each'“,” org / jruby / RubyEnumerable.java:805:inflat_map'“,”(eval):1067:in initialize'“,” org / jruby / RubyProc.java:271:incall' “,” / opt / logstash / lib / logstash / pipeline.rb:279:在输出中“,” / opt / logstash / lib / logstash / pipeline.rb:235:inoutputworker'“,” / opt / logstash / lib /logstash/pipeline.rb:163:in `start_outputs'"], :level=>:warn} `start_outputs'“],:level =>:warn}

Your elasticsearch have surpassed storage and it is unable to write new documents coming from logstash, try deleting old indices and then 您的Elasticsearch已超过存储空间,无法写入来自logstash的新文档,请尝试删除旧索引,然后

PUT your_index/_settings
{
  "index": {
  "blocks.read_only": false
  }
} 

I hope this will work for you. 希望这对您有用。 Thanks !! 谢谢 !!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM