繁体   English   中英

Logstash 给出 Elasticsearch Unreachable 错误

[英]Logstash giving Elasticsearch Unreachable error

我正在尝试将 Logstash 与 Elasticsearch 连接,但无法使其正常工作。 我的 Elasticsearch 在 localhost:9200 上运行良好,我可以卷曲它。

我的 logstash.config 看起来像这样:

 http.host: "0.0.0.0" xpack.monitoring.elasticsearch.url: http://localhost:9200

我的logstash-sample.conf:

 # Sample Logstash configuration for creating a simple # Beats -> Logstash -> Elasticsearch pipeline. input { beats { port => 5044 } } output { elasticsearch { hosts => ["http://localhost:9200"] index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" #user => "elastic" #password => "changeme" } }

我用来启动容器。

 docker run --name logstash -p 5044:5044 -e "discovery.type=single-node" docker.elastic.co/logstash/logstash:6.4.2

但是我得到以下错误日志:

 Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties [2018-11-02T15:30:44,622][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml [2018-11-02T15:30:44,814][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"} [2018-11-02T15:30:48,350][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", hosts=>[http://localhost:9200], sniffing=>false, manage_template=>false, id=>"2196aa69258f6adaaf9506d8988cc76ab153e658434074dcf2e424e0aca0d381", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_1afa70a3-eaef-4cf5-9762-0d759d720d1c", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>} [2018-11-02T15:30:48,710][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50} [2018-11-02T15:30:50,318][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-11-02T15:30:51,394][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"} [2018-11-02T15:30:51,462][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x688d6788 run>"} [2018-11-02T15:30:51,698][INFO ][org.logstash.beats.Server] Starting server on port: 5044 [2018-11-02T15:30:51,972][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-11-02T15:30:51,982][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-11-02T15:30:52,194][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)"} [2018-11-02T15:30:52,218][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]} [2018-11-02T15:30:52,322][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-11-02T15:30:52,322][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-11-02T15:30:52,332][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)"} [2018-11-02T15:30:52,378][WARN ][logstash.licensechecker.licensereader] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused) {:url=>http://localhost:9200/, :error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"} [2018-11-02T15:30:52,509][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:293:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in get'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:162:in `get'", "/usr/share/logstash/x-pack/lib/license_checker/license_reader.rb:28:in `fetch_xpack_info'", "/usr/share/logstash/x-pack/lib/license_checker/license_manager.rb:40:in `fetch_xpack_info'", "/usr/share/logstash/x-pack/lib/license_checker/license_manager.rb:27:in `initialize'", "/usr/share/logstash/x-pack/lib/license_checker/licensed.rb:37:in `setup_license_checker'", "/usr/share/logstash/x-pack/lib/monitoring/inputs/metrics.rb:56:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:242:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:396:in `start_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:200:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"]} [2018-11-02T15:30:52,776][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x4bac2ad3 sleep>"} [2018-11-02T15:30:52,842][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]} [2018-11-02T15:30:52,865][ERROR][logstash.inputs.metrics ] X-Pack is installed on Logstash but not on Elasticsearch. Please install X-Pack on Elasticsearch to use the monitoring feature. Other features may be available. [2018-11-02T15:30:53,119][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-11-02T15:30:57,213][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-11-02T15:30:57,222][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)"} [2018-11-02T15:30:57,347][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}

而且这种情况一直在继续。 我真的很感激你的帮助,我会很感激的。 谢谢你。

正如@Val 评论所注意到的, localhost 位于容器内部,它是容器,而不是您的主机。

如果 Docker Compose 对你没问题,你可以尝试包含 ElasticSearch、LogStash 和 Kibana 的sebp/elk Docker 镜像。 并使用此示例撰写文件

docker-compose.yml

version: "2.4"
services:
  elk:
    image: sebp/elk
    ports:
      - "5601:5601"
      - "9200:9200"
      - "5044:5044"
  1. 安装 Docker-Compose
  2. 将示例“docker-compose.yml”保存在一个文件夹中
  3. 在同一个文件夹中运行: $ docker-compose up

但是如果你想改变配置,你应该:

  1. 以该映像为基础创建您自己的 Dockerfile,并覆盖您想要的 conf 文件
  2. 用户 Docker Compose 卷,通过编辑 docker-compose.yml 而不是使用 Dockerfile 创建新的 Docker 镜像来覆盖单个文件

替代品

  1. 在容器配置中使用您的私有本地 LAN ip,而不是“localhost”,以便容器可以访问它
  2. 使用两个容器化系统,并在“docker run”上使用相同的网络,设置主机名,并在配置中使用这些主机名
  3. 使用 Docker Compose 为每个系统(Elastic 和 LogStash)提供一个服务,并在配置中使用服务名称(因为 Docker Compose 会默认使用服务名称作为主机名)
  4. 使用 Docker Compose 的原始建议,使用单个服务,使用 sebp/elk Docker Image,因此所有内容都在同一个容器中,并且您可以保留 localhost conf

请记住,每个备选方案可能有不同的方法来自定义配置。

我认为问题的线索是这个错误消息:

[2018-11-02T15:30:52,865][ERROR][logstash.inputs.metrics  ] X-Pack is installed on Logstash but not on Elasticsearch. Please install X-Pack on Elasticsearch to use the monitoring feature. Other features may be available.

看起来您已经在 Logstash 上安装了 X-Pack,但没有在您的 Elasticsearch 节点上安装。 试试这个,然后重新运行-Logstash。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM