简体   繁体   English

logstash无法在ES中创建索引

[英]logstash fails to create an index in ES

I am trying to parse a log file using Logstash.. Filebeat-reading sample logs from a directory and indexed it into ElasticSearch through Logstash. 我正在尝试使用Logstash解析日志文件。文件读取目录中的示例日志,并通过Logstash将其索引到ElasticSearch中。 ( Reading the input file from a directory through Filebeat and specifying to read the Logstash as output in Filebeat.yml, and parsing the log file in logstash configuration file, and putting the result in an index in ES. ) (通过Filebeat从目录读取输入文件,并指定在Filebeat.yml中读取Logstash作为输出,并在logstash配置文件中解析日志文件,并将结果放入ES中的索引。)

Filebeat.yml Filebeat.yml

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

  #input_type: log
  #input_type: log
  document_type: my_log
paths:
  - C:\logsa\elast.log

    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["localhost:5044"]



elast.log : (I am trying to parse this one line of log in the log file) 

    [2016-11-03 07:30:05,987] [INFO] [o.e.p.PluginsService     ] [hTYKFFt] initializing...

Logstash Configuration file : Logstash配置文件:

input {
beats {
port => "5044"
}
}
filter {
if [type] == "my_log" {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

I am running filebeat.exe, logstash conf file, and elasticsearch. 我正在运行filebeat.exe,logstash conf文件和elasticsearch。

I am not getting any errors as such when running the logstash configuration file... 运行logstash配置文件时,我没有收到任何错误...

Console when running logstash conf: 运行logstash conf时的控制台:

C:\logstash-5.0.0\logstash-5.0.0\bin>logstash -f log-h.conf
JAVA_OPTS was set to [ -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSP
arallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSIniti
atingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutO
fMemoryError -XX:HeapDumpPath="$LS_HOME/heapdump.hprof"]. Logstash will trust th
ese options, and not set any defaults that it might usually set
Sending Logstash logs to C:/logstash-5.0.0/logstash-5.0.0
/logs which is now configured via log4j2.properties.
[2016-11-08T17:38:02,452][INFO ][logstash.inputs.beats    ] Beats inputs: Starti
ng input listener {:address=>"0.0.0.0:5044"}
[2016-11-08T17:38:02,728][INFO ][org.logstash.beats.Server] Starting server on p
ort: 5044
[2016-11-08T17:38:03,082][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
[2016-11-08T17:38:03,089][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2016-11-08T17:38:03,324][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
}}}}}
[2016-11-08T17:38:03,359][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"
]}
[2016-11-08T17:38:03,596][INFO ][logstash.pipeline        ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>500}
[2016-11-08T17:38:03,612][INFO ][logstash.pipeline        ] Pipeline main starte
d
[2016-11-08T17:38:03,783][INFO ][logstash.agent           ] Successfully started
 Logstash API endpoint {:port=>9600}

It is not creating an index in ES, not getting any errors as such as seen in the console above as well. 它不是在ES中创建索引,也不是在上面的控制台中看到的任何错误。

can anyone help here? 有人可以帮忙吗? thanks in advance. 提前致谢。

There are some indentation issues with your Filebeat configuration. Filebeat配置存在一些缩进问题。 It should look like this for Filebeat 5.x. 它应该像Filebeat 5.x一样。

filebeat.prospectors:
- paths:
    - C:/logsa/elast.log
  document_type: my_log

output.logstash:
  hosts: ["localhost:5044"]

There is a Logstash configuration example provided in the Beats documenation that shows how to configure the Elasticsearch output. Beats文档中提供了Logstash 配置示例 ,其中显示了如何配置Elasticsearch输出。 This will write the data to a filebeat-YYYY.MM.DD index. 这会将数据写入filebeat-YYYY.MM.DD索引。

input {
  beats {
    port => "5044"
  }   
}   

filter {
  if [type] == "my_log" {
    grok {
      match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
    }   
  }   
}   

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }   
}   

When using Logstash you must also manually install the Filebeat index template to Elasticsearch. 使用Logstash时,还必须手动将 Filebeat索引模板安装到Elasticsearch。

For Windows: 对于Windows:

PS C:\\Program Files\\Filebeat> Invoke-WebRequest -Method Put -InFile filebeat.template.json -Uri http://localhost:9200/_template/filebeat?pretty

For Unix: 对于Unix:

curl -XPUT 'http://localhost:9200/_template/filebeat' -d@/etc/filebeat/filebeat.template.json

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM