简体   繁体   中英

logstash fails to create an index in ES

I am trying to parse a log file using Logstash.. Filebeat-reading sample logs from a directory and indexed it into ElasticSearch through Logstash. ( Reading the input file from a directory through Filebeat and specifying to read the Logstash as output in Filebeat.yml, and parsing the log file in logstash configuration file, and putting the result in an index in ES. )

Filebeat.yml

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

  #input_type: log
  #input_type: log
  document_type: my_log
paths:
  - C:\logsa\elast.log

    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["localhost:5044"]



elast.log : (I am trying to parse this one line of log in the log file) 

    [2016-11-03 07:30:05,987] [INFO] [o.e.p.PluginsService     ] [hTYKFFt] initializing...

Logstash Configuration file :

input {
beats {
port => "5044"
}
}
filter {
if [type] == "my_log" {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

I am running filebeat.exe, logstash conf file, and elasticsearch.

I am not getting any errors as such when running the logstash configuration file...

Console when running logstash conf:

C:\logstash-5.0.0\logstash-5.0.0\bin>logstash -f log-h.conf
JAVA_OPTS was set to [ -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSP
arallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSIniti
atingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutO
fMemoryError -XX:HeapDumpPath="$LS_HOME/heapdump.hprof"]. Logstash will trust th
ese options, and not set any defaults that it might usually set
Sending Logstash logs to C:/logstash-5.0.0/logstash-5.0.0
/logs which is now configured via log4j2.properties.
[2016-11-08T17:38:02,452][INFO ][logstash.inputs.beats    ] Beats inputs: Starti
ng input listener {:address=>"0.0.0.0:5044"}
[2016-11-08T17:38:02,728][INFO ][org.logstash.beats.Server] Starting server on p
ort: 5044
[2016-11-08T17:38:03,082][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
[2016-11-08T17:38:03,089][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2016-11-08T17:38:03,324][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
}}}}}
[2016-11-08T17:38:03,359][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"
]}
[2016-11-08T17:38:03,596][INFO ][logstash.pipeline        ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>500}
[2016-11-08T17:38:03,612][INFO ][logstash.pipeline        ] Pipeline main starte
d
[2016-11-08T17:38:03,783][INFO ][logstash.agent           ] Successfully started
 Logstash API endpoint {:port=>9600}

It is not creating an index in ES, not getting any errors as such as seen in the console above as well.

can anyone help here? thanks in advance.

There are some indentation issues with your Filebeat configuration. It should look like this for Filebeat 5.x.

filebeat.prospectors:
- paths:
    - C:/logsa/elast.log
  document_type: my_log

output.logstash:
  hosts: ["localhost:5044"]

There is a Logstash configuration example provided in the Beats documenation that shows how to configure the Elasticsearch output. This will write the data to a filebeat-YYYY.MM.DD index.

input {
  beats {
    port => "5044"
  }   
}   

filter {
  if [type] == "my_log" {
    grok {
      match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
    }   
  }   
}   

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }   
}   

When using Logstash you must also manually install the Filebeat index template to Elasticsearch.

For Windows:

PS C:\\Program Files\\Filebeat> Invoke-WebRequest -Method Put -InFile filebeat.template.json -Uri http://localhost:9200/_template/filebeat?pretty

For Unix:

curl -XPUT 'http://localhost:9200/_template/filebeat' -d@/etc/filebeat/filebeat.template.json

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM