简体   繁体   中英

logstash output not showing up in kibana

I just started learning Elastic Search and trying to dump IIS logs to ES via logstash and see how it looks in Kibana. Have set up all the 3 agents succuessfully and they run witout errors. But when I run logstash on my stored log files, the logs doesnt show up in Kibana. (Am using ES5.0 which doesnt have the 'head' _plugin)

This is the output I see in logstash command.

    Sending Logstash logs to C:/elasticsearch-5.0.0/logstash-5.0.0-rc1/logs which is now configured via log4j2.properties.
06:28:26.067 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
06:28:26.081 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
06:28:26.501 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
06:28:26.573 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
06:28:26.717 [[main]-pipeline-manager] INFO  logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
06:28:26.736 [[main]-pipeline-manager] INFO  logstash.pipeline - Pipeline main started
06:28:26.857 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}

But kibana doesnt show up any indexes. I am a newbie here and am not sure whats going on internally. Could you please help me understand what is wrong here.

在此处输入图片说明

Logstash Config file:

input { 
    file {
        type => "iis-w3c"
        path => "C:/Users/ras/Desktop/logs/logs/LogFiles/test/aug1/*.log"
      }
}
filter {  
    grok {
        match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
      }
     mutate {
    ## Convert some fields from strings to integers
    #
    convert => ["bytesSent", "integer"]
    convert => ["bytesReceived", "integer"]
    convert => ["timetaken", "integer"]

    ## Create a new field for the reverse DNS lookup below
    #
    add_field => { "clientHostname" => "%{clientIP}" }

    ## Finally remove the original log_timestamp field since the event will
    #   have the proper date on it
    #
    remove_field => [ "log_timestamp"]
  }
}
output {
  elasticsearch { 
        hosts => ["localhost:9200"]
        index => "%{type}-%{+YYYY.MM}"
  }
  stdout { codec => rubydebug }
}

You can check the name of the index present in Elasticsearch with a plugin like kopf, or with the endpoint _cat/indices/ , which you can access directly via a browser at [ip of ES]:9200/_cat/indices or via curl: curl [ip of ES]:9200/_cat/indices .


With Kibana you have to provide a pattern of the index names, which is by default logstash-* , as shown in your screenshot. This default is used in Kibana since, in the elasticsearch output plugin for logstash, the default index pattern is logstash-%{+YYYY.MM.dd} ( cf doc ), which will be use to name the index created with this plugin.

But in you case, the plugin is configured with index => "%{type}-%{+YYYY.MM}" . So the index created will be of the iis-w3c-%{+YYYY.MM} format. So you'll have to replace logstash-* by iis-w3c-* in the field Index name or pattern

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM