简体   繁体   中英

logs sent from logstash haven't been indexed in elasticsearch

I've written an apache.conf file for logstash as shown below:

input 
{
    file {
        path => "E:\ferdowsi-data\data\logs\logs"
        start_position => "beginning"
    }
    
}

filter
{
    grok{
        match => {
            "message" => "%{COMBINEDAPACHELOG}"
        }
    }
    mutate{
        convert => { "bytes" => "integer" }
    }
    date {
        match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
        locale => en
        remove_field => "timestamp"
    }
    geoip {
        source => "clientip"
    }
    useragent {
        source => "agent"
        target => "useragent"
    }
}


output
{
    stdout {
        codec => dots
    }

    elasticsearch {
        hosts => ["http://localhost:9200/"]
    }

}

after elasticsearch & kibana setting up, I ran the following command:

bin\logstash.bat -f E:\ferdowsi-data\data\apache.conf

but I've got this results in cmd:

Using JAVA_HOME defined java: C:\\Program Files\\Java\\jdk-15.0.2 WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK Sending Logstash logs to E:/ferdowsi-data/logstash-7.15.1-windows-x86_64/logstash-7.15.1/logs which is now configured via log4j2.properties [2021-11-02T19:34:53,285][INFO ][logstash.runner ] Log4j configuration path used is: E:\\ferdowsi-data\\logstash-7.15.1-windows-x86_64\\logstash-7.15.1\\config\\log4j2.properties [2021-11-02T19:34:53,307][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.15.1", "jruby.version"=>"jruby 9.2.19.0 (2.5.8) 2021-06-15 55810c552b Java HotSpot(TM) 64-Bit Server VM 15.0.2+7-27 on 15.0.2+7-27 +indy +jit [mswin32-x86_64]"} [2021-11-02T19:34:53,532][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2021-11-02T19:34:59,861][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2021-11-02T19:35:01,963][IN FO ][org.reflections.Reflections] Reflections took 254 ms to scan 1 urls, producing 120 keys and 417 values [2021-11-02T19:35:08,699][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200/"]} [2021-11-02T19:35:09,667][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2021-11-02T19:35:10,034][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2021-11-02T19:35:10,221][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.15.1) {:es_version=>7} [2021-11-02T19:35:10,228][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [2021-11-02T19:35:10,425][WARN ][logstash.outputs.elasticsearch][main] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set data_stream => true/false to disable this warning) [2021-11-02T19:35:10,425][WARN ][logstash.outputs.elasticsearch][main] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set data_stream => true/false to disable this warning) [2021-11-02T19:35:10,517][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled} [2021-11-02T19:35:11,548][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubySymbol) has been created for key: status. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2021-11-02T19:35:11,553][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubySymbol) has been created for key: status. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2021-11-02T19:35:11,595][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.RubySymbol) has been created for key: status. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2021-11-02T19:35:16,863][ERROR][logstash.filters.geoip.databasemanager] Connect to geoip.elastic.co:443 [geoip.elastic.co/104.154.207.153] failed: Connect timed out {:cause=>org.apache.http.conn.ConnectTimeoutException: Connect to geoip.elastic.co:443 [geoip.elastic.co/104.154.207.153] failed: Connect timed out} [2021-11-02T19:35:17,111][INFO ][logstash.filters.geoip ][main] Using geoip database {:path=>"E:/ferdowsi-data/logstash-7.15.1-windows-x86_64/logstash-7.15.1/data/plugins/filters/geoip/CC/GeoLite2-City.mmdb"} [2021-11-02T19:35:18,035][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["E:/ferdowsi-data/data/apache.conf"], :thread=>"#<Thread:0x67262850 run>"} [2021-11-02T19:35:22,350][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>4.3} [2021-11-02T19:35:22,683][INFO ][logstash.i nputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"E:/ferdowsi-data/logstash-7.15.1-windows-x86_64/logstash-7.15.1/data/plugins/inputs/file/.sincedb_16641d4dcb06fea1584da5dab1d50d1b", :path=>["E:\\ferdowsi-data\\data\\logs\\logs"]} [2021-11-02T19:35:22,858][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"} [2021-11-02T19:35:23,080][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2021-11-02T19:35:23,090][INFO ][filewatch.observingtail ][main][66b892a5d2b1ce6637fc3b1583e69a7f7f213fce29afe3cbec145c6ad96b24cf] START, creating Discoverer, Watch with file and sincedb collections

and unfortunately, nothing have been indexed in elasticsearch. How can i fix it? My configuration is:

Windows 10
elasticsearch 7.15.1
logstash 7.15.1
kibana 7.15.1

Do not use backslash in the path option, it is treated as an escape, so logstash is waiting for the file "E:ferdowsi-datadatalogslogs" to be created. Use forward slash of double backslash.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM