繁体   English   中英

为什么logstash不产生日志?

[英]Why doesn't logstash produce logs?

我阅读以下文章以了解我建立的ELK环境的logstash技术。 https://tpodolak.com/blog/tag/kibana/


input {
    file {
        path => ["C:/logs/*.log"]
        start_position => beginning
        ignore_older => 0

    }
}
filter {
    grok {
        match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
    }
    # set the event timestamp from the log
    # https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
    date {
         match => [ "logdate", "yyyy-MM-dd HH:mm:ss.SSSS" ]
        target => "@timestamp"
    }
}
output {
    elasticsearch {
        hosts => "localhost:9200"
    }
    stdout {}
}

我在logstash.conf中添加了输入路径C / logs / *。log。 我有不为空的test.log文件,它具有:


TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-11-01 00:13:01.1669 CorrelationId=77530786-8e6b-45c2-bbc1-31837d911c14 Level=INFO Message=Request completed with status code: 200

根据以上文章。 我必须看到我的日志在elasticsearch中。 (摘自“ https://tpodolak.com/blog/tag/kibana/ ”示例结果) 在此处输入图片说明 但是,如果我在浏览器中写入以下地址,我的结果将是: http:// localhost:9200 / _cat / indices?v我在弹性搜索中看不到logstash日志吗? 弹性搜索中将logstash日志存储在哪里? logstash.conf看起来还可以。 但是没有令人满意的结果。 结果是。 我想从C / logs / *。log下获取所有日志以通过logstash进行弹性处理吗? 但是我的logstash.conf中有什么错误? 在此处输入图片说明

我的日志(C:\\ monitoring \\ logstash \\ logs \\ C:\\ monitoring \\ logstash \\ logs.log):


[2017-03-13T10:47:17,849][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:46:35,123][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:48:20,023][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:55:10,808][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-03-13T11:55:10,871][INFO ][logstash.pipeline        ] Pipeline main started
[2017-03-13T11:55:11,316][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-03-13T12:00:52,188][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:02:48,309][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:06:33,270][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 52 (byte 52) after output { elasticsearch { hosts "}
[2017-03-13T12:08:51,636][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:09:48,114][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:11:40,200][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:19:17,622][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash


首先,您有一些配置问题:

  • Elasticsearch中的主机应该是一个数组(例如hosts => ["myHost:myPort3] ),请参阅文档
  • Windows上使用通配符的文件应使用正斜杠,而不应使用反斜杠(请参阅此问题
  • 您的日期过滤器应在“ TimeStamp”字段中查找“ logdate”字段(鉴于您的日志文件)
  • 为了方便起见,我需要设置的一个设置是sincedb_path因为Logstash不会尝试再次解析它已经解析的文件(它检查一个.sincedb来查看它是否已经解析了一个文件,默认情况下位于$ HOME / .sincedb, 您使用相同的日志文件进行测试时,需要在解析之间将其删除

这就是为什么经过一些研究(实际上很多,不是Windows用户)之后,我可以提出一个有效的配置:

input {
    file {
        path => "C:/some/log/dir/*"
        start_position => beginning
        ignore_older => 0
        sincedb_path => "NIL" #easier to remove from the current directory, the file will be NIL.sincedb

    }
}
filter {
    grok {
        match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
    }
    # set the event timestamp from the log
    # https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
    date {
         match => [ "TimeStamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
        target => "@timestamp"
    }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
    }
    stdout {}
}

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM