简体   繁体   English

日志没有从 logstash 推送到 elasticsearch

[英]logs are not getting pushed to elasticsearch from logstash

logstash-config.conf logstash-config.conf

input {
 file {
path => ["D:/project/log/samplex.log"]
sincedb_path => "D:/Project/logstash-7.5.0/data/plugins/inputs/file/null"
start_position => "beginning"
}
}
output {
elasticsearch {
hosts => ["192.168.1.8:9200"]
index => "db"
#user => "elastic"
#password => "changeme"
 }  }

Console log控制台日志

D:\\Project\\logstash-7.5.0\\bin>logstash -f logstash-sample.conf D:\\Project\\logstash-7.5.0\\bin>logstash -f logstash-sample.conf
Thread.exclusive is deprecated, use Thread::Mutex Sending Logstash logs to D:/Project/logstash-7.5.0/logs which is now configured via log4j2.properties [2019-12-16T23:26:28,465][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified Thread.exclusive 已弃用,使用 Thread::Mutex Sending Logstash 日志到 D:/Project/logstash-7.5.0/logs 现在通过 log4j2.properties [2019-12-16T23:26:28,465][WARN][ logstash.config.source.multilocal] 忽略“pipelines.yml”文件,因为指定了模块或命令行选项
[2019-12-16T23:26:28,580][INFO ][logstash.runner ] Starting Logstash [2019-12-16T23:26:28,580][INFO][logstash.runner] 启动 Logstash
{"logstash.version"=>"7.5.0"} [2019-12-16T23:26:30,143][INFO ][org.reflections.Reflections] Reflections took 32 ms to scan 1 urls, producing 20 keys and 40 values [2019-12-16T23:26:31,024][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[ http://192.168.1.8:9200/] }} [2019-12-16T23:26:31,201][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>" http://192.168.1.8:9200/ "} [2019-12-16T23:26:31,256][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7} [2019-12-16T23:26:31,264][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [2019-12-16T23:26:31,333][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.8:9200"]} [2019-12-16T23:26:31,404][INFO ][logstash.outpu {"logstash.version"=>"7.5.0"} [2019-12-16T23:26:30,143][INFO][org.reflections.Reflections] 反射用了 32 毫秒扫描 1 个 url,产生 20 个键和 40 个值[2019-12-16T23:26:31,024][INFO][logstash.outputs.elasticsearch][main] Elasticsearch 池 URL 更新 {:changes=>{:removed=>[], : added=>[ http:// 192.168.1.8:9200/] }} [2019-12-16T23:26:31,201][WARN][logstash.outputs.elasticsearch][main] 恢复到 ES 实例的连接 {:url=>" http://192.168。 1.8:9200/ "} [2019-12-16T23:26:31,256][INFO][logstash.outputs.elasticsearch][main] ES 输出版本确定 {:es_version=>7} [2019-12-16T23:26: 31,264][WARN][logstash.outputs.elasticsearch][main] 检测到 6.x 及以上的集群: type事件字段不会用于确定文档 _type {:es_version=>7} [2019-12- 16T23:26:31,333][INFO][logstash.outputs.elasticsearch][main] 新的 Elasticsearch 输出 {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.8:9200 "]} [2019-12-16T23:26:31,404][INFO][logstash.outpu ts.elasticsearch][main] Using default mapping template [2019-12-16T23:26:31,439][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. ts.elasticsearch][main] 使用默认映射模板 [2019-12-16T23:26:31,439][WARN][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] 未知类型(org .jruby.specialized.RubyArrayOneObject) 已为键创建:cluster_uuids。 This may result in invalid serialization.这可能会导致无效的序列化。 It is recommended to log an issue to the responsible developer/development team.建议将问题记录到负责的开发人员/开发团队。 [2019-12-16T23:26:31,449][INFO ][logstash.javapipeline [2019-12-16T23:26:31,449][信息][logstash.javapipeline
][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["D:/Project/logstash-7.5.0/bin/logstash-sample.conf"], :thread=>"#"} [2019-12-16T23:26:31,506][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash- ", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>" ", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_poin ][main] 启动管道 {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight "=>1000, "pipeline.sources"=>["D:/Project/logstash-7.5.0/bin/logstash-sample.conf"], :thread=>"#"} [2019-12-16T23: 26:31,506][INFO][logstash.outputs.elasticsearch][main] 尝试安装模板 {:manage_template=>{"index_patterns"=>"logstash- ", "version"=>60001, "settings"=>{ "index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type" =>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>" ", "match_mapping_type"= >"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above "=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"} , "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_poin t"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [2019-12-16T23:26:32,041][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"} [2019-12-16T23:26:32,114][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections [2019-12-16T23:26:32,118][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2019-12-16T23:26:32,502][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} t"}, "纬度"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [2019-12-16T23:26: 32,041][INFO][logstash.javapipeline][main] 管道开始 {"pipeline.id"=>"main"} [2019-12-16T23:26:32,114][INFO][filewatch.observingtail][main] START , 创建 Discoverer, Watch with file and sincedb collections [2019-12-16T23:26:32,118][INFO][logstash.agent] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines= >[]} [2019-12-16T23:26:32,502][INFO][logstash.agent] 成功启动 Logstash API 端点 {:port=>9600}

The logstash doesnt read the log file mentioned and its in idle state. logstash 不读取提到的日志文件并且它处于空闲状态。

samplex.log样本日志

[2019-12-16T22:30:59,310][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[ http://192.168.1.8:9200/] }} [2019-12-16T22:30:59,472][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>" http://192.168.1.8:9200/ "} [2019-12-16T22:30:59,558][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7} [2019-12-16T22:30:59,565][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [2019-12-16T22:30:59,653][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.8:9200"]} [2019-12-16T22:30:59,724][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template dsdasd [2019-12-16T22:30:59,310][INFO][logstash.outputs.elasticsearch][main] Elasticsearch 池 URL 更新 {:changes=>{:removed=>[], : added=>[ http:// 192.168.1.8:9200/] }} [2019-12-16T22:30:59,472][WARN][logstash.outputs.elasticsearch][main] 恢复到 ES 实例的连接 {:url=>" http://192.168。 1.8:9200/ "} [2019-12-16T22:30:59,558][INFO][logstash.outputs.elasticsearch][main] ES 输出版本确定 {:es_version=>7} [2019-12-16T22:30: 59,565][WARN][logstash.outputs.elasticsearch][main] 检测到 6.x 及以上的集群: type事件字段不会用于确定文档 _type {:es_version=>7} [2019-12- 16T22:30:59,653][INFO][logstash.outputs.elasticsearch][main] 新的 Elasticsearch 输出 {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.8:9200 "]} [2019-12-16T22:30:59,724][INFO][logstash.outputs.elasticsearch][main] 使用默认映射模板 dsdasd

In Windows,i think the file name you have saved is sample.log but internally it would be considered as text file .在 Windows 中,我认为您保存的文件名是 sample.log 但在内部它会被视为文本文件。 So it would be something like "sample.log.txt"所以它会像“sample.log.txt”

So please do try所以请尝试

input {
file {
#type => "log"
path => "D:/Downloads/logstash-6.7.0/bin/samplex.log.txt"
sincedb_path => "D:/Downloads/logstash-6.7.0/data/plugins/inputs/file/null"
start_position => "beginning"
#ignore_older => 0
}
}



output {
stdout { codec => "rubydebug"}
elasticsearch {
hosts => "http://xx-xx-xx-xx:9200"
index => "db"
} 
}

If still the issue is seen, try deleting the null file in sincedb_path and try again.如果问题仍然存在,请尝试删除sincedb_path 中的空文件,然后重试。

Please let me know,if issue is getting resolved with this.如果问题得到解决,请告诉我。 Hope this helps you.. !!希望这对你有帮助.. !!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM