[英]Logstash logs are read but are not pushed to elasticsearch
我有以下 logstash 配置:
input {
file {
codec => "json_lines"
path => ["/etc/logstash/input.log"]
sincedb_path => "/etc/logstash/dbfile"
start_position => "beginning"
ignore_older => "0"
}
}
output {
elasticsearch {
hosts => ["192.168.169.46:9200"]
}
stdout {
codec => rubydebug
}
}
/etc/logstash/input.log
文件填充了来自正在运行的 Java 应用程序的日志。 日志采用以下 json 格式(它们以\\n
字符分隔的行内写入):
{
"exception": {
"exception_class": "java.lang.RuntimeException",
"exception_message": "Test runtime exception stack: 0",
"stacktrace": "java.lang.RuntimeException: Test runtime exception stack: 0"
},
"@version": 1,
"source_host": "WS-169-046",
"message": "Test runtime exception stack: 0",
"thread_name": "parallel-1",
"@timestamp": "2019-12-02T16:30:14.084+02:00",
"level": "ERROR",
"logger_name": "nl.hnf.logs.aggregator.demo.LoggingTest",
"aplication-name": "demo-log-aggregation"
}
我还使用 elasticsearch API 更新了 logstash 默认模板(将请求正文放在: http://192.168.169.46:9200/_template/logstash?pretty
: http://192.168.169.46:9200/_template/logstash?pretty
:9200/_template/logstash?pretty):
{
"index_patterns": "logstash-*",
"version": 60002,
"settings": {
"index.refresh_interval": "5s",
"number_of_shards": 1
},
"mappings": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
],
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "keyword"
},
"source_host": {
"type": "keyword"
},
"message": {
"type": "text"
},
"thread_name": {
"type": "text"
},
"level": {
"type": "keyword"
},
"logger_name": {
"type": "keyword"
},
"aplication_name": {
"type": "keyword"
},
"exception": {
"dynamic": true,
"properties": {
"exception_class": {
"type": "text"
},
"exception_message": {
"type": "text"
},
"stacktrace": {
"type": "text"
}
}
}
}
}
Elasticsearch 以"acknowledged": true
响应,我可以看到正在通过 API 更新模板。 现在以debug
日志级别启动logstash,我看到输入日志正在读取但未发送到elasticsearch,尽管索引已创建但它始终为空(0 个文档):
[2019-12-03T09:30:51,655][DEBUG][logstash.inputs.file ][custom] Received line {:path=>"/etc/logstash/input.log", :text=>"{\"@version\":1,\"source_host\":\"ubuntu\",\"message\":\"Generating some logs: 65778 - 2019-12-03T09:30:50.775\",\"thread_name\":\"parallel-1\",\"@timestamp\":\"2019-12-03T09:30:50.775+00:00\",\"level\":\"INFO\",\"logger_name\":\"nl.hnf.logs.aggregator.demo.LoggingTest\",\"aplication-name\":\"demo-log-aggregation\"}"}
[2019-12-03T09:30:51,656][DEBUG][filewatch.sincedbcollection][custom] writing sincedb (delta since last write = 1575365451)
此外,elasticsearch 日志也处于debug
级别,但我没有看到任何错误或任何可以提示我问题根源的内容。
你们对为什么不将日志推送到 elasticsearch 有任何想法或建议吗?
在 filebeat 中,将 ignore_older 设置为零意味着“不检查文件的年龄”。 对于logstash 文件输入,它意味着“忽略超过零秒的文件”,这实际上是“忽略所有内容”。 删除它。 如果这没有帮助,则增加日志级别以进行跟踪,并查看 filewatch 模块对它正在监视的文件所说的内容。
通过使用json
编解码器而不是json_lines
并删除start_position
、 ignore_older
和sincedb_path
修复了该问题
input {
file {
codec => "json"
path => ["/etc/logstash/input.log"]
}
}
output {
elasticsearch {
hosts => ["192.168.169.46:9200"]
}
stdout {
codec => rubydebug
}
}
此外json_lines
编解码器似乎与文件输入不兼容( \\n
分隔符它没有按预期工作)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.