簡體   English   中英

讀取 Logstash 日志但不會推送到 elasticsearch

[英]Logstash logs are read but are not pushed to elasticsearch

我有以下 logstash 配置:

input {
  file {
    codec => "json_lines"
    path => ["/etc/logstash/input.log"]
    sincedb_path => "/etc/logstash/dbfile"
    start_position => "beginning"
    ignore_older => "0"
  }
}
output {
   elasticsearch {
      hosts => ["192.168.169.46:9200"]
   }
   stdout {
      codec => rubydebug
   }
}

/etc/logstash/input.log文件填充了來自正在運行的 Java 應用程序的日志。 日志采用以下 json 格式(它們以\\n字符分隔的行內寫入):

{
"exception": {
    "exception_class": "java.lang.RuntimeException",
    "exception_message": "Test runtime exception stack: 0",
    "stacktrace": "java.lang.RuntimeException: Test runtime exception stack: 0"
},
"@version": 1,
"source_host": "WS-169-046",
"message": "Test runtime exception stack: 0",
"thread_name": "parallel-1",
"@timestamp": "2019-12-02T16:30:14.084+02:00",
"level": "ERROR",
"logger_name": "nl.hnf.logs.aggregator.demo.LoggingTest",
"aplication-name": "demo-log-aggregation"
}

我還使用 elasticsearch API 更新了 logstash 默認模板(將請求正文放在: http://192.168.169.46:9200/_template/logstash?pretty : http://192.168.169.46:9200/_template/logstash?pretty :9200/_template/logstash?pretty):

{
"index_patterns": "logstash-*",
"version": 60002,
"settings": {
    "index.refresh_interval": "5s",
    "number_of_shards": 1
},
"mappings": {
    "dynamic_templates": [
        {
            "message_field": {
                "path_match": "message",
                "match_mapping_type": "string",
                "mapping": {
                    "type": "text",
                    "norms": false
                }
            }
        },
        {
            "string_fields": {
                "match": "*",
                "match_mapping_type": "string",
                "mapping": {
                    "type": "text",
                    "norms": false,
                    "fields": {
                        "keyword": {
                            "type": "keyword",
                            "ignore_above": 256
                        }
                    }
                }
            }
        }
    ],
    "properties": {
        "@timestamp": {
            "type": "date"
        },
        "@version": {
            "type": "keyword"
        },
        "source_host": {
            "type": "keyword"
        },
        "message": {
            "type": "text"
        },
        "thread_name": {
            "type": "text"
        },
        "level": {
            "type": "keyword"
        },
        "logger_name": {
            "type": "keyword"
        },
        "aplication_name": {
            "type": "keyword"
        },
        "exception": {
            "dynamic": true,
            "properties": {
                "exception_class": {
                    "type": "text"
                },
                "exception_message": {
                    "type": "text"
                },
                "stacktrace": {
                    "type": "text"
                }
            }
        }
    }
}

Elasticsearch 以"acknowledged": true響應,我可以看到正在通過 API 更新模板。 現在以debug日志級別啟動logstash,我看到輸入日志正在讀取但未發送到elasticsearch,盡管索引已創建但它始終為空(0 個文檔):

[2019-12-03T09:30:51,655][DEBUG][logstash.inputs.file     ][custom] Received line {:path=>"/etc/logstash/input.log", :text=>"{\"@version\":1,\"source_host\":\"ubuntu\",\"message\":\"Generating some logs: 65778 - 2019-12-03T09:30:50.775\",\"thread_name\":\"parallel-1\",\"@timestamp\":\"2019-12-03T09:30:50.775+00:00\",\"level\":\"INFO\",\"logger_name\":\"nl.hnf.logs.aggregator.demo.LoggingTest\",\"aplication-name\":\"demo-log-aggregation\"}"}
[2019-12-03T09:30:51,656][DEBUG][filewatch.sincedbcollection][custom] writing sincedb (delta since last write = 1575365451)

此外,elasticsearch 日志也處於debug級別,但我沒有看到任何錯誤或任何可以提示我問題根源的內容。

你們對為什么不將日志推送到 elasticsearch 有任何想法或建議嗎?

在 filebeat 中,將 ignore_older 設置為零意味着“不檢查文件的年齡”。 對於logstash 文件輸入,它意味着“忽略超過零秒的文件”,這實際上是“忽略所有內容”。 刪除它。 如果這沒有幫助,則增加日志級別以進行跟蹤,並查看 filewatch 模塊對它正在監視的文件所說的內容。

通過使用json編解碼器而不是json_lines並刪除start_positionignore_oldersincedb_path修復了該問題

input {
  file {
    codec => "json"
    path => ["/etc/logstash/input.log"]
  }
}
output {
   elasticsearch {
      hosts => ["192.168.169.46:9200"]
   }
   stdout {
      codec => rubydebug
   }
}

此外json_lines編解碼器似乎與文件輸入不兼容( \\n分隔符它沒有按預期工作)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM