[英]Logstash grok filter - field value duplicated
我的logstash过滤器配置如下:
filter {
grok {
patterns_dir => ["/usr/share/logstash/pipeline/patterns/"]
match => {
"[message]" => "%{TIMESTAMP_ISO8601:timestamp} %{THREAD:thread} %{LOGLEVEL:level} %{LOGGER:logger} %{CONTEXT:context} - %{GREEDYDATA:message}"
}
}
mutate {
rename => { "[fields][index]" => "application" }
rename => { "[host][name]" => "instance" }
remove_field => ["@version","agent.ephemeral_id","agent","ecs","fields","input","tags"]
}
}
Grok 调试器建议一切正常,对于错误行:
2020-10-28 05:14:41,282 [Worker-5] DEBUG Amount - calculate operation: [1], useCurrencyCodeOfPosition: [false]
我得到以下输出:
{
"level": "DEBUG",
"logger": "Amount",
"context": "",
"thread": "Worker-5",
"message": "calculate operation: [1], useCurrencyCodeOfPosition: [false]",
"timestamp": "2020-10-28 05:14:41,282"
}
模式定义如下:
THREAD \[(?<thread>[^\]]*)\]
LOGGER (?<logger>[^ ]*)
CONTEXT (?<context>[^-]*)
现在,grok 过滤器产生的每个值都被复制,如下例所示:
"logger" => [
[0] "Amount",
[1] "Amount"
],
"thread" => [
[0] "[Worker-5]",
[1] "Worker-5"
这里有什么问题? 我只是想不通。 这是我的第一个过滤器:)。 我正在使用 Logstash 7.9.2 (dockerized)
我认为过滤器中的自定义模式存在问题。 只需使用如下开箱即用的模式也可以实现您想要的
filter{
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}\[%{DATA:thread}\]%{SPACE}%{LOGLEVEL:level}%{SPACE}%{NOTSPACE:logger} %{DATA:context}-%{SPACE}%{GREEDYDATA:message}"
}
overwrite => [ "message" ]
}
mutate {
rename => { "[fields][index]" => "application" }
rename => { "[host][name]" => "instance" }
remove_field => ["@version","agent.ephemeral_id","agent","ecs","fields","input","tags"]
}
}
查看此默认 grok 模式的链接。 如果您需要对这些事件进行时间序列分析,我建议您使用timestamp
覆盖@timestamp
或至少在timestamp
上应用日期过滤器。
如果您希望捕获多行堆栈跟踪错误,请考虑在输入插件上使用多行过滤器。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.