[英]Logstash JSON filter with mutate to add new fields
我正在尝试使用下面的 logstash 配置文件和过滤器从以下日志条目中获取数据,但数据不是从 json 中获取,而是显示了 grok 模式。
日志:
13:41:37.3921 Info {"message":"CTS execution started","level":"Information","logType":"Default","timeStamp":"2019-12-03T13:41:37.3861868-05:00","fingerprint":"29dad848-4ff7-4d2d-905b-460637f3d534","windowsIdentity":"home","machineName":"L02174400","processName":"CTS","processVersion":"1.0.5","jobId":"5bbc492c-bcb7-451f-b6ac-87d9784ad00d","robotName":"home","machineId":0,"fileName":"SendBackReasons(Autosaved)"}
配置:
input{
file{
type => "executionlog"
path => ["c:/users/xyj/appdata/local/uipath/logs/*[^W]_execution.log"]
start_position => "beginning"
sincedb_path => "c:/dbfile"
}
}
filter{
grok{
match => { "message" => ["(?<id>[\d\:\.]+)\s%{LOGLEVEL:level} %{GREEDYDATA:json-data}"]
}
}
json{
source => "json_data"
target => "parsed_json"
}
mutate{
add_field => {
"Info1" => "%{[json_data][message]}" #i tried parsed_json as well here
"level2" => "%{[json_data][level]}"
}
}
}
output{
elasticsearch{
hosts=>["http://localhost:9200"]
index=> "uipathexecutionlog"
}
stdout{}
}
Kibana 输出: Kibana 输出
试试下面的代码,
filter{
grok{
match => { "message" => ["(?<id>[\d\:\.]+)\s%{LOGLEVEL:level} %{GREEDYDATA:json-data}"]
}
}
json{
source => "json-data"
target => "parsed_json"
}
mutate{
add_field => {
"Info1" => "%{[parsed_json][message]}"
"level2" => "%{[parsed_json][level]}"
}
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.