I'm sending JSON to logstash with a config like so:
filter {
json {
source => "event"
remove_field => [ "event" ]
}
}
Here is an example JSON object I'm sending:
{
"@timestamp": "2015-04-07T22:26:37.786Z",
"type": "event",
"event": {
"activityRecord": {
"id": 68479,
"completeTime": 1428445597542,
"data": {
"2015-03-16": true,
"2015-03-17": true,
"2015-03-18": true,
"2015-03-19": true
}
}
}
}
Because of the arbitrary nature of the activityRecord.data
object, I don't want logstash and elasticsearch to index all these date fields. As is, I see activityRecord.data.2015-03-16
as a field to filter on in Kibana.
Is there a way to ignore this sub-tree of data? Or at least delete it after it has already been parsed? I tried remove_field
with wildcards and whatnot, but no luck.
Though not entirely intuitive it is documented that subfield references are made with square brackets, eg [field][subfield], so that's what you'll have to use with remove_field:
mutate {
remove_field => "[event][activityRecord][data]"
}
To delete fields using wildcard matching you'd have to use a ruby filter .
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.