Here is the mapping of target Elasticsearch index:
"mappings": {
"_doc": {
"properties": {
"start_time": {
"format": "epoch_millis",
"type": "date"
},
"channel": {
"type": "keyword"
},
"end_time": {
"format": "epoch_millis",
"type": "date"
},
"range_time": {
"format": "epoch_millis",
"type": "date_range"
}
}
}
}
And here is my related logstash config file:
filter {
mutate {
split => ["message", "|"]
add_field => {
"start_time" => "%{[message][1]}"
"end_time" => "%{[message][2]}"
"channel" => "%{[message][5]}"
**"range_time" => [
"%{[message][1]}",
"%{[message][2]}"
]**
}
remove_field => "message"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost" ]
index => "test_live"
}
}
My question is how to write the "range_time" => part ([mutate][add_field][range_time]) for shipping date_range type data to ES. In the console, I got the output like this:
{
"@timestamp" => 2021-04-19T01:46:40.617Z,
"start_time" => "20210401001401",
"end_time" => "20210401001408",
"range_time" => [
[0] "20210401001401",
[1] "20210401001408"
],
"host" => "localhost.localdomain",
"channel" => "SCTV-2",
"path" => "/**/",
"@version" => "1"
}
But the output can't write data to index correctly. How could I do this?
A date_range
field contains two fields named gte
and lte
.
So you simply need to do it like this:
add_field => {
...
"[range_time][gte]" => "%{[message][1]}"
"[range_time][lte]" => "%{[message][2]}"
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.