简体   繁体   中英

Logstash Dynamic Index From Document Field Fails

I still face problems to figure out, how to tell Logstash to send a dynamic index, based on a document field. Furthermore, this Field must be transformed in order to get the "real" index at the very end. Given, that there is a field "time" (which is a UNIX Timestamp). This Field gets already transformed with a "date" Filter to a DateTime Object for Elastic. Additionally, it should server as index (YYYYMM). The index should NOT be derived from @Timestamp, which is not touched.

Example: {...,"time":1453412341,...}

Shall go to the Index: 201601

I use the following Config:

filter {
  date {
    match => [ "time", "UNIX" ]
    target => "time"
    timezone => "Europe/Berlin"
  }
}
output {
    elasticsearch {
        index => "%{time}%{+YYYYMM}"
        document_type => "..."
        document_id => "%{ID}"
        hosts => "..."
    }
}

Sadly, its not working. Any idea, how to achieve that?

Thanks a lot!

The "%{+YYYYMM}" says to use the date values from @timestamp. If you want an index named after the YYYYMM in %{time}, you need to make a string out of that date field and then reference that string in the output stanza. There might be a mutate{} that would do it, or drop into ruby{}.

In most installations, you want to set @timestamp to the event's value. The default of logstash's own time is not very useful (imagine if your events were delayed by an hour during processing). If you did that, then %{+YYYYMM}" would work just fine.

这是因为默认情况下基于UTC时间创建了索引名称。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM