简体   繁体   中英

Date parsing issue in ELK Logstash with the custom Java timestamp format of logs

Following is the sample logs recieved from java application

2019-04-11 9:08:22:562 Log 1 
2019-04-11 9:08:22:660 Log 2 
2019-04-11 9:08:43:79 Log 3 
2019-04-11 9:08:43:156 Log 4 

From above logs, I'm facing issue with Log 3 where the milliseconds value is only 79, but after parsing in the Logstash, the value is set as 790 ms (Logstash parsing is correct, but java log value is wrong). Actually the value should be 2019-04-11 9:08:43:079 in the log for proper parsing.

Logstash filter is as below:

date {
    match => [ "log_time", "yyyy-MM-dd HH:mm:ss:SSS", "ISO8601" ]
    target => "log_time"
    timezone => "CET"
}

On digging deeper, I found the issue is with Java logging with this time format, it will be resolved if the format is yyyy-MM-dd HH:mm:ss.SSS . But the logging application uses the format yyyy-MM-dd HH:mm:ss:SSS which causes this issue (Note the difference in format :SSS and .SSS ).

I cannot change the logging java system, So is there any workaround with the Logstash filter to fix this issue.

I got it resolved by inserting 0 prefix to the milliseconds having only 2 digits, using following gsub:

mutate { gsub => [ "log_time", "^([0-9-]+ [0-9]+:[0-9]{2}:[0-9]{2}:)([0-9])$", "\100\2",
"log_time", "^([0-9-]+ [0-9]+:[0-9]{2}:[0-9]{2}:)([0-9]{2})$", "\10\2" ] }

Got help from elastic discuss group

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM