I'm storing my Jenkins build logs in ElasticSearch with the Jenkins Logstash plugin.
My configuration looks sort of like this:
That part works great, but I'd like to view the full log in Kibana.
The plugin incrementally sends the results to ES and breaks on each newline. That means a long log can look something like this in Kibana:
Where each line is a massive JSON output containing tons of fields I do not care about. I really only care about the message field.
I'm reading about aggregators right now that appear to be what I need, but my results are not coming out to what I'd like.
curl -X GET "localhost:9200/_search" -H 'Content-Type: application/json' -d'
{
"aggs" : {
"buildLog" : {
"terms" : {
"field" : "data.url"
}
}
}
}'
Prints out a large glob of json that does not have what I need.
In a perfect world, I'd like to concatenate every message field from each data.url and fetch that.
In SQL, an individual query for this might look something like:
SELECT message FROM jenkins-logstash WHERE data.url='job/playground/36' ORDER BY ASC
Where 'job/playground/36' is one example of every data.url.
How can I go about doing this?
Update: Better answer than before.
I still ended up using FileBeat, but with ELK v6.5.+ Kibana has a logs UI! https://www.elastic.co/guide/en/kibana/current/logs-ui.html
The default config from FileBeat works fine with it.
__
Old answer:
I ended up solving this by using FileBeat to harvest all logs, and then using the Kibana Log Viewer to watch each one. I filtered based on source
and then used the path where the log was going to be.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.