简体   繁体   中英

Split filebeat message field into multiple fields in kibana

I've the following data for the message field which is being shipped by filebeat to elasticseatch. I am not using Logstash here

2020-09-20 15:44:23 ::1 get / - 80 - ::1 mozilla/5.0+(windows+nt+10.0;+win64;+x64)+windows/537.36+(khtml,+like+gecko)+chrome/85.0.4183.102+chrome/537.36 - 200 0 0 10

I want to split the above data at every space and assign them to different fields and the new fields should get reflect in the kibana discovery portal.

How can we do that?

I've tried to use scripted field in kibana, but I am unable to achieve it as I am not aware of scripted field querying.

You can set up an ingest pipeline in Elasticsearch and use the Grok processor to parse your message field into multiple fields. Reference: https://www.elastic.co/guide/en/elasticsearch/reference/master/grok-processor.html

For example, based on your log:

POST _ingest/pipeline/_simulate
{
  "pipeline": {
    "description" : "...",
    "processors": [
      {
        "grok": {
          "field": "message",
          "patterns": ["%{DATESTAMP:timestamp} %{DATA} %{WORD:request.method} %{DATA:request.url} - %{NUMBER:port} %{GREEDYDATA:useragent} - %{NUMBER:response.status_code} %{GREEDYDATA}"]
        }
      }
    ]
  },
  "docs":[
    {
      "_source": {
        "message": "2020-09-20 15:44:23 ::1 get / - 80 - ::1 mozilla/5.0+(windows+nt+10.0;+win64;+x64)+windows/537.36+(khtml,+like+gecko)+chrome/85.0.4183.102+chrome/537.36 - 200 0 0 10"
      }
    }
  ]
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM