简体   繁体   中英

Parsing JSON data into separate fields in Kibana

We have an EKS cluster and an ELK stack to monitor the cluster. The logs from the cluster is shipped via Fluentbit. The requirement is to make the log message into separate fields in Kibana so that it becomes easier to filter and create dashboards out of it. If supposing the logs are made into a JSON file, could someone suggest a solution to make the key-value pairs in JSON into separate fields in Kibana?

在此处输入图像描述

Tldr;

There are multiple ways to go about it.

This is not an extensive list, but those are the one I know about.

Solution

Ingest pipeline

Create a pipeline

PUT _ingest/pipeline/pipeline_json_parser
{
  "description" : "json parser pipeline",
  "processors" : [
    {
      "json" : {
        "field" : "string_source",
        "target_field" : "json_target"
      }
    }
  ]
}

Test the pipeline

POST /_ingest/pipeline/pipeline_json_parser/_simulate
{
  "docs": [
    {
      "_index": "index",
      "_id": "id",
      "_source": {
        "string_source": "{\"a\":\"b\"}"
      }
    }
  ]
}

Set the pipeline as default on an index

And you are all set.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM