简体   繁体   中英

insert data into elasticsearch using logstash and visualize in kibana

I have the following CSV file

tstp,voltage_A_real,voltage_B_real,voltage_C_real  #header not present in actual file
2000-01-01 00:00:00,2535.53,-1065.7,-575.754
2000-01-01 01:00:00,2528.31,-1068.67,-576.866
2000-01-01 02:00:00,2528.76,-1068.49,-576.796
2000-01-01 03:00:00,2530.12,-1067.93,-576.586
2000-01-01 04:00:00,2531.02,-1067.56,-576.446
2000-01-01 05:00:00,2533.28,-1066.63,-576.099
2000-01-01 06:00:00,2535.53,-1065.7,-575.754
2000-01-01 07:00:00,2535.53,-1065.7,-575.754
....

I am trying to insert the data into elasticsearch through logstash and have the following logstash config

input {
    file {
        path => "path_to_csv_file"
        sincedb_path=> "/dev/null"
        start_position => beginning
    }
}
filter {
    csv {
        columns => [
          "tstp",
          "Voltage_A_real",
          "Voltage_B_real",
          "Voltage_C_real"
        ]
        separator => ","
        }
    date {
        match => [ "tstp", "yyyy-MM-dd HH:mm:ss"]
    }
    mutate {
        convert => ["Voltage_A_real", "float"]
        convert => ["Voltage_B_real", "float"]
        convert => ["Voltage_C_real", "float"]
    }
}
output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => ["localhost:9200"]
        action => "index"
        index => "temp_load_index"
    }
}

My output from rubydebug when I run logstash -f conf_file -v is

{
           "message" => "2000-02-18 16:00:00,2532.38,-1067,-576.238",
          "@version" => "1",
        "@timestamp" => "2000-02-18T21:00:00.000Z",
              "path" => "path_to_csv",
              "host" => "myhost",
              "tstp" => "2000-02-18 16:00:00",
    "Voltage_A_real" => 2532.38,
    "Voltage_B_real" => -1067.0,
    "Voltage_C_real" => -576.238
}

However I see only 2 events in kibana when I look at the dashboard and both have the current datetime stamp and not that of the year 2000 which is the range of my data. Could someone please help me figure out what is happening?

A sample kibana object is as follows

{
  "_index": "temp_load_index",
  "_type": "logs",
  "_id": "myid",
  "_score": null,
  "_source": {
    "message": "2000-04-02 02:00:00,2528.76,-1068.49,-576.796",
    "@version": "1",
    "@timestamp": "2016-09-27T05:15:29.753Z",
    "path": "path_to_csv",
    "host": "myhost",
    "tstp": "2000-04-02 02:00:00",
    "Voltage_A_real": 2528.76,
    "Voltage_B_real": -1068.49,
    "Voltage_C_real": -576.796,
    "tags": [
      "_dateparsefailure"
    ]
  },
  "fields": {
    "@timestamp": [
      1474953329753
    ]
  },
  "sort": [
    1474953329753
  ]
}

When you open Kibana, it usually show you only events in the last 15 min, according to the @timestamp field. So you need to set the time filter to the appropriate time range (cf documentation ), in your case, using the absolute option and starting 2000-01-01.

Or you can put the parsed timestamp in another field (for example original_tst ), so that the @timestamp added by Logstash will be kept.

date {
    match => [ "tstp", "yyyy-MM-dd HH:mm:ss"]
    target => "original_tst"
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM