简体   繁体   中英

Logstash not parsing JSON

Logstash config

input {
    file{
        path => "/home/folder/stash/data/memcached_shadowserver.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
    csv {
        separator => ","
        columns => ["CATEGORY", "DESCRIPTION", "IP", "PORT", "ASN", "TIME", "GEOIP", "FQDN", "TAGS"]
    }

    mutate { convert => ["CATEGORY","string"]}
    mutate { convert => ["DESCRIPTION","string"]}
    mutate { convert => ["IP","string"]}
    mutate { convert => ["PORT","integer"]}
    mutate { convert => ["ASN","integer"]}
    mutate { convert => ["TIME","string"]}
    mutate { convert => ["GEOIP","string"]}
    mutate { convert => ["FQDN","string"]}
    mutate { convert => ["TAGS","string"]}

    json {
        source => "TAGS"
        target => "TAGS"
    }

}

output { 
    elasticsearch {
        hosts => "localhost"
        index => "test4"
    }
    stdout {}
}

Example of data

CATEGORY,DESCRIPTION,IP,PORT,ASN,TIME,GEOIP,FQDN,TAGS
vulnerable service,misconfigured memcached servers,10.10.10.10,11211,3000,February 27th 2018 15:46:23.000,SS,,{"tag":"DDoS","tag":"reflection","tag":"amplification","tag":"attack"}

Logstash log

[2018-03-17T00:16:44,182][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"vulnerable service,misconfigured memcached servers,10.10.10.10,11211,3333,February 24th 2018 13:19:12.000,SS,,{\"tag\":\"DDoS\",\"tag\":\"reflection\",\"tag\":\"amplification\",\"tag\":\"attack\"}", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}

Other than duplicate keys (which shouldn't matter I would think) the JSON is valid. And I checked another thread Logstash does not parse json where basically the same JSON format was used and apparently it worked.

Also ran my log without the JSON and then it works, so it has to be in there somewhere? Data shows up in Kibana as a message but nothing gets separated into fields.

Any ideas, suggestions?

Thanks

I also tried to parse a log file which contained JSON that I had previously gotten to work.

CATEGORY,DESC,IP,GEOIP,ASN,TIME,PORT,DNS,TAGS
ics,Modbus protokolliga seadmed,80.235.16.222,EE,AS3249,2017-08-29T06:57:22.546423,1,kala.kalamees.ee,{"cms":"Drupal"} {"cve":"CVE-2018-0111"}

But now for some reason this JSON dosen't get parsed either. The other fields are properly parsed and can be searched in Kibana but nothing for the JSON even though I have previously gotten it to work with the same Logstash config..

Here is the log file which I've been using
https://github.com/jannoa/visualiseerimisplatvorm-DATA/files/1821633/test1.txt

Maybe the problem is related to default csv quote char that is the character " , that is present in the json field.

Try to set the quote_char to some values that is not present in your csv.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM