简体   繁体   English

Logstash不解析JSON

[英]Logstash not parsing JSON

Logstash config Logstash配置

input {
    file{
        path => "/home/folder/stash/data/memcached_shadowserver.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
    csv {
        separator => ","
        columns => ["CATEGORY", "DESCRIPTION", "IP", "PORT", "ASN", "TIME", "GEOIP", "FQDN", "TAGS"]
    }

    mutate { convert => ["CATEGORY","string"]}
    mutate { convert => ["DESCRIPTION","string"]}
    mutate { convert => ["IP","string"]}
    mutate { convert => ["PORT","integer"]}
    mutate { convert => ["ASN","integer"]}
    mutate { convert => ["TIME","string"]}
    mutate { convert => ["GEOIP","string"]}
    mutate { convert => ["FQDN","string"]}
    mutate { convert => ["TAGS","string"]}

    json {
        source => "TAGS"
        target => "TAGS"
    }

}

output { 
    elasticsearch {
        hosts => "localhost"
        index => "test4"
    }
    stdout {}
}

Example of data 数据示例

CATEGORY,DESCRIPTION,IP,PORT,ASN,TIME,GEOIP,FQDN,TAGS
vulnerable service,misconfigured memcached servers,10.10.10.10,11211,3000,February 27th 2018 15:46:23.000,SS,,{"tag":"DDoS","tag":"reflection","tag":"amplification","tag":"attack"}

Logstash log Logstash日志

[2018-03-17T00:16:44,182][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"vulnerable service,misconfigured memcached servers,10.10.10.10,11211,3333,February 24th 2018 13:19:12.000,SS,,{\"tag\":\"DDoS\",\"tag\":\"reflection\",\"tag\":\"amplification\",\"tag\":\"attack\"}", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}

Other than duplicate keys (which shouldn't matter I would think) the JSON is valid. 除了重复键(我认为这应该不重要),JSON是有效的。 And I checked another thread Logstash does not parse json where basically the same JSON format was used and apparently it worked. 我检查了另一个线程Logstash没有解析json ,其中使用了基本相同的JSON格式,显然它工作正常。

Also ran my log without the JSON and then it works, so it has to be in there somewhere? 还运行我的日志没有JSON,然后它工作,所以它必须在那里的某个地方? Data shows up in Kibana as a message but nothing gets separated into fields. 数据显示在Kibana中作为消息但没有任何内容被分成字段。

Any ideas, suggestions? 有什么想法,建议吗?

Thanks 谢谢

I also tried to parse a log file which contained JSON that I had previously gotten to work. 我还试图解析一个包含我以前开始工作的JSON的日志文件。

CATEGORY,DESC,IP,GEOIP,ASN,TIME,PORT,DNS,TAGS
ics,Modbus protokolliga seadmed,80.235.16.222,EE,AS3249,2017-08-29T06:57:22.546423,1,kala.kalamees.ee,{"cms":"Drupal"} {"cve":"CVE-2018-0111"}

But now for some reason this JSON dosen't get parsed either. 但现在由于某种原因,这个JSON也不会被解析。 The other fields are properly parsed and can be searched in Kibana but nothing for the JSON even though I have previously gotten it to work with the same Logstash config.. 其他字段已正确解析,可以在Kibana中搜索,但JSON没有任何内容,即使我之前已经使用相同的Logstash配置。

Here is the log file which I've been using 这是我一直在使用的日志文件
https://github.com/jannoa/visualiseerimisplatvorm-DATA/files/1821633/test1.txt https://github.com/jannoa/visualiseerimisplatvorm-DATA/files/1821633/test1.txt

Maybe the problem is related to default csv quote char that is the character " , that is present in the json field. 也许这个问题与默认的csv引用char有关,它是字符" ,它存在于json字段中。

Try to set the quote_char to some values that is not present in your csv. 尝试将quote_char设置为csv中不存在的某些值。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM