简体   繁体   中英

Logstash not writing to Elasticsearch

I am having an error where Logstash is not writing a parse document to elasticsearch when the message property contains hierarchical message data. When the message property does not contain hierarchical data it works fine. Here is some data that works:

{
  "Layer": "Web",
  "DurationMilliseconds": 65,
  "CreatedOn": "2014-09-29T20:44:40.5380157Z",
  "Enviroment": "Dev",
  "AssemblyName": "LoggingTest",
  "ClassName": "HomeController",
  "MethodName": "Index",
  "WindowsIdentity": "XXX\\YYY",
  "SessionId": "wrm11rken3lc442humrxyhoe",
  "Application": "LoggingTest",
  "Machine": "XXX.XXX.XXX.XXX",
  "Browser": "Chrome",
  "@version": "1",
  "@timestamp": "2014-09-29T20:45:38.432Z",
  "type": "Perf"
}

Here is some data the does not work:

{
  "Enviroment": "Dev",
  "Level": "Fatal",
  "CreatedOn": "2014-09-29 20:46:30.5042",
  "WindowsIdentity": "XXX\\XXX",
  "Application": "LoggingTest",
  "UserAgent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.124 Safari/537.36",
  "SessionId": "wrm11rken3lc442humrxyhoe",
  "URL": "/LoggingTest/jsnlog.logger",
  "UserAddress": "XXX.XXX.XXX.XXX",
  "Message": {
    "stack": "TypeError: undefined is not a function\n    at Log (http://XXX/LoggingTest/:58:16)\n    at HTMLInputElement.onclick (http://XXX/LoggingTest/:66:141)",
    "message": "undefined is not a function",
    "name": "TypeError",
    "logData": "JS Fatal Exception"
  },
  "@version": "1",
  "@timestamp": "2014-09-29T20:46:30.331Z",
  "type": "JS"
}

Here is my logstash config:

input {
    redis {
        host => "127.0.0.1"
        type => "JS"
        data_type => "list"
        key => "JS"
    }
}

filter
{
    json{ source => "message"}
}

output {
    stdout { codec => rubydebug}

    elasticsearch { 
        host => localhost 
        index => dev
    }
}

When I run the above code through the first item is parsed into my elasticsearch successfully but the 2nd one disappears with no errors written by logstash.

I have change with your logs. Here is my config,

input {
    stdin{}
}

filter{
    json{ source => "message"}
}

output {
    stdout {
            codec => "rubydebug"
    }
    elasticsearch {
            host => localhost
            cluster => "BENLIM"
    }
}

With this config, when I send the logs which do not work to logstash, it can parse and output to the elasticsearch. FYI.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM