简体   繁体   中英

Can't get logstash to handle JSON file

I'm trying to get a JSON file parsed by logstash into elasticsearch. I've read several examples but nothing I've tried seems to work. All I want to do for a start is read the JSON file and convert each key:value pair into the same values in elasticsearch. But when I run /opt/logstash$ bin/logstash -f ~/logstash-test.conf with the filter below I just get:

Logstash startup completed

and nothing appears in elasticsearch. What am I missing?

input {
  file {
    type => "json"
    path => ["/home/demo/data.json"]
    start_position => beginning
  }
}

filter {
  json {
    source => message
  }
}

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

I'm not sure if this will solve your problem, but the logstash docs for the json filter say:

filter {
  json {
    source => "message"
  }
}

Putting quotes around message could maybe fix your issue. Also put quotes around beginning.

Furthermore the path must always be the absolute path, check this too if you set it correctly.

The main problem turned out to be that my hard drive was almost full so Elasticsearch was reporting that it couldn't save it. Unfortunately this isn't reported from logstash so while the process appears to be working nothing gets saved.

I was also helped by Magnus Bäck 's comment about how start_position wasn't resetting the file, so thanks for that.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM