简体   繁体   中英

Consuming a kafka topic using logstash to elasticSearch

I'm starting consuming the messages from kafka to logstash, and i want to send the full topic to elasticSearch, but i'm not getting any messages on logstash, i can see in kafka the messages coming, but from the kafka side i don't see anything, what is the right way to configure it?

input {
   kafka {
   zk_connect => "localhost:2181"
   topic_id => "event"
  }
}

output{
   stdout{
     codec => rubydebug
   }
   elasticsearch{
    index => "event-%{+YYYY.MM.dd}"
    hosts => ["localhost:9201"]
    codec => json
   }
}

curl localhost:9201
{
  "name" : "Flex",
  "cluster_name" : "elasticsearch",
  "version" : {
    "number" : "2.3.4",
    "build_hash" : "e455fd0c13dceca8dbbdbb1665d068ae55dabe3f",
    "build_timestamp" : "2016-06-30T11:24:31Z",
    "build_snapshot" : false,
    "lucene_version" : "5.5.0"
  },
  "tagline" : "You Know, for Search"
}

the command:

/kafka-console-consumer.sh --zookeeper localhost:2181 --topic event

produces results from time to time.

Try like this with auto_offset_reset and reset_beginning :

  kafka {
    topic_id => "event"
    zk_connect => "localhost:2181"
    group_id => "event-group"
    auto_offset_reset => "smallest"
    reset_beginning => true
    consumer_threads => 1
  }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM