简体   繁体   English

KAFKA Java使用者无法使用

[英]KAFKA Java consumer not working

I am not able get my java consumer work in local host. 我无法在本地主机上获得Java使用者工作。 Console consumer works fine. 控制台使用者工作正常。 Below is my consumer code. 以下是我的消费者代码。

public class TestConsumer { public static void main(String[] args) throws Exception { 公共类TestConsumer {公共静态void main(String [] args)引发异常{

  //Kafka consumer configuration settings
  String topicName = "test";// args[0].toString();
  Properties props = new Properties();

  props.put("bootstrap.servers", "localhost:9092");
  props.put("group.id", "test-consumer-group");
  props.put("enable.auto.commit", "true");
  props.put("auto.commit.interval.ms", "1000");
  props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
  props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

  KafkaConsumer<String, String> consumer = new KafkaConsumer <String, String>(props);

  //Kafka Consumer subscribes list of topics here.
  consumer.subscribe(Arrays.asList(topicName));

  //print the topic name
  System.out.println("Subscribed to topic " + topicName);
  int i = 0;

  while (true) {
      System.out.printf("while loop");
      ConsumerRecords<String, String> records  = consumer.poll(1000);

      for (ConsumerRecord<String, String> record : records) // print the offset,key and value for the consumer records.
          System.out.printf("offset = %d, key = %s, value = %s\n",record.offset(), record.key(), record.value());
  }

} } }}

Java producer works fine on the same topic. Java生产者在同一主题上工作良好。

public class TestProducer { public static void main(String[] args) throws Exception{ 公共类TestProducer {公共静态void main(String [] args)引发异常{

      //Assign topicName to string variable
      String topicName = "test";//args[0].toString();
      Properties props = new Properties();
      props.put("bootstrap.servers", "localhost:9092");
      props.put("acks", "all");
      props.put("retries", 0);
      props.put("batch.size", 16384);
      props.put("linger.ms", 1);
      props.put("buffer.memory", 33554432);
      props.put("key.serializer","org.apache.kafka.common.serialization.StringSerializer");

      props.put("value.serializer","org.apache.kafka.common.serialization.StringSerializer");

      Producer<String, String> producer = new KafkaProducer
         <String, String>(props);

      for(int i = 0; i < 10; i++)
         producer.send(new ProducerRecord<String, String>(topicName, 
            Integer.toString(i), Integer.toString(i)));
               System.out.println("Message sent successfully");
               producer.close();
        }

} }

I got another implementation of consumer and that has started working. 我得到了另一个消费者实现,并且已经开始工作。 I guess the was the properties that I was putting was not correct.public class KafkaConsumer { private ConsumerConnector consumerConnector = null; 我猜这是我要放置的属性不正确。public class KafkaConsumer {private ConsumerConnector ConsumerConnector = null; private final String topic = "test"; private final String topic =“ test”;

public void initialize() {
      Properties props = new Properties();
      props.put("zookeeper.connect", "localhost:2181");
      props.put("group.id", "testgroup");
      props.put("zookeeper.session.timeout.ms", "400");
      props.put("zookeeper.sync.time.ms", "300");
      props.put("auto.commit.interval.ms", "1000");
      ConsumerConfig conConfig = new ConsumerConfig(props);
      consumerConnector = Consumer.createJavaConsumerConnector(conConfig);
}

public void consume() {
      //Key = topic name, Value = No. of threads for topic
      Map<String, Integer> topicCount = new HashMap<String, Integer>();       
      topicCount.put(topic, new Integer(1));

      //ConsumerConnector creates the message stream for each topic
      Map<String, List<KafkaStream<byte[], byte[]>>> consumerStreams =   consumerConnector.createMessageStreams(topicCount);         

      // Get Kafka stream for topic 'zinguplife'
      List<KafkaStream<byte[], byte[]>> kStreamList = consumerStreams.get(topic);
      // Iterate stream using ConsumerIterator
      for (final KafkaStream<byte[], byte[]> kStreams : kStreamList) {
             ConsumerIterator<byte[], byte[]> consumerIte = kStreams.iterator();

             while (consumerIte.hasNext())
                    System.out.println("Message consumed from topic    [" + topic + "] : "  + new String(consumerIte.next().message()));              
      }
      //Shutdown the consumer connector
      if (consumerConnector != null)   consumerConnector.shutdown();          
}

public static void main(String[] args) throws InterruptedException {
      KafkaConsumer kafkaConsumer = new KafkaConsumer();
      // Configure Kafka consumer
      kafkaConsumer.initialize();
      // Start consumption
      kafkaConsumer.consume();
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM