简体   繁体   中英

Apache Kafka does not consume from api

The console commands of kafka-console-producer.sh and kafka-console-consumer.sh are functioning properly but when I try to produce or consume using the api, I am not able to! Can someone tell me if there is something wrong with my scala code?

import java.util.Properties

import org.apache.kafka.clients.producer.{KafkaProducer, ProducerRecord}

object ScalaProducerExample  {
  val topic = "test"
  val brokers = "<broker>:9092"
  val props = new Properties()
  props.put("bootstrap.servers", brokers)
  props.put("client.id", "ScalaProducerExample")
  props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
  props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
  val producer = new KafkaProducer[String, String](props)
  val data = new ProducerRecord[String, String](topic, "message")
  producer.send(data)
  producer.close()
}

This is the dependencies loaded in build.sbt file:

libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"

libraryDependencies += "org.apache.kafka" %% "kafka" % "0.10.2.0"

I even wrote it in Java and the same is happening.

import org.apache.kafka.clients.ClientRequest;
import org.apache.kafka.clients.ClientResponse;
import org.apache.kafka.clients.KafkaClient;
import org.apache.kafka.clients.RequestCompletionHandler;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.Node;
import org.apache.kafka.common.requests.AbstractRequest;

import java.io.IOException;
import java.util.Date;
import java.util.List;
import java.util.Properties;
import java.util.Random;

public class ProducerExample {
    public static void main(String[] args) {
        String topic = "test";
        String brokers = "<broker>:9092";
        System.out.println("init " );
        Properties props = new Properties();
        props.put("bootstrap.servers", brokers);
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("batch.size", 16384);
        props.put("linger.ms", 1);
        props.put("buffer.memory", 33554432);


        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

        System.out.println("creating prducer " );
        KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);
        producer.flush();
        producer.send(new ProducerRecord<>(topic, "1", "2"));
        producer.close();
        System.out.println("close  " );
    }
}

The dependencies in built.sbt are:

libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"

I know that the connection works because, when I change the broker, I receive an error. But when the broker is correct, the program runs successfully, but I am not receiving any message.

Update: I am assuming that the reason the program runs successfully is that it gives timeout. I ran this

try {
            producer.send(new ProducerRecord<>(topic, "1", "2")).get(30, TimeUnit.SECONDS);
        } catch (InterruptedException e) {
            e.printStackTrace();
        } catch (ExecutionException e) {
            e.printStackTrace();
        } catch (TimeoutException e) {
            e.printStackTrace();
        }

And got this error:

java.util.concurrent.TimeoutException: Timeout after waiting for 30000 ms.
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:64)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
        at de.innocow.kafka.ProducerExample.main(ProducerExample.java:45)

How can I debug more than this and investigate why the producer is not sending?

producer.send(new ProducerRecord<>(topic, "1", "2"));
producer.flush();            
producer.close();

Try with this and see the Docs :

 The flush() call gives a convenient way to ensure all previously sent messages have actually completed. This example shows how to consume from one Kafka topic and produce to another Kafka topic: 
 for(ConsumerRecord<String, String> record: consumer.poll(100))
     producer.send(new ProducerRecord("my-topic", record.key(), record.value());
 producer.flush();
 consumer.commit();

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM