[英]Apache Kafka does not consume from api
kafka-console-producer.sh和kafka-console-consumer.sh的控制台命令运行正常,但是当我尝试使用api进行生产或使用时,却无法执行! 有人可以告诉我我的Scala代码是否有问题?
import java.util.Properties
import org.apache.kafka.clients.producer.{KafkaProducer, ProducerRecord}
object ScalaProducerExample {
val topic = "test"
val brokers = "<broker>:9092"
val props = new Properties()
props.put("bootstrap.servers", brokers)
props.put("client.id", "ScalaProducerExample")
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
val producer = new KafkaProducer[String, String](props)
val data = new ProducerRecord[String, String](topic, "message")
producer.send(data)
producer.close()
}
这是在build.sbt文件中加载的依赖项:
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"
libraryDependencies += "org.apache.kafka" %% "kafka" % "0.10.2.0"
我什至用Java编写了它,并且同样发生了。
import org.apache.kafka.clients.ClientRequest;
import org.apache.kafka.clients.ClientResponse;
import org.apache.kafka.clients.KafkaClient;
import org.apache.kafka.clients.RequestCompletionHandler;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.Node;
import org.apache.kafka.common.requests.AbstractRequest;
import java.io.IOException;
import java.util.Date;
import java.util.List;
import java.util.Properties;
import java.util.Random;
public class ProducerExample {
public static void main(String[] args) {
String topic = "test";
String brokers = "<broker>:9092";
System.out.println("init " );
Properties props = new Properties();
props.put("bootstrap.servers", brokers);
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
System.out.println("creating prducer " );
KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);
producer.flush();
producer.send(new ProducerRecord<>(topic, "1", "2"));
producer.close();
System.out.println("close " );
}
}
Built.sbt中的依赖项是:
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"
我知道该连接有效,因为当我更改代理时,会收到错误消息。 但是,当代理正确时,该程序成功运行,但是我没有收到任何消息。
更新:我假设程序成功运行的原因是它给出了超时。 我跑了
try {
producer.send(new ProducerRecord<>(topic, "1", "2")).get(30, TimeUnit.SECONDS);
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
} catch (TimeoutException e) {
e.printStackTrace();
}
并得到此错误:
java.util.concurrent.TimeoutException: Timeout after waiting for 30000 ms.
at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:64)
at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
at de.innocow.kafka.ProducerExample.main(ProducerExample.java:45)
我该如何调试更多信息,并调查生产者为何不发送消息?
producer.send(new ProducerRecord<>(topic, "1", "2"));
producer.flush();
producer.close();
试试看,看看Docs :
The flush() call gives a convenient way to ensure all previously sent messages have actually completed. This example shows how to consume from one Kafka topic and produce to another Kafka topic:
for(ConsumerRecord<String, String> record: consumer.poll(100))
producer.send(new ProducerRecord("my-topic", record.key(), record.value());
producer.flush();
consumer.commit();
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.