简体   繁体   English

Kafka生产者无法向服务器发送数据

[英]Kafka producer unable to send data to server

Here is my code. 这是我的代码。 I am able to create the topic but am not able to send data inside the topic due to some reason. 我能够创建主题但由于某种原因无法在主题内发送数据。 I get these errors after a long time. 很长一段时间后我都收到了这些错误。 I am using kafka version 2.11-0.8.2.1 我正在使用kafka版本2.11-0.8.2.1

org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@5474c6c
org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@4b6995df

This is the server.log file for kafka 这是kafka的server.log文件

[2016-12-27 21:05:54,873] ERROR Closing socket for /127.0.0.1 because of error (kafka.network.Processor)
java.io.IOException: An established connection was aborted by the software in your host machine
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(Unknown Source)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(Unknown Source)
at sun.nio.ch.IOUtil.read(Unknown Source)
at sun.nio.ch.SocketChannelImpl.read(Unknown Source)
at kafka.utils.Utils$.read(Utils.scala:380)
at kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
at kafka.network.Processor.read(SocketServer.scala:444)
at kafka.network.Processor.run(SocketServer.scala:340)
at java.lang.Thread.run(Unknown Source)
[2016-12-27 21:07:54,727] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor)
[2016-12-27 21:16:08,559] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor)

Here is my java code to send integer numbers to the kafka system: 这是我的java代码,用于将整数发送到kafka系统:

Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("batch.size", 16384);
    props.put("linger.ms", 1);
    props.put("buffer.memory", 33554432);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("timeout.ms", "50");

    Producer<String, String> producer = new KafkaProducer<>(props);
         for(int i = 0; i < 2; i++)
             System.out.println(producer.send(new ProducerRecord<String, String>("testtopic", Integer.toString(i), 
                     Integer.toString(i))).toString());

producer.close();

Here is pom.xml 这是pom.xml

<dependencies>
     <dependency>
     <groupId>org.apache.kafka</groupId>
     <artifactId>kafka-clients</artifactId>
     <version>0.10.1.0</version>
</dependency>  
<dependency>
  <groupId>org.apache.kafka</groupId>
  <artifactId>kafka_2.11</artifactId>
  <version>0.8.2.1</version>
</dependency>  
<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-simple</artifactId>
  <version>1.6.4</version>
</dependency>
<dependency>
  <groupId>log4j</groupId>
  <artifactId>log4j</artifactId>
  <version>1.2.16</version>
  <exclusions>
    <exclusion>
      <groupId>javax.jms</groupId>
      <artifactId>jms</artifactId>
    </exclusion>
  </exclusions>
</dependency>
</dependencies>

Nothing stands out except 没有什么比这更突出了

props.put("timeout.ms", "50");

Request timeout should be more than default polling interval which by default is 5 mins in Kafka. 请求超时应大于默认轮询间隔,默认情况下,在Kafka中为5分钟。 So I guess if leave it to default value (which is just above 5 mins) it should work. 所以我想如果将它保留为默认值(刚好超过5分钟)它应该可以工作。

I downgraded Kafka version to kafka_2.10-0.9.0.0 and the following properties work with it. 我将Kafka版本降级为kafka_2.10-0.9.0.0 ,以下属性可以使用它。

    Properties props = new Properties();
    props.put("metadata.broker.list", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("batch.size", 16384);
    props.put("linger.ms", 1);
    props.put("buffer.memory", 33554432);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
    props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
    props.put("serializer.class", "kafka.serializer.StringEncoder");
    ProducerConfig producerConfig = new ProducerConfig(props);
    kafka.javaapi.producer.Producer<String, String> producer 
    = new kafka.javaapi.producer.Producer<String, String>(producerConfig);

My Pom.xml file is as follows: 我的Pom.xml文件如下:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>TwitterKafkaPostgre</groupId>
<artifactId>TwitterKafkaPostgre</artifactId>
<version>0.0.1-SNAPSHOT</version>
 <dependencies>
  <dependency>
    <groupId>com.twitter</groupId>
    <artifactId>hbc-core</artifactId> <!-- or hbc-twitter4j -->
    <version>2.2.0</version> <!-- or whatever the latest version is -->
  </dependency>
  <dependency>
   <groupId>org.apache.kafka</groupId>
   <artifactId>kafka-clients</artifactId>
   <version>0.9.0.0</version>
 </dependency>  
 <dependency>
   <groupId>org.apache.kafka</groupId>
   <artifactId>kafka_2.11</artifactId>
   <version>0.9.0.0</version>
 </dependency>
 <dependency>
   <groupId>log4j</groupId>
   <artifactId>log4j</artifactId>
   <version>1.2.16</version>
   <exclusions>
     <exclusion>
       <groupId>javax.jms</groupId>
       <artifactId>jms</artifactId>
     </exclusion>
   </exclusions>
 </dependency>
<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-simple</artifactId>
  <version>1.6.4</version>
</dependency>
<dependency>
  <groupId>com.google.guava</groupId>
  <artifactId>guava</artifactId>
  <version>18.0</version>
</dependency>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM