简体   繁体   English

Kafka spring boot 应用程序生产者,无法使用 Avro 格式的 Kafka Sink Connector 反映这一点

[英]Kafka spring boot application producer and unable to reflect that with Kafka Sink Connector with Avro format

My target is I have spring boot application kafka producer with Avro serializing property and i am excepting the message which is pushed to respective topic that should access by confluent Sink Connector and insert into mysql/Oracle database tables, am able to produce Avro serialize and spring boot consumer can Avro deserialize but my Sink connector is not working, I am not able to catch what kind of payload sink connector is excepting and how should be spring boot producer coded to push message such a way that sink connector can cop-up with that properties我的目标是我有具有Avro序列化属性的spring boot 应用程序 kafka 生产者,我排除了被推送到相应主题的消息,这些主题应该通过 confluent Sink Connector 访问并插入到 mysql/Oracle 数据库表中,我能够产生 Avro 序列化和spring boot consumer可以 Avro 反序列化,但我的 Sink 连接器不工作,我无法捕捉到哪种 payload sink connector异常,以及 spring boot producer 应该如何编码以推送消息,以便 sink 连接器可以应对特性

Thanks in advance:)提前致谢:)

This is application.yml in spring boot application这是spring boot应用中的application.yml

server: 
    port: 9000
    spring.kafka:
    bootstrap-servers: "localhost:9092"
    properties:
      schema.registry.url: "http://localhost:8081"
      specific.avro.reader: true
    producer:
      key-serializer: "io.confluent.kafka.serializers.KafkaAvroSerializer"
      value-serializer: "io.confluent.kafka.serializers.KafkaAvroSerializer"
    app:
      topic: event_pay2

This is payload for creation of schema from spring boot application这是用于从 spring boot 应用程序创建模式的有效负载

{
  "schema": {
    "type": "struct",
    "fields": [
      {
        "type": "string",
        "optional": false,
        "field": "userid"
      },
      {
        "type": "string",
        "optional": false,
        "field": "regionid"
      },
      {
        "type": "string",
        "optional": false,
        "field": "gender"
      }
    ],
    "optional": false,
    "name": "oracle.user"
  },
  "payload": {
    "userid": "User_1",
    "regionid": "Region_5",
    "gender": "FEMALE"
  }
}

Pom.xml pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>2.0.5.RELEASE</version>
    <relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.kafka</groupId>
<artifactId>kafka-producer-example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>kafka-producer-example</name>
<description>Demo project for Spring Boot</description>

 <repositories>
    <repository>
        <id>confluent</id>
        <url>http://packages.confluent.io/maven</url>
    </repository>
</repositories>

<properties>
    <java.version>1.8</java.version>
    <confluent.version>4.0.0</confluent.version>
</properties>

<dependencies>
<dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-devtools</artifactId>
        <scope>runtime</scope>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka-test</artifactId>
        <scope>test</scope>
    </dependency>
     <dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>${confluent.version}</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
  <groupId>org.apache.avro</groupId>
  <artifactId>avro</artifactId>
  <version>1.8.2</version>
</dependency>
</dependencies>

<build>
 <pluginManagement>
    <plugins>
    <plugin>
        <groupId>org.apache.avro</groupId>
        <artifactId>avro-maven-plugin</artifactId>
        <version>1.8.2</version>
        <executions>
            <execution>
                <phase>generate-sources</phase>
                <goals>
                    <goal>schema</goal>
                </goals>
                <configuration>
                    <sourceDirectory>${project.basedir}/src/main/resources/avro/</sourceDirectory>
                    <outputDirectory>${project.build.directory}/generated/avro</outputDirectory>
                </configuration>
            </execution>
        </executions>
    </plugin>


        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
    </plugins>
     </pluginManagement>
</build>

</project>

This is my rest call how am pushing messgae into kafka topic这是我的休息电话,如何将 messgae 推入 kafka 主题

@PostMapping("/publish/avrodata")
public String sendMessage(@RequestBody String request) {
    sender.send(request);
    return "Published successfully";
}

Finally My sink connector最后我的水槽连接器

"name": "JDBC_Sink_EVENT_PAY",
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "event_pay2",
"connection.url": "jdbc:mysql://localhost:3306/user",
"connection.user": "****",
"connection.password": "****",
"auto.create": "true",
"auto.evolve":"true",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"key.converter.schemas.enable": "true",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
 "value.converter.schemas.enable": "true"

Always, always debug your topic before setting up a Connector.始终,始终在设置连接器之前调试您的主题。 Use kafka-avro-console-consumer for this.为此使用kafka-avro-console-consumer If that doesn't work, then Connect + AvroConverter likely won't work either (in my experience), and you can reduce the problem space.如果这不起作用,那么 Connect + AvroConverter可能也不起作用(根据我的经验),您可以减少问题空间。


If I read your code correctly, you've sent a String, and not an Avro object如果我没看错你的代码,你发送的是一个字符串,而不是一个 Avro 对象

public String sendMessage(@RequestBody String request) {
    sender.send(request);  // <--- here and ^^^^^^ here
    return "Published successfully";
}

Instead, you need to parse your input request into an object that was created as part of your /src/main/resources/avro schema data, not just forward through the incoming request as a string.相反,您需要将输入请求解析为作为/src/main/resources/avro模式数据的一部分创建的对象,而不仅仅是将传入请求作为字符串转发。

And that AVSC file might look something like那个 AVSC 文件可能看起来像

{
  "type": "Record",
  "namespace": "oracle.user",
  "name": "User",
  "fields": [
      { "type": "string", "name: "user_id" },
      { "type": "string", "name": "regionid" },
      { "type" "string", "name" "gender" }
   ]
}

Which would create a oracle.user.User object, making your KafkaTemplate need to be something like KafkaTemplate<String, oracle.user.User> sender这将创建一个oracle.user.User对象,使您的KafkaTemplate需要类似于KafkaTemplate<String, oracle.user.User> sender

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法将 spring 引导生产者应用程序连接到 kafka 融合主题 - Unable to connect spring boot producer app to kafka confluent topic 如何在 Spring Boot 中设置 Kafka 幂等生产者? - How to setup Kafka Idempotent Producer in Spring Boot? 使用 Spring Boot 启动 Kafka 生产者时出错 - Error while staring Kafka producer with Spring Boot 使用 avro 的 Kafka 控制台生产者 - Kafka Console Producer using avro Spring 使用 Spring 云启动应用程序 Stream Kafka Binder + Kafka Streams Binder 不工作 - 生产者不发送消息 - Spring Boot application using the Spring Cloud Stream Kafka Binder + Kafka Streams Binder not working - Producer doesn't send messages Spring 引导应用程序与 kafka stream - Spring boot application with kafka stream 使用 Spring Boot 的多线程事务性 Kafka 生产者和消费者 - Multi thread transactional Kafka producer and consumer with Spring Boot spring boot kafka producer disable topic auto creation - spring boot kafka producer disable topic auto creation Spring Boot &amp; Kafka,Producer 抛出异常,key=&#39;null&#39; - Spring Boot & Kafka, Producer thrown exception with key='null' Kafka重启后,Spring Boot生产者无法发送任何消息 - Spring boot producer fail to send any message after kafka restart
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM