简体   繁体   English

找不到 Spring kafka AVRO 生成的类

[英]Spring kafka AVRO generated class not found

I'm using confluent JDBC connector to connect to a postgres database to retrieve changes and put them in a kafka topic.我正在使用融合 JDBC 连接器连接到 postgres 数据库以检索更改并将它们放入 kafka 主题中。 Now I want to consume these messages with a spring boot consumer.现在我想使用 Spring Boot 消费者来使用这些消息。 These messages are in AVRO format.这些消息采用 AVRO 格式。 I have the schema from the connector and generated a POJO class for it with the avro maven plugin.我有来自连接器的模式,并使用 avro maven 插件为它生成了一个 POJO 类。

But when the listener starts there are only the following error但是当监听器启动时只有以下错误

   java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer
       at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:194) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:112) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1598) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1210) ~[spring-kafka-2.7.2.jar:2.7.2]
       at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
       at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
       at java.base/java.lang.Thread.run(Thread.java:832) ~[na:na]
   Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition ps_git_repo-0 at offset 0. If needed, please seek past the record to continue consumption.
   Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class ps_git_repo specified in writer's schema whilst finding reader's schema for a SpecificRecord.

When I do not use avro to deserialise the data then I will receive data but unreadable.当我不使用 avro 反序列化数据时,我将收到数据但不可读。

In the pom.xml I have the following dependencies在 pom.xml 我有以下依赖项


    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
    </dependency>
    
    <dependency>
        <groupId>org.apache.avro</groupId>
        <artifactId>avro</artifactId>
        <version>1.10.2</version>
    </dependency>
    <dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>6.2.0</version>
        <exclusions>
            <exclusion>
                <artifactId>netty</artifactId>
                <groupId>io.netty</groupId>
            </exclusion>
        </exclusions>
    </dependency>

and in the application.properties i've added the deserialiser and schema registry url.在 application.properties 中,我添加了反序列化器和模式注册表 url。


    spring.kafka.consumer.key-deserializer = org.apache.kafka.common.serialization.StringDeserializer
    spring.kafka.consumer.value-deserializer = io.confluent.kafka.serializers.KafkaAvroDeserializer
    spring.kafka.bootstrap-servers = http://localhost:9092
    spring.kafka.consumer.properties.specific.avro.reader = true
    spring.kafka.consumer.properties.schema.registry.url = http://localhost:8081

In the build I use the avro maven plugin to generate a POJO from the schema created by the connector.在构建中,我使用 avro maven 插件从连接器创建的模式生成 POJO。

plugin in pom.xml pom.xml 中的插件


    <plugin>
        <groupId>org.apache.avro</groupId>
        <artifactId>avro-maven-plugin</artifactId>
        <version>1.10.2</version>
        <executions>
            <execution>
                <phase>generate-sources</phase>
                <goals>
                    <goal>schema</goal>
                </goals>
                <configuration>
                    <sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
                    <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
                    <stringType>String</stringType>
                </configuration>
            </execution>
        </executions>
    </plugin>

I've put the following schema into the folder and generate to pojo with mvn generate-sources我已将以下架构放入文件夹并使用mvn generate-sources生成到 pojo

Schema.avsc架构.avsc


    {
      "connect.name": "ps_git_repo",
      "fields": [
        {
          "name": "id",
          "type": "long"
        },
        {
          "default": null,
          "name": "name",
          "type": [
            "null",
            "string"
          ]
        }
      ],
      "name": "ps_git_repo",
      "namespace": "com.company.api.kafkademo",
      "type": "record"
    }

I get the ps_git_repo.java class in the correct then I have this listener to retrieve the messages.我得到了正确的 ps_git_repo.java 类,然后我有这个监听器来检索消息。

    @SpringBootApplication
    @EnableKafka
    public class KafkaDemoApplication {
    
        @KafkaListener(groupId = "test123", topics = "ps_git_repo_test")
        public void handleMessage(ps_git_repo message) {
            System.out.println(message);
        }
    
        public static void main(String[] args) {
            SpringApplication.run(KafkaDemoApplication.class, args);
        }
    
    }

The schema cannot be found.找不到架构。

Does anybody know what is wrong?有人知道出了什么问题吗?

The deserialiser used the connect.name field instead of the namespace to find the correct class.反序列化器使用 connect.name 字段而不是命名空间来查找正确的类。

I've added the following lines to the config of the JDBC_connector to let the connector generate the correct namespace我在 JDBC_connector 的配置中添加了以下几行,以让连接器生成正确的命名空间

"transforms":"AddNamespace",
"transforms.AddNamespace.type":"org.apache.kafka.connect.transforms.SetSchemaMetadata$Value",
"transforms.AddNamespace.schema.name": "com.company.api.kafkademo.ps_git_repo"

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AVRO 生成的 pojo's + Kafka + Spring Cloud Schema Registry - AVRO Generated pojo's + Kafka + Spring Cloud Schema Registry Spring Kafka 与 Confluent Kafka Avro 反序列化器 - Spring Kafka with Confluent Kafka Avro Deserializer Kafka 和具有多种消息类型的主题(Avro):Consumer(spring cloud spring 由于类路径上缺少 class 而失败 - Kafka and Topic with Multiple Messages Types (Avro) : Consumer(spring cloud spring fails due to the lack of class on classpath Avro 不可变生成 java 中的 class - Avro immutable generated class in java Spring Cloud Kafka和Avro序列化问题 - Spring cloud kafka and avro serialization issue 如何在 Spring-Kafka 中使用 Avro 序列化 - How to use Avro serialization with Spring-Kafka Spring Kafka Test with Confluent Kafka Avro Serializer 找不到 ZKClientConfig - Spring Kafka Test with Confluent Kafka Avro Serializer cannot find ZKClientConfig Avro通过json转换生成类问题[kotlin] - Avro generated class issue with json conversion [kotlin] Spring Kafka、Spring Cloud Stream 和 Avro 兼容性未知魔术字节 - Spring Kafka, Spring Cloud Stream, and Avro compatibility Unknown magic byte Spring Cloud Stream Kafka-找不到Serde类:org.apache.kafka.common.serialization.Serde $ StringSerde - Spring Cloud Stream Kafka - Serde class not found: org.apache.kafka.common.serialization.Serde$StringSerde
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM