简体   繁体   中英

Spring kafka AVRO generated class not found

I'm using confluent JDBC connector to connect to a postgres database to retrieve changes and put them in a kafka topic. Now I want to consume these messages with a spring boot consumer. These messages are in AVRO format. I have the schema from the connector and generated a POJO class for it with the avro maven plugin.

But when the listener starts there are only the following error

   java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer
       at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:194) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:112) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1598) ~[spring-kafka-2.7.2.jar:2.7.2]
       at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1210) ~[spring-kafka-2.7.2.jar:2.7.2]
       at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
       at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
       at java.base/java.lang.Thread.run(Thread.java:832) ~[na:na]
   Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition ps_git_repo-0 at offset 0. If needed, please seek past the record to continue consumption.
   Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class ps_git_repo specified in writer's schema whilst finding reader's schema for a SpecificRecord.

When I do not use avro to deserialise the data then I will receive data but unreadable.

In the pom.xml I have the following dependencies


    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
    </dependency>
    
    <dependency>
        <groupId>org.apache.avro</groupId>
        <artifactId>avro</artifactId>
        <version>1.10.2</version>
    </dependency>
    <dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>6.2.0</version>
        <exclusions>
            <exclusion>
                <artifactId>netty</artifactId>
                <groupId>io.netty</groupId>
            </exclusion>
        </exclusions>
    </dependency>

and in the application.properties i've added the deserialiser and schema registry url.


    spring.kafka.consumer.key-deserializer = org.apache.kafka.common.serialization.StringDeserializer
    spring.kafka.consumer.value-deserializer = io.confluent.kafka.serializers.KafkaAvroDeserializer
    spring.kafka.bootstrap-servers = http://localhost:9092
    spring.kafka.consumer.properties.specific.avro.reader = true
    spring.kafka.consumer.properties.schema.registry.url = http://localhost:8081

In the build I use the avro maven plugin to generate a POJO from the schema created by the connector.

plugin in pom.xml


    <plugin>
        <groupId>org.apache.avro</groupId>
        <artifactId>avro-maven-plugin</artifactId>
        <version>1.10.2</version>
        <executions>
            <execution>
                <phase>generate-sources</phase>
                <goals>
                    <goal>schema</goal>
                </goals>
                <configuration>
                    <sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
                    <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
                    <stringType>String</stringType>
                </configuration>
            </execution>
        </executions>
    </plugin>

I've put the following schema into the folder and generate to pojo with mvn generate-sources

Schema.avsc


    {
      "connect.name": "ps_git_repo",
      "fields": [
        {
          "name": "id",
          "type": "long"
        },
        {
          "default": null,
          "name": "name",
          "type": [
            "null",
            "string"
          ]
        }
      ],
      "name": "ps_git_repo",
      "namespace": "com.company.api.kafkademo",
      "type": "record"
    }

I get the ps_git_repo.java class in the correct then I have this listener to retrieve the messages.

    @SpringBootApplication
    @EnableKafka
    public class KafkaDemoApplication {
    
        @KafkaListener(groupId = "test123", topics = "ps_git_repo_test")
        public void handleMessage(ps_git_repo message) {
            System.out.println(message);
        }
    
        public static void main(String[] args) {
            SpringApplication.run(KafkaDemoApplication.class, args);
        }
    
    }

The schema cannot be found.

Does anybody know what is wrong?

The deserialiser used the connect.name field instead of the namespace to find the correct class.

I've added the following lines to the config of the JDBC_connector to let the connector generate the correct namespace

"transforms":"AddNamespace",
"transforms.AddNamespace.type":"org.apache.kafka.connect.transforms.SetSchemaMetadata$Value",
"transforms.AddNamespace.schema.name": "com.company.api.kafkademo.ps_git_repo"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM