[英]Kafka Avro serialize/deserialize into concrete type using schema registry failing
I can't seem to be able to consume messages in their concrete avro implementation, I get the following exception:我似乎无法在他们的具体 avro 实现中使用消息,我得到以下异常:
class org.apache.avro.generic.GenericData$Record cannot be cast to class my.package.MyConcreteClass
Here is the code (I use Spring Boot
)这是代码(我使用
Spring Boot
)
MyProducer.java MyProducer.java
private final KafkaTemplate<String, MyConcreteClass> kafkaTemplate;
public PositionProducer(KafkaTemplate<String, MyConcreteClass> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
public void sendMessage(MyConcreteClass myConcreteClass) {
this.kafkaTemplate.send(topic, myConcreteClass);
}
MyConsumer.java MyConsumer.java
@KafkaListener(topics = "#{'${consumer.topic.name}'}", groupId = "#{'${spring.kafka.consumer.group-id}'}")
public void listen(MyConcreteClass incomingMsg) {
//handle
}
Note that if I change everything to GenericRecord
, the deserialization works properly, so I know all config (not pasted) is configured correctly.请注意,如果我将所有内容更改为
GenericRecord
,则反序列化工作正常,因此我知道所有配置(未粘贴)均已正确配置。
Also maybe important to note that I didn't register the schema myself, and instead let my client code do it for me.同样重要的是要注意我自己没有注册模式,而是让我的客户端代码为我做。
Any ideas?有任何想法吗?
EDIT:编辑:
Config:配置:
@Bean
public ConsumerFactory<String, MyConcreteClass> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "io.confluent.kafka.serializers.KafkaAvroDeserializer");
props.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl);
return new DefaultKafkaConsumerFactory<>(props);
}
MyConcreteClass needs to extend SpecificRecord MyConcreteClass 需要扩展SpecificRecord
You can use the Avro maven plugin to generate it from a schema您可以使用 Avro maven 插件从架构中生成它
Then you must configure the serializer to know you want to use specific records然后您必须配置序列化程序以知道您要使用特定记录
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true") ;
In addition to OneCricketeer's answer, I encountered another java.lang.ClassCastException after setting the specific avro reader config.除了 OneCricketeer 的回答,我在设置了特定的 avro reader 配置后遇到了另一个 java.lang.ClassCastException。 It was
nested exception is java.lang.ClassCastException: class my.package.Envelope cannot be cast to class my.package.Envelope (my.package.Envelope is in unnamed module of loader 'app'; my.package.Envelope is in unnamed module of loader org.springframework.boot.devtools.restart.classloader.RestartClassLoader @3be312bd);
nested exception is java.lang.ClassCastException: class my.package.Envelope cannot be cast to class my.package.Envelope (my.package.Envelope is in unnamed module of loader 'app'; my.package.Envelope is in unnamed module of loader org.springframework.boot.devtools.restart.classloader.RestartClassLoader @3be312bd);
It seems like spring boot devtools wrapped the class in it's reloader module causing jvm thought that's a different class.似乎 spring boot devtools 将类包装在它的重新加载器模块中,导致 jvm 认为这是一个不同的类。
I removed the spring boot devtools in pom and it finally worked as expected now.我删除了 pom 中的 spring boot devtools,现在它终于按预期工作了。
<!-- Remove this from pom.xml -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
I was getting blocked by the same issue as you.我被和你一样的问题阻止了。 Thing is that the
KafkaAvroDeserializer
was deserializing the message as GenericData$Record
so then spring kafka is searching the class annotated with @KafkaListener
to have @KafkaHandler
methods with a parameter of this type.问题是
KafkaAvroDeserializer
将消息反序列化为GenericData$Record
,因此 spring kafka 正在搜索用 @KafkaListener 注释的@KafkaListener
以具有带有此类参数的@KafkaHandler
方法。
You'll need to add this property to your spring kafka configuration so the deserializer can return directly the SpecificRecord
classes that you previously need to generate with the avro plugin:您需要将此属性添加到您的 spring kafka 配置中,以便反序列化器可以直接返回您之前需要使用 avro 插件生成的
SpecificRecord
类:
spring:
kafka:
properties:
specific.avro.reader: true
Then your consumer may be like this那么你的消费者可能是这样的
@KafkaListener(...)
public void consumeCreation(MyAvroGeneratedClass specificRecord) {
log.info("Consuming record: {}", specificRecord);
}
You need to customize your consumer configuration.您需要自定义您的消费者配置。 The ContentDeserializer needs to be an KafkaAvroDeserializer with a reference to your schema registry.
ContentDeserializer 需要是一个 KafkaAvroDeserializer,并引用您的架构注册表。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.