简体   繁体   中英

Spring / Avro - using confluent schema registry

I am trying to use the Confluent schema-registry and its working for me following some examples I found in Github ( https://github.com/gAmUssA/springboot-kafka-avro ).

When consumer and producer shares the same namespace as the model than its working.

When the consumer is in a different project with different namespace but the same class (name and properties wise) than it is not working.

Confluent Avro deserializer can deserialize to GenericData$Record class with the correct values but it cant cast it to the actual object.

I am trying this:


@Data
@AllArgsConstructor
public class User {
    String name;
    int age;
}
...

@KafkaListener(topics = "users", groupId = "group_id")
public void consume(ConsumerRecord<String, User> record) {
    log.info(String.format("Consumed message -> %s", record.value().getName()));
}

The above code fails on a casting issue.

When I add specific.avro.reader=true to the props than it fails also.

Isn't this the whole purpose of the schema-registry, to be a central repo so the data can be deserialized by the schema in different projects and even different languages (python,java,.net,etc..)?

What am I missing?

The problem is that your User class is not an Avro class.

You should be using the Avro Maven / Gradle plugins to generate your classes rather than using Lombok, and the namespaces and full schemas do need to align. The best way I've found to manage that is to develop and release the models entirely separate from the actual kafka code

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM