[英]How to send message to Kafka with Avro serializer and schema registry
I am trying to send an object to Kafka with Avro serializer and schema registry.我正在尝试使用 Avro 序列化程序和架构注册表向 Kafka 发送一个对象。
Here is a simplified code:这是一个简化的代码:
Properties props = new Properties();
...
props.put(KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.put(VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class.getName());
props.put(SCHEMA_REGISTRY_URL_CONFIG, "http://" + schemaRegistryHostname + ":8081");
Producer<String, User> producer = new KafkaProducer(properties);
User user = new User("name", "address", 123);
ProducerRecord record = new ProducerRecord<>(topic, key, user);
producer.send(record);
I assumed that the schema is read "behind the scenes" from the registry and the object (user) is serialized, but I get the error below.我假设模式是从注册表“幕后”读取的,并且对象(用户)被序列化,但我收到以下错误。
What am I missing?我错过了什么?
Do I have to read the schema explicitly and send a GenericRecord?我是否必须明确读取架构并发送 GenericRecord?
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message org.apache.kafka.common.errors.SerializationException:序列化 Avro 消息时出错
Caused by: java.lang.IllegalArgumentException: Unsupported Avro type.引起:java.lang.IllegalArgumentException:不支持的 Avro 类型。 Supported types are null, Boolean, Integer, Long, Float, Double, String, byte[] and IndexedRecord支持的类型有 null、Boolean、Integer、Long、Float、Double、String、byte[] 和 IndexedRecord
at io.confluent.kafka.serializers.AbstractKafkaAvroSerDe.getSchema(AbstractKafkaAvroSerDe.java:123) ~[kafka-avro-serializer-3.3.0.jar!/:?]在 io.confluent.kafka.serializers.AbstractKafkaAvroSerDe.getSchema(AbstractKafkaAvroSerDe.java:123) ~[kafka-avro-serializer-3.3.0.jar!/:?]
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:73) ~[kafka-avro-serializer-3.3.0.jar!/:?]在 io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:73) ~[kafka-avro-serializer-3.3.0.jar!/:?]
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53) ~[kafka-avro-serializer-3.3.0.jar!/:?]在 io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53) ~[kafka-avro-serializer-3.3.0.jar!/:?]
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:424) ~[kafka-clients-0.9.0.1.jar!/:?]在 org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:424) ~[kafka-clients-0.9.0.1.jar!/:?]
Your code seems to be correct.您的代码似乎是正确的。 The only thing that could be missing is that your AVRO object was not properly generated with some AVRO plugin, it means that your class need to implements SpecificRecords
which implements IndexedRecord
.唯一可能缺少的是您的 AVRO 对象没有使用某些 AVRO 插件正确生成,这意味着您的类需要实现实现IndexedRecord
SpecificRecords
。
Your code seems to be correct , U must have not created proper structure from avsc file with mvn generate sources with maven (pass this command on terminal in your project folder)您的代码似乎是正确的,您一定没有使用mvn从 avsc 文件创建正确的结构,使用 maven生成源代码(在项目文件夹中的终端上传递此命令)
Next it will create a bean where u can pass the values as接下来它将创建一个 bean,您可以在其中传递值
User order = User.newBuilder()
.setName("xyz")
.setAddress("CId432")
.setPrice("123")
.build();
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.