[英]Kafka, Avro and Schema Registry
I have a Kafka consumer configured with schema polling from the topic, what I would like to do, is create another Avro schema, on top of the current one, and hydrate data using it, basically I don't need 50% of the information and need to write some logic to change a couple of fields.我有一个配置了来自主题的模式轮询的 Kafka 消费者,我想做的是在当前模式之上创建另一个 Avro 模式,并使用它来水合数据,基本上我不需要 50% 的信息并且需要编写一些逻辑来更改几个字段。 Thats just an example
这只是一个例子
val consumer: KafkaConsumer<String, GenericRecord>(props) = createConsumer()
while (true) {
consumer.poll(Duration.ofSeconds(10).forEach {it ->
println(it.value())
}
}
The event returned from stream is pretty complex, so I've modelled a smaller CustomObj as a.avsc file and compiled it to java.从 stream 返回的事件非常复杂,因此我将较小的 CustomObj 建模为 a.avsc 文件并将其编译为 java。 And when trying to run the code with the CustomObj,
Error deserializing key/value for partition
all I want to do is consume an event, and then deserialize it into a much smaller object with just selected fields.当尝试使用 CustomObj 运行代码时,
Error deserializing key/value for partition
我想要做的就是使用一个事件,然后将其反序列化为一个小得多的 object,只有选定的字段。
return KafkaConsumer<String, CustomObj>(props)
This didn't work, not sure how can I deserialize it using CustomObj from the GenericRecord?这不起作用,不知道如何使用 GenericRecord 中的 CustomObj 反序列化它? Let me just add that I don't have any access to the stream or its config I can just consume from it.
让我补充一点,我无法访问 stream 或其配置,我只能从中消费。
In Avro, your reader schema needs to be compatible with the writer schema.在 Avro 中,您的阅读器模式需要与写入器模式兼容。 By giving the smaller object, you're providing a different reader schema
通过提供较小的 object,您将提供不同的阅读器架构
It's not possible to directly deserialize to a subset of the input data, so you must parse the larger object and map it to the smaller one (which isn't what deserialization does)无法直接反序列化为输入数据的子集,因此您必须将较大的 object 和 map 解析为较小的(这不是反序列化所做的)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.