[英]Avro fails to deserialize message with updated schema
I have a schema which has been updated to include a new field. 我有一个已更新为包含新字段的架构。 I'm using avro reflection and the confluent schema registry to deserialize/serialize data like so: 我正在使用avro反射和汇合模式注册表来反序列化/序列化数据,如下所示:
Serialization: 连载:
Schema schema = REFLECT_DATA.getSchema(value.getClass());
try {
int registeredSchemaId = this.schemaRegistry.register(subject, schema);
ByteArrayOutputStream out = new ByteArrayOutputStream();
out.write(0);
out.write(ByteBuffer.allocate(4).putInt(registeredSchemaId).array());
DatumWriter<Object> dw = new ReflectDatumWriter<>(schema);
Encoder encoder = ENCODER_FACTORY.directBinaryEncoder(out, null);
dw.write(value, encoder);
encoder.flush();
return out.toByteArray();
} catch (RuntimeException | IOException e) {
throw new SerializationException("Error serializing Avro message", e);
} catch (RestClientException e) {
throw new SerializationException("Error registering Avro schema: " + schema, e);
}
Deserialization: 反序列化:
if (readerSchema == null) {
readerSchema = new Schema.Parser().parse(schemaString);
}
int schemaId = -1;
try {
ByteBuffer buffer = ByteBuffer.wrap(payload);
if (buffer.get() != MAGIC_BYTE) {
throw new SerializationException("Unknown magic byte!");
}
schemaId = buffer.getInt();
Schema writerSchema = schemaRegistry.getById(schemaId);
int start = buffer.position() + buffer.arrayOffset();
int length = buffer.limit() - 1 - idSize;
DatumReader<Object> reader = new ReflectDatumReader<>(writerSchema, readerSchema);
BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
return reader.read(null, decoder); //line 83
} catch (IOException e) {
throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
} catch (RestClientException e) {
throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
}
The schema is defined by a scala case class, the old one looks like this: 模式由scala案例类定义,旧模式如下所示:
case class Data(oldField: String) {
def this("")
}
and it has been updated like so: 并且它已经像这样更新:
case class Data(oldField: String, @AvroDefault("") newField: String) {
def this("", "")
}
However, deserializing sometimes throws an AvroTypeException with the following stack: 但是,反序列化有时会抛出具有以下堆栈的AvroTypeException:
Caused by: org.apache.avro.AvroTypeException: Found com.company.project.DataHolder$.Data, expecting com.company.project.DataHolder$.Data
at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:231)
at org.apache.avro.io.parsing.Parser.advance(Parser.java:88)
at org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.java:127)
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:173)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139)
at io.fama.pubsub.KafkaAvroReflectDeserializer.deserialize(KafkaAvroReflectDeserializer.java:83)
Which I think is caused by difficulties serializing old messages (but am not entirely sure - I just can't reason as to what else it could be). 我认为这是由序列化旧消息的困难引起的(但我不完全确定 - 我无法推断它可能是什么)。 Has anyone else ever experienced this error or does anyone have any ideas to fix it? 有没有其他人遇到过这个错误或有没有人有任何想法来解决它?
如果您正在使用org.apache.avro.reflect属性,那么我认为您不能使用Scala案例类 - Scala案例类参数是不可变的,我相信属性映射器需要有一个公共类空构造函数和java可见字段,甚至可能是@BeanProperty来生成java setter / getters。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.