[英]How to convert from Avro GenericRecord to JSON without adding schema name?
I have 2 schemas:我有 2 个模式:
Event.avsc:事件.avsc:
{
"type": "record",
"namespace": "com.onemount.jobs.transform.schema.avro",
"name": "Event",
"fields": [
{
"name": "id",
"type": "string"
},
{
"name": "mtp_interest_submit",
"type": ["null", "InterestSubmitParam"],
"default": null
}
]
}
InterestSubmitParam.avsc: InterestSubmitParam.avsc:
{
"type": "record",
"namespace": "com.onemount.jobs.transform.schema.avro",
"name": "InterestSubmitParam",
"fields": [
{
"name": "interest",
"type": {
"type": "array",
"items": "string"
}
}
]
}
I'm consuming Avro messages from Kafka Confluent (with specific.avro.reader=false
) and need to convert from GenericRecord
to ObjectNode
.我正在使用来自 Kafka Confluent 的 Avro 消息(带有specific.avro.reader=false
)并且需要从GenericRecord
转换为ObjectNode
。 This is the result:这是结果:
{
"id": "c8b76e58-9803-4c78-9f82-a185bda1cabf",
"mtp_interest_submit": {
"com.onemount.jobs.transform.schema.avro.InterestSubmitParam": {
"interest": [
"fashion",
"travel"
]
}
}
}
But I'm expected it should be:但我预计它应该是:
{
"id": "c8b76e58-9803-4c78-9f82-a185bda1cabf",
"mtp_interest_submit": {
"interest": [
"fashion",
"travel"
]
}
}
How can I fix it.我该如何解决。 This is my converter code:这是我的转换器代码:
GenericRecord genericRecord = ...
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
DatumWriter<GenericRecord> writer = new GenericDatumWriter<>(genericRecord.getSchema());
JsonEncoder encoder = EncoderFactory.get().jsonEncoder(genericRecord.getSchema(), outputStream);
writer.write(genericRecord, encoder);
encoder.flush();
return new String(outputStream.toByteArray(), StandardCharsets.UTF_8);
}
Thanks a lot!非常感谢!
By using jackson-dataformat-avro
, the problem has been resolved:通过使用jackson-dataformat-avro
,问题已得到解决:
ObjectMapper mapper = new ObjectMapper(new AvroFactory());
GenericRecord genericRecord = ...;
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
DatumWriter<GenericRecord> writer = new GenericDatumWriter<>(genericRecord.getSchema());
BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null);
writer.write(genericRecord, encoder);
encoder.flush();
byte[] bytes = outputStream.toByteArray();
return mapper.readerFor(ObjectNode.class)
.with(new AvroSchema(genericRecord.getSchema()))
.readValue(bytes);
}
pom.xml: pom.xml:
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-avro</artifactId>
<version>2.12.3</version>
</dependency>
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.