简体   繁体   English

Apache Kafka Avro 反序列化:无法反序列化或解码特定类型的消息。

[英]Apache Kafka Avro Deserialization: Unable to deserialize or decode Specific type message.

I am trying to use Avro Serialize with Apache kafka for serialize/deserialize messages.我正在尝试将 Avro Serialize 与 Apache kafka 一起用于序列化/反序列化消息。 I am create one producer, which is used to serialize specific type message and send it to the queue.我创建了一个生产者,用于序列化特定类型的消息并将其发送到队列。 When message is send successfully to the queue, our consumer pick the message and trying to process, but while trying we are facing an exception, for case bytes to specific object.当消息成功发送到队列时,我们的消费者选择消息并尝试处理,但在尝试时我们面临异常,对于特定对象的 case 字节。 The exception is as below:例外情况如下:

[error] (run-main-0) java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.avroserializer.Customer
java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.avroserializer.Customer
    at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.lambda$infiniteConsumer$0(AvroSpecificDeserializer.java:51)
    at java.lang.Iterable.forEach(Iterable.java:75)
    at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.infiniteConsumer(AvroSpecificDeserializer.java:46)
    at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.main(AvroSpecificDeserializer.java:63)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

According to exception, we are using some inconenient way for read the data, below is our code:根据异常,我们使用了一些不方便的方式来读取数据,以下是我们的代码:

Kafka Producer Code:卡夫卡生产者代码:

static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }


public static void main(String[] args) throws InterruptedException, IOException {
        Customer customer1 = new Customer(1002, "Jimmy");

        Parser parser = new Parser();
        Schema schema = parser.parse(AvroSpecificProducer.class
                .getClassLoader().getResourceAsStream("avro/customer.avsc"));

        SpecificDatumWriter<Customer> writer = new SpecificDatumWriter<>(schema);
        try(ByteArrayOutputStream os = new ByteArrayOutputStream()) {
            BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(os, null);
            writer.write(customer1, encoder);
            encoder.flush();

            byte[] avroBytes = os.toByteArray();

            ProducerRecord<String, byte[]> record1 = new ProducerRecord<>("CustomerSpecificCountry",
                    "Customer One 11 ", avroBytes
            );

            asyncSend(record1);
        }

        Thread.sleep(10000);
    }

Kafka Consumer Code:卡夫卡消费者代码:

static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
        kafkaProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
        kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG, "CustomerCountryGroup1");
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
    }

    public static void infiniteConsumer() throws IOException {
        try(KafkaConsumer<String, byte[]> kafkaConsumer = new KafkaConsumer<>(kafkaProps)) {
            kafkaConsumer.subscribe(Arrays.asList("CustomerSpecificCountry"));

            while(true) {
                ConsumerRecords<String, byte[]> records = kafkaConsumer.poll(100);
                System.out.println("<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<" + records.count());

                Schema.Parser parser = new Schema.Parser();
                Schema schema = parser.parse(AvroSpecificDeserializer.class
                        .getClassLoader().getResourceAsStream("avro/customer.avsc"));

                records.forEach(record -> {
                    DatumReader<Customer> customerDatumReader = new SpecificDatumReader<>(schema);
                    BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(record.value(), null);
                    try {
                        System.out.println(">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>");
                        Customer customer = customerDatumReader.read(null, binaryDecoder);
                        System.out.println(customer);
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                });
            }

        }
    }

Using consumer in console, we are successfully able to receive the message.在控制台中使用消费者,我们能够成功接收消息。 So what is the way for decode message into our pojo files ?那么将消息解码到我们的 pojo 文件中的方法是什么?

The solution of this problem is, use这个问题的解决方法是,使用

DatumReader<GenericRecord> customerDatumReader = new SpecificDatumReader<>(schema);

instead of代替

`DatumReader<Customer> customerDatumReader = new SpecificDatumReader<>(schema);

The exact reason for this, still not found.具体原因,目前还没有找到。 This may be, because Kafka, doesn't know about the structure of message, we explicitly define schema for message, and GenericRecord is useful to convert any message into readable JSON format according to schema.这可能是因为 Kafka 不知道消息的结构,我们为消息显式定义了 schema, GenericRecord可以根据 schema 将任何消息转换为可读的 JSON 格式。 After creating JSON, we can easily convert it into our POJO class.创建 JSON 后,我们可以轻松地将其转换为我们的 POJO 类。

But Still, need to find solution for convert directly into our POJO class.但是,仍然需要找到直接转换为我们的 POJO 类的解决方案。

You don't need to do the Avro serialization explicitly before passing the values to ProduceRecord .在将值传递给ProduceRecord之前,您不需要显式执行 Avro 序列化。 The serializer will do it for you.序列化程序会为您完成。 Your code would look like:您的代码如下所示:

Customer customer1 = new Customer(1002, "Jimmy");
ProducerRecord<String, Customer> record1 = new ProducerRecord<>("CustomerSpecificCountry", customer1);
    asyncSend(record1);
}

See an example from Confluent for a simple producer using avro请参阅 Confluent 中使用 avro简单生产者的示例

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Kafka Avro 反序列化器无法将 Kafka 消息反序列化为特定的 Avro 记录 - Kafka Avro Deserializer is not able to deserialize the Kafka message to Specific Avro Record Kafka Avro 使用模式注册表将序列化/反序列化为具体类型失败 - Kafka Avro serialize/deserialize into concrete type using schema registry failing 无法在Kafka的Avro Consumer端解码自定义对象 - Unable to decode Custom object at Avro Consumer end in Kafka 从 kafka 将 Avro 反序列化为 SpecificRecord Failing。 期望类型为 PojoTypeInfo - Deserialize Avro from kafka as SpecificRecord Failing. Expecting type to be a PojoTypeInfo 使用 Apache Avro 但具有不同架构的 kafka 发送/接收消息 - Send / Receive message through kafka using Apache Avro but with different schema 带有AVRO的Apache kafka,架构ID在消息中的哪里? - Apache kafka with AVRO, where in the message does the schema id go? 解密Kafka Avro消息 - Decrypt Kafka Avro Message 将 spark sql 2.4.4 数据帧中的 Avro 类型消息生成到 Kafka - Producing Avro type message in spark sql 2.4.4 data frame to Kafka 使用 kafka lib 反序列化 PRIMITIVE AVRO KEY - Deserialization PRIMITIVE AVRO KEY with kafka lib spring-kafka 使用 Avro 序列化/反序列化 org.springframework.messaging.Message 对象 - spring-kafka serialize/deserialize org.springframework.messaging.Message objects using Avro
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM