簡體   English   中英

kafka 流 protobuf 轉換異常

[英]kafka streams protobuf cast exception

我正在使用 Kafka 流來讀取和處理 protobuf 消息。

我正在為 stream 使用以下屬性:


        Properties properties = new Properties();
        properties.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfig.getGroupId());
        properties.put(StreamsConfig.CLIENT_ID_CONFIG, kafkaConfig.getClientId());
        properties.put(StreamsConfig.APPLICATION_ID_CONFIG, kafkaConfig.getApplicationId());
        properties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfig.getBootstrapServers());

        properties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.StringSerde.class);
        properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, KafkaProtobufSerde.class);
        properties.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, kafkaConfig.getSchemaRegistryUrl());
        properties.put(KafkaProtobufDeserializerConfig.SPECIFIC_PROTOBUF_VALUE_TYPE, ProtobufData.class);
        return properties;
    }

但是在運行時我遇到了這個錯誤:

Caused by: java.lang.ClassCastException: class com.google.protobuf.DynamicMessage cannot be cast to class model.schema.proto.input.ProtobufDataProto$ProtobufData (com.google.protobuf.DynamicMessage and model.schema.proto.input.ProtobufDataProto$ProtobufData are in unnamed module of loader 'app')

我的.proto文件如下所示:

import "inner_data.proto";
package myPackage;

option java_package = "model.schema.proto.input";
option java_outer_classname = "ProtobufDataProto";

message OuterData {
    string timestamp = 1;
    string x = 3;
    repeated InnerObject flows = 4;
}

(我有兩個單獨的原始文件)

package myPackage;

option java_package = "model.schema.proto.input";
option java_outer_classname = "InnerDataProto";

message InnerData {
  string a = 1;
  string b = 2;
  string c = 3;
}

我想知道為什么 Kafka 使用DynamicMessage即使我在屬性中給出了特定的 protobuf 值 class 以及如何解決這個問題?

我在嘗試讓 Kafkastream 與 protobuf 一起工作時遇到了同樣的問題,

我通過專門使用KafkaProtobufSerde來配置流構建器並通過明確指定 class 以使用以下行反序列化來解決了這個問題: serdeConfig.put(SPECIFIC_PROTOBUF_VALUE_TYPE,StreamStateEvent.class.getName());

    /*
     *  Define SpecificSerde for Even in protobuff
     */
    final KafkaProtobufSerde< ProtobufDataProto > protoSerde = new KafkaProtobufSerde<>();
    Map<String, String> serdeConfig = new HashMap<>();
    serdeConfig.put(SCHEMA_REGISTRY_URL_CONFIG, registryUrl);
    /*
     * Technically, the following line is only mandatory in order to de-serialize object into GeneratedMessageV3
     * and NOT into DynamicMessages : https://developers.google.com/protocol-buffers/docs/reference/java/com/google/protobuf/DynamicMessage
     */
    serdeConfig.put(SPECIFIC_PROTOBUF_VALUE_TYPE,StreamStateEvent.class.getName());
    protoSerde.configure(serdeConfig, false);

然后我可以創建我的輸入 stream 並將其反序列化:

 //Define a Serde for the key
 final Serde<byte[]> bytesSerde = Serdes.ByteArray();
 //Define the stream
 StreamsBuilder streamsBuilder = new StreamsBuilder();
 streamsBuilder.stream("inputTopic", Consumed.with(bytesSerde, protoSerde));
 /* 
 add your treatments, maps, filter etc
 ...
 */
 streamsBuilder.build();

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM