简体   繁体   English

消息密钥在Kafka Streams中为Long

[英]Message key as Long in Kafka Streams

I am trying to use Long as type of the message key, but I get 我正在尝试使用Long作为消息密钥的类型,但是我得到了

Exception in thread "kafka_stream_app-f236aaca-3f90-469d-9d32-20ff694806ff-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Failed to deserialize key for record. topic=test, partition=0, offset=0
    at org.apache.kafka.streams.processor.internals.SourceNodeRecordDeserializer.deserialize(SourceNodeRecordDeserializer.java:38)
    at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:84)
    at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
    at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:474)
    at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:642)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:548)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:519)
Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by LongDeserializer is not 8

I checked and the data.length is 7 . 我检查了一下, data.length7

In streamsConfiguration I've set 在streamsConfiguration中我已经设置

streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.Long().getClass().getName());

and I use 我用

KStream<Long, GenericRecord> stream = builder.stream(topic);

I've tried sending the message via a simple app and also with kafka-avro-console-producer : 我尝试通过一个简单的应用程序以及kafka-avro-console-producer发送消息:

/opt/confluent-3.3.0/bin/kafka-avro-console-producer \
--broker-list localhost:9092 \
--topic test \
--property key.separator=, \
--property parse.key=true \
--property key.schema='{"type":"long"}' \
--property value.schema='{"type":"string"}' \
--property schema.registry.url=http://localhost:8081

with message 带消息

123,"293"

Using the kafka-avro-console-consumer I can consume the message and see (with --property print.key=true that the key sent is correctly 123 ) 使用kafka-avro-console-consumer我可以使用该消息并查看(通过--property print.key=true ,发送的密钥正确为123

Any idea what could be wrong when decoding the message? 知道解码消息时可能出什么问题吗?

Because you are using kafka-avro-console-producer the key is not serialized as plain Long but as an Avro type. 由于您使用的是kafka-avro-console-producer因此密钥不会序列化为纯Long而是序列为Avro。 Thus, you need to use a corresponding Avro Serde with the same schema you used on the write path (ie, '{"type":"long"}" ). 因此,您需要使用具有与在写入路径上使用的相同模式的对应Avro Serde(即'{"type":"long"}" )。

Also, your return type will not be Long but an Avro type. 同样,您的返回类型将不是Long而是Avro类型。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM