简体   繁体   English

从 KSQL Stream 使用 AVRO Kafka 主题时出错

[英]Error while consuming AVRO Kafka Topic from KSQL Stream

I created some dummydata as a Stream in KSQLDB with VALUE_FORMAT='JSON' TOPIC='MYTOPIC'我在 KSQLDB 中使用VALUE_FORMAT='JSON' TOPIC='MYTOPIC'创建了一些虚拟数据作为 Stream

The Setup is over Docker-compose.设置结束 Docker-compose。 I am running a Kafka Broker, Schema-registry, ksqldbcli, ksqldb-server, zookeeper我正在运行 Kafka Broker、Schema-registry、ksqldbcli、ksqldb-server、zookeeper

Now I want to consume these records from the topic.现在我想从主题中使用这些记录。 My first and last approach was over the commandline with following command我的第一个也是最后一个方法是通过命令行使用以下命令

docker run  --net=host  --rm  confluentinc/cp-schema-registry:5.0.0  kafka-avro-console-consumer
--bootstrap-server localhost:29092 --topic DXT --from-beginning --max-messages 10
--property print.key=true --property print.value=true
--value-deserializer io.confluent.kafka.serializers.KafkaAvroDeserializer
--key-deserializer org.apache.kafka.common.serialization.StringDeserializer

But that just returns the error但这只会返回错误

[2021-04-22 21:45:42,926] ERROR Unknown error when running consumer:  (kafka.tools.ConsoleConsumer$:76)
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!

I also tried it with different use cases in Java Spring but with no prevail.我还在 Java Spring 中尝试了不同的用例,但没有成功。 I just cannot consume the created topics.我只是无法使用创建的主题。 If I would need to define my own schema, where should I do that and what would be the easiest way because I just created a stream in Ksqldb?如果我需要定义自己的模式,我应该在哪里做,最简单的方法是什么,因为我刚刚在 Ksqldb 中创建了一个 stream? Is there an easy to follow example.是否有一个易于遵循的示例。 I did not specifiy anything else when I created the stream like in the quickstart example on Ksqldb.io .当我创建 stream 时,我没有指定任何其他内容,就像在Ksqldb.io上的快速入门示例中一样。 (I added the schema-registry in my deployment) As I am a noob that is sitting here for almost 10 hours any help would be appreciated. (我在我的部署中添加了模式注册表)因为我是一个在这里坐了将近 10 个小时的菜鸟,任何帮助将不胜感激。

Edit: I found that pure JSON does not need the Schema-registry with ksqldb.编辑:我发现纯 JSON 不需要带有 ksqldb 的 Schema-registry。 Here . 在这里 But how to deserialize it?但是如何反序列化呢?

If you've written JSON data to the topic then you can read it with the kafka-console-consumer .如果您已将 JSON 数据写入主题,那么您可以使用kafka-console-consumer阅读它。

The error you're getting ( Error deserializing Avro message for id -1…Unknown magic byte! ) is because you're using the kafka-avro-console-consumer which attempts to deserialise the topic data as Avro - which it isn't, hence the error.您收到的错误( Error deserializing Avro message for id -1…Unknown magic byte! )是因为您使用的是kafka-avro-console-consumer ,它试图将主题数据反序列化为 Avro - 它不是,因此错误。

You can also use PRINT DXT;您也可以使用PRINT DXT; from within ksqlDB.从 ksqlDB 中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM