简体   繁体   English

如何使用from_json与Kafka connect 0.10和Spark Structured Streaming?

[英]How to use from_json with Kafka connect 0.10 and Spark Structured Streaming?

I was trying to reproduce the example from [Databricks][1] and apply it to the new connector to Kafka and spark structured streaming however I cannot parse the JSON correctly using the out-of-the-box methods in Spark... 我试图重现[Databricks] [1]中的示例并将其应用于Kafka的新连接器并激发结构化流媒体,但我无法使用Spark中的开箱即用方法正确解析JSON ...

note: the topic is written into Kafka in JSON format. 注意:该主题以JSON格式写入Kafka。

val ds1 = spark
          .readStream
          .format("kafka")
          .option("kafka.bootstrap.servers", IP + ":9092")
          .option("zookeeper.connect", IP + ":2181")
          .option("subscribe", TOPIC)
          .option("startingOffsets", "earliest")
          .option("max.poll.records", 10)
          .option("failOnDataLoss", false)
          .load()

The following code won't work, I believe that's because the column json is a string and does not match the method from_json signature... 以下代码不起作用,我相信这是因为列json是一个字符串而且与from_json签名方法不匹配...

    val df = ds1.select($"value" cast "string" as "json")
                .select(from_json("json") as "data")
                .select("data.*")

Any tips? 有小费吗?

[UPDATE] Example working: https://github.com/katsou55/kafka-spark-structured-streaming-example/blob/master/src/main/scala-2.11/Main.scala [更新]工作示例: https//github.com/katsou55/kafka-spark-structured-streaming-example/blob/master/src/main/scala-2.11/Main.scala

First you need to define the schema for your JSON message. 首先,您需要为JSON消息定义架构。 For example 例如

val schema = new StructType()
  .add($"id".string)
  .add($"name".string)

Now you can use this schema in from_json method like below. 现在您可以在from_json方法中使用此模式,如下所示。

val df = ds1.select($"value" cast "string" as "json")
            .select(from_json($"json", schema) as "data")
            .select("data.*")

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 from_json 未在 spark 结构化流中将 json 转换为 DF - from_json is not converting json to DF in spark structured streaming 如何在Kafka Direct Stream中使用Spark Structured Streaming? - How to use Spark Structured Streaming with Kafka Direct Stream? 如何在 spark 3.0 结构化流媒体中使用 kafka.group.id 和检查点以继续从 Kafka 中读取它在重启后停止的位置? - How to use kafka.group.id and checkpoints in spark 3.0 structured streaming to continue to read from Kafka where it left off after restart? Spark from_json 无一例外 - Spark from_json No Exception 如何使用结构化流从 Kafka 读取 JSON 格式的记录? - How to read records in JSON format from Kafka using Structured Streaming? 使用 Spark Structured Streaming 读取带有架构的 Kafka Connect JSONConverter 消息 - Reading Kafka Connect JSONConverter messages with schema using Spark Structured Streaming 如何通过Spark结构化流连接到受Kerberos保护的Kafka群集? - How do I connect to a Kerberos-secured Kafka cluster with Spark Structured Streaming? Kafka protobuf 的 Spark 结构化流 - Spark structured streaming of Kafka protobuf 在使用 Kafka 的 Spark Structured streaming 中,Spark 如何管理多个主题的偏移量 - In Spark Structured streaming with Kafka, how spark manages offset for multiple topics Scala:从火花结构化流中读取 Kafka Avro 消息时出错 - Scala: Error reading Kafka Avro messages from spark structured streaming
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM