簡體   English   中英

Spark 2.0.1:將JSON數組列拆分為ArrayType(StringType)

[英]Spark 2.0.1: split JSON Array Column into ArrayType(StringType)

我有一個這樣的數據框

root
 |-- sum_id: long (nullable = true)
 |-- json: string (nullable = true)

+-------+------------------------------+
|sum_id |json                          |
+-------+------------------------------+
|8124455|[{"itemId":11},{"itemId":12}] |
|8124457|[{"itemId":53}]               |
|8124458|[{"itemId":11},{"itemId":33}] |
+-------+------------------------------+

我想和Scala一起爆發

root
 |-- sum_id: long (nullable = true)
 |-- itemId: int(nullable = true)

+-------+--------+
|sum_id |itemId  |
+-------+--------+
|8124455|11      |
|8124455|12      |
|8124457|53      |
|8124458|11      |
|8124458|33      |
+-------+--------+

我嘗試了什么:

  1. 使用get_json_object ,但是該列是JSON對象的數組,因此我認為應該首先將其分解為對象,但是如何?

  2. 嘗試將列jsonStringTypeArrayType(StringType) ,但出現data type mismatch異常。

請指導我如何解決此問題。

下面的代碼將精確地完成您的工作。

val toItemArr = udf((jsonArrStr:String) => {
      jsonArrStr.replace("[","").replace("]","").split(",")
   })

inputDataFrame.withColumn("itemId",explode(toItemArr(get_json_object(col("json"),"$[*].itemId")))).drop("json").show


+-------+------+
|     id|itemId|
+-------+------+ 
|8124455|    11|
|8124455|    12|
|8124457|    53|
|8124458|    11|
|8124458|    33|
+-------+------+

因為您正在使用Json,所以這可能是最好的方法:

請看一下:

import org.apache.spark._
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.databind.DeserializationFeature

val df = sc.parallelize(Seq((8124455,"""[{"itemId":11},{"itemId":12}]"""),(8124457,"""[{"itemId":53}]"""),(8124458,"""[{"itemId":11},{"itemId":33}]"""))).toDF("sum_id","json")
val result = df.rdd.mapPartitions(records => {
        val mapper = new ObjectMapper with ScalaObjectMapper
        mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
        mapper.registerModule(DefaultScalaModule)
      val values=records.flatMap(record => {
          try {
            Some((record.getInt(0),mapper.readValue(record.getString(1), classOf[List[Map[String,Int]]]).map(_.map(_._2).toList).flatten))
          } catch {
            case e: Exception => None
          }
        })
values.flatMap(listOfList=>listOfList._2.map(a=>(listOfList._1,a)))
    }, true)

result.toDF.show()

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM