简体   繁体   中英

java.lang.ClassCast errors.GenericRowWithSchema cannot be cast to scala.collection.Seq

How to convert the RDD of map to wrapped Array , I am getting error

Schema:

在此处输入图片说明

When I am trying to convert dataframe to pojo, I am getting exception as below:

java.lang.ClassCastException: 
  org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema cannot 
  be cast to scala.collection.Seq

Code:

rdd.map(row => {
  var arrm_list: Seq[Row] = rows.getAs[AnyRef]("ArrTeber").asInstanceOf[Seq[Row]]
  //working fine here
  arrm_list.foreach(x => {
    var arrmtrb__list: Seq[Row] = rows.getAs[AnyRef]("ArrTeberPRD").asInstanceOf[Seq[Row]]

    //working fine here
    arrmtrb__list.foreach(pt => {

      var pd__list: Seq[Row] = rows.getAs[AnyRef]("Pduct").asInstanceOf[Seq[Row]] //raising error
    })
  })
})

The above exception is simply a class cast exception, Since struct cannot be cast to Seq of struct (Refer Schema: -- Pduct: struct (nullable = true) ). Cast Pduct to Row and then extract nested members.

df.map(row => {
  var arrm_list: Seq[Row] = row.getAs[Seq[Row]]("ArrTeber")

  arrm_list.foreach(x => {
    var arrmtrb__list: Seq[Row] = x.getAs[Seq[Row]]("ArrTeberPRD")


    arrmtrb__list.foreach(pt => {
      var pd__list: Row = pt.getAs[Row]("Pduct") //Pduct: struct (nullable = true)

    })
  })

})

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM