简体   繁体   English

不支持 Any 类型的架构

[英]Schema for type Any is not supported

I'm trying to create a spark UDF to extract a Map of (key, value) pairs from a User defined case class.我正在尝试创建一个 spark UDF 来从用户定义的案例类中提取(键,值)对的映射。

The scala function seems to work fine, but when I try to convert that to a UDF in spark2.0, I'm running into the " Schema for type Any is not supported" error. scala 函数似乎工作正常,但是当我尝试在 spark2.0 中将其转换为 UDF 时,我遇到了“不支持任何类型的架构”错误。

case class myType(c1: String, c2: Int)
def getCaseClassParams(cc: Product): Map[String, Any] = {

    cc
      .getClass
      .getDeclaredFields // all field names
      .map(_.getName)
      .zip(cc.productIterator.to) // zipped with all values
      .toMap

  }

But when I try to instantiate a function value as a UDF it results in the following error -但是当我尝试将函数值实例化为 UDF 时,它会导致以下错误 -

val ccUDF = udf{(cc: Product, i: String) => getCaseClassParams(cc).get(i)}

java.lang.UnsupportedOperationException: Schema for type Any is not supported
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:716)
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:668)
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:654)
  at org.apache.spark.sql.functions$.udf(functions.scala:2841)

The error message says it all.错误消息说明了一切。 You have an Any in the map.您在地图中有一个 Any 。 Spark SQL and Dataset api does not support Any in the schema. Spark SQL 和 Dataset api 不支持架构中的 Any。 It has to be one of the supported type (which is a list of basic types such as String, Integer etc. a sequence of supported types or a map of supported types).它必须是受支持的类型之一(它是基本类型的列表,例如字符串、整数等。支持类型的序列或支持类型的映射)。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spark Dataframe UDF - 不支持 Any 类型的架构 - Spark Dataframe UDF - Schema for type Any is not supported Spark将列表转换为数据框错误:不支持任何类型的架构 - Spark convert list to dataframe error: Schema for type Any is not supported Spark Scala UDF:java.lang.UnsupportedOperationException:不支持任何类型的架构 - Spark Scala UDF : java.lang.UnsupportedOperationException: Schema for type Any is not supported Scala UDF返回“不支持单位类型的模式” - Scala UDF returning 'Schema for type Unit is not supported' 不支持Spark 2.1.0 UDF Schema类型 - Spark 2.1.0 UDF Schema type not supported Scala java.lang.UnsupportedOperationException:不支持类型 A 的架构 - Scala java.lang.UnsupportedOperationException: Schema for type A is not supported 不支持类型 org.apache.spark.sql.types.DataType 的模式 - Schema for type org.apache.spark.sql.types.DataType is not supported 创建 Spark 数据集时不支持类型 TypeTag[java.sql.Timestamp] 的架构 - Schema for type TypeTag[java.sql.Timestamp] is not supported when creating Spark Dataset 在 spark 数据帧中运行 UDF 时,不支持获取 org.apache.spark.sql.Column 类型的架构 - getting Schema for type org.apache.spark.sql.Column is not supported while running UDF in spark dataframe 为什么创建数据框失败并显示“ java.lang.UnsupportedOperationException:不支持类别类型的模式” - Why does creating a DataFrame fail with “java.lang.UnsupportedOperationException: Schema for type Category is not supported”
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM