[英]Scala access Map[String, Any] with nested data structures
I have s simple map like 我有简单的地图
val parameters: Map[String, Any] = Map("digits" -> Seq(1, 2, 3, 4, 5, 6, 7, 8,
an want to multiply each number by 3 like shown below 想要将每个数字乘以3,如下所示
class PrintMap extends App {
val conf: SparkConf = new SparkConf()
.setAppName("sparkApiSample")
.setMaster("local[*]")
val session: SparkSession = SparkSession
.builder()
.config(conf)
.getOrCreate()
val parameters: Map[String, Any] = Map("digits" -> Seq(1, 2, 3, 4, 5, 6, 7, 8, 9, 0))
val numbers: Seq[Int] = parameters("digits").asInstanceOf[Seq[Int]]
val rdd = session.sparkContext.parallelize(numbers)
val result = Map("result" -> rdd.map(x => x * 3).collect())
// want to "access / print the contents of the Array at result
result.get("result") match {
case Some(x) => x.asInstanceOf[Seq[Any]].foreach(println)
case None => println("error occurred")
}
Why does it result in the following exception and how could I actually access the map? 为什么会导致以下异常,我怎样才能真正访问地图? java.lang.ClassCastException: [I cannot be cast to scala.collection.Seq
Collect on an RDD returns an Array. 在RDD上收集返回一个数组。 Array does not extend Seq. 数组不会扩展Seq。 So your x cannot be cast to a Seq. 所以你的x不能被强制转换为Seq。
eg.) 例如。)
Array(2).asInstanceOf[Seq[Int]]
Throws the same exception. 抛出相同的异常。
Instead your result should be of type: Map[String, Array[Int]] 相反,您的结果应该是类型:Map [String,Array [Int]]
So instead just use x.toSeq isntead of x.asInstanceOf[Seq[Int]] 所以只需使用x.toSeq而不是x.asInstanceOf [Seq [Int]]
EDIT: "[I" in your stack means Array of Int. 编辑:“你的堆栈中的[I”表示Int的数组。
It occurred to me as I was writing this, I'm guessing the reason you're using any is because each your have a bunch of different parameters and return types in your array. 在我写这篇文章的时候,我想到了,我猜你使用any的原因是因为每个你的数组都有一堆不同的参数和返回类型。 If this is the case it'd be handy see a slightly more complete example. 如果是这种情况,它会很方便,看一个稍微完整的例子。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.