I'm trying to make a generic value type in my HashMap like so:
val aMap = ArrayBuffer[HashMap[String, Any]]()
aMap += HashMap()
aMap(0)("aKey") = "aStringVal"
aMap(0)("aKey2") = true // a bool value
aMap(0)("aKey3") = 23 // an int value
This works in my spark-shell but it gives me this ClassNotFoundException on scala.Any in my IntelliJ Project:
org.apache.spark.streaming.scheduler.JobScheduler logError - Error running job streaming job 1521859195000 ms.0
java.lang.ClassNotFoundException: scala.Any
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
I'm using Scala 2.11. Any ideas what could be causing this?
What this ended up being for me was creating a DataFrame
with mixed data using .toDF
I had:
val baseDataFrame = Seq(
("value1", "one"),
("value2", 2),
("value3", 3)
).toDF("column1", "column2")
and this change fixed the issue:
val baseDataFrame = Seq(
("value1", "one"),
("value2", "2"),
("value3", "3")
).toDF("column1", "column2")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.