[英]Spark Scala API: No typeTag available in spark.createDataFrame in official example
I just started working with the MLib for Spark and tried to run the provided examples, more specifically https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/DCTExample.scala 我刚刚开始使用MLib for Spark并尝试运行提供的示例,更具体地说是https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples /ml/DCTExample.scala
However, compilation using the IntelliJ IDE fails with the message 但是,使用IntelliJ IDE进行编译失败并显示以下消息
Error:(41, 35) No TypeTag available for (org.apache.spark.ml.linalg.Vector,)
val df = spark.createDataFrame(data.map(Tuple1.apply)).toDF("features")
The project setup uses jdk1.8.0_121, spark2.11-2.1.0 and scala 2.10.6. 项目设置使用jdk1.8.0_121,spark2.11-2.1.0和scala 2.10.6。
Any ideas on why the example fails to run? 关于该示例为何无法运行的任何想法? I followed the following tutorial during installation: https://www.supergloo.com/fieldnotes/intellij-scala-spark/ 我在安装过程中遵循了以下教程: https : //www.supergloo.com/fieldnotes/intellij-scala-spark/
You can't have spark for Scala 2.11 (that's what _2.11
in the name means) with Scala 2.10, though this specific error looks quite strange. 尽管使用Scala 2.10时,Scala 2.11(名称中的_2.11
就是这个意思)没有火花,尽管这个特定的错误看起来很奇怪。 Switch to Scala 2.11.8. 切换至Scala 2.11.8。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.