简体   繁体   English

Spark Scala API:官方示例中spark.createDataFrame中没有typeTag

[英]Spark Scala API: No typeTag available in spark.createDataFrame in official example

I just started working with the MLib for Spark and tried to run the provided examples, more specifically https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/DCTExample.scala 我刚刚开始使用MLib for Spark并尝试运行提供的示例,更具体地说是https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples /ml/DCTExample.scala

However, compilation using the IntelliJ IDE fails with the message 但是,使用IntelliJ IDE进行编译失败并显示以下消息

Error:(41, 35) No TypeTag available for (org.apache.spark.ml.linalg.Vector,)
    val df = spark.createDataFrame(data.map(Tuple1.apply)).toDF("features")

The project setup uses jdk1.8.0_121, spark2.11-2.1.0 and scala 2.10.6. 项目设置使用jdk1.8.0_121,spark2.11-2.1.0和scala 2.10.6。

Any ideas on why the example fails to run? 关于该示例为何无法运行的任何想法? I followed the following tutorial during installation: https://www.supergloo.com/fieldnotes/intellij-scala-spark/ 我在安装过程中遵循了以下教程: https : //www.supergloo.com/fieldnotes/intellij-scala-spark/

You can't have spark for Scala 2.11 (that's what _2.11 in the name means) with Scala 2.10, though this specific error looks quite strange. 尽管使用Scala 2.10时,Scala 2.11(名称中的_2.11就是这个意思)没有火花,尽管这个特定的错误看起来很奇怪。 Switch to Scala 2.11.8. 切换至Scala 2.11.8。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 spark.createDataFrame()无法与Seq RDD一起使用 - spark.createDataFrame () not working with Seq RDD 没有 TypeTag 可用于案例 class 使用 scala 3 和火花 3 - No TypeTag available for a case class using scala 3 with spark 3 带有scala2.10.6的spark1.6.2没有可用的TypeTag - spark1.6.2 with scala2.10.6 No TypeTag available 具有“No TypeTag available”的Scala / Spark应用程序“def main”样式App中的错误 - Scala/Spark App with “No TypeTag available” Error in “def main” style App Spark 数据帧 udf 没有可用的 TypeTag - Spark dataframe udf No TypeTag available Scala Spark - 调用createDataFrame时获取重载方法 - Scala Spark - Get Overloaded method when calling createDataFrame Spark createDataFrame因ArrayOutOfBoundsException而失败 - Spark createDataFrame failing with ArrayOutOfBoundsException Scala Spark Encoders.product[X](其中 X 是一个案例类)不断给我“No TypeTag available for X”错误 - Scala Spark Encoders.product[X] (where X is a case class) keeps giving me "No TypeTag available for X" error Spark MLlib示例,NoSuchMethodError:org.apache.spark.sql.SQLContext.createDataFrame() - Spark MLlib example, NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame() 使用createDataFrame创建Spark向量列 - Creating a Spark Vector Column with createDataFrame
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM