简体   繁体   中英

Spark Scala API: No typeTag available in spark.createDataFrame in official example

I just started working with the MLib for Spark and tried to run the provided examples, more specifically https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/DCTExample.scala

However, compilation using the IntelliJ IDE fails with the message

Error:(41, 35) No TypeTag available for (org.apache.spark.ml.linalg.Vector,)
    val df = spark.createDataFrame(data.map(Tuple1.apply)).toDF("features")

The project setup uses jdk1.8.0_121, spark2.11-2.1.0 and scala 2.10.6.

Any ideas on why the example fails to run? I followed the following tutorial during installation: https://www.supergloo.com/fieldnotes/intellij-scala-spark/

You can't have spark for Scala 2.11 (that's what _2.11 in the name means) with Scala 2.10, though this specific error looks quite strange. Switch to Scala 2.11.8.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM