![](/img/trans.png)
[英]Spark 1.3.1 SQL Lib: Exception in thread “main” java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits()
[英]Spark MLlib example, NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame()
我正在跟蹤文檔示例, 例如:Estimator,Transformer和Param
我收到錯誤信息
15/09/23 11:46:51信息BlockManagerMaster:“主”線程中注冊的BlockManager異常java.lang.NoSuchMethodError:scala.reflect.api.JavaUniverse.runtimeMirror(Ljava / lang / ClassLoader;)Lscala / reflect / api / JavaUniverse $ JavaMirror; 在SimpleApp $ .main(hw.scala:75)
第75行是代碼“ sqlContext.createDataFrame()”:
import java.util.Random
import org.apache.log4j.Logger
import org.apache.log4j.Level
import scala.io.Source
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.rdd._
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.param.ParamMap
import org.apache.spark.mllib.linalg.{Vector, Vectors}
import org.apache.spark.mllib.recommendation.{ALS, Rating, MatrixFactorizationModel}
import org.apache.spark.sql.Row
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.functions._
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]");
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
val training = sqlContext.createDataFrame(Seq(
(1.0, Vectors.dense(0.0, 1.1, 0.1)),
(0.0, Vectors.dense(2.0, 1.0, -1.0)),
(0.0, Vectors.dense(2.0, 1.3, 1.0)),
(1.0, Vectors.dense(0.0, 1.2, -0.5))
)).toDF("label", "features")
}
}
我的sbt如下所示:
lazy val root = (project in file(".")).
settings(
name := "hello",
version := "1.0",
scalaVersion := "2.11.4"
)
libraryDependencies ++= {
Seq(
"org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
"org.apache.spark" % "spark-hive_2.11" % "1.4.1",
"org.apache.spark" % "spark-mllib_2.11" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-streaming-kinesis-asl" % "1.4.1" % "provided"
)
}
我試圖四處搜索,發現這篇文章與我的問題非常相似,並且我嘗試將我的sbt設置更改為spark版本(將spark-mllib_2.11更改為2.10,將spark-1.4.1更改為1.5.0),但是它帶來了更多的依賴沖突。
我的直覺是這是某個版本問題,但我自己無法解決,有人可以幫忙嗎? 非常感謝。
它現在對我有用,僅作記錄用途,引用@MartinSenne答案。
我所做的如下:
@注意:
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.