簡體   English   中英

關於java.lang.NoClassDefFoundError的錯誤:org / apache / spark / sql / SQLContext

[英]Error on java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext

我通過IntelliJ在Windows機器上運行代碼時面臨org.apache.spark.sql.hive.HiveContext或org.apache.spark.sql.SQLContext的NoClassDefFoundError

我有以下build.sbt。

name := "sample"
version := "1.0"
scalaVersion := "2.10.6"

resolvers += "Maven Central" at "https://repo.maven.apache.org/maven2/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Nexus Repository Releases" at "http://10.85.114.41/content/repositories/releases/"
libraryDependencies ++={
  val hortonVer = "1.6.2.2.5.0.0-1245"
  val elasticVer = "5.0.0"
  val kafkaVer = "0.10.0.2.5.0.0-1245"
  Seq(
    "org.apache.spark" % "spark-sql_2.10" % hortonVer % "provided" exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("net.minidev", "json-smart"),
    "org.apache.spark" % "spark-hive_2.10" % hortonVer % "provided",
    "org.elasticsearch" % "elasticsearch-spark-13_2.10" % elasticVer exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.apache.spark", "spark-core_2.10") exclude("org.apache.spark", "spark-sql_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude ("org.spark-project.spark", "unused"),
    "com.sample.app" % "cap-spark-api-01" % "1.0.7" exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("com.datastax.spark", "spark-cassandra-connector_2.10") exclude ("org.elasticsearch", "elasticsearch-spark-13_2.10"),
    "org.apache.kafka" % "kafka-clients" % kafkaVer

    )
}
assemblyJarName in assembly := "sample.jar"

我已經在代碼中導入了org.apache.spark.sql.hive.HiveContext並獲取了NoClassDefFoundError異常。

當HiveContext提供NoClassDefFoundErroracception時,我已經導入了org.apache.spark.sql.SQLContext,然后當我聲明如下時,還導入了它給SQLContext的NoClassDefFoundError

val hc = new HiveContext(sc) or val hc = new SQLCOntext(sc)

如果我在sbt構建時刪除了“已提供”%,則出現以下錯誤。

sbt.librarymanagement.ResolveException: unresolved dependency: org.mortbay.jetty#jetty-util;6.1.26.hwx: Nexus Repository Releases: unable to get resource for org/mortbay/jetty#jetty-util;6.1.26.hwx: res=http://10.85.114.41/content/repositories/releases/org/mortbay/jetty/jetty-util/6.1.26.hwx/jetty-util-6.1.26.hwx.pom: java.net.ConnectException: Failed to connect to /10.85.114.41:80

您能幫我解決這個問題嗎?

謝謝,巴布

嘗試將spark-core添加到您的SBT依賴項中。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM