简体   繁体   中英

Error on java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext

I am facing NoClassDefFoundErrorfor org.apache.spark.sql.hive.HiveContext or org.apache.spark.sql.SQLContext while running the code in my windows machine through intellij

I have following built.sbt.

name := "sample"
version := "1.0"
scalaVersion := "2.10.6"

resolvers += "Maven Central" at "https://repo.maven.apache.org/maven2/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Nexus Repository Releases" at "http://10.85.114.41/content/repositories/releases/"
libraryDependencies ++={
  val hortonVer = "1.6.2.2.5.0.0-1245"
  val elasticVer = "5.0.0"
  val kafkaVer = "0.10.0.2.5.0.0-1245"
  Seq(
    "org.apache.spark" % "spark-sql_2.10" % hortonVer % "provided" exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("net.minidev", "json-smart"),
    "org.apache.spark" % "spark-hive_2.10" % hortonVer % "provided",
    "org.elasticsearch" % "elasticsearch-spark-13_2.10" % elasticVer exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.apache.spark", "spark-core_2.10") exclude("org.apache.spark", "spark-sql_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude ("org.spark-project.spark", "unused"),
    "com.sample.app" % "cap-spark-api-01" % "1.0.7" exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("com.datastax.spark", "spark-cassandra-connector_2.10") exclude ("org.elasticsearch", "elasticsearch-spark-13_2.10"),
    "org.apache.kafka" % "kafka-clients" % kafkaVer

    )
}
assemblyJarName in assembly := "sample.jar"

I have imported org.apache.spark.sql.hive.HiveContext in the code and getting NoClassDefFoundError exception.

When HiveContext is giving NoClassDefFoundErroracception I have imported org.apache.spark.sql.SQLContext and then also its giving NoClassDefFoundErrorfor SQLContext when I declared as below

val hc = new HiveContext(sc) or val hc = new SQLCOntext(sc)

If I remove % "provided" %, while sbt build I am getting following error.

sbt.librarymanagement.ResolveException: unresolved dependency: org.mortbay.jetty#jetty-util;6.1.26.hwx: Nexus Repository Releases: unable to get resource for org/mortbay/jetty#jetty-util;6.1.26.hwx: res=http://10.85.114.41/content/repositories/releases/org/mortbay/jetty/jetty-util/6.1.26.hwx/jetty-util-6.1.26.hwx.pom: java.net.ConnectException: Failed to connect to /10.85.114.41:80

Could you please help me to resolve this issue.

Thanks,Bab

尝试将spark-core添加到您的SBT依赖项中。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM