[英]Error on java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
I am facing NoClassDefFoundErrorfor org.apache.spark.sql.hive.HiveContext or org.apache.spark.sql.SQLContext while running the code in my windows machine through intellij 我通过IntelliJ在Windows机器上运行代码时面临org.apache.spark.sql.hive.HiveContext或org.apache.spark.sql.SQLContext的NoClassDefFoundError
I have following built.sbt. 我有以下build.sbt。
name := "sample"
version := "1.0"
scalaVersion := "2.10.6"
resolvers += "Maven Central" at "https://repo.maven.apache.org/maven2/"
resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "Nexus Repository Releases" at "http://10.85.114.41/content/repositories/releases/"
libraryDependencies ++={
val hortonVer = "1.6.2.2.5.0.0-1245"
val elasticVer = "5.0.0"
val kafkaVer = "0.10.0.2.5.0.0-1245"
Seq(
"org.apache.spark" % "spark-sql_2.10" % hortonVer % "provided" exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("net.minidev", "json-smart"),
"org.apache.spark" % "spark-hive_2.10" % hortonVer % "provided",
"org.elasticsearch" % "elasticsearch-spark-13_2.10" % elasticVer exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.apache.spark", "spark-core_2.10") exclude("org.apache.spark", "spark-sql_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude ("org.spark-project.spark", "unused"),
"com.sample.app" % "cap-spark-api-01" % "1.0.7" exclude("org.apache.spark", "spark-streaming-kafka_2.10") exclude("org.mortbay.jetty", "jetty") exclude("org.mortbay.jetty", "jetty-util") exclude("com.datastax.spark", "spark-cassandra-connector_2.10") exclude ("org.elasticsearch", "elasticsearch-spark-13_2.10"),
"org.apache.kafka" % "kafka-clients" % kafkaVer
)
}
assemblyJarName in assembly := "sample.jar"
I have imported org.apache.spark.sql.hive.HiveContext in the code and getting NoClassDefFoundError exception. 我已经在代码中导入了org.apache.spark.sql.hive.HiveContext并获取了NoClassDefFoundError异常。
When HiveContext is giving NoClassDefFoundErroracception I have imported org.apache.spark.sql.SQLContext and then also its giving NoClassDefFoundErrorfor SQLContext when I declared as below 当HiveContext提供NoClassDefFoundErroracception时,我已经导入了org.apache.spark.sql.SQLContext,然后当我声明如下时,还导入了它给SQLContext的NoClassDefFoundError
val hc = new HiveContext(sc) or val hc = new SQLCOntext(sc)
If I remove % "provided" %, while sbt build I am getting following error. 如果我在sbt构建时删除了“已提供”%,则出现以下错误。
sbt.librarymanagement.ResolveException: unresolved dependency: org.mortbay.jetty#jetty-util;6.1.26.hwx: Nexus Repository Releases: unable to get resource for org/mortbay/jetty#jetty-util;6.1.26.hwx: res=http://10.85.114.41/content/repositories/releases/org/mortbay/jetty/jetty-util/6.1.26.hwx/jetty-util-6.1.26.hwx.pom: java.net.ConnectException: Failed to connect to /10.85.114.41:80
Could you please help me to resolve this issue. 您能帮我解决这个问题吗?
Thanks,Bab 谢谢,巴布
尝试将spark-core添加到您的SBT依赖项中。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.