简体   繁体   English

想法sbt java.lang.NoClassDefFoundError:org / apache / spark / SparkConf

[英]idea sbt java.lang.NoClassDefFoundError: org/apache/spark/SparkConf

I'm a beginner of spark.I build an environment use "linux + idea + sbt" ,when I try the quick start of Spark,I get the problem: 我是spark的初学者。我构建了一个环境使用“linux + idea + sbt”,当我尝试Spark的快速启动时,我遇到了问题:

    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
    at test$.main(test.scala:11)
    at test.main(test.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

The versions of them in my disk: 我的磁盘中的版本:

sbt   = 0.13.11
jdk   = 1.8
scala = 2.10
idea  = 2016

My directory structure: 我的目录结构:

test/
  idea/
  out/
  project/
    build.properties    
    plugins.sbt
  src/
    main/
      java/
      resources/
      scala/
      scala-2.10/
        test.scala
  target/
  assembly.sbt
  build.sbt

In build.properties: 在build.properties中:

sbt.version = 0.13.8

In plugins.sbt: 在plugins.sbt中:

logLevel := Level.Warn

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

In build.sbt: 在build.sbt中:

import sbt._
import Keys._
import sbtassembly.Plugin._
import AssemblyKeys._

name := "test"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1" % "provided"

In assembly.sbt: 在assembly.sbt中:

import AssemblyKeys._ // put this at the top of the file

assemblySettings

In test.scala: 在test.scala中:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object test {
  def main(args: Array[String]) {
    val logFile = "/opt/spark-1.6.1-bin-hadoop2.6/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Test Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

How can I solve this problem. 我怎么解决这个问题。

Dependencies with "provided" scope are only available during compilation and testing, and are not available at runtime or for packaging. 具有"provided"范围的依赖关系仅在编译和测试期间可用,并且在运行时或包装时不可用。 So, instead of making an object test with a main , you should make it an actual test suite placed in src/test/scala (If you're not familiar with unit-testing in Scala, I'd suggest to use ScalaTest, for example. First add a dependency on it in your build.sbt: libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test and then go for this quick start tutorial to implement a simple spec). 因此,不应使用main进行对象test ,而应将其设置为放置在src/test/scala的实际测试套件(如果您不熟悉Scala中的单元测试,我建议使用ScalaTest,例如。首先在build.sbt中添加一个依赖项: libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test然后转到这个快速 libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test 教程来实现一个简单的规范。


Another option, which is quite hacky, in my opinion (but does the trick nonetheless), involves removing provided scope from your spark-core dependency in some configurations and is described in the accepted answer to this question . 另一种选择,在我看来相当hacky(但仍然是技巧),涉及在某些配置中从你的spark-core依赖中删除provided范围,并在此问题的接受答案中描述。

In intelliJ version 2018.1 there is a checkbox in the run configuration called "Include dependencies with "Provided" scope". 在intelliJ版本2018.1中,运行配置中有一个名为“包含依赖关系”的复选框,其中包含“范围”。 Checking this option solved it for me. 选中此选项可以解决这个问题。

I had same issue this morning with the error provided. 我今天早上遇到了同样的错误,提供了错误。 I removed "provided" and ran sbt clean, reload, compile, package, run . 我删除了“提供”并运行sbt clean,reload,compile,package,run。 I also test using spark-submit from command line. 我还使用命令行中的spark-submit进行测试。 But I think "provided", the extra overhead on code, jar is less. 但我认为“提供”,代码的额外开销,jar更少。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 java.lang.NoClassDefFoundError:org / apache / spark / sql / DataFrame - java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame java.lang.NoClassDefFoundError:org / apache / spark / internal / Logging - java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging 齐柏林飞艇:java.lang.NoClassDefFoundError:org/apache/spark/Logging - Zeppelin: java.lang.NoClassDefFoundError: org/apache/spark/Logging Apache Spark-java.lang.NoClassDefFoundError - Apache spark - java.lang.NoClassDefFoundError 启动琐碎的独立Spark应用程序时出现问题:java.lang.NoClassDefFoundError:org / apache / spark / sql / internal / StaticSQLConf $ - Problem starting trivial standalone spark app: java.lang.NoClassDefFoundError: org/apache/spark/sql/internal/StaticSQLConf$ Spark-线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / DataFrame - Spark - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame Spark:线程“主”java.lang.NoClassDefFoundError 中的异常:org/apache/spark/Logging - Spark: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging spark-cassandra (java.lang.NoClassDefFoundError: org/apache/spark/sql/cassandra/package) - spark-cassandra (java.lang.NoClassDefFoundError: org/apache/spark/sql/cassandra/package) Kafka Spark 流式传输错误 - java.lang.NoClassDefFoundError: org/apache/spark/sql/connector/read/streaming/ReportsSourceMetrics - Kafka Spark Streaming Error - java.lang.NoClassDefFoundError: org/apache/spark/sql/connector/read/streaming/ReportsSourceMetrics Spark java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/v2/FileDataSourceV2 - Spark java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/v2/FileDataSourceV2
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM