[英]java.lang.ClassNotFoundException: org.apache.spark.sql.Dataset
When running a Scala file that uses the Spark Dataset
type I get the following stack trace:运行使用 Spark Dataset
类型的 Scala 文件时,我得到以下堆栈跟踪:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.Dataset
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more
I find this strange because I have the following import:我觉得这很奇怪,因为我有以下导入:
import org.apache.spark.sql._
Also, in my build.sbt
I have the following added to libraryDependencies
:此外,在我的build.sbt
我将以下内容添加到libraryDependencies
:
"org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.2" % "provided",
If you are executing this standalone you can try removing provided
from your dependencies.如果您正在执行此独立程序,您可以尝试从您的依赖项中删除provided
。 Provided means that you expect the dependencies to already be on the classpath when you run this application.提供意味着当您运行此应用程序时,您希望依赖项已经在类路径上。 So the Spark dependencies won't be included in your jar if you use provided
.因此,如果您使用provided
则 Spark 依赖项将不会包含在您的 jar 中。
Your build.sbt file specified that the spark dependencies are provided to the application's classpath, but it wasn't able to locate them.您的 build.sbt 文件指定将 spark 依赖项提供给应用程序的类路径,但无法找到它们。 If you're not running on a cluster, then you can try removing the "provided" from your build.sbt, or put the Spark dependencies on your classpath.如果您不是在集群上运行,那么您可以尝试从 build.sbt 中删除“provided”,或者将 Spark 依赖项放在您的类路径中。
In IntelliJ 2020.3.2 community edition, go to menu run
then edit configurations
.在 IntelliJ 2020.3.2 社区版中,转到菜单run
然后edit configurations
。 Finally, in Modify options
select 'Include dependencies with "Provided" scope'
.最后,在Modify options
选择'Include dependencies with "Provided" scope'
。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.