繁体   English   中英

运行Spark Jar应用程序时出现java.lang.NoClassDefFoundError

[英]java.lang.NoClassDefFoundError when running a spark jar application

我编译了我的Apache Spark应用程序,该应用程序在IntellijIDEA中用sbt用Scala编写,并且在IntelliJ中运行时运行良好。 但是,当我将其编译并打包为jar文件并在远程服务器上运行时,当代码到达在org / locationtech / jts / geom / Envelope内尝试创建Envelope实例的位置时,出现了此错误

Exception in thread "main" java.lang.NoClassDefFoundError: org/locationtech/jts/geom/Envelope
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethod(Class.java:2128)
at java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1629)
at java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:520)...

我知道这是库版本之间不一致的问题,并且也知道NoClassDefFoundError意味着库在编译时是可访问的,在运行时不可访问,但是我无法解决问题。

这是我的build.sbt文件:

name := "MySpark"
version := "0.1"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

远程计算机上的Spark和Scala版本与build.sbt文件中的版本相同。

当我在sbt shell中运行逐出时 ,我会收到此信息,但我不知道如何使用此信息来解决我的问题。

[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn]  * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 3.9.9.Final)
[warn]      +- org.apache.zookeeper:zookeeper:3.4.6               (depends on 3.6.2.Final)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 3.6.2.Final)
[warn]  * commons-net:commons-net:2.2 is selected over 3.1
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 2.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 3.1)
[warn]  * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
[warn]      +- org.apache.hadoop:hadoop-yarn-client:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-api:2.6.5            (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-common:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.6.5 (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-common:2.6.5  (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 11.0.2)
[warn]      +- org.apache.curator:curator-framework:2.6.0         (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-client:2.6.0            (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-recipes:2.6.0           (depends on 16.0.1)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 16.0.1)
[warn]      +- org.htrace:htrace-core:3.0.4                       (depends on 12.0.1)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Here are other dependency conflicts that were resolved:
[info]  * com.thoughtworks.paranamer:paranamer:2.8 is selected over {2.6, 2.3}
[info]      +- com.fasterxml.jackson.module:jackson-module-paranamer:2.7.9 (depends on 2.8)
[info]      +- org.json4s:json4s-core_2.11:3.2.11                 (depends on 2.6)
[info]      +- org.apache.avro:avro:1.7.7 

如果您运行sbt软件包cmd,则生成的jar将仅包含您的源文件,而不会包含第三方库。

因此,如果您部署Jar并尝试运行它。 您的运行时环境必须提供此处指定的外部库

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

当您运行Spark应用程序"org.apache.spark" %% "spark-core" % "2.3.0"时,运行时环境类路径中会出现"org.apache.spark" %% "spark-core" % "2.3.0" spark-submit将为您做到这一点。 "org.locationtech.jts" % "jts-core" % "1.16.0"是特定的第三方库,默认情况下,我认为您的运行时环境不会包含它。

您必须使用插件:

  • SBT-assemby
  • SBT-onejar
  • SBT-本机打包

并将其配置为包括您的第三方档案。 因此,您的应用程序将根据所选解决方案附带必要的类。 在这种情况下,我强烈建议您将provided的范围添加到运行时环境中已经存在的spark库中。 它应该将它们从软件包中排除,并防止软件包与目标(运行时)环境中已存在的软件包之间发生潜在冲突。

尝试使用--packages将依赖项添加到spark-submit命令中

spark-submit --packages org.locationtech.jts:jts-core:1.16.0 --class labX.applications.ServerTest --executor-memory 2048m --executor-cores 1 --num-executors 17 mySparkApp_2.11-0.1.jar

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM