简体   繁体   English

运行Spark Jar应用程序时出现java.lang.NoClassDefFoundError

[英]java.lang.NoClassDefFoundError when running a spark jar application

I compiled my Apache Spark application which is written in Scala with sbt inside the IntellijIDEA and works fine when running inside the IntelliJ. 我编译了我的Apache Spark应用程序,该应用程序在IntellijIDEA中用sbt用Scala编写,并且在IntelliJ中运行时运行良好。 But when I do compile and package it as a jar file and run it on a remote server, I got this error, when the code reaches where I try to create an Envelope instance inside the org/locationtech/jts/geom/Envelope 但是,当我将其编译并打包为jar文件并在远程服务器上运行时,当代码到达在org / locationtech / jts / geom / Envelope内尝试创建Envelope实例的位置时,出现了此错误

Exception in thread "main" java.lang.NoClassDefFoundError: org/locationtech/jts/geom/Envelope
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethod(Class.java:2128)
at java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1629)
at java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:520)...

I understand that this is the problem of inconsistency between versions of libraries and also know that NoClassDefFoundError means that the library was accessible while compiling and is not accessible in runtime, but I can not solve the problem. 我知道这是库版本之间不一致的问题,并且也知道NoClassDefFoundError意味着库在编译时是可访问的,在运行时不可访问,但是我无法解决问题。

This is my build.sbt file: 这是我的build.sbt文件:

name := "MySpark"
version := "0.1"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

The versions of Spark and Scala on the remote computer are the same as in the build.sbt file. 远程计算机上的Spark和Scala版本与build.sbt文件中的版本相同。

When I run the evicted in the sbt shell I get this info, but I don't how to use this info to solve my problem. 当我在sbt shell中运行逐出时 ,我会收到此信息,但我不知道如何使用此信息来解决我的问题。

[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn]  * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 3.9.9.Final)
[warn]      +- org.apache.zookeeper:zookeeper:3.4.6               (depends on 3.6.2.Final)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 3.6.2.Final)
[warn]  * commons-net:commons-net:2.2 is selected over 3.1
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 2.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 3.1)
[warn]  * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
[warn]      +- org.apache.hadoop:hadoop-yarn-client:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-api:2.6.5            (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-common:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.6.5 (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-common:2.6.5  (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 11.0.2)
[warn]      +- org.apache.curator:curator-framework:2.6.0         (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-client:2.6.0            (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-recipes:2.6.0           (depends on 16.0.1)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 16.0.1)
[warn]      +- org.htrace:htrace-core:3.0.4                       (depends on 12.0.1)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Here are other dependency conflicts that were resolved:
[info]  * com.thoughtworks.paranamer:paranamer:2.8 is selected over {2.6, 2.3}
[info]      +- com.fasterxml.jackson.module:jackson-module-paranamer:2.7.9 (depends on 2.8)
[info]      +- org.json4s:json4s-core_2.11:3.2.11                 (depends on 2.6)
[info]      +- org.apache.avro:avro:1.7.7 

If you run the sbt package cmd, the generated jar will only containe your source files but no 3rd party library. 如果您运行sbt软件包cmd,则生成的jar将仅包含您的源文件,而不会包含第三方库。

So If you deploy your Jar and try to run it. 因此,如果您部署Jar并尝试运行它。 Your runtime environment must provide the external librairies specified here 您的运行时环境必须提供此处指定的外部库

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

As you run a Spark application "org.apache.spark" %% "spark-core" % "2.3.0" is present in you runtime environment classpath. 当您运行Spark应用程序"org.apache.spark" %% "spark-core" % "2.3.0"时,运行时环境类路径中会出现"org.apache.spark" %% "spark-core" % "2.3.0" the spark-submit will do it for you. spark-submit将为您做到这一点。 "org.locationtech.jts" % "jts-core" % "1.16.0" is a specific 3rd party lib and by default, I don't think you runtime environment will include it. "org.locationtech.jts" % "jts-core" % "1.16.0"是特定的第三方库,默认情况下,我认为您的运行时环境不会包含它。

You have to use a plugin : 您必须使用插件:

  • sbt-assemby SBT-assemby
  • sbt-onejar SBT-onejar
  • sbt-native-packager SBT-本机打包

and configure it to include your 3rd party librairies. 并将其配置为包括您的第三方档案。 So your application will come with the necessary classes according to the chosen solution. 因此,您的应用程序将根据所选解决方案附带必要的类。 In this case, I strongly advise you to add the scope provided to the spark libs already present in your runtime environment. 在这种情况下,我强烈建议您将provided的范围添加到运行时环境中已经存在的spark库中。 It should exclude them from the package and prevent potential conflicts between your package and what already exists on your target (runtime) environment 它应该将它们从软件包中排除,并防止软件包与目标(运行时)环境中已存在的软件包之间发生潜在冲突。

尝试使用--packages将依赖项添加到spark-submit命令中

spark-submit --packages org.locationtech.jts:jts-core:1.16.0 --class labX.applications.ServerTest --executor-memory 2048m --executor-cores 1 --num-executors 17 mySparkApp_2.11-0.1.jar

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 java.lang.NoClassDefFoundError:从Scala运行JAR时 - java.lang.NoClassDefFoundError : while running JAR from Scala 使用SPARK时发生错误:java.lang.NoClassDefFoundError:调度/ Http - ERROR when using SPARK : java.lang.NoClassDefFoundError: dispatch/Http 为什么即使 jar 存在,火花应用程序也会因 java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig 失败? - Why does spark application fail with java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig even though the jar exists? 在IDEA终端中运行activator命令时出现java.lang.NoClassDefFoundError - java.lang.NoClassDefFoundError when running activator command in IDEA terminal java.lang.NoClassDefFoundError进行火花提交 - java.lang.NoClassDefFoundError for spark-submit Apache Spark-java.lang.NoClassDefFoundError - Apache spark - java.lang.NoClassDefFoundError spark-submit中的java.lang.NoClassDefFoundError - java.lang.NoClassDefFoundError in spark-submit java.lang.NoClassDefFoundError: org/apache/spark/streaming/twitter/TwitterUtils$ 运行 TwitterPopularTags 时 - java.lang.NoClassDefFoundError: org/apache/spark/streaming/twitter/TwitterUtils$ while running TwitterPopularTags 为什么Eclipse中的Spark应用程序失败并显示“ main” java.lang.NoClassDefFoundError”中的异常? - Why does a Spark application fail with “Exception in thread ”main“ java.lang.NoClassDefFoundError” in Eclipse? 当我尝试在Spark中运行单词计数示例时出现java.lang.NoClassDefFoundError - I get a java.lang.NoClassDefFoundError when i try to run the word count example in Spark
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM