簡體   English   中英

運行Spark Jar應用程序時出現java.lang.NoClassDefFoundError

[英]java.lang.NoClassDefFoundError when running a spark jar application

我編譯了我的Apache Spark應用程序,該應用程序在IntellijIDEA中用sbt用Scala編寫,並且在IntelliJ中運行時運行良好。 但是,當我將其編譯並打包為jar文件並在遠程服務器上運行時,當代碼到達在org / locationtech / jts / geom / Envelope內嘗試創建Envelope實例的位置時,出現了此錯誤

Exception in thread "main" java.lang.NoClassDefFoundError: org/locationtech/jts/geom/Envelope
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethod(Class.java:2128)
at java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1629)
at java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:520)...

我知道這是庫版本之間不一致的問題,並且也知道NoClassDefFoundError意味着庫在編譯時是可訪問的,在運行時不可訪問,但是我無法解決問題。

這是我的build.sbt文件:

name := "MySpark"
version := "0.1"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

遠程計算機上的Spark和Scala版本與build.sbt文件中的版本相同。

當我在sbt shell中運行逐出時 ,我會收到此信息,但我不知道如何使用此信息來解決我的問題。

[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn]  * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 3.9.9.Final)
[warn]      +- org.apache.zookeeper:zookeeper:3.4.6               (depends on 3.6.2.Final)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 3.6.2.Final)
[warn]  * commons-net:commons-net:2.2 is selected over 3.1
[warn]      +- org.apache.spark:spark-core_2.11:2.3.0             (depends on 2.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 3.1)
[warn]  * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
[warn]      +- org.apache.hadoop:hadoop-yarn-client:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-api:2.6.5            (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-common:2.6.5         (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.6.5 (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-yarn-server-common:2.6.5  (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 11.0.2)
[warn]      +- org.apache.curator:curator-framework:2.6.0         (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-client:2.6.0            (depends on 16.0.1)
[warn]      +- org.apache.curator:curator-recipes:2.6.0           (depends on 16.0.1)
[warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 16.0.1)
[warn]      +- org.htrace:htrace-core:3.0.4                       (depends on 12.0.1)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Here are other dependency conflicts that were resolved:
[info]  * com.thoughtworks.paranamer:paranamer:2.8 is selected over {2.6, 2.3}
[info]      +- com.fasterxml.jackson.module:jackson-module-paranamer:2.7.9 (depends on 2.8)
[info]      +- org.json4s:json4s-core_2.11:3.2.11                 (depends on 2.6)
[info]      +- org.apache.avro:avro:1.7.7 

如果您運行sbt軟件包cmd,則生成的jar將僅包含您的源文件,而不會包含第三方庫。

因此,如果您部署Jar並嘗試運行它。 您的運行時環境必須提供此處指定的外部庫

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0", "org.locationtech.jts" % "jts-core" % "1.16.0" 
)

當您運行Spark應用程序"org.apache.spark" %% "spark-core" % "2.3.0"時,運行時環境類路徑中會出現"org.apache.spark" %% "spark-core" % "2.3.0" spark-submit將為您做到這一點。 "org.locationtech.jts" % "jts-core" % "1.16.0"是特定的第三方庫,默認情況下,我認為您的運行時環境不會包含它。

您必須使用插件:

  • SBT-assemby
  • SBT-onejar
  • SBT-本機打包

並將其配置為包括您的第三方檔案。 因此,您的應用程序將根據所選解決方案附帶必要的類。 在這種情況下,我強烈建議您將provided的范圍添加到運行時環境中已經存在的spark庫中。 它應該將它們從軟件包中排除,並防止軟件包與目標(運行時)環境中已存在的軟件包之間發生潛在沖突。

嘗試使用--packages將依賴項添加到spark-submit命令中

spark-submit --packages org.locationtech.jts:jts-core:1.16.0 --class labX.applications.ServerTest --executor-memory 2048m --executor-cores 1 --num-executors 17 mySparkApp_2.11-0.1.jar

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM