簡體   English   中英

在 maven 中使用 Spark 時出現 java.lang.NoClassDefFoundError

[英]java.lang.NoClassDefFoundError when using Spark in maven

我有一個 maven 項目,我在其中使用以下 Spark 依賴項:

<dependencies>
        <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-mllib_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-graphx_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-yarn_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-network-shuffle_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-flume_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>
    <dependency>
      <groupId>com.databricks</groupId>
      <artifactId>spark-csv_2.11</artifactId>
      <version>1.3.0</version>
    </dependency>
  </dependencies>

火花版本是 2.4.4

現在我運行以下代碼:

    SparkSession spark = SparkSession.builder()
            .master("local[*]")
            .config("spark.sql.warehouse.dir", "/tmp/spark")
            .appName("SurvivalPredictionMLP")
            .getOrCreate();
    //Reads the training set
    Dataset<Row> df = spark.sqlContext()
            .read()
            .format("com.databricks.spark.csv")
            .option("header", true)
            .option("inferSchema", true)
            .load("data/train.csv");
    //Show
    df.show();

但我在 getOrCreate() 行得到以下異常:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/09/22 14:18:06 INFO SparkContext: Running Spark version 2.4.4
19/09/22 14:18:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/22 14:18:07 INFO SparkContext: Submitted application: SurvivalPredictionMLP
19/09/22 14:18:07 INFO SecurityManager: Changing view acls to: pro
19/09/22 14:18:07 INFO SecurityManager: Changing modify acls to: pro
19/09/22 14:18:07 INFO SecurityManager: Changing view acls groups to: 
19/09/22 14:18:07 INFO SecurityManager: Changing modify acls groups to: 
19/09/22 14:18:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(pro); groups with view permissions: Set(); users  with modify permissions: Set(pro); groups with modify permissions: Set()
Exception in thread "main" java.lang.NoClassDefFoundError: io/netty/channel/Channel
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:59)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at com.jdlp.projects.titanic.App.<init>(App.java:18)
    at com.jdlp.projects.titanic.App.main(App.java:33)
Caused by: java.lang.ClassNotFoundException: io.netty.channel.Channel
    at java.net.URLClassLoader$1.run(URLClassLoader.java:371)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 14 more
Caused by: java.util.zip.ZipException: invalid LOC header (bad signature)
    at java.util.zip.ZipFile.read(Native Method)
    at java.util.zip.ZipFile.access$1400(ZipFile.java:60)
    at java.util.zip.ZipFile$ZipFileInputStream.read(ZipFile.java:734)
    at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(ZipFile.java:434)
    at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
    at java.util.jar.Manifest$FastInputStream.fill(Manifest.java:476)
    at java.util.jar.Manifest$FastInputStream.readLine(Manifest.java:410)
    at java.util.jar.Manifest$FastInputStream.readLine(Manifest.java:444)
    at java.util.jar.Attributes.read(Attributes.java:376)
    at java.util.jar.Manifest.read(Manifest.java:234)
    at java.util.jar.Manifest.<init>(Manifest.java:81)
    at java.util.jar.Manifest.<init>(Manifest.java:73)
    at java.util.jar.JarFile.getManifestFromReference(JarFile.java:199)
    at java.util.jar.JarFile.getManifest(JarFile.java:180)
    at sun.misc.URLClassPath$JarLoader$2.getManifest(URLClassPath.java:992)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:451)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
    ... 20 more

當我用谷歌搜索這些異常時,它建議我更改文件等,但是當我使用 maven 時,我不能或不應該更改任何內容。

有沒有辦法解決這個錯誤?

謝謝!

看起來您在 pom 文件中使用了 spark 2.11,但您使用 spark 2.4.4 運行程序。 當 pom 中的版本與我機器上的版本不匹配時,我看到了奇怪的錯誤。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM