简体   繁体   English

Apache Spark 3.3.0 在 Java 17 上中断,显示“无法访问 class sun.nio.ch.DirectBuffer”

[英]Apache Spark 3.3.0 breaks on Java 17 with "cannot access class sun.nio.ch.DirectBuffer"

A similar question was asked at Running unit tests with Spark 3.3.0 on Java 17 fails with IllegalAccessError: class StorageUtils cannot access class sun.nio.ch.DirectBuffer , but that question (and solution) was only about unit tests.Running unit tests with Spark 3.3.0 on Java 17 fails with IllegalAccessError: class StorageUtils cannot access class sun.nio.ch.DirectBuffer中提出了类似的问题,但该问题(和解决方案)仅与单元测试有关。 For me Spark is breaking actually running the program.对我来说,Spark 正在破坏实际运行程序。

According to the Spark overview , Spark works with Java 17. I'm using Temurin-17.0.4+8 (build 17.0.4+8) on Windows 10, including Spark 3.3.0 in Maven like this:根据Spark 概述,Spark 与 Java 17 一起使用。我在 Windows 10 上使用 Temurin-17.0.4+8(构建 17.0.4+8),包括 Maven 中的 Spark 3.3.0,如下所示:

<scala.version>2.13</scala.version>
<spark.version>3.3.0</spark.version>
...
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_${scala.version}</artifactId>
  <version>${spark.version}</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_${scala.version}</artifactId>
  <version>${spark.version}</version>
</dependency>

I try to run a simple program:我尝试运行一个简单的程序:

final SparkSession spark = SparkSession.builder().appName("Foo Bar").master("local").getOrCreate();
final Dataset<Row> df = spark.read().format("csv").option("header", "false").load("/path/to/file.csv");
df.show(5);

That breaks all over the place:这到处都是:

Caused by: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x59d016c9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x59d016c9
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:464)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
    at scala.Option.getOrElse(Option.scala:201)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)

Spark is obviously doing things one is not supposed to do in Java 17. Spark 显然在做一些不应该做的事情 Java 17。

Disappointing.令人失望。 How do I get around this?我该如何解决这个问题?

Following step helped me to unblock the issue.以下步骤帮助我解决了这个问题。

If you are running the application from IDE (intelliJ IDEA) follow the below instructions.如果您从IDE (intelliJ IDEA)运行应用程序,请按照以下说明操作。

Add the JVM option "--add-exports java.base/sun.nio.ch=ALL-UNNAMED"添加 JVM 选项“--add-exports java.base/sun.nio.ch=ALL-UNNAMED”

在此处输入图像描述

source: https://arrow.apache.org/docs/java/install.html#java-compatibility来源: https://arrow.apache.org/docs/java/install.html#java-compatibility

Add this as explicit dependency in Pom.xml file.将其添加为 Pom.xml 文件中的显式依赖项。 Don't change version other than 3.0.16不要更改 3.0.16 以外的版本

<dependency>
    <groupId>org.codehaus.janino</groupId>
    <artifactId>janino</artifactId>
    <version>3.0.16</version>
</dependency>

and then add the command line arguments. If you use VS code, add然后添加命令行arguments。如果你使用VS代码,添加

"vmArgs": "--add-exports java.base/sun.nio.ch=ALL-UNNAMED"

in configurations section in launch.json file under .vscode folder in your project.在项目的launch.json文件夹下的.vscode文件的配置部分。

Solution解决方案

A similar question was asked at Running unit tests with Spark 3.3.0 on Java 17 fails with IllegalAccessError: class StorageUtils cannot access class sun.nio.ch.DirectBuffer , but that question (and solution) was only about unit tests. A similar question was asked at Running unit tests with Spark 3.3.0 on Java 17 fails with IllegalAccessError: class StorageUtils cannot access class sun.nio.ch.DirectBuffer , but that question (and solution) was only about unit tests. For me Spark is breaking actually running the program.对我来说,Spark 正在破坏实际运行程序。

Please, consider adding the appropriate Java Virtual Machine command-line options.请考虑添加适当的 Java 虚拟机命令行选项。
The exact way to add them depends on how you run the program: by using a command line, an IDE, etc.添加它们的确切方法取决于您运行程序的方式:使用命令行、IDE 等。

Example: Command line示例:命令行

For example, to run the program (the .jar file) by using the command line:例如,要使用命令行运行程序( .jar文件):

java \
    --add-opens=java.base/java.lang=ALL-UNNAMED \
    --add-opens=java.base/java.lang.invoke=ALL-UNNAMED \
    --add-opens=java.base/java.lang.reflect=ALL-UNNAMED \
    --add-opens=java.base/java.io=ALL-UNNAMED \
    --add-opens=java.base/java.net=ALL-UNNAMED \
    --add-opens=java.base/java.nio=ALL-UNNAMED \
    --add-opens=java.base/java.util=ALL-UNNAMED \
    --add-opens=java.base/java.util.concurrent=ALL-UNNAMED \
    --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED \
    --add-opens=java.base/sun.nio.ch=ALL-UNNAMED \
    --add-opens=java.base/sun.nio.cs=ALL-UNNAMED \
    --add-opens=java.base/sun.security.action=ALL-UNNAMED \
    --add-opens=java.base/sun.util.calendar=ALL-UNNAMED \
    --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED \
    -jar <JAR_FILE_PATH>

Example: IDE: IntelliJ IDEA示例:IDE:IntelliJ IDEA

References:参考:

References参考

You could use JDK 8. You maybe should really.你可以使用 JDK 8。你也许真的应该。

But if you can't you might try adding to your build.sbt file these java options.但如果你不能,你可以尝试将这些 java 选项添加到你的build.sbt文件中。 For me they were needed for tests so I put them into:对我来说,测试需要它们,所以我将它们放入:

val projectSettings = Seq(
...
  Test / javaOptions ++= Seq(
    "base/java.lang", "base/java.lang.invoke", "base/java.lang.reflect", "base/java.io", "base/java.net", "base/java.nio",
    "base/java.util", "base/java.util.concurrent", "base/java.util.concurrent.atomic",
    "base/sun.nio.ch", "base/sun.nio.cs", "base/sun.security.action",
    "base/sun.util.calendar", "security.jgss/sun.security.krb5",
  ).map("--add-opens=java." + _ + "=ALL-UNNAMED"),
...

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 Java 17 上使用 Spark 3.3.0 运行单元测试失败并显示 IllegalAccessError:class StorageUtils 无法访问 class sun.nio.ch.DirectBuffer - Running unit tests with Spark 3.3.0 on Java 17 fails with IllegalAccessError: class StorageUtils cannot access class sun.nio.ch.DirectBuffer 编年史与 corretto jdk17 java.lang.NoSuchMethodError: 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()' - chronicle with corretto jdk17 java.lang.NoSuchMethodError: 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()' java.lang.NoClassDefFoundError:无法初始化类 sun.nio.ch.FileChannelImpl - java.lang.NoClassDefFoundError: Could not initialize class sun.nio.ch.FileChannelImpl selenium webdriver失败:java.lang.NoClassDefFoundError:无法初始化类sun.nio.ch.FileChannelImpl - selenium webdriver failed with: java.lang.NoClassDefFoundError: Could not initialize class sun.nio.ch.FileChannelImpl Java sun.nio.ch.FileChannelImpl.map被阻止了一分钟 - java sun.nio.ch.FileChannelImpl.map is blocked for a minute sun.nio.ch.AsynchronousSocketChannelImpl.read上的java.lang.NullPointerException - java.lang.NullPointerException at sun.nio.ch.AsynchronousSocketChannelImpl.read 无法初始化类 sun.nio.ch.SocketChannelImpl$DefaultOptionsHolder - Teamcity 在重启后不起作用 - Could not initialize class sun.nio.ch.SocketChannelImpl$DefaultOptionsHolder - Teamcity doesn't work after restart 为什么Netty使用反射用基于数组的set替换sun.nio.ch.SelectorImpl类中的成员? - Why Netty uses reflection to replace members in sun.nio.ch.SelectorImpl class with array based set? NPE位于sun.nio.ch.Util.free(Util.java:199) - NPE at sun.nio.ch.Util.free(Util.java:199) 线程挂在sun.nio.ch.FileDispatcherImpl.size0(本机方法)上 - Thread hung at sun.nio.ch.FileDispatcherImpl.size0(Native Method)
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM