简体   繁体   English

spark-shell 命令给出错误(windows cmd 和 cygwin)

[英]spark-shell command giving error(windows cmd and cygwin)

Using Windows 10, jdk 16.0.1, scala 2.13.5, spark 3.1.1, hadoop 2.7, the command spark-shell gives the following error in cmd.exe: Using Windows 10, jdk 16.0.1, scala 2.13.5, spark 3.1.1, hadoop 2.7, the command spark-shell gives the following error in cmd.exe:

Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
        at org.apache.spark.internal.config.package$.<init>(package.scala:1095)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
        at scala.Option.orElse(Option.scala:447)
        at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
        at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
        at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:1013)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:1013)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @7530ad9c
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
        at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
        at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
        ... 13 more

Cygwin:赛格温:

D:\Java\jdk-16.0.1\bin\java -cp "D:\spark/conf\;D:\spark\jars\*" "-Dscala.usejavacp=true" "-Djline.terminal=unix" -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name "Spark shell" spark-shell
D:\spark/bin/spark-class: line 96: CMD: bad array subscript

[Edit /] [编辑 /]

I just solved it by using spark 2.4.7 instead and making a separate hadoop\bin for winutils.我刚刚通过使用 spark 2.4.7 解决了这个问题,并为 winutils 制作了一个单独的hadoop\bin Though I would like to ask why that happened?虽然我想问为什么会这样?

I just solved this error.我刚刚解决了这个错误。 it was a Java version issue.这是一个 Java 版本问题。 if you see the spark documentation, it doesn't support anything other than java 8 or 11.如果您查看 spark 文档,它不支持 java 8 或 11 以外的任何内容。

please find the link here: https://spark.apache.org/docs/latest/请在此处找到链接: https://spark.apache.org/docs/latest/

Try installing Java 8 and then set its respective environment variable.尝试安装 Java 8 然后设置其各自的环境变量。 you need not change Hadoop and spark version to 2.7.您无需将 Hadoop 和 spark 版本更改为 2.7。 it works with version 3.2.1 as well.它也适用于 3.2.1 版本。

after you finish setup, restart the cmd prompt and type: spark-shell and the magic happens!!!完成设置后,重新启动 cmd 提示符并键入:spark-shell,奇迹发生了!!!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM