简体   繁体   English

Apache Spark 安装时出现 Java 错误

[英]Apache Spark installation with Java Error

Hi I am having a problem to install spark on my PC having the Java error below.嗨,我在我的 PC 上安装 spark 时遇到问题,出现以下 Java 错误。 Would you help me and give me tips how to fix it?你能帮我并给我一些提示如何解决它吗?

Thanks millions in advance提前感谢数百万

enter image description here在此处输入图片说明

(See my comments above, but here is my answer) (见我上面的评论,但这是我的答案)

As Spark relies on Scala (but you don't have to use Scala yourself) you are dependent on the version of the JVM used by Scala, which in your situation is Java 8 or 11, not 17.由于 Spark 依赖于 Scala(但您不必自己使用 Scala),因此您依赖于 Scala 使用的 JVM版本,在您的情况下是 Java 8 或 11,而不是 17。

Check out https://spark.apache.org/docs/latest/ for more details.查看https://spark.apache.org/docs/latest/了解更多详情。

Install Java 8 or 11, point your JAVA_HOME to the right directory, and run Spark again.安装 Java 8 或 11,将 JAVA_HOME 指向正确的目录,然后再次运行 Spark。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM