[英]Is there a version compatibility issue between Spark/Hadoop/Scala/Java/Python?
I'm getting an error while running spark-shell command through cmd but unfortunately without any luck so far.我在通过 cmd 运行 spark-shell 命令时遇到错误,但不幸的是到目前为止没有任何运气。 I have Python/Java/Spark/Hadoop(winutils.exe)/Scala installed with versions as below:
我安装了 Python/Java/Spark/Hadoop(winutils.exe)/Scala,版本如下:
I followed below steps and ran spark-shell ( C:\\Program Files\\spark-3.2.0-bin-hadoop3.2\\bin>
) through cmd:我按照以下步骤运行 spark-shell(
C:\\Program Files\\spark-3.2.0-bin-hadoop3.2\\bin>
)通过 cmd:
JAVA_HOME
variable: C:\\Program Files\\Java\\jdk1.8.0_311\\bin
JAVA_HOME
变量: C:\\Program Files\\Java\\jdk1.8.0_311\\bin
%JAVA_HOME%\\bin
%JAVA_HOME%\\bin
SPARK_HOME
variable: C:\\spark-3.2.0-bin-hadoop3.2\\bin
SPARK_HOME
变量: C:\\spark-3.2.0-bin-hadoop3.2\\bin
%SPARK_HOME%\\bin
%SPARK_HOME%\\bin
winutils.exe
as the following: C:\\Hadoop\\bin
Sure you will locate winutils.exe
inside this path. winutils.exe
之前包含 bin 文件,如下所示: C:\\Hadoop\\bin
确保您将在此路径中找到winutils.exe
。HADOOP_HOME
Variable: C:\\Hadoop
HADOOP_HOME
变量: C:\\Hadoop
%HADOOP_HOME%\\bin
%HADOOP_HOME%\\bin
Am I missing out on anything?我错过了什么吗? I've posted my question with error details in another thread ( spark-shell command throwing this error: SparkContext: Error initializing SparkContext )
我在另一个线程中发布了带有错误详细信息的问题( spark-shell 命令抛出此错误:SparkContext: Error initializing SparkContext )
You went the difficult way in installing everything by hand.您手动安装所有东西都走得很艰难。 You may need Scala too, be extremely vigilant with the version you are installing, from your example it seems like it's Scala 2.12.
您可能也需要 Scala,对您正在安装的版本保持高度警惕,从您的示例来看,它似乎是 Scala 2.12。
But you are right: Spark is extremely demanding in term of version matching.但您是对的:Spark 在版本匹配方面要求极高。 Java 8 is good.
Java 8 很好。 Java 11 is ok too, but not any more recent version.
Java 11 也可以,但不是最新版本。
Alternatively, you can:或者,您可以:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.