[英]The system cannot find the path specified error while running pyspark
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了 spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的步骤pyspark installation for windows 10 。我使用注释 bin\\pyspark 运行 spark 并收到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附上报错信息截图
Attached is the screen shot of my spark bin folder附件是我的 spark bin 文件夹的屏幕截图
Screen shot of my path variable looks like我的路径变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的 Windows 10 系统中有 python 3.6 和 Java "1.8.0_151" 你能建议我如何解决这个问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
I just downloaded spark-2.3.0-bin-hadoop2.7.tgz.我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz。 After downloading I followed the steps mentioned here pyspark installation for windows 10 .I used the comment bin\\pyspark to run the spark & got error message下载后,我按照此处提到的Windows 10 pyspark安装步骤进行操作。我使用了注释bin \\ pyspark来运行spark和得到错误消息
The system cannot find the path specified
Attached is the screen shot of error message附件是错误消息的屏幕截图
Attached is the screen shot of my spark bin folder附件是我的Spark Bin文件夹的屏幕截图
Screen shot of my path variable looks like我的path变量的屏幕截图看起来像
I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?我的Windows 10系统中有python 3.6和Java“ 1.8.0_151”,您能建议我如何解决此问题吗?
对于那些使用 Windows 并仍在尝试的人,解决我的问题是将 Python(3.9)重新安装为本地用户( c:\\Users\\<user>\\AppData\\Local\\Programs\\Python
)并定义了两个环境变量PYSPARK_PYTHON
和PYSPARK_DRIVER_PYTHON
c:\\Users\\<user>\\AppData\\Local\\Programs\\Python\\python.exe
Fixing problems installing Pyspark (Windows)修复安装 Pyspark (Windows) 的问题
Incorrect JAVA_HOME path JAVA_HOME 路径不正确
> pyspark
The system cannot find the path specified.
Open System Environment variables:打开系统环境变量:
rundll32 sysdm.cpl,EditEnvironmentVariables
Set JAVA_HOME: System Variables > New:设置 JAVA_HOME:系统变量 > 新建:
Variable Name: JAVA_HOME
Variable Value: C:\Program Files\Java\jdk1.8.0_261
Also, check that SPARK_HOME and HADOOP_HOME are correctly set, eg:另外,检查 SPARK_HOME 和 HADOOP_HOME 是否设置正确,例如:
SPARK_HOME=C:\Spark\spark-3.2.0-bin-hadoop3.2
HADOOP_HOME=C:\Spark\spark-3.2.0-bin-hadoop3.2
Important: Double-check the following重要提示:仔细检查以下内容
bin
folder路径不包含bin
文件夹Incorrect Java version Java 版本不正确
> pyspark
WARN SparkContext: Another SparkContext is being constructed
UserWarning: Failed to initialize Spark session.
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
Ensure that JAVA_HOME is set to Java 8 (jdk1.8.0)确保 JAVA_HOME 设置为 Java 8 (jdk1.8.0)
winutils not installed winutils 未安装
> pyspark
WARN Shell: Did not find winutils.exe
java.io.FileNotFoundException: Could not locate Hadoop executable
Download winutils.exe and copy it to your spark home bin folder下载winutils.exe并将其复制到您的 spark home bin 文件夹中
curl -OutFile C:\Spark\spark-3.2.0-bin-hadoop3.2\bin\winutils.exe -Uri https://github.com/steveloughran/winutils/raw/master/hadoop-3.0.0/bin/winutils.exe
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.