简体   繁体   English

spark-shell:系统找不到指定的路径

[英]spark-shell: The system cannot find the path specified

After I have installed an anaconda package, I can not start the Spark Shell anymore under Windows 7. Whenever I type spark-shell , the console answers with The system cannot find the path specified.安装 anaconda 包后,在 Windows 7 下无法再启动 Spark Shell。每当我键入spark-shell ,控制台都会回答The system cannot find the path specified. The Spark Shell does not start of course. Spark Shell 当然不会启动。

I have the following echo %PATH% :我有以下echo %PATH%

C:\\Program Files\\Microsoft MPI\\Bin\\;C:\\Program Files (x86)\\Common Files\\Intel\\Shared Files\\cpp\\bin\\Intel64;C:\\Program Files (x86)\\Intel\\iCLS Client\\;C:\\ProgramFiles\\Intel\\iCLS Client\\;C:\\windows\\system32;C:\\windows;C:\\windows\\System32\\Wbem;C:\\windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Program Files\\Intel\\Intel(R) Management Engine Components\\DAL;C:\\Program Files (x86)\\Intel\\Intel(R) Management Engine Components\\DAL;C:\\Program Files\\Intel\\Intel(R) Management Engine Components\\IPT;C:\\Program Files (x86)\\Intel\\Intel(R) Management Engine Components\\IPT;C:\\Program Files\\Lenovo\\Fingerprint Manager Pro\\;C:\\Program Files (x86)\\WinSCP\\;C:\\Program Files (x86)\\Lenovo\\Access Connections\\;C:\\Program Files\\MiKTeX 2.9\\miktex\\bin\\x64\\;C:\\Program Files\\PuTTY\\;C:\\Program Files (x86)\\Intel\\UCRT\\;C:\\Program Files\\Intel\\UCRT\\;C:\\Program Files\\Intel\\WiFi\\bin\\;C:\\Program Files\\Common Files\\Intel\\WirelessCommon\\;C:\\Program Files\\Microsoft SQL Server\\130\\Tools\\Binn\\;C:\\Program Files\\dotnet\\;C:\\Program Files\\A C:\\Program Files\\Microsoft MPI\\Bin\\;C:\\Program Files (x86)\\Common Files\\Intel\\Shared Files\\cpp\\bin\\Intel64;C:\\Program Files (x86)\\Intel\\iCLS Client\\; C:\\ProgramFiles\\Intel\\iCLS Client\\;C:\\windows\\system32;C:\\windows;C:\\windows\\System32\\Wbem;C:\\windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Program Files\\Intel\\Intel(R) Management Engine Components\\DAL;C:\\Program Files (x86)\\Intel\\Intel(R) Management Engine Components\\DAL;C:\\Program Files\\Intel\\Intel(R) Management Engine Components \\IPT;C:\\Program Files (x86)\\Intel\\Intel(R) Management Engine Components\\IPT;C:\\Program Files\\Lenovo\\Fingerprint Manager Pro\\;C:\\Program Files (x86)\\WinSCP\\;C :\\Program Files (x86)\\Lenovo\\Access Connections\\;C:\\Program Files\\MiKTeX 2.9\\miktex\\bin\\x64\\;C:\\Program Files\\PuTTY\\;C:\\Program Files (x86)\\Intel\\ UCRT\\;C:\\Program Files\\Intel\\UCRT\\;C:\\Program Files\\Intel\\WiFi\\bin\\;C:\\Program Files\\Common Files\\Intel\\WirelessCommon\\;C:\\Program Files\\Microsoft SQL Server \\130\\Tools\\Binn\\;C:\\Program Files\\dotnet\\;C:\\Program Files\\A naconda3;C:\\Program Files\\Anaconda3\\Scripts;C:\\Program Files\\Anaconda3\\Library\\bin;C:\\Program Files (x86)\\GtkSharp\\2.12\\bin;C:\\Program Files\\Git\\cmd;C:\\Program Files\\TortoiseGit\\bin;C:\\Program Files\\TortoiseSVN\\bin;C:\\Program Files (x86)\\sbt\\bin;C:\\Program Files (x86)\\scala\\bin; naconda3;C:\\Program Files\\Anaconda3\\Scripts;C:\\Program Files\\Anaconda3\\Library\\bin;C:\\Program Files (x86)\\GtkSharp\\2.12\\bin;C:\\Program Files\\Git\\cmd;C :\\Program Files\\TortoiseGit\\bin;C:\\Program Files\\TortoiseSVN\\bin;C:\\Program Files (x86)\\sbt\\bin;C:\\Program Files (x86)\\scala\\bin; C:\\Program Files (x86)\\Java\\jre1.8.0_144\\bin; C:\\Program Files (x86)\\Java\\jre1.8.0_144\\bin; C:\\Program Files\\Intel\\WiFi\\bin\\;C:\\Program Files\\Common Files\\Intel\\WirelessCommon\\;C:\\Program Files (x86)\\Graphviz2.38\\bin\\;C:\\Program Files (x86)\\sbt\\bin; C:\\Program Files\\Intel\\WiFi\\bin\\;C:\\Program Files\\Common Files\\Intel\\WirelessCommon\\;C:\\Program Files (x86)\\Graphviz2.38\\bin\\;C:\\Program Files (x86) )\\sbt\\bin; C:\\Program Files (x86)\\scala\\bin;D:\\Spark\\bin;D:\\Hadoop\\bin C:\\Program Files (x86)\\scala\\bin;D:\\Spark\\bin;D:\\Hadoop\\bin

And the following echo %SPARK_HOME% :以及以下echo %SPARK_HOME%

D:\\Spark D:\\火花

And the following echo %JAVA_HOME% :以及以下echo %JAVA_HOME%

C:\\Program Files (x86)\\Java\\jre1.8.0_144 C:\\Program Files (x86)\\Java\\jre1.8.0_144

This is my java -version :这是我的java -version

java version "1.8.0_144" java版本“1.8.0_144”

Java(TM) SE Runtime Environment (build 1.8.0_144-b01) Java(TM) SE 运行时环境(构建 1.8.0_144-b01)

Java HotSpot(TM) Client VM (build 25.144-b01, mixed mode, sharing) Java HotSpot(TM) 客户端 VM(构建 25.144-b01,混合模式,共享)

I have already tried to reinstall Java, but without any success.我已经尝试重新安装 Java,但没有任何成功。 There is a similiar question here , but I don't see any wrong environment variables in my setup.还有一个类似的问题在这里,但我看不到我的设置任何错误的环境变量。 So I have really no idea how to resolve this... Any thoughts?所以我真的不知道如何解决这个问题......有什么想法吗?

After some testing I found out that when I cd into $SPARK_HOME$\\bin I can actually execute spark-shell .经过一些测试,我发现当我cd进入$SPARK_HOME$\\bin我实际上可以执行spark-shell It exits with an error message:它退出并显示错误消息:

\\Java\\jre1.8.0_144\\bin\\java was unexpected at this time. \\Java\\jre1.8.0_144\\bin\\java 这个时候出乎意料。

This error arises when executing the last line "%~dp0spark-class2.cmd" %CLASS% %* of Spark\\bin\\spark-submit2.cmd .执行Spark\\bin\\spark-submit2.cmd的最后一行"%~dp0spark-class2.cmd" %CLASS% %*时会出现此错误。

UPDATE 1:更新1:

Changing the %JAVA_HOME% from "C:\\Program Files..." to "C:\\PROGRA~1..." indeed solved this issue in some parts: spark-shell now seems to start.%JAVA_HOME%从 "C:\\Program Files..." 更改为 "C:\\PROGRA~1..." 确实在某些部分解决了这个问题: spark-shell现在似乎开始了。 However, there are a lot of Access denied errors:但是,有很多Access denied错误:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta
ntiateSessionState(SparkSession.scala:1053)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio
n.scala:130)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio
n.scala:130)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scal
a:129)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar
kSession.scala:938)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar
kSession.scala:938)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:93
8)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97)
... 47 elided
Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: j
ava.lang.RuntimeException: java.io.IOException: Access is denied;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo
g.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCa
talog.scala:193)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(Shared
State.scala:105)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala
:93)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessi
onStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSe
ssionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB
uilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB
uilder.scala:35)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStat
eBuilder.scala:289)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta
ntiateSessionState(SparkSession.scala:1050)
... 61 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOExc
eption: Access is denied
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala
:191)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(Isolated
ClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:3
62)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:2
66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExterna
lCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.sc
ala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
ly$mcZ$sp(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
ly(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
ly(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo
g.scala:97)
... 70 more
Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)

... 84 more
Caused by: java.io.IOException: Access is denied
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(Unknown Source)
at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.
java:818)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)

... 84 more
<console>:14: error: not found: value spark
import spark.implicits._
       ^

<console>:14: error: not found: value spark
import spark.sql
       ^

UPDATE 2:更新 2:

Running spark-shell as an administrator works!以管理员身份运行spark-shell有效! However, this might be very unsafe and I don't consider it a real solution.但是,这可能非常不安全,我认为这不是真正的解决方案。

make sure that you have set your JAVA_HOME and SBT_HOME correctly, I also add them to the Path variable as well to be on the safe side.确保您已正确设置 JAVA_HOME 和 SBT_HOME,为了安全起见,我还将它们添加到 Path 变量中。 To do this easily I could recommend "rapid environment editor", simple and nice tool for editing system variables.为了轻松做到这一点,我可以推荐“快速环境编辑器”,这是一种用于编辑系统变量的简单而不错的工具。 This approach made it work for me as I was getting the same problem as you are.这种方法对我有用,因为我遇到了和你一样的问题。 An example would be:一个例子是:

JAVA_HOME set to C:\\Program Files\\Java\\jdk1.8.0_151 JAVA_HOME 设置为 C:\\Program Files\\Java\\jdk1.8.0_151

SBT_HOME set to C:\\Program Files (x86)\\sbt\\ SBT_HOME 设置为 C:\\Program Files (x86)\\sbt\\

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM