[英]Compiling Apache spark 1.1.0 on for localhost mode on windows 8 x64 PC? See specific errors
I've explicitly installed Java 8 SDK, Scala-IDE (modified eclipse), the normal Eclipse IDE, and sbt though not in that order. 我已经明确安装了Java 8 SDK,Scala-IDE(经过修改的eclipse),普通的Eclipse IDE和sbt,但顺序不一。 I don't really remember the order in which I installed those.
我真的不记得我安装它们的顺序。 I was sort of confused how to set up my environment variables when I have both the vanilla Eclipse IDE and Scala IDE's installed as opposed to just the command line tools, as these installations seem to organized the language environment differently that the command line installations do?
当我同时安装了香草Eclipse IDE和Scala IDE而不是命令行工具时,我对如何设置环境变量感到困惑,因为这些安装似乎与命令行安装不同地组织了语言环境? So I wasn't what to point the variables to?
所以我不是要指向变量指向什么?
My understanding is that the CLASS_PATH
environment variable is for user-defined classes, of which for my current project I have none. 我的理解是
CLASS_PATH
环境变量用于用户定义的类,而对于我当前的项目,我没有。 Beyond this. 超出此。 I have the following environment variable setup.
我有以下环境变量设置。 I am on windows 8 x64 .
我在Windows 8 x64上 。
SBT_HOME = C:\\Program Files (x86)\\sbt
JRE_HOME = C:\\Program Files\\Java\\jre1.8.0_20
JAVA_HOME = C:\\Program Files\\Java\\jdk1.8.0_20
Path = %JRE_HOME%\\bin;%JAVA_HOME%\\bin;C:\\ProgramData\\Oracle\\Java\\javapath;C:\\Program Files (x86)\\scala\\bin;C:\\Program Files (x86)\\sbt\\bin;...(only relevant paths shown)
Upon downloading and extracting Apache Spark 1.1.0 and uncompressing it and entering into the uncompressed top level folder ~/(apache spark folder)/
, and attempting to run sbt/sbt assembly
of which I receive the following error using the cygwin command prompt. 在下载并解压缩Apache Spark 1.1.0并将其解压缩并进入未压缩的顶级文件夹
~/(apache spark folder)/
并尝试运行sbt/sbt assembly
后,我使用cygwin命令提示符收到以下错误。
[error] (streaming-flume-sink/compile:compile) java.io.IOException: Cannot run p
rogram "javac": CreateProcess error=2, The system cannot find the file specified
[error] (core/compile:compile) java.io.IOException: Cannot run program "javac":
CreateProcess error=2, The system cannot find the file specified
[error] Total time: 551 s, completed Oct 28, 2014 5:24:00 PM
So the error appeared to be that I can't run javac
in cygwin. 因此错误似乎是我无法在cygwin中运行
javac
。 Well, I can't. 好吧,我不能。 So I tried the windows prompt as well, where my command
javac
actually works (since the enviro variables are setup.) And I get the following error when again attempting 因此,我也尝试了Windows提示符,我的命令
javac
实际上在这里运行(因为已经设置了enviro变量。)并且再次尝试时出现以下错误
`sbt/sbt assembly`.
C:\Program Files (x86)\spark-1.1.0>sbt\sbt assembly
'sbt\sbt' is not recognized as an internal or external command,
operable program or batch file.
>sbt/sbt assembly
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
[info] Loading project definition from C:\Program Files (x86)\spark-1.1.0\project\projec
t
[info] Loading project definition from C:\Users\jim\.sbt\0.13\staging\ec3aa8f391
11944cc5f2\sbt-pom-reader\project
[warn] Multiple resolvers having different access mechanism configured with same
name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolve
rs (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from C:\Program Files (x86)\spark-1.1.0\project
[error] Expected symbol
[error] Not a valid command: /
[error] /sbt
[error] ^
but all kinds of variants of the same command don't work either, switching out slash types( sbt\\sbt assembly
). 但是同一命令的所有变体也不起作用,切换出斜杠类型(
sbt\\sbt assembly
)。
I've also got spark in a folder that has no spaces: C:\\Program Files (x86)\\spark
. 我还在没有空格的文件夹中产生了火花:
C:\\Program Files (x86)\\spark
。
What in tarnation do I need to do to assemble/compile Spark? 组装/编译Spark我需要做些什么?
Thanks a million. 太感谢了。
Ran into the same problem in Windows8 and following steps fixed the issue 在Windows8中遇到相同的问题,并按照以下步骤解决了该问题
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.