简体   繁体   English

新 Mac 上的 spark-shell 出现错误

[英]spark-shell on new mac gives Error

I am trying to install spark in a new Macbook.我正在尝试在新的 Macbook 中安装 spark。 I couldn't run the spark-shell and get the following error:我无法运行 spark-shell 并收到以下错误:

Failed to initialize compiler: object java.lang.Object in compiler mirror not     found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Exception in thread "main" java.lang.NullPointerException
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336)
at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336)
at scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908)
at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002)
at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997)
at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

First, install Java 8 (You can keep Java 9 if that's what you have).首先,安装 Java 8(如果你有的话,你可以保留 Java 9)。

Then, in your .bash_profile, set JAVA_HOME as follows:然后,在您的 .bash_profile 中,按如下方式设置 JAVA_HOME:

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)  
export PATH=$JAVA_HOME/bin:$PATH

Finally, add this:最后,添加以下内容:

export SPARK_LOCAL_IP="127.0.0.1"

Hope this helps.希望这会有所帮助。 There is a nice trick on how to alternate between different Java versions here:这里有一个关于如何在不同 Java 版本之间交替的好技巧:
Mac OS X and multiple Java versions Check out the answer by @Vegard. Mac OS X 和多个 Java 版本查看@Vegard 的答案。

This is my setting:这是我的设置:

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)  
export PATH=$JAVA_HOME/bin:$PATH

export SCALA_HOME=/path/to/your/scala  
export PATH=$PATH:$SCALA_HOME/bin

export SPARK_HOME=/path/to/your/spark  
export PATH="$SPARK_HOME/bin:$PATH"  
export SPARK_LOCAL_IP="127.0.0.1"

I get the same error when trying to run spark-shell with Java 9.尝试使用 Java 9 运行spark-shell时出现相同的错误。

Please try installing Java 8 and running spark-shell with JAVA_HOME set to /Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home请尝试安装 Java 8 并在JAVA_HOME设置为/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home运行 spark-shell

For more information about JDK 9 support in Scala you can look a scala-dev issue #139有关 Scala 中 JDK 9 支持的更多信息,您可以查看scala-dev 问题 #139

In my case it was brew command who changed JAVA_HOME just for spark-submit and other spark-related command.就我而言,是 brew 命令更改了JAVA_HOME仅用于spark-submit和其他与 spark 相关的命令。

Check with brew info apache-spark - see content of the formula.检查brew info apache-spark - 查看公式的内容。 In mine there was fixed javasdk@11 .在我的有固定的javasdk@11

So I had to edit this ruby file to use current $JAVA_HOME .所以我不得不编辑这个 ruby​​ 文件以使用当前的$JAVA_HOME And brew reinstall apache-spark .brew reinstall apache-spark After that the error gone and the spark-commands use the java version that I have chosen at the moment with jenv在那之后,错误消失了,spark-commands 使用了我目前在jenv 中选择的 java 版本

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM