简体   繁体   English

Spark-shell使用不同版本的Scala。 使用自制软件安装scala和apache-spark

[英]Spark-shell uses different version of Scala. Used homebrew to install both scala and apache-spark

I installed scala and apache-spark using homebrew and it installed scala 2.12.4 and apache-spark 2.2.0. 我使用自制软件安装了scala和apache-spark,它安装了scala 2.12.4和apache-spark 2.2.0。 However, if you checkout spark-shell -version it uses a different scala version. 但是,如果签出spark-shell -version它将使用不同的scala版本。

版本不匹配scala和spark-shell

How do I set spark-shell to use the installed scala version? 如何设置spark-shell以使用已安装的scala版本? Is there way to set scala version used by apache-spark during homebrew installation? 有没有办法设置自制安装过程中apache-spark使用的scala版本?

TL;DR You cannot. TL; DR您不能。

Two problems: 两个问题:

  • Spark (it is not really specific to Spark) will use Scala version which has been used to compile it. Spark(并非特定于Spark)将使用已用于编译的Scala版本。 Version of Scala compile installed on the machine is not relevant at all. 机器上安装的Scala编译版本根本不相关。
  • Spark doesn't support Scala 2.12 yet, so recompiling is not an option. Spark尚不支持Scala 2.12,因此无法进行重新编译。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM