简体   繁体   中英

Spark-shell uses different version of Scala. Used homebrew to install both scala and apache-spark

I installed scala and apache-spark using homebrew and it installed scala 2.12.4 and apache-spark 2.2.0. However, if you checkout spark-shell -version it uses a different scala version.

版本不匹配scala和spark-shell

How do I set spark-shell to use the installed scala version? Is there way to set scala version used by apache-spark during homebrew installation?

TL;DR You cannot.

Two problems:

  • Spark (it is not really specific to Spark) will use Scala version which has been used to compile it. Version of Scala compile installed on the machine is not relevant at all.
  • Spark doesn't support Scala 2.12 yet, so recompiling is not an option.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM