[英]Spark-shell uses different version of Scala. Used homebrew to install both scala and apache-spark
I installed scala and apache-spark using homebrew and it installed scala 2.12.4 and apache-spark 2.2.0. 我使用自制软件安装了scala和apache-spark,它安装了scala 2.12.4和apache-spark 2.2.0。 However, if you checkout
spark-shell -version
it uses a different scala version. 但是,如果签出
spark-shell -version
它将使用不同的scala版本。
How do I set spark-shell to use the installed scala version? 如何设置spark-shell以使用已安装的scala版本? Is there way to set scala version used by apache-spark during homebrew installation?
有没有办法设置自制安装过程中apache-spark使用的scala版本?
TL;DR You cannot. TL; DR您不能。
Two problems: 两个问题:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.