简体   繁体   中英

Where is scala on node with spark-shell installed?

I have Apache Spark installed on a cluster. I can run spark-shell on the cluster master node. So, it means there is the scala installed to this machine. However, I cannot start neither sbt nor scalac . Is it possible to obtain spark's scala and how to do it?

No, Its not. You have to install manually. Go through these links.

https://www.scala-lang.org/download/

https://www.scala-sbt.org/1.0/docs/Installing-sbt-on-Linux.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM