简体   繁体   English

装有spark-shell的节点上的scala在哪里?

[英]Where is scala on node with spark-shell installed?

I have Apache Spark installed on a cluster. 我在群集上安装了Apache Spark。 I can run spark-shell on the cluster master node. 我可以在集群主节点上运行spark-shell So, it means there is the scala installed to this machine. 因此,这意味着该计算机上已安装了scala However, I cannot start neither sbt nor scalac . 但是,我既不能启动sbt也不能启动scalac Is it possible to obtain spark's scala and how to do it? 是否有可能获得Spark的Scala以及如何做到?

No, Its not. 不,这不对。 You have to install manually. 您必须手动安装。 Go through these links. 通过这些链接。

https://www.scala-lang.org/download/ https://www.scala-lang.org/download/

https://www.scala-sbt.org/1.0/docs/Installing-sbt-on-Linux.html https://www.scala-sbt.org/1.0/docs/Installing-sbt-on-Linux.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM