简体   繁体   English

从sbt运行spark-shell

[英]Run spark-shell from sbt

The default way of getting spark shell seems to be to download the distribution from the website . 获取Spark Shell的默认方法似乎是从网站下载发行版 Yet, this spark issue mentions that it can be installed via sbt . 但是, 此火花问题提到可以通过sbt安装。 I could not find documentation on this. 我找不到关于此的文档。 In a sbt project that uses spark-sql and spark-core , no spark-shell binary was found. 在使用spark-sqlspark-core的sbt项目中,未找到spark-shell二进制文件。

How do you run spark-shell from sbt? 如何从sbt运行spark-shell

From the following URL: 从以下URL:

https://bzhangusc.wordpress.com/2015/11/20/use-sbt-console-as-spark-shell/ https://bzhangusc.wordpress.com/2015/11/20/use-sbt-console-as-spark-shell/

If you already using Sbt for your project, it's very simple to setup Sbt Console to replace Spark-shell command. 如果您已在项目中使用Sbt,则设置Sbt Console来替换Spark-shell命令非常简单。 Let's start from the basic case. 让我们从基本情况开始。 When you setup the project with sbt, you can simply run the console as sbt console 使用sbt设置项目时,只需将控制台作为sbt console运行

Within the console, you just need to initiate SparkContext and SQLContext to make it behave like Spark Shell 在控制台内,您只需要启动SparkContext和SQLContext使其表现得像Spark Shell

scala> val sc = new org.apache.spark.SparkContext("localhell")
scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM