简体   繁体   English

从 scala shell 开始火花

[英]Start spark from a scala shell

How can I start spark from an sbt shell. I don't want to use the spark-shell command.如何从 sbt shell 启动 spark。我不想使用 spark-shell 命令。 I would like to use spark and use the objects in my sbt project.我想使用 spark 并使用我的 sbt 项目中的对象。

  • Add spark dependencies to build.sbt :将 spark 依赖项添加到build.sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.1",
  • Run sbt console:运行 sbt 控制台:

sbt console

  • Load spark session/context:加载火花会话/上下文:
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate()
import spark.implicits._
val sc = spark.sparkContext

Or automate the next command with an alias:或者使用别名自动执行下一个命令:

initialCommands in console := s"""
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate()
import spark.implicits._
val sc = spark.sparkContext
"""

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM