简体   繁体   English

找不到有价值的火花SBT项目

[英]Not found value spark SBT project

Hi i am trying to set up a small spark application in SBT, 嗨,我正在尝试在SBT中设置一个小型Spark应用程序,

My build.sbt is 我的build.sbt是

import Dependencies._

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

libraryDependencies += scalaTest % Test

Everything works fine i get all dependencies resolved by SBT, but when i try importing spark in my hello.scala project file i get this error not found: value spark 一切正常,我通过SBT解决了所有依赖关系,但是当我尝试在hello.scala项目文件中导入spark时,找不到此错误:value spark

my hello.scala file is 我的hello.scala文件是

package example
import org.apache.spark._
import org.apache.spark.SparkContext._

object Hello extends fileImport with App {
  println(greeting)
  anime.select("*").orderBy($"rating".desc).limit(10).show()
}

trait fileImport {
  lazy val greeting: String = "hello"
  var anime = spark.read.option("header", true).csv("C:/anime.csv")
  var ratings = spark.read.option("header", true).csv("C:/rating.csv")
}

here is error file i get 这是我得到的错误文件

[info] Compiling 1 Scala source to C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\target\scala-2.11\classes...
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:12: not found: value spark
[error]   var anime = spark.read.option("header", true).csv("C:/anime.csv")
[error]               ^
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:13: not found: value spark
[error]   var ratings = spark.read.option("header", true).csv("C:/rating.csv")
[error]                 ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Sep 10, 2017 1:44:47 PM

spark is initialized in spark-shell only spark仅在spark-shell初始化

but for the code you need to initialize the spark variable by yourself 但是对于代码,您需要自己初始化spark变量

import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().appName("testings").master("local").getOrCreate

you can change the testings name to your desired name .master option is optional if you want to run the code using spark-submit 您可以将testings名称更改为所需的名称。如果要使用spark-submit运行代码,则.master选项是可选的

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM