[英]Scala code not compiling in SBT - Eclipse Maven build
I am trying to compile sample Spark scala file through sbt and have built maven project in Eclipse IDE 我正在尝试通过sbt编译示例Spark scala文件并已在Eclipse IDE中构建了Maven项目
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object simpleSpark {
def main(args : Arrayt[String]){
val logfile = "C:\\spark-1.6.1-bin-hadoop2.6\spark-1.6.1-bin-hadoop2.6\README.md"
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[2]").set("spark.executor.memory", "1g")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numHadoops = logData.filter(line => line.contains("Hadoop")).count()
val numSparks = logData.filer(line => line.contains("Spark")).count()
println("Lines with Hadoop: %s, Lines with Spark: %s".format(numHadoops, numHadoops))
}
}
The error says you have illegal start of expression here set("spark.executor.memory",) . 该错误表明您在set(“ spark.executor.memory”,)中具有非法的表达式开始。 Are you sure you set spark.executor.memory correctly in actual code ? 您确定在实际代码中正确设置了spark.executor.memory吗? If yes , can you show what you wrote is .sbt file ? 如果是,是否可以显示编写的是.sbt文件?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.