[英]Spark - SQL : value implicits is not a member of org.apache.spark.sql.SQLContext
Hi please find below code and corresponding errors :Even though i have used import statements but still giving errors 您好,请找到以下代码和相应的错误:即使我使用了import语句,但仍然给出错误
import org.apache.spark.sql._
val sparkConf = new SparkConf().setAppName("new_proj")
implicit val sc = new SparkContext(sparkConf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
val projects = sqlContext.read.json("/part-m-00000.json")
[error] /mapr/trans.scala:25: value implicits is not a member of org.apache.spark.sql.SQLContext [error] import sqlContext.implicits._ [error] ^ [error] /mapr/ppm_trans.scala:28: value read is not a member of org.apache.spark.sql.SQLContext [error] val projects = sqlContext.read.json("/mapr//part-m-00000.json") [错误] /mapr/trans.scala:25:值隐式不是org.apache.spark.sql.SQLContext的成员[错误] import sqlContext.implicits._ [错误] ^ [错误] /mapr/ppm_trans.scala :28:读取的值不是org.apache.spark.sql.SQLContext的成员[错误] val项目= sqlContext.read.json(“ / mapr // part-m-00000.json”)
I'm able to compile code by changing the following lines in build.sbt : 我可以通过更改build.sbt中的以下行来编译代码:
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.4.0" % "provided",
"org.apache.spark" % "spark-sql_2.10" % "1.4.0",
"org.apache.spark" % "spark-sql_2.10" % "1.4.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.4.0"
)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.