简体   繁体   English

Spark-SQL:值隐式不是org.apache.spark.sql.SQLContext的成员

[英]Spark - SQL : value implicits is not a member of org.apache.spark.sql.SQLContext

Hi please find below code and corresponding errors :Even though i have used import statements but still giving errors 您好,请找到以下代码和相应的错误:即使我使用了import语句,但仍然给出错误

import org.apache.spark.sql._

val sparkConf = new SparkConf().setAppName("new_proj")
implicit val sc = new SparkContext(sparkConf)

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._

val projects = sqlContext.read.json("/part-m-00000.json")

[error] /mapr/trans.scala:25: value implicits is not a member of org.apache.spark.sql.SQLContext [error] import sqlContext.implicits._ [error] ^ [error] /mapr/ppm_trans.scala:28: value read is not a member of org.apache.spark.sql.SQLContext [error] val projects = sqlContext.read.json("/mapr//part-m-00000.json") [错误] /mapr/trans.scala:25:值隐式不是org.apache.spark.sql.SQLContext的成员[错误] import sqlContext.implicits._ [错误] ^ [错误] /mapr/ppm_trans.scala :28:读取的值不是org.apache.spark.sql.SQLContext的成员[错误] val项目= sqlContext.read.json(“ / mapr // part-m-00000.json”)

I'm able to compile code by changing the following lines in build.sbt : 我可以通过更改build.sbt中的以下行来编译代码:

libraryDependencies ++= Seq(
  "org.apache.spark"  % "spark-core_2.10"              % "1.4.0" % "provided",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-mllib_2.10"             % "1.4.0"
  )

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 44:错误:读取的值不是对象org.apache.spark.sql.SQLContext的成员 - 44: error: value read is not a member of object org.apache.spark.sql.SQLContext 在Scala的Eclipse中org.apache.spark.sql.SQLContext需要哪个jar - which jar needed for org.apache.spark.sql.SQLContext in eclipse for scala Spark 1.3.1 SQL Lib:线程“ main”中的异常java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.implicits() - Spark 1.3.1 SQL Lib: Exception in thread “main” java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits() 值映射不是org.apache.spark.sql.Row的成员 - value map is not a member of org.apache.spark.sql.Row 错误:值cassandraFormat不是org.apache.spark.sql.DataFrameWriter的成员 - Error: value cassandraFormat is not a member of org.apache.spark.sql.DataFrameWriter 错误:值 orderBy 不是 org.apache.spark.sql.RelationalGroupedDataset 的成员 - error: value orderBy is not a member of org.apache.spark.sql.RelationalGroupedDataset value 选项不是 org.apache.spark.sql.DataFrame 的成员 - value option is not a member of org.apache.spark.sql.DataFrame 价值负责人不是org.apache.spark.sql.Row的成员 - value head is not a member of org.apache.spark.sql.Row 值 select 不是 org.apache.spark.sql.streaming.DataStreamReader 的成员 - value select is not a member of org.apache.spark.sql.streaming.DataStreamReader 值拆分不是 org.apache.spark.sql.ColumnName 的成员 - value split is not a member of org.apache.spark.sql.ColumnName
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM