简体   繁体   English

44:错误:读取的值不是对象org.apache.spark.sql.SQLContext的成员

[英]44: error: value read is not a member of object org.apache.spark.sql.SQLContext

I am using Spark 1.6.1, and Scala 2.10.5. 我正在使用Spark 1.6.1和Scala 2.10.5。 I am trying to read the csv file through com.databricks. 我正在尝试通过com.databricks读取csv文件。 While launching the spark-shell, I use below lines as well 在启动火花壳时,我也使用以下几行

spark-shell --packages com.databricks:spark-csv_2.10:1.5.0 --driver-class-path path to/sqljdbc4.jar, and below is the whole code spark-shell --packages com.databricks:spark-csv_2.10:1.5.0 --driver-class-path到/sqljdbc4.jar的路径,下面是整个代码

import java.util.Properties
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext


val conf = new SparkConf().setAppName("test").setMaster("local").set("spark.driver.allowMultipleContexts", "true");
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)

import sqlContext.implicits._

val df = SQLContext.read().format("com.databricks.spark.csv").option("inferScheme","true").option("header","true").load("path_to/data.csv");

I am getting below error:- 我得到以下错误:-

error: value read is not a member of object org.apache.spark.sql.SQLContext, and the "^" is pointing toward "SQLContext.read().format" in the error message. 错误:读取的值不是对象org.apache.spark.sql.SQLContext的成员,并且错误消息中的“ ^”指向“ SQLContext.read()。format”。

I did try the suggestions available in stackoverflow, as well as other sites as well. 我确实尝试了stackoverflow以及其他站点中可用的建议。 but nothing seems to be working. 但似乎没有任何效果。

SQLContext means object access - static methods in class. SQLContext表示对象访问-类中的静态方法。

You should use sqlContext variable, as methods are not static, but are in class 您应该使用sqlContext变量,因为方法不是静态的,但是在类中

So code should be: 所以代码应该是:

val df = sqlContext.read.format("com.databricks.spark.csv").option("inferScheme","true").option("header","true").load("path_to/data.csv");

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spark-SQL:值隐式不是org.apache.spark.sql.SQLContext的成员 - Spark - SQL : value implicits is not a member of org.apache.spark.sql.SQLContext 在Scala的Eclipse中org.apache.spark.sql.SQLContext需要哪个jar - which jar needed for org.apache.spark.sql.SQLContext in eclipse for scala 错误:对象DataFrame不是包org.apache.spark.sql的成员 - error: object DataFrame is not a member of package org.apache.spark.sql 错误:值cassandraFormat不是org.apache.spark.sql.DataFrameWriter的成员 - Error: value cassandraFormat is not a member of org.apache.spark.sql.DataFrameWriter 错误:值 orderBy 不是 org.apache.spark.sql.RelationalGroupedDataset 的成员 - error: value orderBy is not a member of org.apache.spark.sql.RelationalGroupedDataset Apache Spark 错误:未找到:值 sqlContext - Apache spark error: not found: value sqlContext 为什么Spark会失败并显示“值写入不是org.apache.spark.sql.DataFrameReader [错误]的成员”? - Why does Spark fail with “value write is not a member of org.apache.spark.sql.DataFrameReader [error]”? forEach Spark Scala中的错误:值选择不是org.apache.spark.sql.Row的成员 - Error in forEach Spark Scala : value select is not a member of org.apache.spark.sql.Row value read不是org.apache.spark.SparkContext的成员 - value read is not a member of org.apache.spark.SparkContext 对象 sql 不是包 org.apache.spark 的成员 - Object sql is not a member of package org.apache.spark
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM