[英]44: error: value read is not a member of object org.apache.spark.sql.SQLContext
I am using Spark 1.6.1, and Scala 2.10.5. 我正在使用Spark 1.6.1和Scala 2.10.5。 I am trying to read the csv file through com.databricks.
我正在尝试通过com.databricks读取csv文件。 While launching the spark-shell, I use below lines as well
在启动火花壳时,我也使用以下几行
spark-shell --packages com.databricks:spark-csv_2.10:1.5.0 --driver-class-path path to/sqljdbc4.jar, and below is the whole code spark-shell --packages com.databricks:spark-csv_2.10:1.5.0 --driver-class-path到/sqljdbc4.jar的路径,下面是整个代码
import java.util.Properties
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext
val conf = new SparkConf().setAppName("test").setMaster("local").set("spark.driver.allowMultipleContexts", "true");
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
val df = SQLContext.read().format("com.databricks.spark.csv").option("inferScheme","true").option("header","true").load("path_to/data.csv");
I am getting below error:- 我得到以下错误:-
error: value read is not a member of object org.apache.spark.sql.SQLContext, and the "^" is pointing toward "SQLContext.read().format" in the error message. 错误:读取的值不是对象org.apache.spark.sql.SQLContext的成员,并且错误消息中的“ ^”指向“ SQLContext.read()。format”。
I did try the suggestions available in stackoverflow, as well as other sites as well. 我确实尝试了stackoverflow以及其他站点中可用的建议。 but nothing seems to be working.
但似乎没有任何效果。
SQLContext
means object access - static methods in class. SQLContext
表示对象访问-类中的静态方法。
You should use sqlContext
variable, as methods are not static, but are in class 您应该使用
sqlContext
变量,因为方法不是静态的,但是在类中
So code should be: 所以代码应该是:
val df = sqlContext.read.format("com.databricks.spark.csv").option("inferScheme","true").option("header","true").load("path_to/data.csv");
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.