[英]value toDF is not a member org.apache.spark.rdd.RDD
def main(args: Array[String]) {
val conf = new SparkConf().setMaster("local").setAppName("test")
val sc = new SparkContext(conf)
//require spark sql environment
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val df1 = sc.makeRDD(1 to 5).map(i => (i, i * 2)).toDF("single", "double")
sc.stop()
}
I have written "import sqlContext.implicits._"; 我写过“import sqlContext.implicits._”; However it still does not work.
但它仍然无效。 It is right in spark-shell.
它恰好在火花壳中。 Why it is not right in this situation?
为什么在这种情况下不对? I have see many other methods to translate rdd to data frame, but most of my code has been written as toDF().
我已经看到许多其他方法将rdd转换为数据帧,但我的大部分代码都是用toDF()编写的。 How to do to make toDF work?
如何使toDF工作? the error:
错误:
Error:(25, 55) value toDF is not a member of org.apache.spark.rdd.RDD[(Int, Int)]
val df1 = sc.makeRDD(1 to 5).map(i => (i, i * 2)).toDF("single", "double")
^
toDF() has been added in Spark version 1.3, you must be using an older version (lesser than 1.3) of Spark, that's why you are getting this error. 在Spark 1.3版本中添加了toDF(),你必须使用Spark的旧版本(小于1.3),这就是你收到此错误的原因。
To resolve this, use Spark version 1.3 or above. 要解决此问题,请使用Spark 1.3或更高版本。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.