简体   繁体   中英

value toDF is not a member org.apache.spark.rdd.RDD

def main(args: Array[String]) {
    val conf = new SparkConf().setMaster("local").setAppName("test")
    val sc = new SparkContext(conf)
    //require spark sql environment
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext.implicits._
    val df1 = sc.makeRDD(1 to 5).map(i => (i, i * 2)).toDF("single", "double")
    sc.stop()
  }

I have written "import sqlContext.implicits._"; However it still does not work. It is right in spark-shell. Why it is not right in this situation? I have see many other methods to translate rdd to data frame, but most of my code has been written as toDF(). How to do to make toDF work? the error:

Error:(25, 55) value toDF is not a member of org.apache.spark.rdd.RDD[(Int, Int)]
val df1 = sc.makeRDD(1 to 5).map(i => (i, i * 2)).toDF("single", "double")
                                                  ^

toDF() has been added in Spark version 1.3, you must be using an older version (lesser than 1.3) of Spark, that's why you are getting this error.
To resolve this, use Spark version 1.3 or above.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM