简体   繁体   中英

Not able to register RDD as TempTable

I am using IntelliJ and trying to get data from MySql DB and then write it into Hive table. However I am not able to register my RDD to a temp table. The error is "Cannot Resolve Symbol registerTempTable".

I know that this issue is due to some imports missing but I am not able to find out which one.

I have been stuck with this issue for quite a long time and tried all the options / answers available on stack overflow.

Below is my code:

import java.sql.Driver
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.JdbcRDD
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.hive.HiveContext


object JdbcRddExample {

  def main(args: Array[String]): Unit = {
    val url = "jdbc:mysql://localhost:3306/retail_db"
    val username = "retail_dba"
    val password ="cloudera"
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)

    val hiveContext = new HiveContext(sc)
    import hiveContext.implicits._

    Class.forName("com.mysql.jdbc.Driver").newInstance



    val conf = new SparkConf().setAppName("JDBC RDD").setMaster("local[2]").set("spark.executor.memory","1g")
    val sc = new SparkContext(conf)


    val myRDD = new JdbcRDD( sc, () => DriverManager.getConnection(url,username,password) ,
      "select department_id,department_name from departments limit ?,?",
      0,999999999,1,  r => r.getString("department_id") + ", " + r.getString("department_name"))

    myRDD.registerTempTable("My_Table") // error: Not able to resolve registerTempTable



    sqlContext.sql("use my_db")
    sqlContext.sql("Create table my_db.depts (department_id INT, department_name String")

My SBT: (I believe I have imported all the artifacts)

name := "JdbcRddExample"

version := "0.1"

scalaVersion := "2.11.12"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.7.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.11.0"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "mysql" % "mysql-connector-java" % "5.1.12"
)
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

Please point me to the exact imports that I missing. Or is there an alternate way. Like I mentioned before I have tried all the solutions and nothing has worked so far.

To use Spark-sql, you probably need a DataFrame rather than a RDD, which obviously doesn't have the ability to registerTempTable .

You can quickly workaround by converting the RDD to a DataFrame, for example How to convert rdd object to dataframe in spark . But it's recommended to use SparkSql feature to read JDBC datasource, like examples here . Sample code:

val dfDepartments = sqlContext.read.format("jdbc")
  .option("url", url)
  .option("driver", "com.mysql.jdbc.Driver")
  .option("dbtable", "(select department_id,department_name from departments) t")
  .option("user", username)
  .option("password", password).load()
dfDepartments.createOrReplaceTempView("My_Table")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM