简体   繁体   中英

Insert record to Hive table from Spark

I am trying to insert records into Hive table from Spark ( Scala ). The values that I am trying to insert are structured data that come as a case class.

Here is what I have:

case class rcd(
  request: Int,
  extract_id: Int)
}

Then:

DF.as[rcd].take(DF.count.toInt).foreach(e => { // DF is a dataframe that contains data

if <condition> {
       ss.sql(s"""INSERT INTO mytable select $e.request ,'$e.extract_id'""")
}

I am getting an error:

Undefined function: 'rcd'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.;

Also, what is the 'default' that Spark mentioned in the previous message

Any comments will be appreciated

Did you register your udf function?

If not please register the udf function with the SQLContext's UDF Registry.

sqlContext.udf.register("RCD", rcd)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM