简体   繁体   中英

Does Spark Supports With Clause?

I have table employee_1 in spark with attributes id and name(with data), and another table employee_2 with same attributes, i want to load the data by increasing the id values with +1

My With Clause shown below:

WITH EXP AS (SELECT  ALIASNAME.ID+1 ID, ALIASNAME.NAME NAME FROM employee_1 ALIASNAME)
INSERT INTO TABLE employee_2 SELECT * FROM EXP; 

Steps of execution:

I have a file(with data) in HDFS location.

  1. Creating RDD based on hdfs location.
  2. RDD to Hive temp table
  3. from temp table to Hive Target (employee_2).

when i am running with test program from backend its succeeding. but data is not loading. employee_2 is empty.

Note: If you run the above with clause in Hive it will succeed and data will load. But in spark it won't in 1.6 ?

The WITH statement is not the problem, but rather the INSERT INTO statement that's causing trouble.

Here's a working example that uses the .insertInto() style instead of the "INSERT INTO" SQL:

val s = Seq((1,"foo"), (2, "bar"))
s: Seq[(Int, String)] = List((1,foo), (2,bar))
val df = s.toDF("id", "name")
df.registerTempTable("df")
sql("CREATE TABLE edf_final (id int, name string)")
val e = sql("WITH edf AS (SELECT id+1, name FROM df cook) SELECT * FROM edf")
e.insertInto("edf_final")

Another option is to use the df.write.mode("append").saveAsTable("edf_final") style.

Relevant SO: "INSERT INTO ..." with SparkSQL HiveContext

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM