簡體   English   中英

Hive合並命令在Spark HiveContext中不起作用

[英]Hive Merge command is not working in Spark HiveContext

我在1.6.3 spark版本中使用Spark HiveContext運行hive merge命令,但由於以下錯誤而失敗。

2017-09-11 18:30:33 Driver [INFO ] ParseDriver - Parse Completed
2017-09-11 18:30:34 Driver [INFO ] ParseDriver - Parsing command: MERGE INTO emp_with_orc AS T USING SOURCE_TABLE AS S 
ON T.id = S.id 
WHEN MATCHED AND (S.operation = 1) THEN UPDATE SET a = S.a,b = S.b 
WHEN MATCHED AND (S.operation = 2) THEN DELETE 
WHEN NOT MATCHED THEN INSERT VALUES (S.id, S.a, S.b)
2017-09-11 18:30:34 Driver [ERROR] HiveWriter - Error while executing the merge query.
org.apache.spark.sql.AnalysisException: cannot recognize input near 'MERGE' 'INTO' 'emp_with_orc'; line 1 pos 0
    at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:318)
    at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
    at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
    at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
    at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
    at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)

我不確定spark的HiveContext是否支持ACID事務合並命令。

任何幫助,將不勝感激。

要使用MERGE操作,您將需要通過HIVE JDBC執行該操作,因為Spark SQL到目前為止還不支持MERGE。

Spark不支持UPDATESDELETES因此異常是一種預期的行為。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM