简体   繁体   English

Hive基于Spark临时表创建分区表

[英]Hive create partitioned table based on Spark temporary table

I have a Spark temporary table spark_tmp_view with DATE_KEY column.我有一个带有DATE_KEY列的 Spark 临时表spark_tmp_view I am trying to create a Hive table (without writing the temp table to a parquet location. What I have tried to run is spark.sql("CREATE EXTERNAL TABLE IF NOT EXISTS mydb.result AS SELECT * FROM spark_tmp_view PARTITIONED BY(DATE_KEY DATE)")我正在尝试创建一个 Hive 表(不将临时表写入镶木地板位置。我尝试运行的是spark.sql("CREATE EXTERNAL TABLE IF NOT EXISTS mydb.result AS SELECT * FROM spark_tmp_view PARTITIONED BY(DATE_KEY DATE)")

The error I got is mismatched input 'BY' expecting <EOF> I tried to search but still haven't been able to figure out the how to do it from a Spark app, and how to insert data after.我得到的错误是mismatched input 'BY' expecting <EOF>我试图搜索但仍然无法弄清楚如何从 Spark 应用程序中执行此操作,以及如何在之后插入数据。 Could someone please help?有人可以帮忙吗? Many thanks.非常感谢。

PARTITIONED BY是正在创建的表定义的一部分,因此它应该在...AS SELECT...之前,请参阅Spark SQL 语法

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM