简体   繁体   中英

How to add new column in to partition by clause in Hive External Table

I have external Hive Table which is filled by spark job and partitioned by(event_date date) now I have modified the spark code and added one extra column 'country'.In earlier written data country column will have null values as it is newly added. now I want to Alter 'partitioned by' clause as partition by(event_date date,country string) how can I achieve this.Thank you!!

Please try to alter the partition using below commnad-

ALTER TABLE table_name PARTITION part_spec SET LOCATION path

part_spec:
    : (part_col_name1=val1, part_col_name2=val2, ...)

Try this databricks spark-sql language manual for alter command

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM