简体   繁体   中英

spark hive get external partition file location

I'm trying to get the external table file location of a partition calculated at run time. Simply alter table drop wouldn't work since it's external. The closest I can get is

 spark.sql(s"describe $tableName partition ($partitionBy=$partitionValue)")
But this would fail when partitionValue is of type timestamp and directly toString before calling the above function. Is there a way to use the same function where spark.write.saveastable() use to create the file path? or is there a way to get the location of the data by partition at runtime?

Try something

show table extended FROM $tableSchema like $tableName partition ($partitionBy=$partitionValue)

See more here: https://spark.apache.org/docs/3.0.0/sql-ref-syntax-aux-show-table.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM