I'm trying to get the external table file location of a partition calculated at run time. Simply alter table drop wouldn't work since it's external. The closest I can get is
spark.sql(s"describe $tableName partition ($partitionBy=$partitionValue)")
Try something
show table extended FROM $tableSchema like $tableName partition ($partitionBy=$partitionValue)
See more here: https://spark.apache.org/docs/3.0.0/sql-ref-syntax-aux-show-table.html
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.