[英]spark hive get external partition file location
I'm trying to get the external table file location of a partition calculated at run time.我正在尝试获取在运行时计算的分区的外部表文件位置。 Simply alter table drop wouldn't work since it's external.
简单地改变表删除是行不通的,因为它是外部的。 The closest I can get is
我能得到的最接近的是
spark.sql(s"describe $tableName partition ($partitionBy=$partitionValue)")
Try something尝试一些东西
show table extended FROM $tableSchema like $tableName partition ($partitionBy=$partitionValue)
See more here: https://spark.apache.org/docs/3.0.0/sql-ref-syntax-aux-show-table.html在此处查看更多信息: https ://spark.apache.org/docs/3.0.0/sql-ref-syntax-aux-show-table.html
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.