[英]Hive metastore location
Spark 1.3.1 supports Hive Sql.When I type show tables
, it returns ABCD
- 4 tables. Spark 1.3.1支持Hive Sql,当我输入
show tables
,它将返回ABCD
-4表。 I have a question that where Spark stores my 4 tables? 我有一个问题,Spark将我的4个表存储在哪里? is it
hive.metastore
location? 是
hive.metastore
位置吗? I have deleted hive.metastore
location but show tables
still shows ABCD
我已删除
hive.metastore
位置,但show tables
仍显示ABCD
I saw this in documentation : "Users who do not have an existing Hive deployment can still create a HiveContext. When not configured by the hive-site.xml, the context automatically creates metastore_db and warehouse in the current directory." 我在文档中看到了这一点:“没有现有Hive部署的用户仍然可以创建HiveContext。如果未由hive-site.xml配置,则上下文会自动在当前目录中创建metastore_db和仓库。”
So if you do not have any custom configuration it should be in your repo under warehouse. 因此,如果您没有任何自定义配置,则应将其放在仓库中的仓库中。 You can change it like this :
您可以这样更改:
val sparkConf = new SparkConf()
val sparkContext = new SparkContext(sparkConf)
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
hiveContext.setConf("hive.metastore.warehouse.dir", $YOUR_LOCATION)
I also saw this in the doc : "When working with Hive one must construct a HiveContext, which inherits from SQLContext, and adds support for finding tables in in the MetaStore". 我在文档中也看到了这一点:“使用Hive时,必须构造一个HiveContext,该继承自SQLContext,并增加了对在MetaStore中查找表的支持”。
So I don't really understand how you could still see your table after deleted your metastore, maybe it was somewhere in memory. 因此,我不太了解删除元存储后仍如何查看表,也许它在内存中。
Edit : 编辑:
What I understood of this : 我对此的了解:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.