简体   繁体   中英

Creating hive tables in S3 bucket using databricks

I want to set a global location for all the tables I create beforehand Ex:

Create table table name stored as parquet location 's3_bucket/db_name.db'

I am doing this in each table that I create,

I am looking for a setting like

use database 's3_database_address'

So that I can eliminate the repeatative cmds

Every managed table in Hive will by default be created in warehouse directory. You don't need to give location unless you want them to point to somewhere else than the default warehouse location. Read this: http://www.devgrok.com/2018/12/using-s3-hive-metastore-with-emr.html?m=1

Creating a database supports location argument. If you then USE DATABASE {} , new tables will be created under the custom location of the database, not the default one.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM