简体   繁体   中英

Spark Scala create external hive table not working with location as a variable

I am trying to create Hive external table from Spark application and passing location as a variable to the SQL command. It doesn't create Hive table and I don't see any errors.

 val location = "/home/data"
 hiveContext.sql(s"""CREATE EXTERNAL TABLE IF NOT EXISTS TestTable(id STRING,name STRING) PARTITIONED BY (city string)  STORED AS PARQUET LOCATION '${location}' """)

Spark only supports creating managed tables. And even then there are severe restrictions: it does not support dynamically partitioned tables.

TL;DR you can not create external tables through Spark. Spark can read them

Not sure which version had this limitations. I using Spark 1.6, Hive 1.1.

I am able to create the external table, please follow below:

var query = "CREATE EXTERNAL TABLE avro_hive_table ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'TBLPROPERTIES   ('avro.schema.url'='hdfs://localdomain/user/avro/schemas/activity.avsc')    STORED AS INPUTFORMAT    'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'    OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'    LOCATION    '/user/avro/applog_avro'"
var hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);

hiveContext.sql(query);
var df = hiveContext.sql("select count(*) from avro_hive_table");

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM