简体   繁体   中英

Correct hive metastore accessible in spark-shell but not spark-submit file.jar

When I run

spark-shell

in linux shell then try:

spark.sql("show databases").show()

I get the correct list of databases (because I am connected to correct metastore). Now, when I submit my jar with the following code:

Submit via -

spark-submit file.jar

Jar code -

SparkConf conf = new SparkConf().setAppName("test");
SparkSession spark = SparkSession
            .builder()
            .config(conf)
            .enableHiveSupport()
            .getOrCreate();
spark.sql("show databases").show();

My only database listed is default, so it is connected to the wrong hive metastore. I also tried adding .config("hive.metastore.uris", "thrift://localhost:9083") to my spark object, but same result. That uri is the same as is /etc/spark/conf/hive-site.xml which is all correct.

How can I fix this?

请尝试将--files /etc/hive/conf/hive-site.xml添加到您的 spark-submit 命令。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM