简体   繁体   中英

Specify a vaild path to the correct hive jars using $HIVE_METASTORE_JARS or change spark.sql.hive.metastore.version to 1.2.1

When i try to run spark-submit on the Jar which had HiveContext,getting the below error.

Spark-defaults.conf had

spark.sql.hive.metastore.version 0.14.0
spark.sql.hive.metastore.jars ----/external_jars/hive-metastore-0.14.0.jar
#spark.sql.hive.metastore.jars maven

I would like to use Hive Metastore version 0.14. both spark and hadoop are on diff clusters.

Can anyone helping me with resolving this one?

16/09/19 16:52:24 INFO HiveContext: default warehouse location is /apps/hive/warehouse Exception in thread "main" java.lang.IllegalArgumentException: Builtin jars can only be used when hive execution version == hive metastore version. Execution: 1.2.1 != Metastore: 0.14.0. Specify a vaild path to the correct hive jars using $HIVE_METASTORE_JARS or change spark.sql.hive.metastore.version to 1.2.1. at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:254) at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.IterableLike$cla

try

val hadoopConfig: Configuration = spark.hadoopConfiguration 
hadoopConfig.set("fs.hdfs.impl", classOf[org.apache.hadoop.hdfs.DistributedFileSystem].getNam‌​e) 
hadoopConfig.set("fs.file.impl", classOf[org.apache.hadoop.fs.LocalFileSystem].getName)

in the spark

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM