![](/img/trans.png)
[英]Unable to instantiate external Hive Metastore in Postgres / driver not found in classpath
[英]Exception while running hive support on Spark: Unable to instantiate SparkSession with Hive support because Hive classes are not found
您好,我正在尝试将 Hive 与 spark 一起使用,但是当我尝试执行时,它会显示此错误
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
这是我的源代码
package com.spark.hiveconnect
import java.io.File
import org.apache.spark.sql.{Row, SaveMode, SparkSession}
object sourceToHIve {
case class Record(key: Int, value: String)
def main(args: Array[String]){
val warehouseLocation = new File("spark-warehouse").getAbsolutePath
val spark = SparkSession
.builder()
.appName("Spark Hive Example")
.config("spark.sql.warehouse.dir", warehouseLocation)
.enableHiveSupport()
.getOrCreate()
import spark.implicits._
import spark.sql
sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING) USING hive")
sql("LOAD DATA LOCAL INPATH '/usr/local/spark3/examples/src/main/resources/kv1.txt' INTO TABLE src")
sql("SELECT * FROM src").show()
spark.close()
}
}
这是我的 build.sbt 文件。
name := "SparkHive"
version := "0.1"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
而且我还在控制台中运行了 hive。 谁能帮我这个? 谢谢你。
尝试添加
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.5"
主要问题是 class "org.apache.hadoop.Z8A4AC216FB230DA30834DE641BZConf."D 无法加载。 您可以插入以下代码进行测试。
Class.forName("org.apache.hadoop.hive.conf.HiveConf",true,
Thread.currentThread().getContextClassLoader);
并且在这一行会发生错误。
确切地说,根本问题是您的pom可能不支持Spark上的hive。 您可以检查以下依赖项。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.4.3</version>
</dependency>
class “org.apache.hadoop.Z8A4AC216FB230DA3834DE641B3E5D0FConfyZ.conf 位于”
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.