简体   繁体   English

在 Spark 中,我无法使用 hive 支持创建表

[英]In Spark I am not able to create a table using hive support

I'm trying follow the examples from我正在尝试遵循以下示例

Not able to find spark-warehouse directory 找不到火花仓库目录

to create a table using hive support but I keep getting this error message:使用 hive 支持创建表,但我不断收到此错误消息:

org.apache.spark.sql.AnalysisException: Hive support is required to CREATE Hive TABLE (AS SELECT);
'CreateTable `default`.`sales`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExists
+- Project [num#2]
   +- SubqueryAlias test
      +- View (`test`, [num#2])
         +- Project [1 AS num#2]
            +- OneRowRelation

  at org.apache.spark.sql.errors.QueryCompilationErrors$.ddlWithoutHiveSupportEnabledError(QueryCompilationErrors.scala:1270)
  at org.apache.spark.sql.execution.datasources.HiveOnlyCheck$.$anonfun$apply$4(rules.scala:438)

Below is my code下面是我的代码

import org.apache.spark.sql.SparkSession
import org.apache.spark.SparkContext
import java.io.File

val sc = new SparkContext("local[*]", "LoadFiles1") 
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val warehouseLocation =  new File("spark-warehouse").getAbsolutePath
val spark = SparkSession
    .builder()
    .appName("Spark Hive Example")
    .config("spark.sql.warehouse.dir", warehouseLocation)
    .enableHiveSupport()
    .getOrCreate()    

spark.sql("select 1 as num ").registerTempTable("test")
spark.sql("create table sales as select * from test")

It seems to me that you are missing a parameter.在我看来,您缺少一个参数。 It worked for me then i add ("spark.sql.catalogImplementation","hive") in the spark config.它对我有用,然后我在 spark 配置中添加 ("spark.sql.catalogImplementation","hive")。

With such parameters, the code you wanted to run worked in my pc:有了这样的参数,你想运行的代码就可以在我的电脑上运行:

  val conf = new SparkConf()
    .set("spark.driver.bindAddress", "127.0.0.1")
    .set("spark.sql.warehouse.dir", "/Users/19658296/csp-fp-snaphot/library/src/test/resources/warehouseLocation")
    .set("spark.sql.catalogImplementation","hive")

  val spark = SparkSession.builder.master("local[*]")
    .appName("testGetSnp")
    .config(conf)
    .getOrCreate
  spark.sparkContext.hadoopConfiguration.set("fs.defaultFS", "file:///")

Also try to reed this answer it's look releted.也尝试重新回答这个答案,它看起来很重要。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM