简体   繁体   English

在Spark2 Scala中找不到Hive表

[英]Unable to find Hive table in Spark2 Scala

I have the following function: 我有以下功能:

def getMet(spark: SparkSession): DataFrame = {
  val spark = SparkSession.builder().appName("AB Test").enableHiveSupport().getOrCreate()

  val dfMet = spark.sql(s"""SELECT
  10,
  cod_entrep as ID_ENTITE,
  cod_entrep_assu as ID_ENTITE_GARANTE,
  id_notification,
  objet_metier as REF_EXT_OBJET_METIER,
  case when typ_mvt = "S" then 1 else 0 end as TOP_SUPP,
  case when typ_mvt = "S" then dt_capt else null end as DT_SUPP
  FROM dev_lkr_send.pz_send_notification""")
  }

Which returns 哪个返回

exception caught: org.apache.spark.sql.AnalysisException: Table or view not found: pz_send_notification; 捕获的异常:org.apache.spark.sql.AnalysisException:未找到表或视图:pz_send_notification; line 10 pos 0 第10行pos 0

The table exists in Impala and executing the same steps on spark2-shell works fine. 该表存在于Impala中,在spark2-shell上执行相同的步骤可以正常工作。

Any help? 有什么帮助吗?

我刚刚找到了解决方案,我必须在执行命令中使用--files选项指示hive-site.xml文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM