简体   繁体   English

Spark with Hive:找不到表或视图

[英]Spark with Hive : Table or view not found

ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: "DB_X"."table_Y" ApplicationMaster:用户类抛出异常:org.apache.spark.sql.AnalysisException:找不到表或视图:“DB_X”.“table_Y”

Spark session :火花会话:

  SparkSession
    .builder()          
    .appName(appName)
    .config("spark.sql.warehouse.dir", "/apps/hive/warehouse")
    .enableHiveSupport()
    .getOrCreate();

Hive warehouse directory in hive-site.xml : /apps/hive/warehouse/ hive-site.xml 中的 Hive 仓库目录:/apps/hive/warehouse/

hadoop fs -ls /apps/hive/warehouse/
drwxrwxrwx   - root hadoop          0 2018-09-03 11:22 /apps/hive/warehouse/DB_X.db


hadoop fs -ls /apps/hive/warehouse/DB_X.db
none

Error is throw here :错误在这里抛出:

spark
   .read()
   .table("DB_X.table_Y");

in java :在 Java 中:

spark.sql("show databases").show()
default

in spark-shell interactive :在 spark-shell 交互中:

spark.sql("show databases").show()
default
DB_X

show create table table_Y :显示创建表 table_Y :

CREATE EXTERNAL TABLE `table_Y`(
...
PARTITIONED BY (
  `partition` string COMMENT '')
...
    location '/data/kafka-connect/topics/table_Y'

hadoop files : hadoop文件:

hadoop fs -ls /data/kafka-connect/topics/table_Y
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=1

hadoop fs -ls data/kafka-connect/topics/table_Y/partition=0
-rw-r--r--   3 kafka hdfs     102388 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001823382+0001824381.avro
-rw-r--r--   3 kafka hdfs     102147 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001824382+0001825381.avro
...

everythings work fine in spark-shell or hive-shell在 spark-shell 或 hive-shell 中一切正常

hive-site.xml from hive conf is copied in spark2/conf hive conf 中的 hive-site.xml 被复制到 spark2/conf

using HDP 2.6.4.0-91 with spark 2.2使用 HDP 2.6.4.0-91 和 spark 2.2

any help ?有什么帮助吗?

使用 HA 名称重新定位表可以解决问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM