简体   繁体   English

通过Spark创建的Hive表在HUE / Hive GUI中不可见

[英]Hive table created with Spark not visible from HUE / Hive GUI

I am creating a hive table from scala using the next code: 我使用下面的代码从scala创建一个hive表:

  val spark = SparkSession
    .builder()
    .appName("self service")
    .enableHiveSupport()
    .master("local")
    .getOrCreate()

  spark.sql("CREATE TABLE default.TEST_TABLE (C1 INT)")

The table must be successfully created, because if I run this code twice I receive an error saying the table already exists. 该表必须成功创建,因为如果我两次运行此代码,则会收到一条错误消息,指出该表已存在。

However, when I try to access this table from the GUI (HUE), I cannot see any table in Hive, so it seems it's being saved in a different path that the used by Hive in HUE to get this information. 但是,当我尝试从GUI(HUE)访问此表时,我看不到Hive中的任何表,因此似乎它被保存在Hive在HUE中用于获取此信息的不同路径中。

Do you know what should I do to see the tables I create from my code from the HUE/Hive web GUI? 您知道如何查看从HUE / Hive Web GUI的代码中创建的表吗?

Any help will be very appreciated. 任何帮助将不胜感激。 Thank you very much. 非常感谢你。

I seems to me you have not added hive-site.xml to the proper path. 在我看来,您尚未将hive-site.xml添加到正确的路径。 Hive-site has the properties that spark need to connect successfully with Hive and you should add this to the directory Hive-site具有触发与Hive成功连接所需的属性,应将其添加到目录中

SPARK_HOME/conf/ SPARK_HOME / conf /

You can also add this file by using spark.driver.extraClassPath and give the directory where this file exists. 您还可以使用spark.driver.extraClassPath添加此文件,并提供该文件所在的目录。 For example in pyspark submit 例如在pyspark提交

/usr/bin/spark2-submit \
--conf spark.driver.extraClassPath=/../ Directory with Hive-site.xml / \
--master yarn --deploy-mode client --driver-memory nG --executor-memory nG \  
--executor-cores n myScript.py

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM