简体   繁体   中英

Drop Hive Table If Exists Error by Spark

I have Hive, MySQL and Spark. MySQL is Hive metastore_db. I follow this guide to configurate it ( http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/4.2.0/CDH4-Installation-Guide/cdh4ig_topic_18_4.html ). It is ok for me to do drop table command under hive shell environment. But when I enter into spark-shell env, I use hiveContext.hql("DROP TABLE IF EXISTS hivetesting") . And then, I get following error:

ERROR Hive: NoSuchObjectException(message:default.hivetesting table not found)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:27129)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:27097)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:27028)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:936)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:922)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:854)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy11.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:950)
......

Every command is running successfully under hive shell . Meanwhile, show tables create table src(id string) and select count(*) from src are ok under spark-shell by hiveContext. Except drop table if the table not exists. Drop table can be run successfully when the table exists.

What can I do to get this resolved?

Any help is highly appreciated.

This problem doesn't exist in Spark 1.5.1; I'm able to successfully call

hiveContext.sql('DROP TABLE IF EXISTS test_table')

Spark is a rapidly evolving platform, and things change between versions.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM