简体   繁体   English

无法删除蜂巢表

[英]Not able to drop hive table

I upgraded my CDH from 5.0.0 beta to 5.2.1 using Cloudera Manager. 我使用Cloudera Manager将CDH从5.0.0 Beta升级到5.2.1。 All services are running properly except hive metastore. 除配置单元metastore外,所有服务均正常运行。 I'm using MySQL for the hive metastore. 我正在将MySQL用于配置单元metastore。

I'm not able to drop a table from hive when I fire a query like: 当我发出如下查询时,我无法从蜂巢中删除表:

`drop table test;`

I get following error. 我收到以下错误。

hive> drop table test;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing query
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
        at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
        at org.apache.hadoop.hive.metastore.ObjectStore.deleteTableColumnStatistics(ObjectStore.java:6030)
        at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:813)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
        at com.sun.proxy.$Proxy0.dropTable(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1386)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1525)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:106)
        at com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:8072)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:8056)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502)
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
    NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column       AO.BIG_DECIMAL_HIGH_VALUE in field list

Reference link 参考链接

http://community.cloudera.com/t5/Interactive-Short-cycle-SQL/CDH-upgrade-from-4-7-to-CDH-5-2-hive-metastore-issue/td-p/20626 http://community.cloudera.com/t5/Interactive-Short-cycle-SQL/CDH-upgrade-from-4-7-to-CDH-5-2-hive-metastore-issue/td-p/20626

Can anyone suggest something? 有人可以建议吗?

I had exactly the same issue. 我有完全一样的问题。 I tried to do what was suggested in the reference link, but I still had the same problem (even when I fixed all the errors I could see when running the scripts). 我尝试执行参考链接中建议的操作,但是仍然存在相同的问题(即使我修复了运行脚本时看到的所有错误)。

Finally, I looked for that BIG_DECIMAL_HIGH_VALUE in the Hive metastore scripts, and I saw it in every hive-schema-0.1*.0.mysql.sql in the following create statements: 最后,我在Hive Metastore脚本中查找了BIG_DECIMAL_HIGH_VALUE,并在以下create语句的每个hive-schema-0.1 * .0.mysql.sql中看到了它:

CREATE TABLE IF NOT EXISTS `TAB_COL_STATS` (
 `CS_ID` bigint(20) NOT NULL,
 `DB_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `TABLE_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `COLUMN_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `COLUMN_TYPE` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `TBL_ID` bigint(20) NOT NULL,
 `LONG_LOW_VALUE` bigint(20),
 `LONG_HIGH_VALUE` bigint(20),
 `DOUBLE_HIGH_VALUE` double(53,4),
 `DOUBLE_LOW_VALUE` double(53,4),
 `BIG_DECIMAL_LOW_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
 `BIG_DECIMAL_HIGH_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
 `NUM_NULLS` bigint(20) NOT NULL,
 `NUM_DISTINCTS` bigint(20),
 `AVG_COL_LEN` double(53,4),
 `MAX_COL_LEN` bigint(20),
 `NUM_TRUES` bigint(20),
 `NUM_FALSES` bigint(20),
 `LAST_ANALYZED` bigint(20) NOT NULL,
  PRIMARY KEY (`CS_ID`),
  CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

CREATE TABLE IF NOT EXISTS `PART_COL_STATS` (
 `CS_ID` bigint(20) NOT NULL,
 `DB_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `TABLE_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `PARTITION_NAME` varchar(767) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `COLUMN_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `COLUMN_TYPE` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
 `PART_ID` bigint(20) NOT NULL,
 `LONG_LOW_VALUE` bigint(20),
 `LONG_HIGH_VALUE` bigint(20),
 `DOUBLE_HIGH_VALUE` double(53,4),
 `DOUBLE_LOW_VALUE` double(53,4),
 `BIG_DECIMAL_LOW_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
 `BIG_DECIMAL_HIGH_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
 `NUM_NULLS` bigint(20) NOT NULL,
 `NUM_DISTINCTS` bigint(20),
 `AVG_COL_LEN` double(53,4),
 `MAX_COL_LEN` bigint(20),
 `NUM_TRUES` bigint(20),
 `NUM_FALSES` bigint(20),
 `LAST_ANALYZED` bigint(20) NOT NULL,
  PRIMARY KEY (`CS_ID`),
  CONSTRAINT `PART_COL_STATS_FK` FOREIGN KEY (`PART_ID`) REFERENCES `PARTITIONS` (`PART_ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

When I checked in the Hive metastore database for those tables I got something different: 当我在Hive Metastore数据库中检查那些表时,得到了一些不同的东西:

mysql> desc TAB_COL_STATS;
+-------------------+--------------+------+-----+---------+-------+
| Field             | Type         | Null | Key | Default | Extra |
+-------------------+--------------+------+-----+---------+-------+
| CS_ID             | bigint(20)   | NO   | PRI | NULL    |       |
| AVG_COL_LEN       | double       | YES  |     | NULL    |       |
| COLUMN_NAME       | varchar(128) | NO   |     | NULL    |       |
| COLUMN_TYPE       | varchar(128) | NO   |     | NULL    |       |
| DB_NAME           | varchar(128) | NO   |     | NULL    |       |
| DOUBLE_HIGH_VALUE | double       | YES  |     | NULL    |       |
| DOUBLE_LOW_VALUE  | double       | YES  |     | NULL    |       |
| LAST_ANALYZED     | bigint(20)   | NO   |     | NULL    |       |
| LONG_HIGH_VALUE   | bigint(20)   | YES  |     | NULL    |       |
| LONG_LOW_VALUE    | bigint(20)   | YES  |     | NULL    |       |
| MAX_COL_LEN       | bigint(20)   | YES  |     | NULL    |       |
| NUM_DISTINCTS     | bigint(20)   | YES  |     | NULL    |       |
| NUM_FALSES        | bigint(20)   | YES  |     | NULL    |       |
| NUM_NULLS         | bigint(20)   | NO   |     | NULL    |       |
| NUM_TRUES         | bigint(20)   | YES  |     | NULL    |       |
| TBL_ID            | bigint(20)   | YES  | MUL | NULL    |       |
| TABLE_NAME        | varchar(128) | NO   |     | NULL    |       |
+-------------------+--------------+------+-----+---------+-------+
17 rows in set (0.00 sec)

mysql> desc PART_COL_STATS;
+-------------------+--------------+------+-----+---------+-------+
| Field             | Type         | Null | Key | Default | Extra |
+-------------------+--------------+------+-----+---------+-------+
| CS_ID             | bigint(20)   | NO   | PRI | NULL    |       |
| AVG_COL_LEN       | double       | YES  |     | NULL    |       |
| COLUMN_NAME       | varchar(128) | NO   |     | NULL    |       |
| COLUMN_TYPE       | varchar(128) | NO   |     | NULL    |       |
| DB_NAME           | varchar(128) | NO   |     | NULL    |       |
| DOUBLE_HIGH_VALUE | double       | YES  |     | NULL    |       |
| DOUBLE_LOW_VALUE  | double       | YES  |     | NULL    |       |
| LAST_ANALYZED     | bigint(20)   | NO   |     | NULL    |       |
| LONG_HIGH_VALUE   | bigint(20)   | YES  |     | NULL    |       |
| LONG_LOW_VALUE    | bigint(20)   | YES  |     | NULL    |       |
| MAX_COL_LEN       | bigint(20)   | YES  |     | NULL    |       |
| NUM_DISTINCTS     | bigint(20)   | YES  |     | NULL    |       |
| NUM_FALSES        | bigint(20)   | YES  |     | NULL    |       |
| NUM_NULLS         | bigint(20)   | NO   |     | NULL    |       |
| NUM_TRUES         | bigint(20)   | YES  |     | NULL    |       |
| PART_ID           | bigint(20)   | YES  | MUL | NULL    |       |
| PARTITION_NAME    | varchar(767) | NO   |     | NULL    |       |
| TABLE_NAME        | varchar(128) | NO   |     | NULL    |       |
+-------------------+--------------+------+-----+---------+-------+

So since both tables were empty I just dropped and created them again, with the create statements listed in the hive-schema-0.13.0.mysql.sql file. 因此,由于两个表都是空的,因此我仅使用hive-schema-0.13.0.mysql.sql文件中列出的create语句删除并再次创建了它们。

After doing so I could drop the Hive tables again. 这样做之后,我可以再次删除Hive表。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM