繁体   English   中英

无法通过Spark-SQL从启用Hive事务的表中获取结果

[英]Not able to fetch result from hive transaction enabled table through spark-sql

背景:-

  • 我正在将HDP与spark1.6.0和蜂巢1.2.1一起使用

遵循的步骤:-

创建配置单元表:-

hive>
CREATE TABLE orctest(PROD_ID bigint, CUST_ID bigint, TIME_ID timestamp, CHANNEL_ID bigint, PROMO_ID bigint, QUANTITY_SOLD decimal(10,0), AMOUNT_SOLD decimal(10,0) ) CLUSTERED BY (PROD_ID) INTO 32 BUCKETS STORED AS ORC TBLPROPERTIES ( "orc.compress"="SNAPPY", "transactional"="true" );

将记录插入orctest

hive>
insert into orctest values(1, 1, '2016-08-02 21:36:54.000000000', 1, 1, 10, 10000);

尝试从spark-shell访问orctest表

scala>
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)

val s = hiveContext.table("orctest")*

抛出异常:

16/08/02 22:06:54 INFO OrcRelation: Listing hdfs://hadoop03:8020/apps/hive/warehouse/orctest on driver
16/08/02 22:06:54 
INFO OrcRelation: Listing hdfs://hadoop03:8020/apps/hive/warehouse/orctest/delta_0000005_0000005 on driver
**java.lang.AssertionError: assertion failed**
at scala.Predef$.assert(Predef.scala:165)
at org.apache.spark.sql.execution.datasources.LogicalRelation$$anonfun$1.apply(LogicalRelation.scala:39)
at org.apache.spark.sql.execution.datasources.LogicalRelation$$anonfun$1.apply(LogicalRelation.scala:38)
at scala.Option.map(Option.scala:145)
at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:38)
at org.apache.spark.sql.execution.datasources.LogicalRelation.copy(LogicalRelation.scala:31)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.org$apache$spark$sql$hive$HiveMetastoreCatalog$$convertToOrcRelation(HiveMetastoreCatalog.scala:588)

任何帮助将不胜感激。

尝试设置: hiveContext.setConf("spark.sql.hive.convertMetastoreOrc", "false")

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM