簡體   English   中英

Shark中的無效緩存類型異常

[英]Invalid cache type exception in Shark

我正在嘗試在shark-0.8.0創建一個緩存的表。 根據文檔( https://github.com/amplab/shark/wiki/Shark-User-Guide ),我創建了如下表:

CREATE TABLE mydata_cached (
  artist string,
  title string ,
    track_id string,
    similars array<array<string>>,
    tags array<array<string>>
)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
TBLPROPERTIES('shark.cache' = 'MEMORY');

該表已創建,我能夠使用LOAD DATA命令LOAD DATA 但是,當我嘗試查詢表時,即使是SELECT COUNT(1)語句也會失敗,並顯示以下錯誤:

shark> select count(1) from mydata_cached;                                                
shark.memstore2.CacheType$InvalidCacheTypeException: Invalid string representation of cache type MEMORY
    at shark.memstore2.CacheType$.fromString(CacheType.scala:48)
    at shark.execution.TableScanOperator.execute(TableScanOperator.scala:119)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at org.apache.hadoop.hive.ql.exec.GroupByPostShuffleOperator.execute(GroupByPostShuffleOperator.scala:194)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.FileSinkOperator.execute(FileSinkOperator.scala:120)
    at shark.execution.SparkTask.execute(SparkTask.scala:101)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
    at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
    at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
    at shark.SharkCliDriver.main(SharkCliDriver.scala)
FAILED: Execution Error, return code -101 from shark.execution.SparkTask

根據GitHub( https://github.com/amplab/shark/blob/master/src/main/scala/shark/memstore2/CacheType.scala )中的代碼, MEMORY選項是有效的。 我也嘗試了MEMORY_ONLY選項,它給了我同樣的錯誤。 對這里出什么問題有任何建議或想法嗎?

謝謝,TM

需要是:

TBLPROPERTIES('shark.cache' = 'MEMORY_ONLY')

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM