简体   繁体   English

将展开内存传输到存储内存失败

[英]Transferring unroll memory to storage memory failed

I am getting this java.lang.AssertionError with spark.我得到这个 java.lang.AssertionError 与火花。 The error does not really explain what is causing this error ( at least to me).该错误并没有真正解释导致此错误的原因(至少对我而言)。 Any help regarding what is causing this error and steps to fix this would be helpful.有关导致此错误的原因和解决此问题的步骤的任何帮助都会有所帮助。

在此处输入图片说明

Spark has to deserialize your data first before it's usable by the application, and this deserialized data is referred to as "unroll memory." Spark 必须先反序列化您的数据,然后才能被应用程序使用,这种反序列化的数据称为“展开内存”。 In your case, you likely lack sufficient RAM on your executors to fit the fully deserialized data.在您的情况下,您的执行程序可能缺乏足够的 RAM 来容纳完全反序列化的数据。 From the source code:从源代码:

There are two reasons for store failed: First, the block is partially-unrolled;存储失败的原因有两个:一是块部分展开; second, the block is entirely unrolled and the actual stored data size is larger than reserved, but we can't request extra memory第二,块完全展开,实际存储的数据大小大于保留,但我们不能请求额外的内存

https://github.com/apache/spark/blob/9628aca68ba0821b8f3fa934ed4872cabb2a5d7d/core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala#L260 https://github.com/apache/spark/blob/9628aca68ba0821b8f3fa934ed4872cabb2a5d7d/core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala#L260

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM