简体   繁体   English

Node.js中稀疏数组的内存消耗

[英]Memory consumption of sparse arrays in Node.js

I have written a small program that produces arrays, which runs quite long (almost forever ;-)): 我写了一个小程序,生成数组,运行时间很长(几乎永远;-)):

var results = [];
var i = 1;

while (true) {
  console.log(i++);
  results.push([]);
}

When, instead of an empty array, I create a sparse array of length i , the program crashes quite fast: 当我创建一个长度为i的稀疏数组而不是空数组时,程序崩溃得非常快:

var results = [];
var i = 1;

while (true) {
  console.log(i);
  results.push(new Array(i++));
}

Actually I get up to i equal to 17424, then I get an error message telling me 其实我起床i等于17424,然后我得到一个错误信息,告诉我

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6

and Node.js takes me back to the console. 和Node.js带我回到控制台。 Since the only difference is that the second one produces "larger" empty arrays than the first ones, this implies that an empty sparse array of length n takes up n times the space of an empty array with length 1 . 由于唯一的区别是第二个产生比前一个更大的空数组,这意味着长度为n的空稀疏数组占用长度为1的空数组的空间的n倍。

Am I right about this (specifically to Node.js)? 我是对的(特别是Node.js)吗?

One more question: If I run 还有一个问题:如果我跑了

var results = [];
var i = 1;

while (true) {
  console.log(i);
  var temp = [];
  temp[i++] = i;
  results.push(temp);
}

then I get up to 1286175, then it again crashes with: 然后我起床到1286175,然后再次崩溃:

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6

Why does this behave differently than the other two options? 为什么这与其他两个选项的行为不同?

PS: I am using Node.js 0.12.0 to run this on OS X. PS:我使用Node.js 0.12.0在OS X上运行它。

When you declare an array with a size 声明具有大小的数组时

Array(1024);

You are doing so that it allocates space for 1024 elements. 您正在这样做,它为1024个元素分配空间。 It has to allocate this space up-front, because this form of declaring an array is an optimization stating 它必须预先分配这个空间,因为这种声明数组的形式是一种优化说明

"I need you to reserve 1024 locations so that you aren't constantly resizing the array as I push more elements onto it". “我需要你保留1024个位置,这样你就不会随着我将更多元素推到它上而不断调整阵列大小”。

As you probably know, declaring an array with simply [] still allows you to push unlimited number of elements onto it, however the array is silently being resized (most likely memcpy() ) behind the scenes to allow this behaviour. 正如您可能知道的那样,使用简单的[]声明一个数组仍允许您将无限数量的元素推送到其上,但是数组会在后台静默调整大小(很可能是memcpy() )以允许此行为。

EDIT: 编辑:

The reason you get much higher iterations in your second example, is because you are now using a sparse array. 您在第二个示例中获得更高迭代的原因是因为您现在使用的是稀疏数组。 With a sparse array doing 用稀疏数组做

var arr = []
arr[1000000] = 1;

Does not mean your array is now using 1,000,000 entries in memory. 并不意味着您的阵列现在在内存中使用1,000,000个条目。 Contrast this to a dense array 将其与密集阵列进行对比

var arr = Array(1000000);

Which explicitly tells the runtime to reserve an array that can store 1000000 entries in memory. 其中明确告诉运行时保留一个可以在内存中存储1000000个条目的数组。

Related StackOverflow question: https://stackoverflow.com/a/1510842/276949 相关的StackOverflow问题: https ://stackoverflow.com/a/1510842/276949

V8, the JS engine in Node, uses 4 bytes for each element in a seemingly empty array. V8,Node中的JS引擎,在看似空的数组中为每个元素使用4个字节。 The best way to find this out for certain is by creating empty arrays in Chrome and using the profiler to see how much additional size the array has used up. 找到这一点的最佳方法是在Chrome中创建空数组,并使用分析器查看数组已用尽的额外大小。 See https://developer.chrome.com/devtools/docs/heap-profiling for details on how you can do this... 有关如何执行此操作的详细信息,请参阅https://developer.chrome.com/devtools/docs/heap-profiling

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM