简体   繁体   English

Node.js Map 中的最大条目数?

[英]Maximum number of entries in Node.js Map?

I was making a large Map in Node.js v11.9.0 and it kept failing with "FATAL ERROR: invalid table size Allocation failed - JavaScript heap out of memory".我在 Node.js v11.9.0 中制作了一个大的Map并且它一直失败并显示“致命错误:无效的表大小分配失败 - JavaScript 堆内存不足”。 My map's keys and values shouldn't be getting anywhere near the size of Node's heap size, so I tried just making a map and inserting numeric keys and values into it:我的地图的键和值不应该接近 Node 的堆大小,所以我尝试制作一个 map 并向其中插入数字键和值:

var N = Math.pow(2, 26);
var map = new Map();
for (var i = 0; i < N; i++) {
  map.set(i, i + 1);
  if (i % 1e5 === 0) { console.log(i / 1e6); }
}

This program crashes Node after inserting roughly 16.6 million entries.该程序在插入大约 1660 万个条目后使 Node 崩溃。 That number seemed suspiciously close to 2^24, so replacing the logging above with if (i > 16777200) { console.log(i); }这个数字似乎可疑地接近 2^24,所以用if (i > 16777200) { console.log(i); }替换上面的日志记录if (i > 16777200) { console.log(i); } , I see that the program crashes immediately after successfully printing "16777215", which is one less than 2^24. if (i > 16777200) { console.log(i); } ,我看到程序在成功打印“16777215”后立即崩溃,它比 2^24 小一。

Question.问题。 Is there a documented limit on the number of entries in Node's Map close to 2^24? Node 的Map中的条目数是否有接近 2^24 的记录限制? Is there any way to raise that limit?有什么办法可以提高这个限制吗?

(NB Running Node as node --max-old-space-size=4096 doesn't prevent the crash, since Node is using far less than 4 GB RAM.) (注意,以node --max-old-space-size=4096运行 Node 并不能防止崩溃,因为 Node 使用的 RAM 远少于 4 GB。)

(NB 2. I don't think this is a hash collision issue since in my actual code, the map contains (short-ish) strings rather than numbers.) (注意 2。我认为这不是 hash 冲突问题,因为在我的实际代码中,map 包含(简短的)字符串而不是数字。)

(NB 3. Running the above programs in Firefox's JavaScript Console does not kill Firefox–Firefox keeps adding entries well past 30 million. However, Chrome crashes just like Node. So this is likely a V8 limitation.) (注意 3。在 Firefox 的 JavaScript 控制台中运行上述程序不会杀死 Firefox – Firefox 不断添加超过 3000 万的条目。但是,Chrome 就像 Node 一样崩溃。所以这可能是 V8 的限制。)

V8 developer here. V8开发人员在这里。 I can confirm that 2^24 is the maximum number of entries in a Map . 我可以确认2 ^ 24是Map的最大条目数。 That's not a bug, it's just the implementation-defined limit. 这不是一个错误,它只是实现定义的限制。

The limit is determined by: 限制由以下因素决定:

  • The FixedArray backing store of the Map has a maximum size of 1GB (independent of the overall heap size limit) MapFixedArray后备存储的最大大小为1GB(独立于整个堆大小限制)
  • On a 64-bit system that means 1GB / 8B = 2^30 / 2^3 = 2^27 ~= 134M maximum elements per FixedArray 在64位系统上,每个FixedArray表示1GB / 8B = 2 ^ FixedArray ^ 3 = 2 ^ 27~ = 134M最大元素
  • A Map needs 3 elements per entry (key, value, next bucket link), and has a maximum load factor of 50% (to avoid the slowdown caused by many bucket collisions), and its capacity must be a power of 2. 2^27 / (3 * 2) rounded down to the next power of 2 is 2^24, which is the limit you observe. Map每个条目需要3个元素(密钥,值,下一个桶链接),最大负载率为50%(以避免因许多桶冲突导致的速度减慢),并且其容量必须是2的幂.2 ^ 27 /(3 * 2)向下舍入到2的下一个幂是2 ^ 24,这是你观察到的极限。

FWIW, there are limits to everything: besides the maximum heap size, there's a maximum String length, a maximum Array length, a maximum ArrayBuffer length, a maximum BigInt size, a maximum stack size, etc. Any one of those limits is potentially debatable, and sometimes it makes sense to raise them, but the limits as such will remain. FWIW,所有内容都有限制:除了最大堆大小外,还有最大String长度,最大Array长度,最大ArrayBuffer长度,最大BigInt大小,最大堆栈大小等。这些限制中的任何一个都有可能引起争议有时提高它们是有意义的,但这样的限制仍然存在。 Off the top of my head I don't know what it would take to bump this particular limit by, say, a factor of two -- and I also don't know whether a factor of two would be enough to satisfy your expectations. 我不知道如何将这个特定的限制提高到两倍 - 而且我也不知道两个因子是否足以满足您的期望。

What's interesting is if you change your code to create two Map objects and insert into them simultaneously, they both crash at exactly the same point, 16.7: 有趣的是,如果您更改代码以创建两个Map对象并同时插入它们,它们都会在完全相同的点上崩溃,16.7:

var N = Math.pow(2, 26);
var m1 = new Map();
var m2 = new Map();

for (var i = 0; i < N; i++) {
  m2.set(i, i + 1);
  m1.set(i, i + 1);

  if (i % 1e5 === 0) { console.log(m1.size / 1e6); }
}

There's something odd happening here when more than 2 24 entries are made in any given Map, not globally across all Map objects. 当在任何给定的Map中创建超过24个条目时,这里发生了一些奇怪的事情,而不是在所有Map对象中全局。

I think you've found a V8 bug that needs to be reported. 我想你已经找到了需要报告的V8 bug。

i wrote BigMap and BigSet classes that allow to go beyond that limit, while being 100% compatible, and build on the standard Map and Set, i simply create new Maps (or Sets) when the limit is reached. 我写了BigMap和BigSet类,允许超出这个限制,同时100%兼容,并建立在标准的Map和Set上,我只需在达到限制时创建新的Maps(或Sets)。

 const kMaxSize = Math.pow(2, 24) const BigMap = class { /* public api, compatible with "Map" */ constructor (...parameters) { this.maps = [new Map(...parameters)] } set (key, value) { const map = this.maps[this.maps.length - 1] if (map.size === kMaxSize) { this.maps.push(new Map()) return this.set(key, value) } else { return map.set(key, value) } } has (key) { return _mapForKey(this.maps, key) !== undefined } get (key) { return _valueForKey(this.maps, key) } delete (key) { const map = _mapForKey(this.maps, key) if (map !== undefined) { return map.delete(key) } return false } clear () { for (let map of this.maps) { map.clear() } } get size () { let size = 0 for (let map of this.maps) { size += map.size } return size } forEach (callbackFn, thisArg) { if (thisArg) { for (let value of this) { callbackFn.call(thisArg, value) } } else { for (let value of this) { callbackFn(value) } } } entries () { return _iterator(this.maps, 'entries') } keys () { return _iterator(this.maps, 'keys') } values () { return _iterator(this.maps, 'values') } [Symbol.iterator] () { return _iterator(this.maps, Symbol.iterator) } } /* private function */ function _mapForKey (maps, key) { for (let index = maps.length - 1; index >= 0; index--) { const map = maps[index] if (map.has(key)) { return map } } } function _valueForKey (maps, key) { for (let index = maps.length - 1; index >= 0; index--) { const map = maps[index] const value = map.get(key) if (value !== undefined) { return value } } } function _iterator (items, name) { let index = 0 var iterator = items[index][name]() return { next: () => { let result = iterator.next() if (result.done && index < (items.length - 1)) { index++ iterator = items[index][name]() result = iterator.next() } return result }, [Symbol.iterator]: function () { return this } } } BigMap.length = 0 /* Big Set */ const BigSet = class { /* public api, compatible with "Set" */ constructor (...parameters) { this.sets = [new Set(...parameters)] } add (key) { const set = this.sets[this.sets.length - 1] if (set.size === kMaxSize) { this.sets.push(new Set()) return this.add(key) } else { return set.add(key) } } has (key) { return _setForKey(this.sets, key) !== undefined } delete (key) { const set = _setForKey(this.sets, key) if (set !== undefined) { return set.delete(key) } return false } clear () { for (let set of this.sets) { set.clear() } } get size () { let size = 0 for (let set of this.sets) { size += set.size } return size } forEach (callbackFn, thisArg) { if (thisArg) { for (let value of this) { callbackFn.call(thisArg, value) } } else { for (let value of this) { callbackFn(value) } } } entries () { return _iterator(this.sets, 'entries') } keys () { return _iterator(this.sets, 'keys') } values () { return _iterator(this.sets, 'values') } [Symbol.iterator] () { return _iterator(this.sets, Symbol.iterator) } } /* private function */ function _setForKey (sets, key) { for (let index = sets.length - 1; index >= 0; index--) { const set = sets[index] if (set.has(key)) { return set } } } function _iterator (items, name) { let index = 0 var iterator = items[index][name]() return { next: () => { let result = iterator.next() if (result.done && index < (items.length - 1)) { index++ iterator = items[index][name]() result = iterator.next() } return result }, [Symbol.iterator]: function () { return this } } } BigSet.length = 0 

I just got this after 48,408,186 elements:我在48,408,186元素之后才得到这个:

RangeError: Map maximum size exceeded

In Node.js 17 with node --max-old-space-size=8192 script.js .在 Node.js 17 node --max-old-space-size=8192 script.js

A regular object {} is doing a lot better.常规 object {}的效果要好得多。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM