[英]Read a huge csv file and populate a Map in Javascript using d3
I am using d3
to load a huge CSV file which I later use for some processing. 我正在使用
d3
加载巨大的CSV文件,以后将其用于某些处理。 I want to load the file and populate a Map
based on certain conditions. 我想加载文件并根据某些条件填充
Map
。
The csv file that I have is like this 我的csv文件是这样的
h1 h2 h3 h4
1 A 3 4
2 A 1 6
1 B 5 7
2 C 8 19
and so on. 等等。 There may be about 4M+ entries.
可能有大约4M +条目。
I want to populate a Map from this csv data. 我想从此csv数据填充地图。 The map should be like this
地图应该是这样的
1 A : [3, 4]
2 A : [1, 6]
1 B : [5, 7]
2 C : [8, 19]
The key must be the combination of h1
and h2
. 密钥必须是
h1
和h2
的组合。 Other columns are added as values. 其他列将作为值添加。
I was able to achieve this using the following codes 我能够使用以下代码实现这一目标
function makeKey(a, b) {
return "" + a + " " + b;
}
function csvToColumnArrays(csv) {
let csvMap = new Map();
for (let i = 0; i < csv.length; i++) {
let data = csv[i];
let value = [];
value.push(parseFloat(data["h3"]));
value.push(parseFloat(data["h4"]));
let key = makeKey(data["h1"], data["h2"]);
csvMap.set(key, value);
}
return csvMap;
}
d3.csv(file_url, function(csv) {
let csvMap = csvToMap(csv);
}
This was working perfect files of size 2M entries. 这是大小为2M条目的完美文件。 But when the size is further increased, the page snaps.
但是,当大小进一步增加时,页面就会卡住。
Is there a more efficient way to do this.? 有没有更有效的方法可以做到这一点?
Making a dictionary with 4M entries does not crash my computer. 用4M条目制作字典不会使计算机崩溃。
var count = 4 * 1000 * 1000; var map = {} for (var i=0; i < count; ++i) { map["" + i + "ABCDEF"[i%6] ] = [ Math.random(), Math.random() ]; } //console.log(map); console.log("done!");
With new Map()
使用
new Map()
var count = 4 * 1000 * 1000; var map = new Map(); for (var i=0; i < count; ++i) { map.set("" + i + "ABCDEF"[i%6], [ Math.random(), Math.random() ]); } //console.log(map); console.log("done!");
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.