简体   繁体   English

如何将“原始”行数据加载到ag-grid

[英]How to load “raw” row data into ag-grid

I'm dealing with an high throughput problem. 我正在处理高吞吐量问题。 My goal is to display, at least on a chrome browser, a grid composed by 1M of rows. 我的目标是至少在chrome浏览器上显示由1M行组成的网格。

These rows are dynamically fetched from a python server running on the same machine. 这些行是从在同一台计算机上运行的python服务器动态获取的。 This server has already loaded the whole dataset in memory. 该服务器已将整个数据集加载到内存中。 The communications between client (the browser) and server (python) take place through websocket. 客户端(浏览器)和服务器(python)之间的通信通过websocket进行。 The grid has the option virtualPaging: true . 网格具有选项virtualPaging: true

So far I reach some good performances loading pages of 100 rows each. 到目前为止,在加载100行的页面时,我取得了一些不错的效果。 Despite that, loading the whole 1M dataset at the beginning (therefore without the remote fetching of rows) , shows significant improvement in scrolling (no "white rows" effect). 尽管如此,在开始时加载整个1M数据集(因此无需远程获取行)在滚动方面显示了显着的改进(没有“白色行”效果)。

I want to achieve the same performance without storing in the browser memory the whole dataset. 我想在不将整个数据集存储在浏览器内存中的情况下实现相同的性能。

The first step that I would try is to avoid some conversions steps. 我要尝试的第一步是避免一些转换步骤。 The client receives from the server an array of arrays, this means that the row model on the server is "positional" (given r as a generic row, r[0] is the element related to first column, r[1] to the second and so on). 客户端从服务器接收一个数组数组,这意味着服务器上的行模型是“位置”的(给定r作为通用行, r[0]是与第一列相关的元素, r[1]与第二,依此类推)。 But the callback function successCallback of ag-grid, require an array of objects, that means that each row takes with it the keys related to column names (given r as a generic row, r["firstColumn"] is the element related to first column, r["secondColumn"] to the second and so on). 但是ag-grid的回调函数successCallback需要一个对象数组,这意味着每一行都带有与列名相关的键(给定r为通用行, r["firstColumn"]是与first相关的元素列, r["secondColumn"]到第二个,依此类推)。

The second approach is totally infeasible for the server perspective, given the huge waste of memory used by the key-value mechanism. 考虑到键值机制使用的大量内存,从服务器角度来看,第二种方法是完全不可行的。 This leads to the need of a local conversion for each page received by the client: 这导致需要对客户端接收的每个页面进行本地转换:

client.appendCallback("message", function(message){
    message = JSON.parse(message.data);
    switch(message.command) {
        case "GetRows":
            if(message.res.code == 0) {
                var bulk = [];
                var arr = message.res.data;
                for (var i = 0, len = arr.length; i < len; i++) {
                    bulk[i] = {"par1" : arr[i][0], "par2" : arr[i][1], "par3" : arr[i][2], "par4" : arr[i][3], "par5" : arr[i][4], "par6" : arr[i][5]};
                }
                _data.successCallback(bulk);
            }
            break;
        default:
            break;
    }
},"ws");

What I need is a way to pass to successCallback the rows as array and not as objects avoiding the conversion part, like this: 我需要的是一种传递给successCallback的方法,将行作为数组而不是作为对象避免回避转换部分,如下所示:

client.appendCallback("message", function(message){
    message = JSON.parse(message.data);
    switch(message.command) {
        case "GetRows":
            if(message.res.code == 0) {
                _data.successCallback(message.res.data);
            }
            break;
        default:
            break;
    }
},"ws");

Any help will be appreciated 任何帮助将不胜感激

What about this ? 那这个呢 ?

Fix the pageSize of something like 100. 固定pageSize大约为100。

Since you use server side paging you have implemented your own datasource : so when you're asked to load data : load [and convert] something like 10000 rows and store them in memory. 由于使用服务器端分页,因此已经实现了自己的数据源:因此,当系统要求您加载数据时:加载[并转换] 10000行之类的内容并将其存储在内存中。

Then use your own intermediary paging : each time the grid ask for next rows, either get them from the memory or fetch the next 10k rows and [convert and] return only the 1st hundreth . 然后使用您自己的中间分页:每次网格要求下一行时,要么从内存中获取下一行,要么获取下一万行,然后[convert and]仅返回第1个hundreth。

the [convert] part is your choice to place the conversion operation either when loading from the server either when asking the next 100 rows. [convert]部分是您选择从服务器加载时还是在询问接下来的100行时进行转换操作。

If the number of data is huge and you consider deploy this not only on your local computer, angular (or the browsers, i don't really know where it is) support gzip compressed data transparently. 如果数据量巨大,并且您考虑不仅将其部署在本地计算机上,Angular(或浏览器,我真的不知道它在哪里)透明地支持gzip压缩数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM