简体   繁体   中英

How to load huge text data comprising of more than 150MB on browser

Log data comprising over 2,000,000 lines and potential size of over 150MB is fetched from API and fed to JavaScript. The intention is to show this log data on the browser and provide filter options, search, navigate next, previous like Notepad ++ application.

When such huge data is fed, the search and filter options consume lot of heap memory and after 4 to 5 search or filtering, browser crashes. What frontend library should be used inorder to achieve this? Does react have any library to show the log data with all these filtering, search and navigation options?

Im reading the log using Ajax and the log data is stored in variable **log_data **.

var log_loader= {
    init: function () {
        $.ajax({
                 url: "/fetch_log_data_file",
                 async: true,
                 success: function (response) {
                     log_data = response['data'].split('\n');
                     update_filter_options();
                 }
              });
            }
    };
log_loader.init();

The fetched data is rendered on browser using pre tag for each line like the following inorder to look elegant. Here div_data variable stores the html information and is rendered to LogDivision which is to hold this html.

div_data = '';

$.each(log_data , function(log_lines, elem) {
   div_data = div_data + '<pre class="line_" id="line_' + log_lines + '">' + elem + '</pre>';
});

$('#LogDivision').html('');
$('#LogDivision').html(div_data);

The above code works of 300,000 to 500,000 lines but beyond that it takes lot of time and sometimes memory exceeds out. What open source frontend or juery libraray should be used to load 2,000,000 lines of lod data and also provide option to search a text and show only those lines which contain searched word?

This is not a question of used technology.

It does not matter which framework, library, etc... you use. With this code the result will be the same.

Browser will be overloaded because you are injecting too much DOM elements, for technology itself its nothing to process this much data, but for browser it too much.

You have to go around this with virtual scroll or load on scroll . This has its downside, you will not be able find elements in DOM until you do enough scroll for certain element to get loaded, which would be bad if you want to perform filtering on data.

You can load data without browser crash if you apply interval, like 500 rows every 2 seconds, or something like this.

The thing is that its not a question of only loading data. Even if you load data without a problem browser will be pretty slow when you perform search on the data.

And depending on client, moving browser window will generate refresh rate problems, will have mini delays.

All in all, you don't want to load so much data. You have to find workaround.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM