简体   繁体   中英

how do I measure the load time of a large array?

I have one rather large array in a javascript widget I'm writing and I want to find out if it'd be more efficient in terms of browser resources to leave it as just one array or break it into several smaller arrays? Can anyone recommend a method of testing the differences?

Could use firefox or chrome to do a console.log(new Date.toString()) before and after the different methods you are trying to compare. I don't think breaking up the routine wil reduce time.

you can always use console.time('timerName') and console.timeEnd('timerName') to set a timer and find out elapsed time between two points of your javascript code, then compare the results:

console.time('BigArray');
var arr1 = [];
for(var i=0; i<200000; i++){
    arr1.push('test');
}
console.timeEnd('BigArray');


console.time('SeveralArrays');
for(var i=0; i<200000; i++){
    var arr2 = ['test'];
}
console.timeEnd('SeveralArrays');

the output will be something like:

BigArray: 123ms
SeveralArrays: 456ms

Just splitting the arrays won't improve processing speed.

I suggest using Web Workers. They will offload the task to a separate thread, which could speed things up if you split the task into chunks. It's totally dependent on how many cores are available, though.

It will also prevent the UI from hanging if you're processing really huge sets of data. Very useful from a UX perspective.

I have used Paralell.js before, and I like it. It provides a fallback for browsers that don't support Web Workers.

http://adambom.github.io/parallel.js/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM