简体   繁体   中英

Reduce requested file size or reduce number of browser calculations?

I have some data that I want to display on a web page. There's quite a lot of data so I really need to figure out the most optimized way of loading and parsing it. In CSV format, the file size is 244K, and in JSON it's 819K. As I see it, I have three different options:

  1. Load the web page and fetch the data in CSV format as an Ajax request . Then transform the data into a JS object in the browser (I'm using a built-in method of the D3.js library to accomplish this).
  2. Load the web page and fetch the data in JSON format as an Ajax request . Data is ready to go as is.
  3. Hard code the data in the main JS file as a JS object. No need for any async requests.

Method number one has the advantage of reduced file size, but the disadvantage of having to loop through all (2700) rows of data in the browser. Method number two gives us the data in the end-format so there's no need for heavy client-side operations. However, the size of the JSON file is huge. Method number three has the advantage of skipping additional requests to the server, with the disadvantage of a longer initial page load time.

What method is the best one in terms of optimization?

In my experience, data processing times in Javascript are usually dwarfed by transfer times and the time it takes to render the display. Based on this, I would recommend going with option 1.

However, what's best in your particular case really does depend on your particular case -- you'll have to try. It sounds like you have all the code/data you need to do that anyway, so why not run a simple experiment to see which one works best for you.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM