简体   繁体   中英

What are the limits to pushing JavaScript performance?

I have been building a prototype page over the past few months that uses a lot of SVG and has a lot of elements in general. There is also a ton of data being processed both in JavaScript and server-side (lots of AJAX). There are thousands of event listeners on the page. It's pretty heavy, is the point.

One of the biggest hurdles to doing something like this in JS is the single-threadedness, which locks the page when I have to perform, say, 10 seconds of calculations. There are some strategies for remedying that, but until Web Workers are supported by IE there isn't much of an elegant solution. Also, the page can use upwards of 500MB of memory, which Chrome seems to struggle with at times.

What I'm wondering about is the feasibility of building something like this in JavaScript. My code is far from optimized, but let's just assume that the load this page handles now is what it requires - or let's say it requires more.

Let's also assume the user will be required to have at least a mid-range desktop to use the application.

Are people pushing JavaScript this hard? What are the limits to what it can be expected to handle, in terms of memory and CPU performance? How much should be done client-side versus server-side?

EDIT: I guess it was inevitable that everyone would misinterpret the question. I'm not asking for advice on how to optimize JS code . I'm asking how much processing and data is it reasonable to handle on the client . YES this is dependent on hardware, which I tried to answer by saying mid-range desktops with newest browsers, but really that's not the point. I want to know conceptually how powerful is JavaScript for doing heavy processing . Is it viable at all to do heavy processing in JavaScript?

I hope everyone gets it now. It's a ratio of server-side versus client-side. If I have to run a loop with 1000000 iterations, and ASSUMING there is no cost in the choice between doing X iterations in JS and Y iterations on the server, how much is reasonable to expect JavaScript to handle?

1) Surely your thousands of event listeners could be consolidated through event bubbling . Using a single, master event handler with different subroutines for different event targets would be more performant than numerous specific handlers.

2) "until Web Workers are supported by IE there isn't much of an elegant solution."

Au contraire, mon frère: freezing the browser can be mitigated by doing the processing in smaller chunks (I'd try to keep it under 100ms for each callback, if at all possible) and executing the next step after a timeout, which gives the browser a chance to update its state and process user input.

3) If you have a huge number of elements, it sounds like the HTML5 Canvas element is a more appropriate solution than SVG.

4) "My code is far from optimized"

Algorithmic optimizations make all the difference when you're pushing the limits like this.

5) DOM access is very expensive, so huge gains can be made by cleverly minimizing the number of DOM operations . Be sure you're not touching each element, one at a time. Better to reconstruct the whole mess and replace it all in one DOM manipulation.

A hurdle you can face, and cannot actually do anything about it is the system that the user uses. For a user still running a Pentium on 512MB RAM, and to add insult to injury IE6, webapps will grind. Another problem is the browser itself. DOM is slow. You should avoid touching the DOM as much as possible.

What you can do is improve your code, find spots that eat up memory or do too much processing and break them down. For example, single threadedness can be currently remedied by using timeouts and callbacks. here's one of my demos processing a very long loop. one does a sync operation and another uses timeouts to simulate an async operation.

You could also off-load your data and processing to the server, making your client-side app a "thin client". Though the HTTP requests might kill you, but you treat your server as a "second thread" while you do something else in your app. Like for example, games. You compute scores, rankings, match-ups and everything on the server. Don't let the client-side do that. Just make the client a "display" for all things that are going on in the server.

There are no limits that are written in stone.

What can be done on my computer versus the machine I look up recipes on versus my 4 year olds netbook will differ. Memory, speed, etc depends on the browser, cpu, ram, and what else is running on the machine. I bet you run your code on some other platforms and it will freeze and you would have to do the 3 finger salute to kill the process.

  • Do smart event handling, detect clicks at lower levels, not on every element.
  • Push as much as you can on the server for intensive processing.
  • Optimize code, make sure you are not updating the screen on every iteration of a loop.
  • Combine/minimize http requests when possible.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM