简体   繁体   中英

Javascript and website loading time optimization

I know that best practice for including javascript is having all code in a separate .js file and allowing browsers to cache that file.

But when we begin to use many jquery plugins which have their own .js , and our functions depend on them, wouldn't it be better to load dynamically only the js function and the required .js for the current page?

Wouldn't that be faster, in a page, if I only need one function to load dynamically embedding it in html with the script tag instead of loading the whole js with the js plugins?

In other words, aren't there any cases in which there are better practices than keeping our whole javascript code in a separate .js ?

One problem with having separate js files is that will cause more HTTP requests.

Yahoo have a good best practices guide on speeding up your site: http://developer.yahoo.com/performance/rules.html

I believe Google's closure library has something for combining javascript files and dependencies, but I havn't looked to much into it yet. So don't quote me on it: http://code.google.com/closure/library/docs/calcdeps.html

Also there is a tool called jingo http://code.google.com/p/jingo/ but again, I havn't used it yet.

It would seem at first glance that this would be a good idea, but in fact it would actually make matters worse. For example, if one page needs plugins 1, 2 and 3, then a file would be build server side with those plugins in it. Now, the browser goes to another page that needs plugins 2 and 4. This would cause another file to be built, this new file would be different from the first one, but it would also contain the code for plugin 2 so the same code ends up getting downloaded twice, bypassing the version that the browser already has.

You are best off leaving the caching to the browser, rather than trying to second-guess it. However, there are options to improve things.

Top of the list is using a CDN. If the plugins you are using are fairly popular ones, then the chances are that they are being hosted with a CDN. If you link to the CDN-hosted plugins, then any visitors who are hitting your site for the first time and who have also happened to have hit another site that's also using the same plugins from the same CDN, the plugins will already be cached.

There are, of course, other things you can to to speed your javascript up. Best practice includes placing all your script include tags as close to the bottom of the document as possible, so as to not hold up page rendering. You should also look into lazy initialization. This involves, for any stuff that needs significant setup to work, attaching a minimalist event handler that when triggered removes itself and sets up the real event handler.

I keep separate files for each plug-in and page during development, but during production I merge-and-minify all my JavaScript files into a single JS file loaded uniformly throughout the site. My main layout file in my web framework ( Sinatra ) uses the deployment mode to automatically either generate script tags for all JS files (in order, based on a manifest file) or perform the minification and include a single querystring-timestamped script inclusion.

Every page is given a body tag with a unique id , eg <body id="contact"> .

For those scripts that need to be specific to a particular page, I either modify the selectors to be prefixed by the body:

$('body#contact form#contact').submit(...);

or (more typically) I have the onload handlers for that page bail early:

jQuery(function($){
  if (!$('body#contact').length) return;
  // Do things specific to the contact page here.
});

Yes, including code (or even a plug-in) that may only be needed by one page of the site is inefficient if the user never visits that page. On the other hand, after the initial load the entire site's JS is ready to roll from the cache.

The network latency is the main problem.
You can get a very responsive page if you reduce the http calls to one.

It means all the JS, CSS are bundled into the HTML page.
And if your can forget IE6/7 you can put the images as data:image/png;base64

When we release a new version of our web app, a shell script minify and bundle everything into a single html page. Then there is a second call for the data, and we render all the HTML client-side using a JS template library: PURE

Ensure the page is cached and gzipped. There is probably a limit in size to consider.
We try to stay under 400kb unzipped, and load secondary resources later when needed.

I would recommend you join common bits of functionality into individual javascript module files and load them only in the pages they are being used using RequireJS / head.js or a similar dependency management tool.

An example where you are using lighbox popups, contact forms, tracking, and image sliders in different parts of the website would be to separate these into 4 modules and load them only where needed. That way you optimize caching and make sure your site has no unnecessary flab.

As a general rule its always best to have less files than more, its also important to work on the timing of each JS file, as some are needed BEFORE the page completes loading and some AFTER (ie, when user clicks something)

See a lot more tips in the article: 25 Techniques for Javascript Performance Optimization .

Including a section on managing Javascript file dependencies.

Cheers, hope this is useful.

You can also try a service like http://www.blaze.io . It automatically peforms most front end optimization tactics and also couples in a CDN.

There currently in private beta but its worth submitting your website to.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM