Tag Archives: javascript

Making Slow Websites Faster, Quickly

Intechnica recently hosted an event called Faster Websites – aimed at discussing with retailers some of the means and methods that can be adopted in improve performance of their online presence.

As part of the preparation for this event we evaluated the websites of the potential attendees as well as the top 50 leading retail sites in the UK.

As would be expected there was a wide range of results from the very fast to the quite slow.

I had a look into the performance of some of the slower sites to see if there were any quick wins that I could propose to improve their speed. I did a very limited investigation using WebPageTest under normal traffic conditions (as far as I know) and came up with the following observations.

Most follow general good practice

With very few exceptions most were doing the obvious things (minifying javascript, compressing content, using a CDN etc). This illustrated that there was unlikely to be a simple, config based solution to the slowness.

Slowness was caused by client side, not server side, issues

None of the sites spent more than 0.5 seconds waiting for a server response, indicating that the server is not struggling to return content. This is as would be expected for a site homepage that is not under load.

Very large page weights – especially javascript

A large amount of the slowness was being caused by simple page weight issues. All of these sites were requesting well over 100 elements with some requesting over 200 items.

The largest chunk of this was images, as was to be expected. As these are retail sites, there is an argument to be made that high quality imagery is to be expected and is essential for business. However one site was requesting close to 70 images, taking 3.5mb of data. It would certainly be worth investigating whether these images could be compressed, loaded asynchronously or even just removed.

Of more concern to me across all these slow loading sites was the general size and number of javascript files that were being requested. Sites were requesting over 40 distinct javascript files and file sizes totalling 300kb+ were common, with one site topping 600kb of javascript content. In most cases this javascript had already been minified and compressed. In all these cases the use of javascript should be fully investigated and rationalised.

CSS and even HTML files were similarly large (50kb+) and could equally be rationalised.

Complex DOMs

Most of the slower sites had more complex DOMs, often topping 2,000+ elements. This does not necessarily cause an issue but when combined with complex javascript and content manipulation it can easily lead to slowdown.

In the examples I tested this was illustrated in how long the startup event was taking for some pages. In one example this took over 1.5 seconds. This illustrates a page that is far too complex and needs rationalising.

3rd parties causing slowdowns

There were a couple of sites that were slowed down in their load time by waiting for responses from 3rd parties for content (e.g. from Facebook). In one case this was causing a 12 second delay.

As a site owner you really can’t let your performance be in the hands of 3rd party content. You must aim to make all these calls asynchronous after page load if possible.

Part of any performance assessment should include poor performance of these elements.

Overall the impression that I got was that effort on these systems was still mainly focussed on providing server side performance, and the state of the client side was being generally ignored beyond following standard good practice. A more considered approach could easily (days not months of effort) speed these pages up dramatically.

Leave a comment

Filed under Opinions