TIPS TO SPEED UP YOUR WEBSITE PERFORMANCE

Optimisation for quicker web page speed will be the talking point in the web design community in 2010. In November Google declared it a central issue for webmasters after spokesman Matt Cutts announced at an SEO conference that Google will change its algorithm in the near future.

And to bring this point into focus for all webmasters a new feature called site performance was launched as part of their Webmaster Tools section and which lists average download times for registered websites.

Not only is Google making this an issue that requires attention but usability data tells how response times expectations have increased over the years. According to a report by Forrester Research two seconds is now users’ threshold page acceptable download level. Amazingly, even speed differences of 100 milliseconds can adversely affect user behavior. Leading frontend engineer Steve Souders states that at least 80% of page load times are down to non-HTML factors. As a case study, I decided to redesign a site I own – rss001.com – to bring a number of speed-enhancing factors into play. RSS 001 is a feed directory that I launched in the spring. As it is just about to reach 10,000 unique visitors per month and with the potential for traffic to dramatically increase in 2010, I thought it was a good site to optimise for speed.
Below are some points to take into consideration.

Keep your CSS basic and tight

I’m sure you are all writing good quality CSS code by which I mean that you are only using one id selector in the property list. You can use as many classes as you want in the property list but the less the better. Try to limit the number of descendent selectors.

The jury is out on how much of a difference shorthand properties make to load times but I always use them as a matter or routine anyway as they make for a more cleaner CSS code page. I use shorthand properties as I work and then go back and refector the code after I have finished the design.

Other CSS aspects to take note of:

• If you are writing for speed it is best to avoid the use of the :hover pseudo-selector on non-anchor elements. This reputedly cases problems in IE 7 and 8 when a strict doctype is used.
• Use CSS sprites as this will reduce the number of server requests.
• Avoid using the * selector to reset the stylesheet as this can add up to two seconds on the download time. (If this was Wikipedia I would provide you reference for that, but it’s not, so I won’t. There is an article out there somewhere that goes into detail on this subject – Google it).
• It has been stated that using some advanced CSS selectors has an impact on performance. I’d like to see some hard facts backing this argument up but I avoided using attribute and child selectors as well as CSS 3 structural pseudo-classes when redesigning rss001.com.

Minify JavaScript and CSS

Removing unnecessary white space and comments creates a leaner CSS or JavaScript file. You can do this manually but it would be a very tedious and error-prone endeavour. If you are using a reputable CMS then the core files or a third-party module will give you the ability to do this automatically. If you are handcoding then take a look at the YUI Compressor and the Minify PHP-based project at Google code.

Keep CSS and JavaScript in one file

No, not in the same file – but one file each for CSS and JavaScript. This reduces the number of server requests down to a minimum.

Put JavaScript at the bottom

Placing JavaScript in the head will cause a page download delay in some browsers as all of the other content underneath has to wait until the JavaScript file has been fully downloaded. Placing the JavaScript at the bottom of the page prevents this from happening. This is fine as long as you are not using JavaScript to assist or create your page layout. (Note: DO NOT put CSS at the bottom. Keep that in your head)

Use Gzip compression

Using Gzip can dramatically shrink the size of a file by as much as two thirds. Apache uses mod_gzip and mod_deflate modules for versions 1.3 and 2.x respectively. Again, any decent CMS will give the administrator easy access to turning on Gzip, but if you are handcoding then read the Apache online documentation.

Add an expires header

This tells the users’ browser how long to keep images and files in the browser cache. Yahoo recommends to set this for a month. Explaining expires header is an article in itself so please read your CMS documentation or read the Yahoo orGoogle guide to browser caching.

Ration your third-party modules

CMSs like WordPress and Drupal have thousands of third-party modules listed. There is a temptation to act like a child given free run in a sweetshop and to download and use as many as possible. Look at all the modules you have installed and ask yourself the question, “Do I really need this? Does this provide something vital to either the user or the administrator experience?” I can can guarantee that you’ll find at least one module where the answer will be no. Removing the module would not only remove unnecessary dynamic code that could include database calls, but may also remove unnecessary JavaScript or CSS that comes with the module.

Cut down the size of you images.

When it comes to saving images for the web when using Photoshop think carefully what purpose the image is going to serve on the website. Few images require the close attention of users and are only there for decoration; if this is the case then keep the quality of the file down to a minimum.

Try to stick to 8 bit PNGs as 24 and 32 bit PNGs can really be quite hefty in size. If you do use 32 big PNGs for its alpha transparency then you’ll run into the usual IE6 problems. Avoid using the AlphaImageLoader filter to fix the problem as it will really cause the download to lag in IE6. As an alternative take a look at the jQuery PNG Fix.

Once you have you finished images then run them through Yahoo’s free service, Smush It. This Yahoo application is wonderful. It utilises a number of command line tools such as Pngcrush and Gifsiclef and allows the user to shrink images without loss of quality. Smush It is now integrated with theYSlow Firebug plugin.

So after its redesign and optimisation how does RSS 001 perform? Using the Web Page Analyzer it is possible to quantify the page download time as 20.41 seconds on a 56K connection. Okay, so the vast majority of internet users in the Western world use broadband but it is a useful comparison metric.

Leave a Reply

Your email address will not be published. Required fields are marked *