Posted on 23rd July 2014
Nobody likes a slow website - with the fast-paced world we live in, users don't have time to sit and wait for your website to load. The average size of a web page from the top 1,000 websites is now 1.62mb (source). The larger the size of the page, the longer it's going to take to load - and with more mobile devices than ever being used, you can't be sure your site is going to be viewed via a high-speed connection.
Users aren't the only ones that don't like to wait, according to a post published by them, Google takes speed into consideration when placing a site on search results pages.
No mater how you look at it, the simple fact is speed is important. There are numerous factors that can deteriorate / improve the performance of your website - you'll find an overview to some of them below. How you actually go about implementing the optimisation techniques will vary depending on your website.
If you have any questions about anything I cover, or would like me to help improve the speed of your website, please contact me.
Choosing a suitable hosting environment is essential - and the best setup will depend on your scale and budget. As choosing a suitable setup is an in-depth subject itself, I'll just give a brief overview for now (expect another blog post at a later date).
Generally speaking shared hosting (you know, the packages you see for a few pound a month) often gives you poor performance - you're sharing a server with hundreds, if not thousands of other sites all trying to use the same limited resources.
A VPS / dedicated server with ample resources (disk, CPU, memory, bandwidth) for the needs of your site & expected traffic which is geographically located as close to your target audience as possible is what you should be aiming for. There are numerous VPS providers available today; I'd recommend Linode - they have data centres in the UK, USA and Japan and recently upgraded the hardware which runs each VPS.
If your target audience is in multiple countries, having a server located in each country (or as close to as possible) and routing each request to the server located as close to the user as possible via a load balancer would be the ideal setup for larger scale sites.
If your website has a CMS (WordPress, Statamic, Drupal etc), an e-commerce system or any other sort of back-end logic that the server needs to execute before delivering the output, the chances are there is room for optimisation there too.
How and what to optimise will vary greatly depending on the system / codebase that you're using. Some common areas to think about are reducing the number of database queries you make, optimising the ones you do make to only read / write exactly what you need to, avoiding loops within loops and ensuring any code within loops is kept to a minimum.
If the number of additional resources needed for a webpage can't be reduced, you can look at using a CDN to serve assets (serving files from different domains will increase the number that can be transfered in parallel - more on this below).
The use of off-the-shelf themes and plugins within content management systems such as WordPress, Joomla and Drupal can often caused an increased number of requests.
All major web servers and web browsers support GZIP; how you enable it on your server will vary based on the web server software you are using (eg Apache, NGINX, IIS). If you have a access to a control panel this may let you enable / disable compression.
By setting up a separate domain or subdomain to serve your static assets from, you will reduce the amount of overhead when resources are requested.
Due to high costs using a CDN was once a privilege reserved for the biggest sites, however the rise in the number of services available has meant pricing is falling and becoming more affordable for small & medium sized sized websites.
Why send more code than needed to your users? Keep your HTML code lean by carefully thinking about the structure and layout of the page. Make sure yours CSS is well written - remove duplication and avoid overwriting rules again and again. Also, try removing unused rules - this can be tough to do depending on the structure of your site but there are tools out there to help such as Dust-Me Selectors.
Always have the KISS principle in your mind when coding; Keep It Simple Stupid.
Also, if you have references to files that no longer exist (returning a 404 error) these are sending wasted requests, stopping other files from loading. There are plenty of tools that also check for broken images on a website such as Integrity to help find wasted requests.
Be enabling browser caching in your web server software, you can control which of your files are stored locally, and for how long. Setting an expiry times on cached items is important - you may release an update to your CSS, but if a long expiring cache time has been set, users may not see that update for a number of hours / days / weeks / months.
Again, this is a complex are itself and there are various caching related headers you can send to a web browser such as Last-Modified, ETag, Expires and Max-Age. The best way of configuring them will depend on the structure and content of your website.
There are methods of overriding cache expiry limits (commonly known as cache-busting) - such as using a different file name each time you publish an update (eg my_style.v1.css, my-style.v2.css), or by including a query string on the end of a filename (eg my_style.css?v=2). The problem with the later technique is that some web browsers won't cache a file at all if it contains a query string in the filename.
Always load your CSS at the top of the page (within the
<body> tag). By referencing your CSS file(s) here, it will stop the browser from rendering your page until the CSS has been loaded (stops progressive rendering).
<head> users would have to wait for it to download before anything is shown. Users are there to read your content - show don't delay showing it unnecessarily.
Images are often a big contributor to large page sizes on websites, yet there are several methods that can be used to reduce the filesize of them.
Depending on the format and origin of an image, the file may contain un-needed metadata (such as the camera used, camera settings, date, location etc) - removing this will reduce the size of the file.
Another common issue when displaying images on the web is their dimensions. Images should be resized so their dimensions are no greater than the largest size it will be displayed at. It is pointless serving a 1000px wide image, when it will only ever be shown at a maximum of 500px wide. By reducing the dimensions the file filesize will also reduce, leading to faster loading times and a reduction in bandwidth costs.
Content management systems (such as WordPress & Joomla) are often culprits at serving oversized images - they allow users to upload images straight from a camera and re-scale them within a content editor, without physically reducing their size.
Using an appropriate image format will also help keep filesizes to a minimum. As a guide, JPEG files should be used for photos and PNG, GIF or SVG should be used for graphics & icons. BMP and TIFF images should be avoided due to their large filesizes.
Once in an dimensions format and at sensible dimensions, images can be further optimised using software which aims to reduce the size of the file without any visible loss in picture quality. Such software and web services include; ImageOptim, JPEGmini and TinyPNG.
You can help reduce the amount of time taken for a browser to render a fully loaded webpage by defining the dimensions of images (using CSS or the height & width HTML attributes). Web browsers will start to render a webpage before images have finished loading. If no dimensions are provided for an image, the web browser will have to redraw the entire page once the image becomes available. Don't be tempted to use this as a cheating way of scaling down the dimensions of your image though.
Minifying HTML can be more challenging - the best method to use will vary based on your content management system (if using one at all). For my bespoke PHP projects, I use a customised version of this PHP class, which is also compatible with WordPress.
Each time a browser comes across a redirect, thats another HTTP request it will have to make before it can serve the final content to the user. Redirect hopping can become an issue when there have been several versions of a website, each with a different URL structure and a new set of redirects added each time. Check your 301 redirects to ensure they are as direct as possible.
Another source of unwanted redirects can be URL shorteners such as Bitly and TinyURL. When building backlinks to your site, always link directly to your site where possible.
There is plenty you can do to improve the speed of your website. Although some of the techniques I've mentioned may sound insignificant on there own, when implemented with others, you will see a considerable difference in the loading times for your website.
For further advice on website performance, or if you want to discuss how I help you to implement effective changes to improve the loading time of your website, get in touch.
Enjoyed reading this blog post? See all my blog articles »