The last area I want to examine in my series on WPO is the client side. Currently many people look at the browser as the only client. But I think clients like Apps can benefit from the same ideas as they are built with the same concepts nowadays, some even in HTML.
One especially interesting fact about browser optimization is that it is mostly built on guessing. This is a nice commonality with SEO for which this is even more true, as nobody really knows how search engines operate. As a result of this, the majority of WPO tools use a term called “grade”. That’s right. Like in school you don’t know exactly what to do to get better. Just try some more. However there are a several tools available that analyze your sites and can give you pointers as to what you might want to fix or change to improve your grade. Two examples are YSlow by Yahoo and PageSpeed by Google which both use a catalog of best practices to check your sites.
You can also use an aggregator like webpagetest.org or showslow.org to run various tests for you. They all offer check lists to let you know how your site is violating best practices and how to fix it. However, what you do not get is an estimate how much your changes will affect page performance.
Requests waste time
Despite the fact that we unfortunately do not know exactly how a site’s behavior will change when applying best practices, it is very clear that extra requests are to be avoided at all costs. Requests cost time and money. There are two aspectes of requests that can be looked at:
- Connection time
- Payload transfer time
Of course payload transfer never comes without connection. Avoiding payload transfer can be achieved easily by two practices
Compression is piece of cake, but very often neglected. It comes from a time long back in history when early versions of IE could not handle compressed responses. Adding compression varies but is usually very easy. I recommend to let your container do this (rather than to code it into your application). For Apache this is as simple as:
Of course making images as small as possible is a sensible idea too. A few days ago Kornel Lesinski published a neat tutorial on PNGs that work, which also takes care about the transparency issues with IE 6.
Activating Caching is also easy, but requires a bit of thinking to avoid delivering outdated content. While you can try to cache all your HTML, most people do not need this, but should only cache really static content. So if you have a folder managed by your CMS like “export” or a place where all your images go, the whole folder should have an expiry defined for a long time. In Apache this is easy as well:
<IfModule mod_expires.c> ExpiresActive On ExpiresDefault "access plus 10 years" </IfModule>
The only problem is that if you are going to replace an image, it has to have a new name, otherwise it will not get downloaded (we declared the old version to be good for 10 years). The usual practice for solving this is to include a version or timestamp in the file name. Problem solved. The process can be automated partially or fully, as discussed later.
The above linked loop benchmark shows how easily you can (or at least could) mess it up. A smart use of language might let the loop take 15ms, a incorrect usage 3203ms (Example: HTML Collection (length=1000, looped 100 times), IE7).
Recently Google released an Apache Module called mod_pagespeed. The idea behind it is that most best practices just should be followed and not cause any issues. However, some are hard to do upfront, but can be easily applied at runtime. This is the job of mod_pagespeed. It will fix your HTML, links, and cache config. It works best on unoptimized sites. However, it is extra code that is getting executed, so it might slow optimized sites down a bit. My usual advice: measure your results. Besides mod_pagespeed, there are multiple commercial solutions available, which also include CDN solutions.
Another kind of automatic fix is the HTML5 Boilerplate, a web site template that covers all the best and proven configuration preconfigured and documented. I can highly recommend it for its ideas, especially for the CSS part of it.
I do believe that employing best practices makes your webpages faster, but finding the 20% change that will improve your page by 80% is not that easy. Tooling support is still a bit limited, so it is left to us to experiment with ideas and check their effects. Browsers are getting faster day by day, so cheating and voodoo has a short lifespan. Also for the web, we need a clean and simple design. If you want some expert consulting, feel free to contact us. We can find out what makes your pages slow and what will give you the boost with the best return on investment on all layers: infrastructure, server software and client side.
I hope you have enjoyed my short introduction on Web Performance Optimization. Happy Holidays!
My WPO series: