A while back I had a problem with a site that I had inherited for quite a large client. It was built on Joomla 2.5 and was already a little out of date. The main problem was that the site wasn't really designed to be on Joomla, or any prebuilt CMS for that matter. It isn't that those frameworks aren't good, its that the technical design of the site was so bad.
A part of a developers job, when a client comes with an idea for a project, is to steer them in a direction where by they get what they want, but the website doesn’t suffer because of it. This site for one reason or another had no technical driver. Which meant what everything the client wanted the client got without question and without logic. Resulting in a complete mess of modules, packages, and a very confused framework. Which meant loading of pages was so slow, you could make a cuppa whilst waiting for the homepage. Not to mention it was running on a tiny server that was more designed for a one page app than a big corporate site.
Usually, I would suggest rebuilding the site properly, or at least bits of it to slim it down. The alternative was maybe putting it on a server that could cope with the resource drain and traffic. However, as with all projects, the client didn’t have the budget or simply didn’t want the extra cost. Plus, convincing them they needed to spend 10 times the amount of servers was not really an idea my company wanted to take for risk of loosing them as a client. So in came Varnish.
I decided that the quickest, and most cost effective way to give the client a usable site, without it costing them a penny, was to install some sort of server side caching. That way the modules and packages that were all fighting for server resource did need to fight anymore.
For those reading this that don’t know what Varnish is, here is a brief overview:
With a server side caching system like Varnish things go a little differently. The user still sends the exact same request, and the bowser will still do the same, but the server only contacts Apache or Nginx if the URL you have requested hasn’t been requested before within a certain time limit. For example, lets assume the BBC website have Varnish cache installed and running. I visit that page and I am the first person to view the page then my request will work normally as explained above. However, if you then visit the BBC within their cache time limit. Your request won’t even make it through to Nginx/Apache, as your request is handled purely by Varnish and you are served with exactly the same content as me. Varnish just blindly serves you with only minimal calculate based on the URL being the same as the URL I requested. This greatly reduces the amount of work the server needs to do. As well as reducing the response time, as it may have taken the server 300ms to work out what I wanted, it would have only taken say 30ms to work out what you wanted.
Usually, Varnish is used for high traffic, heavy load websites like the BBC. Where they get millions of hits a day and really need to have extremely efficient servers. My reasoning for using Varnish was a little different. I used it for completely the wrong reasons. I was trying to hide a mistake that my predecessors had made by making a terribly inefficient website, more efficient. It worked, and I learnt a great deal about Varnish by doing it. However, my true opinion is that the website should have been rewritten properly, then Varnish would never have been needed.