Menu Search
Jump to the content X X
SmashingConf London Avatar

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf London, dedicated to all things web performance.

Varnish – Speed Up Your Mobile Website

Imagine that you have just written a post on your blog, tweeted about it and watched it get retweeted by some popular Twitter users, sending hundreds of people to your blog at once. Your excitement at seeing so many visitors talk about your post turns to dismay as they start to tweet that your website is down — a database connection error is shown.

Varnish Magic
Keep calm and try Varnish to optimize mobile websites. (Image source1)

Or perhaps you have been working hard to generate interest in your startup. One day, out of the blue, a celebrity tweets about how much they love your product. The person’s followers all seem to click at once, and many of them find that the domain isn’t responding, or when they try to sign up for the trial, the page times out. Despite your apologies on Twitter, many of the visitors move on with their day, and you lose much of the momentum of that initial tweet.

Further Reading on SmashingMag: Link

These scenarios are fairly common, and I have noticed in my own work that when content becomes popular via social networks, the proportion of mobile devices that access that content is higher than usual, because many people use their mobile devices, rather than desktop applications, to access Twitter and other social networks. Many of these mobile users access the Web via slow data connections and crowded public Wi-Fi. So, anything you can do to ensure that your website loads quickly will benefit those users.

In this article, I’ll show you Varnish Web application accelerator6, a free and simple thing that makes a world of difference when a lot of people land on your website all at once.

Introducing The Magic Of Varnish Link

For the majority of websites, even those whose content is updated daily, a large number of visitors are served exactly the same content. Images, CSS and JavaScript, which we expect not to change very much — but also content stored in a database using a blogging platform or content management system (CMS) — are often served to visitors in exactly the same way every time.

Visitors coming to a blog from Twitter would likely not all be served exactly the same content — including not only images, JavaScript and CSS, but also content that is created with PHP and with queries to the database before being served as a page to the browser. Each request for that blog’s post would require not only the Web server that serves the file (for example, Apache), but also PHP scripts, a connection to the database, and queries run against database tables.

The number of database connections that can be made and the number of Apache processes that can run are always limited. The greater the number of visitors, the less memory available and the slower each request becomes. Ultimately, users will start to see database connection errors, or the website will just seem to hang, with pages not loading as the server struggles to keep up with demand.

This is where an HTTP cache like Varnish comes in. Instead of requests from browsers directly hitting your Web server, making the server create and serve the pages requested, requests would first hit the cache. If the requested page is in the cache, then it is served directly from memory, never touching Apache or the database. If the page is not in the cache, then the request is handed over to Apache as usual, whereupon Apache will create and serve the page, which is then stored in the cache, ready for the next request.

Serving a page from memory is a lot faster than serving it from disk via Apache. In addition, the page never needs to touch PHP or the database, leaving those processes free to handle traffic that does require a database connection or some processing. For example, in our second scenario of a startup being mentioned by a celebrity, the majority of people clicking through would check out only a few pages of the website — all of those pages could be in the cache and served from memory. The few who go on to sign up would find that the registration form works well, because the server-side code and database connection are not bogged down by people pouring in from Twitter.

How Does It Work? Link

The diagram below shows how a blog post might be served when all requests go to the Apache Web server. This example shows five browsers all requesting the same page, which uses PHP and MySQL.

When all requests go to the Apache Web server.7

Every HTTP request is served by Apache — images, CSS, JavaScript and HTML files. If a file is PHP, then it is parsed by PHP. And if content is required from the database, then a database connection is made, SQL queries are run, and the page is assembled from the returned data before being served to the browser via Apache.

If we place Varnish in front of Apache, we would instead see the following:

If we place Varnish in front of Apache.8

If the page and assets requested are already cached, then Varnish serves them from memory — Apache, PHP and MySQL would never be touched. If a browser requests something that is not cached, then Varnish hands it over to Apache so that it can do the job detailed above. The key point is that Apache needs to do that job only once, because the result is then stored in memory, and when a second request is made, Varnish can serve it.

The tool has other benefits. In Varnish terminology, when you configure Apache as your Web server, you are configuring a “back end.” Varnish allows you to configure multiple back ends. So, you might want to run two Web servers — for example, using Apache for PHP pages while serving static assets (such as CSS files) from nginx. You can set this up in Varnish, which will pass the request through to the correct server. In this tutorial, we will look at the simplest use case.

I’m Sold! How Do I Get Started? Link

Varnish is really easy to install and configure. You will need root, or sudo, access to your server to install things on it. Therefore, your website needs to be hosted on a virtual private server (VPS) or the like. You can get a VPS very inexpensively these days, and Varnish is a big reason to choose a VPS over shared hosting.

Some CMS’ have plugins that work with Varnish or that integrate it in the control panel — usually to make clearing the cache easier. But you can put Varnish in any CMS or any static website, without any particular integration with other systems.

I’ll walk you through installing Varnish, assuming that you already run Apache as a Web server on your system. I run Debian Linux, but packages for other distributions are available. (The paths to files on the system will vary with the Linux distribution.)

Before starting, check that Apache is serving your website as expected. If the server is brand new or you are trying out Varnish on a local virtual machine, make sure to configure a virtual host and that you can view a test page on the server using a browser.

Install Varnish Link

Installation instructions for various platforms are in Varnish’s documentation9. I am using Debian Wheezy; so, as root, I followed the instructions for Debian10. Once Varnish is installed, you will see the following line in the terminal, telling you that it has started successfully.

[ ok ] Starting HTTP accelerator: varnishd.

By default, Apache listens for requests on port 80. This is where incoming HTTP requests go, because we want Varnish to essentially sit in front of Apache. We need to configure Varnish to listen on port 80 and change Apache to a different port — usually 8080. We then tell Varnish where Apache is.

Reconfigure Apache Link

To change the port that Apache listens on, open the file /etc/apache2/ports.conf as root, and find the following lines:

NameVirtualHost *:80
Listen 80

Change these lines to this:

NameVirtualHost *:8080
Listen 8080

If you see the following lines, just change 80 to 8080 in the same way.

Listen 80

Save this file and open your default virtual host file, which should be in /etc/apache2/sites-available. In this file, find the following line:

<VirtualHost *:80>

Change it to this:

<VirtualHost *:8080>

You will also need to make this change to any other virtual hosts you have set up.

Configure Varnish Link

Open the file /etc/default/varnish, and scroll down to the uncommented section that starts with DAEMON_OPTS. Edit this so that it looks like the following block, which will make Varnish listen on port 80.

DAEMON_OPTS="-a :80 
-T localhost:1234 
-f /etc/varnish/default.vcl 
-S /etc/varnish/secret 
-s malloc,256m"

Open the file /etc/varnish/default.vcl, and check that the default back end is set to port 8080, because this is where Apache will be now.

backend default {
  .host = "";
  .port = "8080";

Restart Apache and Varnish as root with the following commands:

service apache2 restart
service varnish restart

Check that your test website is still available. If it is, then you’ll probably be wondering how to test that it is being served from Varnish. There are a few ways to do this. The simplest is to use cURL. In the command line, type the following:

curl --head

The response should be something like Via: 1.1 varnish.

You can also look at the statistics generated by Varnish. In the command line, type varnishstat, and watch the hit rate increase as you refresh your page in the browser. Varnish refers to something it can serve as a “hit” and something it passes to Apache or another back end as a “miss.”

Another useful tool is varnish-top. Type varnishtop -i txurl in the command line, and refresh your page in the browser. This tool shows you which files are being served by Varnish.

Purging The Cache Link

Now that pages are being cached, if you change an HTML or CSS file, you won’t see the changes immediately. This trips me up all of the time. I know that a cache is in front of Apache, yet every so often I still have that baffled moment of “Where are my changes?!” Type varnishadm "ban.url ." in the command line to clear the entire cache.

You can also control Varnish over HTTP. Plugins are available, such as Varnish HTTP Purge11 for WordPress, that you can configure to purge the cache directly from the administration area.

Some Simple Customizations Link

You’ll probably want to know a few things about how Varnish works by default in order to tweak it. Configuring it as described above should cause most basic assets and pages to be served from the cache, once those assets have been cached in memory.

Varnish will only cache things that are safe to do so, and it might not cache some common things that you think it would. A good example is cookies.

In its default configuration, Varnish will not cache content if a cookie is set. So, if your website serves different content to logged-in users, such as personalized content, you wouldn’t want to serve everyone content that is meant for one user. However, you’d probably want to ignore some cookies, such as for analytics. If the website does not serve any personalized content, then the only cookies you would probably care about are those set for your admin area — it would be inconvenient if Varnish cached the admin area and you couldn’t see changes.

Let’s edit /etc/varnish/default.vcl. Assuming your admin area is at /admin, you would add the following:

sub vcl_recv {
  if ( !( req.url ~ ^/admin/) ) {
    unset req.http.Cookie;

Some cookies might be important — for example, logged-in users should get uncached content. So, you don’t want to eliminate all cookies. A trip to the land of regular expressions is required to identify the cookies we’ll need. Many recipes for doing this can be found with a quick search online. For analytics cookies, you could add the following.

sub vcl_recv {
  // Remove has_js and Google Analytics __* cookies.
  set req.http.Cookie = regsuball(req.http.Cookie, "(^|;s*)(_[_a-z]+|has_js)=[^;]*", "");
  // Remove a ";" prefix, if present.
  set req.http.Cookie = regsub(req.http.Cookie, "^;s*", "");

Varnish has a section in its documentation on “Cookies12.”

In most cases, configuring Varnish as described above and removing analytics cookies will dramatically speed up your website. Once Varnish is up and running and you are familiar with the logs, you can start to tweak the configuration and get more performance from the cache.

Next Steps Link

To learn more, go through Varnish’s documentation13. You should understand enough of Varnish’s basics by now to try some of the examples. The section on “Achieving a High Hit Rate14” is well worth a read for the simple tips on tweaking your configuration.

(al, ea, il)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14

↑ Back to top Tweet itShare on Facebook

Rachel Andrew is a web developer, writer and speaker and one of the people behind the content management system, Perch. She is the author of a number of books including The New CSS Layout. She writes about business and technology on her own site at

  1. 1

    In django,this is easily achievable using memcache.static files are served seperately

  2. 2

    I’m no expert on this matter, but I feel you can work a bit pro-active and save some hassle by optimising your website tot the extreme (use font-icons and sprites to reduce http requests, try to make the browser do all the drawing work in stead of .png’s and jpg’s to draw the site ;-) and enable the less=more filosophy. Sure it’s nice to show your tweets, your youtube video’s embedded, some rss feeds and ‘latest comments’ that nobody cares about, but if you can, leave’m out to reduce strain on your databases and clean up the users’ experience. Think about what you REALLY want them to see. and remove all the rest. ALso, try to offer mirrors for big downloads (and even suggest them prior to your own website’s hosted file)

    Just my two cents. Good luck to all!

    • 3

      This is good advice, but ultimately the less server resources your website consumes, the more performant it will be. Varnish helps with that in a big way.

  3. 4

    I’d add two recommendations to this. Varnish does not support SSL so put pound ( in front of it to serve the cert.

    Second, since you’re reading SmashingMag, chances are that you use one of the CMSes for which I found a great github repo with pre-defined best practices:

    While the setup isn’t very complex, it still has quite a few moving parts, so I’d also recommend monit ( to monitor these processes.

  4. 5

    Tobias Hoffmann

    December 4, 2013 7:10 am

    This is all good and fine, but what does this have to do with specifically mobile websites?

  5. 7

    juliana jones

    December 4, 2013 8:53 am

    I read “Speed Up Your Mobile Website With Varnish”. It is a very good idea to varnish websites. Should this work with another type of websites?

    • 8

      Yes, this is not specific to just “mobile” websites. This will work on any website because it is installed and configured at the server level.

  6. 9

    Cool, where’s the Windows support?

  7. 11

    every mobile can use this?

  8. 12

    Excellent article thanks.
    For the beginners here – Caching Tutorial for Web Authors and Webmasters

  9. 13

    Thank for writing this Rachel.

    How is Varnish different from caching files with PHP? For example I can get PHP check if HTML file with the specified name in /caches directory exists and if so serve that. And if it doesn’t the script can continue parsing and write the HTML file for further reference when done.

    • 14

      Dimitri, the main difference is the lack of apache in the process chain. Your example would still need apache in order to parse the PHP and decide whether or not to fetch the cached HTML file and serve it directly.

      I have done some benchmarking and stress-testing on Varnish compared to something close to your solution and, if accounted for error deviation, the difference was too small to make a statement either way. In almost all cases, the real bottleneck is the database – not the PHP that’s being parsed. If you already use your PHP caching solution, don’t expect any miracles from Varnish.

  10. 16

    The best “recipe” I have after years of administrating servers for php applications, and works 100% with WordPress, is Nginx + Apache + PHP-FPM + APC + Memcache. WordPress can benefit of 2 cache layers, one with APC for opcache and one with Memcache for object cache (on same machine preferable as it’s much faster).

    And also, PHP 5.4 for having better performance, as it’s faster then 5.3 or 5.2 and on WordPress it really proves this (I would stay away from 5.3 and 5.2 for other reasons like security too as they aren’t maintained anymore for months now).

    Also, this implementation makes sure you’re served real time data all the time.

    The result: serving websites with an average of 700k page views per month per website with no problem (WordPress, Drupal, Magento and OpenCart websites on the server)

    On PHP 5.5, the recipe changes a bit, but there are no improvements on performance.

    I’ve got a Memcached “implemented” WordPress multisites network, and it can’t beat the previous “recipe”.

  11. 17

    How about a nice CDN so cacheable requests never enter your environment in the first place?

  12. 18

    Is there anyone running servers or significant web properties who is not doing significant caching? And the “serving a page from memory is a lot faster than serving it from disk” is not necessarily true. I would have agreed with that statement a while back until I saw comment from one of the authors of a WordPress caching plugin that memcached was only worth the effort in a load balanced environment, and that it did not help on a single host installation enough to matter. Skeptical and thinking that counter intuitive, I did a crude benchmark using tmpfs, and found the difference between a disk based cache and a memory based cache very, very minor (tested on rackspace cloud servers). So cache by all means. That’s very important indeed. But don’t waste a lot of time on memory based caching unless that is really needed.


↑ Back to top