Menu Search
Jump to the content X X
SmashingConf London Avatar

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf London, dedicated to all things web performance.

HTTPS Everywhere With Nginx, Varnish And Apache

The web is moving toward using HTTPS encryption by default. This move has been encouraged by Google, which announced that HTTPS would be a ranking signal1. However, moving your website to HTTPS is good for other reasons2, too.

varnish https

Rather than debate those reasons, this article assumes you have already decided to move to HTTPS. We’ll walk through how to move your website to HTTPS, taking advantage of Varnish Cache.

Further Reading on SmashingMag: Link

What’s The Problem With Varnish And HTTPS? Link

In previous articles on Smashing Magazine, I’ve explained how to use Varnish to speed up your website7. For those of us who use Varnish and also want to move to HTTPS, there is a problem: Varnish doesn’t support HTTPS8. If you make the move to SSL, configuring Apache to serve your website securely, then you lose the speed advantage of Varnish.

There is a relatively straightforward way to deal with this issue, and that is to stick something in between incoming SSL requests and Varnish, a layer that handles the secure connection and SSL certificates and then passes the request back to Varnish. For this task, we will use Nginx. You may know Nginx as a web server alternative to Apache, and it is. However, it can also be used as a proxy to handle and pass requests on to other services, which is what we are going to do here. In other words, we’re going to create a web server sandwich, with Varnish as the tasty cache-meat in the middle.

Where We Are And Where We Want To Be Link

I’m assuming you are in a similar situation as me and have a server — whether virtual or dedicated hardware — with a number of websites running on it. Some of those websites you want to make fully HTTPS, and perhaps some will remain HTTP for the time being.

Your current configuration would have every request on port 80 handled by Varnish. Varnish then decides, based on the rules added to your Varnish Configuration Language (VCL), whether to deliver a cached copy of the page or hand the request back to Apache for a new page to be created. Once the page hits Apache, the web server might need to pull information from the database or do other processing before delivering it.

By the end of this tutorial, we want to be in the following position:

  • Nginx will run on port 443 and handle incoming HTTPS requests, handing them off to Varnish.
  • Varnish will run on port 80 and handle incoming HTTP requests, including those from Nginx, delivering directly from cache or handing to Apache
  • Apache will run on port 8080 and do what Apache does: deliver your website or application.

In this situation, Nginx becomes a proxy. It does no processing of your website, and it isn’t running PHP or connecting to your database. All it does is accept the HTTPS requests and pass them back to Varnish. Varnish then decides whether to hand back a cached copy or pass it back to Apache to get a fresh one, using the Varnish rules you already have.

My Example Environment Link

I’m going to work in Vagrant, using Ubuntu Trusty. My starting point is as described above, with Apache installed on port 8080, and Varnish 4 installed on port 80.

If you would like to follow along, you can download my environment from GitHub9. Instructions on setting up are in the readme file.

I have two websites configured. If I visit those websites in a browser, Varnish will handle the request on port 80, either delivering the file from cache or passing it back to Apache.

At this point, it is useful to check which ports things are running on. SSH into Vagrant on the command line:

> vagrant ssh

Then, run netstat:

> sudo netstat -taupen

This will give you an output of ports, as well as information on which process is using them. You should find that Varnish is running on port 80 and Apache on 8080.


(View large version11)

You can also check that Varnish is running normally and serving pages from the cache by running the following:

> varnishstat

(View large version13)

If you reload your page in the web browser, you should see cache hits and misses.

If you are using my VCL from GitHub, I’ve added to the Varnish configuration some code that will send a HIT or MISS header to the browser. This means you can look at the headers being sent. You should see X-Cache: HIT if the page came from Varnish and X-Cache: MISS if it was served by Apache.

Viewing a HIT from Varnish in the headers

Installing Nginx Link

We can now install Nginx. On an Ubuntu system, this is as straightforward as issuing the following command:

> sudo apt-get install nginx

Nginx’s documentation14 has information on installing Nginx on a variety of systems, as well as packages for systems that do not include it in their package management. Remember that we are just using Nginx as a proxy, so you don’t need to worry about configuring PHP or MySQL support. Nginx won’t start by default, and currently it is unable to start because Varnish is already using port 80. If you were doing this process on a live server, you would be safe to run this step without any impact on your running websites.

Create Or Install An SSL Certificate Link

The next step is to set up our SSL certificate. Because we are working locally, we can create a “self-signed” certificate in order to test SSL connections.

To create a self-signed certificate for testing, first choose or create a directory to put it in. I’ve created an nginx directory in /etc/ssl. Then, run the command below to generate the key and certificate pair.

> sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key -out /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt

When you run this command you will be prompted for a series of questions. You can mostly put junk in these; however, when prompted for the “Common Name,” use the domain that you type in the URL bar to access your website on Vagrant. For me, this is smashing_ssl_one.tutorials.eoms.

Creating a self-signed certificate15

(View large version16)

If you look now in the folder you created, you should see two files, one with a .key extension and one with a .crt extension.

On your live server, you would purchase a certificate from an issuing authority. You would then be given the key and certificate files and, rather than create them, you would place them on your server before following the next step.

Configure The SSL Websites In Nginx Link

With your self-signed or purchased SSL certificates in place, you can set up your websites in Nginx.

First, remove the default configuration file from /etc/nginx/sites-enabled. You can delete the default file or move it elsewhere.

We only need to configure websites that will be served over SSL; any other websites will continue to be served directly from Varnish on port 80. In my case, I’m going to configure smashing_ssl_one.tutorials.eoms. Wherever you see that domain in the steps below, you can replace it with your own live or local domain, if you are not using my example.

In /etc/nginx/sites-available/, create a configuration file as

In that file, add the following:

server {
  listen *:443 ssl;
  server_name smashing_ssl_one.tutorials.eoms;

  ssl on;
  ssl_certificate /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt;
  ssl_certificate_key /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key;

  location / {

    proxy_pass  ;
    proxy_read_timeout    90;
    proxy_connect_timeout 90;
    proxy_redirect        off;

    proxy_set_header      X-Real-IP $remote_addr;
    proxy_set_header      X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header      X-Forwarded-Proto https;
    proxy_set_header      X-Forwarded-Port 443;
    proxy_set_header      Host $host;

The first line tells the server we are listening on port 443. This is the default port for HTTPS connections, just as port 80 is for HTTP. We then give the server name.

We set SSL to be on and then add the certificate and key that we created or installed, using a full file system path.

Under location, we use proxy_pass to pass the request back to port 80, where Varnish is waiting for it. We then set some headers, which will be passed through.

After adding this file, symlink the file in sites-available to sites-enabled. If you ever want to switch off the website, you can just delete the symlink. The following command will create a symlink on the command line:

> ln -s /etc/nginx/sites-available/smashing_ssl_one.tutorials.eoms.conf /etc/nginx/sites-enabled/smashing_ssl_one.tutorials.eoms.conf

Then, restart Nginx:

> sudo service nginx restart

If you see the output restarting nginx nginx, followed by [fail], the likely problem is some typo in your configuration. My usual problem are either separating the keys and values with a colon or forgetting the semicolon at the end of the line.

If Nginx fails to start, look at the log in /var/log/nginx/error.log because most problems are self-explanatory.

You will see [OK] if Nginx starts up successfully. Now, if you check to see what is running on which port, you should see that Nginx is now on port 443, Varnish still has port 80 and Apache 8080.

> sudo netstat -taupen

The big test is to now visit the website using https://. If you are using a self-signed certificate, then you will have to step through the warning messages — your browser is warning you that the certificate is issued by an unknown authority.

Firefox warns me the connection is untrusted17

(View large version18)

If you see your page served securely with the padlock in the URL bar, then you are now serving HTTPS via Nginx. If you check the HIT or MISS headers or run varnishstat on the command line, you’ll be able to check that pages are being served from Varnish and not hitting Apache each time.

A secure site using a self-signed certificate19

(View large version20)

Redirecting To SSL Using Varnish Link

Based on my own experience of doing this, you might want to tweak a few things.

If your website was running on HTTP and you want to run it on HTTPS, then you will need to redirect all HTTP requests. You can do this using Varnish. Varnish is at at port 80, handling any non-SSL requests. What we want to do is ask Varnish to spot any request for our website and redirect it to HTTPS.

In your VCL file at /etc/varnish/default.vcl, add a subroutine as follows:

# handles redirecting from http to https
sub vcl_synth {
  if (resp.status == 750) {
    set resp.status = 301;
    set resp.http.Location = req.http.x-redir;

Then, in the sub vcl_recv block, add this:

if ( ( ~ "^(?i)smashing_ssl_one.tutorials.eoms") && req.http.X-Forwarded-Proto !~ "(?i)https") {
  set req.http.x-redir = "https://" + + req.url;
  return (synth(750, ""));

You can view the full VCL, with this code included21, on GitHub.

I am pattern-matching my domain and redirecting it to HTTPS with a 301 “moved permanently” code. So, now everything should be switched to SSL. Restart Varnish, and try to go to the HTTP version of the website and check that you are being redirected.

Another useful check is to use cURL on the command line. The following command will return only the headers of your request. You should see that you are getting a 301 when testing the HTTP URL.

> curl -I http://smashing_ssl_one.tutorials.eoms
Redirect 301 headers with curl22

(View large version23)

If you seem to be getting a lot of cache misses on your website, then it would be worth checking which cookies are being stripped by Varnish. Varnish doesn’t cache content with cookies because it assumes that this is personalized content. However, things like Google Analytics cookies should not make your content uncacheable. In my example VCL, I’m dealing with some common cookies, but look at Mattias Geniar’s post24 for a way to see which cookies are being sent to the back end so that you can deal with your unique examples.

Grade An SSL Link

You’ve likely heard of the various compromises in OpenSSL. If you are going to all the trouble of running your websites on HTTPS, then make sure you aren’t vulnerable to any of these issues.

Once you have a live website using SSL, a great way to check is to use the SSL Server Test25 from Qualys SSL Labs. Add your domain name and wait for the test to run. The test checks for many common issues in SSL configurations — your aim is to pass with an A.

When I first ran this on a server with a similar setup to our example Vagrant installation — Ubuntu Trusty, Nginx, Varnish and Apache — I got a B rating, due to the server being vulnerable to the Logjam attack. The fix for this is detailed in “Weak Diffie-Hellman and the Logjam Attack26.”

Back on your server, cd to the directory that you used to put or create SSL certificates, and run the following:

> openssl dhparam -out dhparams.pem 2048

This will create a file named dhparams.pem.

You can then add to your Nginx configuration the code detailed under “Nginx” on the “Weak Diffie-Hellman and the Logjam Attack” website.

server {
  listen *:443 ssl;
  server_name smashing_ssl_one.tutorials.eoms;

  ssl on;
  ssl_certificate /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt;
  ssl_certificate_key /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key;
  ssl_dhparam /etc/ssl/nginx/dhparams.pem;
  ssl_protocols TLSv1 TLSv1.1 TLSv1.2;

  ssl_prefer_server_ciphers on;

  location / {

   proxy_pass  ;
   proxy_read_timeout    90;
   proxy_connect_timeout 90;
   proxy_redirect        off;

   proxy_set_header      X-Real-IP $remote_addr;
   proxy_set_header      X-Forwarded-For $proxy_add_x_forwarded_for;
   proxy_set_header      X-Forwarded-Proto https;
   proxy_set_header      X-Forwarded-Port 443;
   proxy_set_header      Host $host;


Reload Nginx and retest your website. Once you have achieved a A rating, you can periodically check your website to make sure you still have that A.

Running SSL Test to check for any SSL issues27

(View large version28)

Check For Mixed Content Warnings Link

Your website may well have resources being loaded from other domains that are not HTTPS — this will cause a warning on your website. In many cases, the third party will have an HTTPS endpoint that you can link to. However, I had to remove the Lanyrd badges from my own website because the JavaScript was hosted only on HTTP.

Further Reading And Resources Link

I’ve added links to additional reading throughout this article. For your reference, here are those links, plus some extra resources I’ve found useful.

HTTPS and SSL Link

Varnish Link

If you know of any other helpful resources, or if you’ve followed these steps and found some extra piece of information, please add it to the comments. It will help out the next person doing it.

Excerpt image: Yuri Samoilov37

(vf, ml, al)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36
  37. 37

↑ Back to top Tweet itShare on Facebook

Rachel Andrew is a web developer, writer and speaker and one of the people behind the content management system, Perch. She is the author of a number of books including The New CSS Layout. She writes about business and technology on her own site at

  1. 1

    Thanks for that article, I’m just skimmed through it and there’s lots of good material in it.
    Although your solution stuffing nginx in between Varnish and port 443 should work quite alright, I want to point out, that you can achieve the same thing defining a second virtual host on apache, that supports ssl. You can listen to multiple ports, and that saves you from installing another web server.
    But that’s just my 2cents, and thanks again for the article!

    • 2

      That doesn’t achieve the same thing, as you then lose the benefit of Varnish as your requests are going directly to Apache. The aim of this is to keep the caching benefit while enabling use of TLS.

      • 3

        No you don’t. I meant keeping Varnish on Port 80, and Apache on Port 8080 like in your older article. Then another VirtualHost on Port 443 using Apache, with ProxyPass to Varnish. There’s even a default-ssl vhost on Apache which makes that easy.

        • 4

          There’s no technical reason why you can’t proxy back to Apache but I presume the performance would be awful, especially if you’re wanting to do https by default, which is what this article is about. You’re basically asking Apache to be a proxy server, which means it’s going to respond to every https request and then pass those requests back to Varnish.

          If your site is mostly http with a few pages served over https, you probably could get by with that setup but I would take Varnish out of the mix at that point, since it’d be guaranteed to be slower than just having Apache serve it.

          • 5

            I see your point. If we consider taking out Varnish, my idea is of course overly complex und unnecessary. My original comment was based on using Varnish, Apache and enabling HTTPS, and that’s also the basis of this article.

            The solution presented was to install another webserver, and proxy to varnish. I was just arguing, installing another webserver is not necessary as you can use Apache to proxy. Varnish themselves propose use of HAproxy, which is a dedicated proxy, and a good pick if you already have a website running and don’t want to start from scratch using another webserver, nginx here.

            Also I’d really like to see a benchmark of Apache using libevent, no mod_php and no AllowOverride, which is a fair comparison, since most performance benchmarks don’t take that into account when comparing, see this

          • 6

            Apache is mediocre at anything other than dynamic content which is why most system architects throw something like Varnish in front of it to serve static files.

            There’s been plenty of benchmarks comparing Apache to nginx. (Ex: ) PHP performance over PHP-FPM using either server is pretty close, though nginx is more lightweight with less overhead.

            For pure PHP performance, Apache using mod_php (mpm_prefork) is going to be fastest; it’s got the php interpreter built in, it doesn’t have to go through another traffic cop (php fpm or fast cgi) to serve the page. ( ) But that’s the only benchmark I’ve seen Apache win.

            However, for static files, If you have to load up the PHP interpreter just for serving an image, a high traffic site will quickly burn through your max_clients and ram (especially on a VPS). You’re right, the mpm and turning off looking for htaccess files can make a slight difference but nginx still trounces Apache. mpm_event and mpm_worker are definitely better at serving static files but nowhere near as good as nginx or varnish. mpm_event was hyped as “faster” than nginx but it really didn’t improve on mpm_worker and in some cases was actually much slower.

            As always YMMV. nginx’s docs aren’t as good and the community isn’t as large. Unless your website gets a ton of traffic, php fpm and mpm_worker can be a decent combination. I have a few small to medium size websites running on a VPS in that very combination.

        • 7

          That doesn’t sound like a great plan for performance, but if you do it and it works well then please write it up and add a link. This is obviously not the only way to achieve this but after a bit of research and testing it seemed like a reasonable way to achieve it – for the time being anyway.

      • 8

        Why would you not just use the caching built into nginx and eliminate Varnish entirely?

        That’s how Cloudflare, MaxCDN, and my $dayjob at a SaaS provider use nginx.

        Nginx’s proxy_cache directives are extremely fast and quite flexible, especially as you can specify what variables from the request goes into the proxy_cache_key.

  2. 9

    Great walk through and very helpful. I also want to switch to HTTPS, but my problem is how to pay the certificates? I got some some sites running and it would be hard to sell my (small and mid business) customers a certificate for a bunch of hundred Euros every year. And i don’t want to scare users with the browser warning of self signed certs as well.
    I think this is the biggest hurdle to switch the web to HTTPS at the moment.
    Any ideas?

    • 10

      Keep an eye on

      Making SSL free for everyone

    • 11

      Assuming you’re not running stores/payment gateways, you can purchase some of the more inexpensive options from a site like A basic certificate will run you in the neighborhood of $10/year.

      • 12

        I’ve used, they’re now owned by namecheap which is a solid registrar. I’d recommend them as well. But don’t be fooled, a $10 cert will work as well as a more expensive option. You’re more or less paying for extra “insurance.” IMO the only reason you’d go for a more expensive option is to secure multiple sub domains or for an extended validation (EV) cert (which will give you the green bar/padlock in many browsers, which generally users don’t really notice anyway as long as your SSL is working).

        The problem with SSL isn’t really certs IMO, it’s you also need another IP address for every domain you host unless you don’t care about older (but still used) versions of Android and Windows that don’t support SNI.

    • 15

      Markus Seyfferth

      September 17, 2015 7:27 pm

      There will be a new certificate authority; free, automated, and available soon:

    • 16

      Have a look at – a new certificate authority, that will provide free certificates in the near future. Hopefully all browsers will support it!

    • 17

      By the time HTTPS everywhere becomes that important, this should well be live:

      It’s not some wishful thinking self-signed SSL thingie, this is an initiative by The Linux Foundation and will end up in all major browsers.

  3. 18

    There is simply no reason for a setup this complex and running three different web servers to serve one site.

    First of all, nginx is more than capable of running your application, and will do that better and faster than Apache 90% of the times. It can also reverse proxy to itself.

    Finally, nginx is a darn capable caching reverse-proxy:

  4. 20

    Why not go all in with nginx and throw out varnish and apache? It’d be a much simpler config for most of the gain. nginx already supports SSL termination, it’ll serve static files on its own just as fast and its proxy/fast_cgi cache is “good enough” for a lot of setups.

  5. 23

    Rather than running a whole nginx server (or rather a second nginx server, since most my server setups with varnish have nginx behind them instead of Apache) I’ve been using Pound for SSL termination.

    However in the future I might switch to using nginx for that. Not simply for ssl. But because in the future I could also use nginx as a HTTP/2 terminator.

  6. 24

    Frédéric Kayser

    September 19, 2015 9:48 am

    Unless you’ll update this page every time a new security weakness not covered by default Nginx settings is disclosed, it would be wise to point to a trusted reference that will. Mozilla did a pretty good job explaining how to configure TLS on servers:


  7. 25

    For Varnish Plus users, SSL/TLS support was added earlier this year. More info on the Varnish website:

  8. 26

    I might be missing something but when is use this setup i have a lot of issues in dynamic content sites like wordpress.
    When i visit a site using https, a lot of content is still send back using http. Has someone have an idea on this issue?

  9. 29

    Good writeup, i bumped into this after configuring up a similar setup with SNI and http/2 support for a multi-domain Drupal installation and doing a search because i was curious what info is already publicly available on this subject.

    I have one question for you:
    Why do the redirection of http to https in Varnish instead of nginx? If nginx is sitting on the outside of the stack is it not easier to do the redericting right there?

  10. 30

    hello every one i fallow the guide above and its work fine, but i have 3 website under varnish as backend but i not able to redirect https more than one website only , how it is possable to redirect 3 or 2 website for https to ngnix

    thank you

  11. 31

    I have this same setup and I can’t get http to redirect into https. I’m running apache and varnish for http, and for https I have nginx in front to handle the ssl. I have a e-commerce store and I need all traffic redirect to https.


↑ Back to top