Menu Search
Jump to the content X X
Smashing Conf New York

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

HTTPS Everywhere With Nginx, Varnish And Apache

The web is moving toward using HTTPS encryption by default. This move has been encouraged by Google, which announced that HTTPS would be a ranking signal1. However, moving your website to HTTPS is good for other reasons2, too.

Rather than debate those reasons, this article assumes you have already decided to move to HTTPS. We’ll walk through how to move your website to HTTPS, taking advantage of Varnish Cache.

What’s The Problem With Varnish And HTTPS? Link

In previous articles on Smashing Magazine, I’ve explained how to use Varnish to speed up your website3. For those of us who use Varnish and also want to move to HTTPS, there is a problem: Varnish doesn’t support HTTPS4. If you make the move to SSL, configuring Apache to serve your website securely, then you lose the speed advantage of Varnish.

There is a relatively straightforward way to deal with this issue, and that is to stick something in between incoming SSL requests and Varnish, a layer that handles the secure connection and SSL certificates and then passes the request back to Varnish. For this task, we will use Nginx. You may know Nginx as a web server alternative to Apache, and it is. However, it can also be used as a proxy to handle and pass requests on to other services, which is what we are going to do here. In other words, we’re going to create a web server sandwich, with Varnish as the tasty cache-meat in the middle.

Where We Are And Where We Want To Be Link

I’m assuming you are in a similar situation as me and have a server — whether virtual or dedicated hardware — with a number of websites running on it. Some of those websites you want to make fully HTTPS, and perhaps some will remain HTTP for the time being.

Your current configuration would have every request on port 80 handled by Varnish. Varnish then decides, based on the rules added to your Varnish Configuration Language (VCL), whether to deliver a cached copy of the page or hand the request back to Apache for a new page to be created. Once the page hits Apache, the web server might need to pull information from the database or do other processing before delivering it.

By the end of this tutorial, we want to be in the following position:

  • Nginx will run on port 443 and handle incoming HTTPS requests, handing them off to Varnish.
  • Varnish will run on port 80 and handle incoming HTTP requests, including those from Nginx, delivering directly from cache or handing to Apache
  • Apache will run on port 8080 and do what Apache does: deliver your website or application.

In this situation, Nginx becomes a proxy. It does no processing of your website, and it isn’t running PHP or connecting to your database. All it does is accept the HTTPS requests and pass them back to Varnish. Varnish then decides whether to hand back a cached copy or pass it back to Apache to get a fresh one, using the Varnish rules you already have.

My Example Environment Link

I’m going to work in Vagrant, using Ubuntu Trusty. My starting point is as described above, with Apache installed on port 8080, and Varnish 4 installed on port 80.

If you would like to follow along, you can download my environment from GitHub5. Instructions on setting up are in the readme file.

I have two websites configured. If I visit those websites in a browser, Varnish will handle the request on port 80, either delivering the file from cache or passing it back to Apache.

At this point, it is useful to check which ports things are running on. SSH into Vagrant on the command line:

> vagrant ssh

Then, run netstat:

> sudo netstat -taupen

This will give you an output of ports, as well as information on which process is using them. You should find that Varnish is running on port 80 and Apache on 8080.

Netstat6
(View large version7)

You can also check that Varnish is running normally and serving pages from the cache by running the following:

> varnishstat
varnishstat8
(View large version9)

If you reload your page in the web browser, you should see cache hits and misses.

If you are using my VCL from GitHub, I’ve added to the Varnish configuration some code that will send a HIT or MISS header to the browser. This means you can look at the headers being sent. You should see X-Cache: HIT if the page came from Varnish and X-Cache: MISS if it was served by Apache.

Viewing a HIT from Varnish in the headers

Installing Nginx Link

We can now install Nginx. On an Ubuntu system, this is as straightforward as issuing the following command:

> sudo apt-get install nginx

Nginx’s documentation10 has information on installing Nginx on a variety of systems, as well as packages for systems that do not include it in their package management. Remember that we are just using Nginx as a proxy, so you don’t need to worry about configuring PHP or MySQL support. Nginx won’t start by default, and currently it is unable to start because Varnish is already using port 80. If you were doing this process on a live server, you would be safe to run this step without any impact on your running websites.

Create Or Install An SSL Certificate Link

The next step is to set up our SSL certificate. Because we are working locally, we can create a “self-signed” certificate in order to test SSL connections.

To create a self-signed certificate for testing, first choose or create a directory to put it in. I’ve created an nginx directory in /etc/ssl. Then, run the command below to generate the key and certificate pair.

> sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key -out /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt

When you run this command you will be prompted for a series of questions. You can mostly put junk in these; however, when prompted for the “Common Name,” use the domain that you type in the URL bar to access your website on Vagrant. For me, this is smashing_ssl_one.tutorials.eoms.

Creating a self-signed certificate11
(View large version12)

If you look now in the folder you created, you should see two files, one with a .key extension and one with a .crt extension.

On your live server, you would purchase a certificate from an issuing authority. You would then be given the key and certificate files and, rather than create them, you would place them on your server before following the next step.

Configure The SSL Websites In Nginx Link

With your self-signed or purchased SSL certificates in place, you can set up your websites in Nginx.

First, remove the default configuration file from /etc/nginx/sites-enabled. You can delete the default file or move it elsewhere.

We only need to configure websites that will be served over SSL; any other websites will continue to be served directly from Varnish on port 80. In my case, I’m going to configure smashing_ssl_one.tutorials.eoms. Wherever you see that domain in the steps below, you can replace it with your own live or local domain, if you are not using my example.

In /etc/nginx/sites-available/, create a configuration file as your_domain.com.conf.

In that file, add the following:

server {
  listen *:443 ssl;
  server_name smashing_ssl_one.tutorials.eoms;

  ssl on;
  ssl_certificate /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt;
  ssl_certificate_key /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key;

  location / {

    proxy_pass            http://127.0.0.1:80;
    proxy_read_timeout    90;
    proxy_connect_timeout 90;
    proxy_redirect        off;

    proxy_set_header      X-Real-IP $remote_addr;
    proxy_set_header      X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header      X-Forwarded-Proto https;
    proxy_set_header      X-Forwarded-Port 443;
    proxy_set_header      Host $host;
  }
}


The first line tells the server we are listening on port 443. This is the default port for HTTPS connections, just as port 80 is for HTTP. We then give the server name.

We set SSL to be on and then add the certificate and key that we created or installed, using a full file system path.

Under location, we use proxy_pass to pass the request back to port 80, where Varnish is waiting for it. We then set some headers, which will be passed through.

After adding this file, symlink the file in sites-available to sites-enabled. If you ever want to switch off the website, you can just delete the symlink. The following command will create a symlink on the command line:

> ln -s /etc/nginx/sites-available/smashing_ssl_one.tutorials.eoms.conf /etc/nginx/sites-enabled/smashing_ssl_one.tutorials.eoms.conf

Then, restart Nginx:

> sudo service nginx restart

If you see the output restarting nginx nginx, followed by [fail], the likely problem is some typo in your configuration. My usual problem are either separating the keys and values with a colon or forgetting the semicolon at the end of the line.

If Nginx fails to start, look at the log in /var/log/nginx/error.log because most problems are self-explanatory.

You will see [OK] if Nginx starts up successfully. Now, if you check to see what is running on which port, you should see that Nginx is now on port 443, Varnish still has port 80 and Apache 8080.

> sudo netstat -taupen

The big test is to now visit the website using https://. If you are using a self-signed certificate, then you will have to step through the warning messages — your browser is warning you that the certificate is issued by an unknown authority.

Firefox warns me the connection is untrusted13
(View large version14)

If you see your page served securely with the padlock in the URL bar, then you are now serving HTTPS via Nginx. If you check the HIT or MISS headers or run varnishstat on the command line, you’ll be able to check that pages are being served from Varnish and not hitting Apache each time.

A secure site using a self-signed certificate15
(View large version16)

Redirecting To SSL Using Varnish Link

Based on my own experience of doing this, you might want to tweak a few things.

If your website was running on HTTP and you want to run it on HTTPS, then you will need to redirect all HTTP requests. You can do this using Varnish. Varnish is at at port 80, handling any non-SSL requests. What we want to do is ask Varnish to spot any request for our website and redirect it to HTTPS.

In your VCL file at /etc/varnish/default.vcl, add a subroutine as follows:

# handles redirecting from http to https
sub vcl_synth {
  if (resp.status == 750) {
    set resp.status = 301;
    set resp.http.Location = req.http.x-redir;
    return(deliver);
  }
}

Then, in the sub vcl_recv block, add this:

if ( (req.http.host ~ "^(?i)smashing_ssl_one.tutorials.eoms") && req.http.X-Forwarded-Proto !~ "(?i)https") {
  set req.http.x-redir = "https://" + req.http.host + req.url;
  return (synth(750, ""));
}

You can view the full VCL, with this code included17, on GitHub.

I am pattern-matching my domain and redirecting it to HTTPS with a 301 “moved permanently” code. So, now everything should be switched to SSL. Restart Varnish, and try to go to the HTTP version of the website and check that you are being redirected.

Another useful check is to use cURL on the command line. The following command will return only the headers of your request. You should see that you are getting a 301 when testing the HTTP URL.

> curl -I http://smashing_ssl_one.tutorials.eoms
Redirect 301 headers with curl18
(View large version19)

If you seem to be getting a lot of cache misses on your website, then it would be worth checking which cookies are being stripped by Varnish. Varnish doesn’t cache content with cookies because it assumes that this is personalized content. However, things like Google Analytics cookies should not make your content uncacheable. In my example VCL, I’m dealing with some common cookies, but look at Mattias Geniar’s post20 for a way to see which cookies are being sent to the back end so that you can deal with your unique examples.

Grade An SSL Link

You’ve likely heard of the various compromises in OpenSSL. If you are going to all the trouble of running your websites on HTTPS, then make sure you aren’t vulnerable to any of these issues.

Once you have a live website using SSL, a great way to check is to use the SSL Server Test21 from Qualys SSL Labs. Add your domain name and wait for the test to run. The test checks for many common issues in SSL configurations — your aim is to pass with an A.

When I first ran this on a server with a similar setup to our example Vagrant installation — Ubuntu Trusty, Nginx, Varnish and Apache — I got a B rating, due to the server being vulnerable to the Logjam attack. The fix for this is detailed in “Weak Diffie-Hellman and the Logjam Attack22.”

Back on your server, cd to the directory that you used to put or create SSL certificates, and run the following:

> openssl dhparam -out dhparams.pem 2048

This will create a file named dhparams.pem.

You can then add to your Nginx configuration the code detailed under “Nginx” on the “Weak Diffie-Hellman and the Logjam Attack” website.

server {
  listen *:443 ssl;
  server_name smashing_ssl_one.tutorials.eoms;

  ssl on;
  ssl_certificate /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.crt;
  ssl_certificate_key /etc/ssl/nginx/smashing_ssl_one.tutorials.eoms.key;
  ssl_dhparam /etc/ssl/nginx/dhparams.pem;
  ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
  ssl_ciphers 'ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA';

  ssl_prefer_server_ciphers on;

  location / {

   proxy_pass            http://127.0.0.1:80;
   proxy_read_timeout    90;
   proxy_connect_timeout 90;
   proxy_redirect        off;

   proxy_set_header      X-Real-IP $remote_addr;
   proxy_set_header      X-Forwarded-For $proxy_add_x_forwarded_for;
   proxy_set_header      X-Forwarded-Proto https;
   proxy_set_header      X-Forwarded-Port 443;
   proxy_set_header      Host $host;

  }
}


Reload Nginx and retest your website. Once you have achieved a A rating, you can periodically check your website to make sure you still have that A.

Running SSL Test to check for any SSL issues23
(View large version24)

Check For Mixed Content Warnings Link

Your website may well have resources being loaded from other domains that are not HTTPS — this will cause a warning on your website. In many cases, the third party will have an HTTPS endpoint that you can link to. However, I had to remove the Lanyrd badges from my own website because the JavaScript was hosted only on HTTP.

Further Reading And Resources Link

I’ve added links to additional reading throughout this article. For your reference, here are those links, plus some extra resources I’ve found useful.

HTTPS and SSL Link

Varnish Link

If you know of any other helpful resources, or if you’ve followed these steps and found some extra piece of information, please add it to the comments. It will help out the next person doing it.

Excerpt image: Yuri Samoilov33

(vf, ml, al)

Footnotes Link

  1. 1 http://googleonlinesecurity.blogspot.co.uk/2014/08/https-as-ranking-signal_6.html
  2. 2 https://medium.com/so-now-you-know/10-reasons-to-go-https-a2cba5734bb6
  3. 3 https://www.smashingmagazine.com/2013/12/speed-up-your-mobile-website-with-varnish/
  4. 4 https://www.varnish-cache.org/docs/trunk/phk/ssl_again.html
  5. 5 https://github.com/rachelandrew/smashing-ssl-tutorial
  6. 6 https://www.smashingmagazine.com/wp-content/uploads/2015/08/01-netstat-opt.jpg
  7. 7 https://www.smashingmagazine.com/wp-content/uploads/2015/08/01-netstat-opt.jpg
  8. 8 https://www.smashingmagazine.com/wp-content/uploads/2015/08/02-varnishstat-opt.jpg
  9. 9 https://www.smashingmagazine.com/wp-content/uploads/2015/08/02-varnishstat-opt.jpg
  10. 10 http://nginx.org/en/docs/install.html
  11. 11 https://www.smashingmagazine.com/wp-content/uploads/2015/08/04-self-signed-opt.png
  12. 12 https://www.smashingmagazine.com/wp-content/uploads/2015/08/04-self-signed-opt.png
  13. 13 https://www.smashingmagazine.com/wp-content/uploads/2015/08/05-exception-opt.png
  14. 14 https://www.smashingmagazine.com/wp-content/uploads/2015/08/05-exception-opt.png
  15. 15 https://www.smashingmagazine.com/wp-content/uploads/2015/08/06-secure-opt.png
  16. 16 https://www.smashingmagazine.com/wp-content/uploads/2015/08/06-secure-opt.png
  17. 17 https://gist.github.com/rachelandrew/6a2693d3cd6fb9268756
  18. 18 https://www.smashingmagazine.com/wp-content/uploads/2015/08/07-headers-redirect-opt.png
  19. 19 https://www.smashingmagazine.com/wp-content/uploads/2015/08/07-headers-redirect-opt.png
  20. 20 https://ma.ttias.be/varnish-tip-see-cookies-stripped-vcl/
  21. 21 https://www.ssllabs.com/ssltest/
  22. 22 https://weakdh.org/sysadmin.html
  23. 23 https://www.smashingmagazine.com/wp-content/uploads/2015/08/08-ssl-test-opt.png
  24. 24 https://www.smashingmagazine.com/wp-content/uploads/2015/08/08-ssl-test-opt.png
  25. 25 https://medium.com/so-now-you-know/10-reasons-to-go-https-a2cba5734bb6
  26. 26 https://moz.com/blog/seo-tips-https-ssl
  27. 27 https://weakdh.org/sysadmin.html
  28. 28 https://www.ssllabs.com/ssltest/index.html
  29. 29 https://www.startssl.com/
  30. 30 https://letsencrypt.org/
  31. 31 https://ma.ttias.be/varnish-tip-see-cookies-stripped-vcl/
  32. 32 https://www.varnish-cache.org/docs/4.0/users-guide/performance.html
  33. 33 https://www.flickr.com/photos/110751683@N02/13792583873/in/photolist-n1NAEe-9buWwJ-77FJT6-sT5zR-iBJRvX-i3pDMJ-8BQtUA-9yxNTv-9yxPiz-7DCD83-9yxPcz-9yxP9v-bocGSi-9yxPfa-9yxNQk-9yAPWf-a7bwJ4-bocJpx-aWroXa-sT5tv-aoeuM3-9yAQ35-aobKqk-5nMYnS-bocHAz-5JiBqz-f3jqwu-9yAQdf-aK4Wjx-8BMooz-sT5ni-a92bRJ-9Y6fdL-5EdvaF-5vWMjq-sT5YN-iBJSit-6yGU4d-5vWMQd-sT5Sv-99CgZM-qi1eYu-5vStBK-5vSt9R-5vStir-5vWMKY-5vWMUL-cMjVKG-5vWMkN-5vWM4S
SmashingConf New York

Hold on, Tiger! Thank you for reading the article. Did you know that we also publish printed books and run friendly conferences – crafted for pros like you? Like SmashingConf Barcelona, on October 25–26, with smart design patterns and front-end techniques.

↑ Back to top Tweet itShare on Facebook

Rachel Andrew is a web developer, writer and speaker and one of the people behind the content management system, Perch. She is the author of a number of books including The Profitable Side Project Handbook. She writes about business and technology on her own site at rachelandrew.co.uk.

  1. 1

    Thanks for that article, I’m just skimmed through it and there’s lots of good material in it.
    Although your solution stuffing nginx in between Varnish and port 443 should work quite alright, I want to point out, that you can achieve the same thing defining a second virtual host on apache, that supports ssl. You can listen to multiple ports, and that saves you from installing another web server.
    But that’s just my 2cents, and thanks again for the article!

    -3
    • 2

      That doesn’t achieve the same thing, as you then lose the benefit of Varnish as your requests are going directly to Apache. The aim of this is to keep the caching benefit while enabling use of TLS.

      2
      • 3

        No you don’t. I meant keeping Varnish on Port 80, and Apache on Port 8080 like in your older article. Then another VirtualHost on Port 443 using Apache, with ProxyPass to Varnish. There’s even a default-ssl vhost on Apache which makes that easy.

        0
        • 4

          There’s no technical reason why you can’t proxy back to Apache but I presume the performance would be awful, especially if you’re wanting to do https by default, which is what this article is about. You’re basically asking Apache to be a proxy server, which means it’s going to respond to every https request and then pass those requests back to Varnish.

          If your site is mostly http with a few pages served over https, you probably could get by with that setup but I would take Varnish out of the mix at that point, since it’d be guaranteed to be slower than just having Apache serve it.

          1
          • 5

            I see your point. If we consider taking out Varnish, my idea is of course overly complex und unnecessary. My original comment was based on using Varnish, Apache and enabling HTTPS, and that’s also the basis of this article.

            The solution presented was to install another webserver, and proxy to varnish. I was just arguing, installing another webserver is not necessary as you can use Apache to proxy. Varnish themselves propose use of HAproxy, which is a dedicated proxy, and a good pick if you already have a website running and don’t want to start from scratch using another webserver, nginx here.

            Also I’d really like to see a benchmark of Apache using libevent, no mod_php and no AllowOverride, which is a fair comparison, since most performance benchmarks don’t take that into account when comparing, see this http://dracony.org/stop-using-php-fpm-to-argue-using-nginx-vs-apache/

            0
          • 6

            Apache is mediocre at anything other than dynamic content which is why most system architects throw something like Varnish in front of it to serve static files.

            There’s been plenty of benchmarks comparing Apache to nginx. (Ex: http://systemsarchitect.net/apache2-vs-nginx-for-php-application/ ) PHP performance over PHP-FPM using either server is pretty close, though nginx is more lightweight with less overhead.

            For pure PHP performance, Apache using mod_php (mpm_prefork) is going to be fastest; it’s got the php interpreter built in, it doesn’t have to go through another traffic cop (php fpm or fast cgi) to serve the page. (http://www.eschrade.com/page/why-is-fastcgi-w-nginx-so-much-faster-than-apache-w-mod_php/ ) But that’s the only benchmark I’ve seen Apache win.

            However, for static files, If you have to load up the PHP interpreter just for serving an image, a high traffic site will quickly burn through your max_clients and ram (especially on a VPS). You’re right, the mpm and turning off looking for htaccess files can make a slight difference but nginx still trounces Apache. mpm_event and mpm_worker are definitely better at serving static files but nowhere near as good as nginx or varnish. mpm_event was hyped as “faster” than nginx but it really didn’t improve on mpm_worker and in some cases was actually much slower.

            As always YMMV. nginx’s docs aren’t as good and the community isn’t as large. Unless your website gets a ton of traffic, php fpm and mpm_worker can be a decent combination. I have a few small to medium size websites running on a VPS in that very combination.

            0
        • 7

          That doesn’t sound like a great plan for performance, but if you do it and it works well then please write it up and add a link. This is obviously not the only way to achieve this but after a bit of research and testing it seemed like a reasonable way to achieve it – for the time being anyway.

          0
      • 8

        Why would you not just use the caching built into nginx and eliminate Varnish entirely?

        That’s how Cloudflare, MaxCDN, and my $dayjob at a SaaS provider use nginx.

        Nginx’s proxy_cache directives are extremely fast and quite flexible, especially as you can specify what variables from the request goes into the proxy_cache_key.

        5
  2. 9

    Great walk through and very helpful. I also want to switch to HTTPS, but my problem is how to pay the certificates? I got some some sites running and it would be hard to sell my (small and mid business) customers a certificate for a bunch of hundred Euros every year. And i don’t want to scare users with the browser warning of self signed certs as well.
    I think this is the biggest hurdle to switch the web to HTTPS at the moment.
    Any ideas?

    2
    • 10

      Keep an eye on https://letsencrypt.org/

      Making SSL free for everyone

      1
    • 11

      Assuming you’re not running stores/payment gateways, you can purchase some of the more inexpensive options from a site like ssls.com. A basic certificate will run you in the neighborhood of $10/year.

      1
      • 12

        I’ve used ssls.com, they’re now owned by namecheap which is a solid registrar. I’d recommend them as well. But don’t be fooled, a $10 cert will work as well as a more expensive option. You’re more or less paying for extra “insurance.” IMO the only reason you’d go for a more expensive option is to secure multiple sub domains or for an extended validation (EV) cert (which will give you the green bar/padlock in many browsers, which generally users don’t really notice anyway as long as your SSL is working).

        The problem with SSL isn’t really certs IMO, it’s you also need another IP address for every domain you host unless you don’t care about older (but still used) versions of Android and Windows that don’t support SNI.

        2
    • 15

      Markus Seyfferth

      September 17, 2015 7:27 pm

      There will be a new certificate authority; free, automated, and available soon:

      https://letsencrypt.org

      0
    • 16

      Have a look at https://letsencrypt.org – a new certificate authority, that will provide free certificates in the near future. Hopefully all browsers will support it!

      0
    • 17

      By the time HTTPS everywhere becomes that important, this should well be live:

      https://letsencrypt.org/

      It’s not some wishful thinking self-signed SSL thingie, this is an initiative by The Linux Foundation and will end up in all major browsers.

      0
  3. 18

    There is simply no reason for a setup this complex and running three different web servers to serve one site.

    First of all, nginx is more than capable of running your application, and will do that better and faster than Apache 90% of the times. It can also reverse proxy to itself.

    Finally, nginx is a darn capable caching reverse-proxy:

    https://serversforhackers.com/nginx-caching/

    7
  4. 20

    Why not go all in with nginx and throw out varnish and apache? It’d be a much simpler config for most of the gain. nginx already supports SSL termination, it’ll serve static files on its own just as fast and its proxy/fast_cgi cache is “good enough” for a lot of setups.

    4
  5. 23

    Rather than running a whole nginx server (or rather a second nginx server, since most my server setups with varnish have nginx behind them instead of Apache) I’ve been using Pound for SSL termination.

    However in the future I might switch to using nginx for that. Not simply for ssl. But because in the future I could also use nginx as a HTTP/2 terminator.

    0
  6. 24

    Frédéric Kayser

    September 19, 2015 9:48 am

    Hello
    Unless you’ll update this page every time a new security weakness not covered by default Nginx settings is disclosed, it would be wise to point to a trusted reference that will. Mozilla did a pretty good job explaining how to configure TLS on servers:
    https://wiki.mozilla.org/Security/Server_Side_TLS

    Cheers

    0
  7. 25

    For Varnish Plus users, SSL/TLS support was added earlier this year. More info on the Varnish website:

    https://www.varnish-software.com/plus/ssl-tls-support

    0
  8. 26

    I might be missing something but when is use this setup i have a lot of issues in dynamic content sites like wordpress.
    When i visit a site using https, a lot of content is still send back using http. Has someone have an idea on this issue?

    3
  9. 29

    Good writeup, i bumped into this after configuring up a similar setup with SNI and http/2 support for a multi-domain Drupal installation and doing a search because i was curious what info is already publicly available on this subject.

    I have one question for you:
    Why do the redirection of http to https in Varnish instead of nginx? If nginx is sitting on the outside of the stack is it not easier to do the redericting right there?

    0
  10. 30

    hello every one i fallow the guide above and its work fine, but i have 3 website under varnish as backend but i not able to redirect https more than one website only , how it is possable to redirect 3 or 2 website for https to ngnix

    thank you

    0
  11. 31

    I have this same setup and I can’t get http to redirect into https. I’m running apache and varnish for http, and for https I have nginx in front to handle the ssl. I have a e-commerce store and I need all traffic redirect to https.

    0

↑ Back to top