Menu Search
Jump to the content X X
Smashing Conf San Francisco

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. upcoming SmashingConf San Francisco, dedicated to smart front-end techniques and design patterns.

Why Static Website Generators Are The Next Big Thing

At StaticGen281, our open-source directory of static website generators, we’ve kept track of more than a hundred generators for more than a year now, and we’ve seen both the volume and popularity of these projects take off incredibly on GitHub during that time, going from just 50 to more than 100 generators and a total of more than 100,000 stars for static website generator repositories.

Influential design-focused companies such as Nest and MailChimp now use static website generators for their primary websites. Vox Media2 has built a whole publishing system3 around Middleman. Carrot4, a large New York agency and part of the Vice empire, builds websites for some of the world’s largest brands with its own open-source generator, Roots5. And several of Google’s properties, such as “A Year In Search6” and Web Fundamentals7, are static.

StaticGen's graph of growth over the last year8
StaticGen’s graph of growth over the last year. (View large version9)

Static websites are hardly new, going all the way back to the beginning of the web. So, why the sudden explosion in interest? What’s up? Why now?

When Static Was It Link

The first ever website, Tim Berners-Lee’s original home page for the World Wide Web10, was static. A website back then was a folder of HTML documents that consisted of just 18 tags11. Browsers were simple document navigators that would fetch HTML from a server and allow the end user to navigate them by following hyperlinks. The web was fundamentally static.

As browsers evolved, so did HTML, and gradually the limitations of purely static websites started to show.

Initially, websites were just plain unstyled documents, but soon they grew into carefully designed objects, with graphical headers and complex navigation. By that point, managing each page of a website as its own document stopped making sense, and templating languages entered the picture.

It also quickly became evident that reserving HTML for structure and CSS for style was not enough of an abstraction to keep the content of a website (the stories, products, gallery items, etc.) separate from the design.

Around the same time, SQL-based relational databases started going mainstream, and for many online companies, the database became the almost-holy resting place of all of their content, guarded by vigilant, long-bearded database administrators.

Desktop applications such as Dreamweaver and FrontPage offered solutions for building content-driven websites through WYSIWYG editors, where pages could be separated into reusable parts, such as navigation, headers and footers, and where content to some degree could be put in a database. In some ways, fatally flawed as they were, these were the original static website generators: building websites from templates, partials, media libraries and sometimes even SQL databases, and publishing them via FTP as static files. As late as 2004, I had the unique experience of working on a major content-driven website, with tens of thousands of pages spread across different editorial groups, all managed via Dreamweaver!

Even if Dreamweaver could, to some degree, integrate with a database, it had no content model, offering no sense of content being separate from design, each half being editable independently with the appropriate tools.

The most mainstream answer to these problems was the LAMP stack and CMS’ such as WordPress, Drupal and Joomla. All of these played an incredibly important role in moving the web forward, enabling the Web 2.0 phenomenon, in which user-generated content became a driving factor for a lot of websites. Users went from following hyperlinks to ordering products, participating in communities and creating content.

Dynamic Problems Link

When I built my first dynamic website more than 15 years ago, I was following the original LAMP-stack tutorials from the MySQL documentation. When I realized that all of this stuff was going on every time someone visited a website that was built like this, it blew my mind!

A web server would load my code into a PHP interpreter, on the fly, and then open connections to a database, sending queries back and forth, using the data in templates and stitching together strings of text into an HTML document, tailor-made for the visitor at that moment. Amazing!

It was, admittedly, a bit less amazing when I visited the website a few years later and found the whole web page replaced with a message from a hacker who pointed out the security flaws in the configuration and was at least generous enough just to deface the website, rather than use it as a vehicle to spread malware.

This dynamic website architecture moved the web forward, but it also opened a can of worms. By a conservative estimate, more than 70% of today’s WordPress installations are vulnerable to known exploits (and WordPress powers more than 23% of the web). Just a few months ago, 1.2 million Drupal installations got infected with malware in the span of a few days 12 million Drupal sites needed emergency patching12, and any not patched within 7 hours of the exploits’ announcement should be considered infected with malware.. Not a week goes by when I don’t follow a link from social media to a website that shows a “Database connection error.” Scaling a dynamic website can be very expensive, and agencies that launch a campaign website or the like often have to overprovision drastically in order to guard against the website blowing up if it manages to go viral — or else they have to desperately scale it while trying to get it back online (something that never seems to happen during office hours).

We pay a huge price for the underlying complexity of dynamic code running on a server for every request — a price we could avoid paying entirely when this kind of complexity is not needed.

Dynamic Websites And Caching Link

To some degree, we tend to work around this by caching. No high-profile WordPress website would be capable of running without a plugin such as WP Super Cache. Large websites no doubt rely on proxy caches such as Varnish, Nginx and Apache Traffic Server in front of their websites.

Caching is notoriously difficult to get right, however, and even the most optimized dynamic website will normally be many times slower than a static solution.

This website, Smashing Magazine, is obviously run by one of the most performance-focused teams out there and is, in general, very heavily optimized for performance. So, I ran a small experiment for this article. Using HTTrack13, I grabbed a copy of this website one level deep and then deployed the static version to Netlify14, a static-hosting platform based on a content delivery network (CDN). I didn’t do anything to improve performance of the static version apart from simply deploying to a host with deep CDN integration.

Smashing Magazine is faster than most websites, but it serves all requests from a single data center.15
Smashing Magazine is faster than most websites, but it serves all requests from a single data center. (View large version16)

I then ran some tests to see how this affected the time to first byte and the complete download time of the main index.html page. Here’s what Sucuri’s super-useful performance tool17 showed.

Even with a highly optimized dynamic website, the static version is more than six times as fast on average! Granted, not every static host will make this kind of difference, but leveraging this level of CDN-based caching simply wouldn’t be possible without any manual configuration of a dynamic website, at least not without introducing really weird caching artifacts.

The exact same HTML served from a high-performance static host18
The exact same HTML served from a high-performance static host. (View large version19)

Caching and, more specifically, cache invalidation is extremely hard to get right with a dynamic website, especially the kind of distributed caching required to take full advantage of a CDN. With a WordPress website, there’s no guarantee that the same URL won’t return different HTML depending on whether the user is logged in, query parameters, ongoing A/B tests and so on. Keeping track of when a page needs to be invalidated in the cache is a complex task: Any change to a comment, global website setting, tag, category or other content in the database could lead to changes in the lists of related posts, index pages, archive, comment counters, etc.

Static websites are fundamentally different in this regard. They stick to a really simple caching contract: Any URL will return the same HTML to any visitor until the specific file that corresponds with that URL is explicitly updated.

Working with this caching contract does impose constraints during development, but if a website can be built under these constraint, then the difference in performance, uptime and cost can be enormous.

The Modern Static Website Generator Link

In recent years, this alternative to the traditional dynamic infrastructure has gained ground. The idea of a static website generator is nothing new. Even WordPress’ largest competitor back in the day, Movable Type, had the option of working as a static website generator.

Google Trends for static website generator20
Google Trends for “static website generator”. (View large version21)

Since then, a lot of the constraints that made static websites lose out have fallen away, and today’s generators are modern, competitive publishing engines with a strong appeal to front-end developers.

More static website generators are released every week, and keeping up with developments can be hard. In general, though, the most popular static generators share the following traits.

Templating Link

Allowing a website to be split into layouts and includes to get rid of repetition is one of the basics of static website generators. There are myriad of template engines to chose from, each with its own tradeoffs — some being logic-less, some inviting a mixture of template and code, all allowing you to get rid of duplicate headers, footers and navigations.

Markdown Support Link

The rise of Markdown is likely one of the primary reasons why static website generators have become so popular. Few people would dream of writing all of their content in BBCode22 or pure HTML, whereas Markdown is very pleasant to work with, and Markdown editors for serious writing, note-taking and blogging seem to be exploding in popularity.

All of the major static generators support Markdown. Some swear by reStructuredText23 or an alternative markup format. In general, they all allow content developers to work by writing plain-text documents in a structured format.

This approach keep content and design separate, while keeping all files as plain text. As developers, we’ve grown accustomed to an amazing suite of tools for working with plain text, so this is a huge step up from having all content dumped into a database as binary blobs.

Meta Data Link

Content rarely stands completely on its own. Readers will often want to know the author of a blog post, the date of the post, the categories it belongs to and so on.

Jekyll2625 has pushed the idea behind static site generators forward: now it could be powered by Markdown templates.

When GitHub’s own Tom Preston Werner wrote Jekyll2625 to power his blog, he came up with a really interesting solution for representing meta data when working primarily with Markdown documents and templates: front matter.

Front matter is the bit of meta data, typically in YAML format, at the very top of a document:

title: Title of the document
- Category A
- Category B

# Actual content

This is the document

This makes annotation of single-file documents with meta data straightforward, and it lends a simple human-readable text format to all of the data that would normally be stored in a much more opaque format in a database.

Asset Pipeline Link

Front-end development today almost always involves several build tools and compilers. We want our assets to be minified and bundled. CSS preprocessors have gone from oddities to mainstream tools. And CoffeeScript and ECMAScript 6 transpiling have made compilers an integrated part of programming for the browser.

Most modern static website generators include an asset pipeline that handles asset compilation, transpiling, minification and bundling. Some are based on build tools, such as Grunt, Gulp and Broccoli, and let you hook into whole ecosystems of tasks and build steps. Others are more focused on streamlining a particular process or making sure a certain set of tools work well together without any complex configuration. Live browser refreshing when a file is saved has also become standard for many generators.

Putting It All Together Link

A static website generator typically comes with a command-line UI for building a website or running a local server with your website for development.

Jekyll, for example, comes with a jekyll build command that you run from within a folder containing the source files for a Jekyll project, and it then outputs a completely static website in a _site subfolder.

Here’s what a simple source folder looks like:


This folder is a completely self-contained static website that can be uploaded to any static host or served from any normal web server.

Why Now? Link

If all of this sounds pretty awesome, that’s because it is. But why is static website technology taking off now, and why did the early generators fail to make a dent in WordPress’ dominance? What’s changed? And how far can we take this?

Today’s generators play into a totally different ecosystem than their predecessors. Many of the constraints that made dynamic websites the best option for creating anything but the most basic online brochure have fallen away, although some remain.

The Browser Is Growing Up Link

When Tim Berners-Lee launched the first website of the World Wide Web, a browser was a simple document viewer that could display hypertext, links and little else.

Today, we’re finally in the process of burying the last browser that has been holding the web back (RIP Internet Explorer 8). The modern browser is an operating system in its own right, no longer merely displaying documents downloaded from the web, but capable of running full-fledged web applications, making external calls to any CORS-compatible API, storing data locally, opening WebSockets to streaming servers, and even handling peer-to-peer connections to other browsers via WebRTC.

With the maturation of browsers, many features that used to require dynamic code running on a server can be moved entirely to the client. Want comments on your website? Add Disqus, Isso or Facebook comments. Want social integration? Add Twitter or Facebook’s JavaScript widget to your website. Want real-time data updating live on your website? Add a squirt of Firebase. Want search? Add Swiftype. Want to add live chat support? Olark is there. Heck, you can even add an entire store to a static website with Snipcart.

The list goes on and on, as a whole ecosystem of purely browser-based add-ons to websites is emerging. Apart from that, modern web apps built with Ember.js, AngularJS or React are often deployed entirely as static websites and served directly from a CDN with a pure API back end that’s shared between the website’s UI and the mobile client.

The CDN Is Going Mainstream Link

When Akamai launched the first content delivery network in 1999, only the largest web properties in the world could afford to deliver their web assets from CDN edge nodes distributed all over the world. It wasn’t that long ago that CDNs were used only by companies at the scale of CNN and Facebook, rather than mere mortals.

While Akamai still has enterprise-level pricing, today anyone can sign up for Amazon AWS and put CloudFront on top of their website. Also, companies like Fastly, MaxCDN and CloudFlare offer CDN services at prices that even a small business can afford.

You could use a CDN with a dynamic website, but cache invalidation is one of those notoriously tricky problems in computer science. Getting the right balance between caching on edge nodes and running a dynamic system on a back end that is potentially doing ad-hoc computations on each request is tricky, to say the least.

A static website, on the other hand, can be readily deployed directly to a CDN and served straight from local caches near end users. Fiddling with the configuration still takes some time, and cache invalidation can be tricky, but it’s doable and can be completely automated with services such as Netlify.

Performance Is A Must Link

The explosion of mobile devices has changed the face of the web in many ways. More and more visitors are coming to the web from a mobile device, sometimes on a 3G connection. Never has performance been as important as it is now.

We all know this data: 57% of online visitors will abandon a page if it takes longer than 3 seconds to load. People used to be willing to wait up to 10 seconds, but expectations are way higher today. And on mobile, where there’s no multi-tasking and little else to do, waiting for a website to load is so frustrating that more than 4% of people report that they’ve physically thrown their phone while using a slow mobile website!

No matter how much you optimize a dynamic website for performance or how many thousand of dollars you throw at it, it will never give you the same basic performance guarantee as a well-tuned static website hosted right on a CDN for a few bucks a month. With performance constantly growing in importance, it’s no wonder that developers are looking for ways to pre-generate their HTML, instead of letting the server spend time and resources on generating a page for every single HTTP request.

Static website generation also eliminates a lot of performance concerns during the development process.

If you’re building a dynamic database-driven website, the efficiency of the database queries you’re making is extremely important because they’ll need to be fast enough to run once for every single HTTP request. Even if a solid caching layer lies on top of your website, there’s often a risk that some requests will effectively work as cache-busters, triggering worst-case queries in the back end and causing the whole system to grind to a halt.

With a static-generated website, it doesn’t matter much if pulling content into a template takes a few seconds more or less: That only happens when you publish, and there will never be a performance penalty for end users.

Build Tools Are Everywhere Link

Compilers and build tools used to be something that C and Java programmers worried about, not something you would wield while building a website. Now, for better or worse, that has completely changed.

Today, front-end developers have adopted build tools, package managers and various kinds of compilers and transpilers wholesale. Grunt was the first front-end build tool to go mainstream, and now most new projects will have build steps.

With this prevalence of build tools, static website generators feel like a much more natural part of the front-end toolkit, whereas the traditional PHP-based tools for dynamic websites are starting to feel strangely alien to the modern front-end workflow.

What’s Missing? Link

All of these forces have come together to create something like a perfect storm for static website generators, and it’s no wonder that more and more websites are being built statically.

It’s not all roses, though. Before static websites go fully mainstream, a few areas need to evolve.

Picking a static website generator and starting a project can still be a surprisingly rough experience the first time around. There are a lot of ins and outs and a lot of room for improvement in the tools, documentation and resources available.

While the ecosystem around static website generators is growing, it’s still a far cry from the mature theme marketplaces and support services of traditional dynamic platforms.

The biggest missing piece of the puzzle, however, is content editing. While working directly in Markdown in a text editor and pushing to GitHub is close to the ideal workflow for a front-end developer, it’s not something you’d get normal, non-technical end user to participate in.

Because of this, many websites built with static website generators currently end up being migrated to a dynamic CMS. There’s a huge need to bridge the gap between content editors and static website generation. Before that happens, static website generation will be reserved for a relatively small subset of today’s websites.

Some interesting “no-CMS” solutions are out there. The Verge has been using Google Sheets as a content layer for Middleman27; StaticGen281 uses Gist and the GitHub API as a kind of database; and Carrot use Contentful as a static CMS29 to let non-techies produce content for its statically generated websites.

Others are working on tackling the general problem of how to best mix static website generation and content editing, and the coming years will no doubt bring exciting new ways of working with content and publishing.

Systems such as Contentful30, Prismic.io31 and GatherContent32 decouple the CMS layer from the actual website builder. This makes them really interesting tools for multi-channel content management, where you’re writing content not just for a particular website, but also for a mobile app, a Facebook page or a white paper. Publishing new content triggers a webhook in a build system; then, a static website generator runs the build and fetches data from the content API; and the result is pushed straight to a CDN.

Another option for content editing is to work directly on the underlying repository.

Markdown editor in a static generator Prose.io3534 which seamlessly integrates with GitHub’s API.

Prose.io3534 has been around for a while now, integrating with GitHub’s API to give content editors a somewhat more gentle UI in which to edit Markdown files in GitHub.

At Netlify, we’re working on an open-source CMS36, without any lock-in to a particular static website generator, Git host or hosting platform. The goal is to make it work with almost all current static website generators, and we think it will be a great way to push the limit of what kind of website you can build within the constraints of modern static website technology.

Obviously, there will always be websites that are simply not a good fit for static generation — especially ones whose core content is a constantly updating feed or ones with an extremely high volume of content that relies heavily on search and filtering.

That being said, static website generators will continue to grow in capability and popularity. The infrastructure and ecosystem will keep maturing. And as the tools improve, we’ll see developers push the limit of what can be done with static websites.

At Netlify, we’re already starting to see large content-driven websites, with real-time search, multi-language regions and content, and private sections being built with static website generators and content APIs. With awareness of the importance of performance and security increasing, you can expect to see much more of this.

(al, ml, jb)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36

↑ Back to top Tweet itShare on Facebook


Matt Biilmann has been building developer tools, content management systems and web infrastructure for more than a decade. He is co-founder and CEO of Netlify, the premium static hosting platform. In his spare time he drinks Jazz and listens to Beer, while helping to organize the SF Static Web-Tech Meetup.

  1. 1

    Brian Rinaldi

    November 2, 2015 4:07 pm

    As you already know, I couldn’t agree more, which is why I have been talking heavily on this topic over the past year.

    If people are new to static site generators, my free O’Reilly ebook is a good primer –

    And for people looking to choose which engine they may want to use, my static site samples repository on GitHub may be helpful to see samples using various engines.

    • 2

      Definitively recommend Rinaldi’s O’Reilly ebook for a primer! And the Static Site Samples repo is a super useful resource when comparing different static site generators!

    • 3

      Another resource for getting started with a static website generator geared specifically towards blogging is my course on Pluralsight Build a Better Blog with a Static Site Generator.

      Pluralsight is a paid site, but Mathias if you would like to give away a few 1 month memberships to your readers, I can probably swing it. I’m assuming you can see my email, but if not I’m jeffa00 on Twitter.

  2. 4

    For a complete list of open-source flat-file CMS check out:

    • 5

      There’s a huge difference between a flat file CMS and a static site generator.

      Flat file CMSs are still dynamic websites, they’re just storing data on the file system instead of the database. This can make them simpler to install and it does give the benefit of having all the content in plain text, but it makes them significantly harder to scale than database backed CMSs (it’s much harder to distribute a file system with good performance, than a database) and it doesn’t make any difference in regard to security vulnerabilities or cacheability.

  3. 8

    Moving from enterprise, back-end development in performance-sensitive applications into web development, it did not feel right to put sites on top of dynamic database queries that would return essentially the same content to every user and generate nearly-identical HTML every time.

    In addition to performance improvements, security issues and database maintenance, a static website allows for increase stability by simply versioning updates.

    There is a bug in the latest release? Instead of heading to the source code or restoring the database, simply pull back to the last stable static version stability is restored.

    No moving parts once a site is generated, other than serving it up.

  4. 9

    Mohammad Ashour

    November 2, 2015 5:02 pm

    One thing that has turned me off static website generators this far is i18n. I have to work with bilingual sites, and while I’ve only looked into Jekyll, I found its i18n support lacklustre. That said, there’s is something refreshingly simple about static generators and I’m looking forward to the kinks being ironed out.

    • 10

      I’m not familiar with Jekyll’s i18n approach, but Middleman’s seems pretty solid – maybe you should take a look at their docs.
      I’m planning on using. it for a multi-lingual site very soon and judging by past experiences with WordPress and multiple languages plugins, which have been awful, this should be a welcome relief.

    • 11

      Mohammad, not sure if you’ve seen Octopress (Jekyll enhancements) Multilingual, but it may help you get there.

      – Bud
      {static is} The New Dynamic

  5. 12

    I like the idea of a static web site generator but as mentioned, not always the best choice. Don’t think I would drop my tool or stop using various other tools. But that is more for apps, these generators would really suit quick building of blogs.

  6. 13

    Chad Campbell

    November 2, 2015 6:12 pm

    Great article, Matt!

  7. 14

    Jarrod Medrano

    November 2, 2015 7:29 pm

    I have been using and I’m liking it much more than jekyll. It’s purely JS based, meaning no ruby! Also check out, it looks very promising.

    • 15

      Why “no ruby” would be an advantage for a statically built website???

      • 16

        He may think that Ruby is at it’s sunset. Which appears to be true.

      • 17

        I think he uses a JavaScript / Node workflow for the most things – with Grunt or Gulp, as many people do currently. If everything you use is in node and you add Jekyll, you have to care about Ruby and Ruby versions. A part of your environment you don’t need to care anymore if you use everything in the same language. Especially ruby was very sensitive to version incompatibilities and problems….

  8. 18

    Hi Matt, what a great and interesting read.
    So maybe I’m reading your article too much through my CMS centerend (Drupal) view, but I’m missing some aspects, the most obvious one being dynamic content.
    There is no way generate data (like lists) based on for instance populair content.

    I get how this could work well for like a blog, but when you start on bigger stuff you start quickly to incorporate a lot of external calls. Calling services for commenting, search, social media, chat, etc. These external call’s, in my experience, usually have a pretty negative effect on performance.

    Then there a some real showstoppers, like having users on your site, content aggregation, exposing your data through an API, etc.
    While it looks very promising, I think there is definitely a limit on what you can do, or at least for now.

    • 19

      Javascript could be your answer here? if you need to make external calls then you’d be using some sort of API, make all your DOM adjustments via JS?

      Obviously if you’re integrating into lots of other systems then maybe this isn’t a path you’d go down?

    • 20

      Thanks Boris!

      Of course I agree that there are situations where rendering HTML on a server with a traditional dynamic approach is the right way to go – no questions about it. However, today its basically the default approach to just about every web site, and I think it ought to fundamentally be the other way around.

      Apart from that, I think you would be surprised about just how far you can take a static approach. At netlify we’re seeing large sites with i18n, 10k+ pages and complex search and filtering options being built completely statically.

      Sometimes we also see hybrid approaches, especially with people migrating from a dynamic to a static site, where they use our proxying features to setup a build process for a static site and then proxy any URL that doesn’t resolve to a dynamic asset from our CDN servers to their original dynamic backend. This means they can gradually make as much of the page static as they feel make sense, and leave some specific areas running against a dynamic backend…

      We’re working on getting some case studies put together on some of these projects, so stay tuned :)

      • 21

        Thanks for the reply, it makes a lot of sense to have dedicated technologies for specific use cases.

        But then two final questions, as I feel the editorial workflow will be much longer / more complicated.

        How about media handling, images in particular. One set it and forget it feature, in a CMS would be image derivatives. So now a CMS user would upload a big image to a story and hit save. That results in a big header image on the post itself, a thumbnail on the post overview and maybe like a little bigger one on the ‘featured post’ section.

        Does this mean that we in a static generator I have to go and resize 3 images manually in image editing software, place them manually and so forth?

        In the same workflow, when the ‘featured post’ changes, I have to manually edit this block, put up the thumbnail, copy past the title and tagline, maybe change the author name to the author of the new featured post. Go to the article, place the featured post icon…

        In both cases this feels like moving from a one click action to something that takes a couple of minutes.

        • 22

          That’s not the case at all. Typically you resize the images during the build process for the site.

          There’s lots of different takes on this. For our blog at netlify, we just upload a source image, and then use our simple Jekyll Srcset extension ( to resize the images to thumbnails and create the needed version for different pixed ratios, etc…

          It’s all just part of the build process and you can use whatever image tools you like. A really simple example of this kind of processing is Exposé, a simple Bash based static site generator for creating photo essays. You just feed it a folder with images and captions and then it handles all the needed image manipulation…

          Another approach is to upload the images directly to a Cloud service like Cloudinary ( that’ll do all kind of image manipulation on the fly…

        • 23

          Hi Boris,
          As Mathias said, there are a lot of tools out there to manage your build process and your assets (eg. images). I have built a huge image based website with Jekyll. And my workflow for images is very similar to a CMS one. I put the name of my images there i want in my file (eg. featured or in content) and my grunt process minifies the images, resizes them in four differents size (s, m, l, xl) and copies them into my S3 storage. I have nothing to do, apart choose beautiful images.
          You can take a look at my workflow with Grunt here and the final result

  9. 24

    Great article. We’re happy to see a whole ecosystem of static web site hosting companies evolve. If you are interested in deployment that is as easy as ‘git push’ go check out

  10. 25

    Great read. I can remember using .shtml files quite heavily in my early years on the web (late 90s, early 2000s). Scripting languages simply weren’t available on many servers and if they were, cgi scripts were certainly frowned upon. Relational databases? Weren’t even a thing. I still remember one of my friends back in middle/high school proclaim “I’m going to create a website completely with perl/CGI!” The thought at the time was preposterous, much like someone who time warped in from the ’60s might see a cell phone now.

    I got into the CMS game relatively late. WordPress never really appealed to me, it was something I had heard of people using. (At the time, people were also using Moveable Type). The way developers talked about it—and still do—it can simply do no wrong, with people who blindly recommend it to everyone.

    People who don’t update their site that often could easily get by with a static site generator. Those same people tend to be in the DIY/WordPress camp though because it’s easy to setup, a “safe” choice (something they heard about).

    And if you need to search a site for product information or relate various pieces of content together on the fly, a static site generator is just going to hold you back.

    There is no one-size-fits all here. I don’t have much experience with static site generators mainly because of the types of sites I work on. I find myself using Craft for most projects these days but that’s only because the work I do; anyone not in client services might prefer something vastly different.

    With the right integration into a traditional database CMS, technically nginx’s fastcgi_cache or varnish could almost be considered “static site” generators (thought that’s stretching it a bit!)

  11. 26

    Great article. Please add — to the top of the code example where you talk about front matter as it has to have — as it has to surround front matter options

  12. 27

    Matteo Papadopoulos

    November 3, 2015 7:44 am

    We use middleman since a couple of years and we are definitely happy with it. Then we have felt the need to give to our customers a tool to be free to manage content. That’s why DATO is born, aa CMS for static websites. Now we have a dozen of websites built with this tool… No more problem with performance, scalability, mantainance. Just an API and a backbend (Ruby). Here some details

  13. 28

    Keep and eye on this one, it’s set to be a great mix of CMS with a static website generator:

  14. 29

    Hi Mathias,

    that was a nostalgia trip. I clearly remember the pre-Dreamweaver times, Dreamweaver times (unitl MX) and the CMS age. Over the past 8 years I mostly used Typo3 and WordPress.

    But your assessment that complexity has its price is absolutely right. And it is one of the reasons I recently went to test the simple but elegant Kirby CMS.

    When we use a dynamic backend to create static frontends distributed via CDN this will not be much easier. I fear complexity will stay.


  15. 30

    Thanks for the great article. There is one thing, I can’t wrap my head around. You wrote:

    “Static websites are fundamentally different in this regard. They stick to a really simple caching contract: Any URL will return the same HTML to any visitor until the specific file that corresponds with that URL is explicitly updated.”

    What exactly stops ANY CMS from following the same simple caching contract to employ a simple caching mechanism wich could compete in speed with a static website. The only thing it would have to do is checking the requested URL, look up the corresponding page in the cache and deliver it. As long as the mentioned caching contract (1 URL = 1 page) is fulfilled, it would be no problem to store every page as one item in the cache, ready made for delivering. And if somebody logs into the backend, changes the content of the page, the page is made invalid in the cache and will be rendered and put into the cache at the next time the corresponding URL is requested.

    So the benefits of a static page (speed) and the benefits of a dynamic CMS (ease of use for the untechnical people) could be combined. Am I missing something?

    • 31

      You are correct. Hybrid solutions can be used effectively in DB-driven sites, and you can argue simple CDN use is an example of this.

      Some of these static/hybrid alternatives are more complicated than a traditional DB architecture. If I need “a primer for static site generators” [but which one?] or “use some sort of API [which sort?], make all your DOM adjustments via JS [how? any downsides?]”, or use “[a] bit of meta data, typically in YAML” then what is gained, apart from billable hours or book sales?

      If a static site works *out of the box* (a github documentation site) then of course, use that. But if you have to start jumping through hoops to make it work, you probably are better off going the traditional route.

      On modern (including virtualized) servers, databases are not slow, especially for smaller sites (the important parts of the database end up in ram anyway, including important cached queries).

      Such an architecture has only a few well-defined and understood points of failure.

      I think the resurgence of static sites comes from the fact that the web is (even more) ubiquitous, most sites get almost no traffic, and servers (instances) have become powerful, so a static solution works “90%” of the time.

      But if you are in the remaining 10%, save your conference and book fees and stick with wordpress, which is of course is under active development.

    • 32

      Hello Jan,

      technically you are right – I think it’s more of a philosophical question:

      Let’s say you have chosen the dynamic approach (with PHP or similar) and you want some dynamic content on your index.html (e.g. “the most trending comments to some article”) – you would do this in PHP as well – just because this is the most obvious approach.

      Now you run into the Caching-Problem (you violated the 1 URL= 1 Page Law).

      But if you use the static approach – you can’t even use PHP – you will have to integrate the dynamic content at the client side (using Javascript and some API-Calls, JQuery, etc.)

      No Caching-Problem here – the source of index.html remains the same.

      Of course, in the first (PHP-)scenario you could still do the same: Reject PHP and insert dynamic content via Javascript – but as I said, it doesn’t feel right.

      • 33

        Hello Pascal,

        thank you for your reply. What you say makes a lot of sense. The mentioned hybrid solution is “weder Fisch noch Fleisch” as we say in Germany (it’s neither fish nor meat, not really the one thing but neither exactly the other thing). Technical possible but maybe not a good idea.

        Have a nice day!

  16. 34

    Mauro Mandracchia

    November 3, 2015 2:53 pm

    I think that Static Website Generators, aren’t by them self the next big thing.
    Mostly is about how the codebase influence on the content of the website, ad use the the power of the existing versioning tools is indeed an advantage.
    Instead by them self are quite limited.

    IMO, they are the next big thing if their are contextualised in Micro Service Structure, so that’s why I build Monera –

    What you think?

  17. 35

    How could you not mention Statamic ( It’s a static site generator with a backend for content editors. While version 1 is nice, it isn’t as robust as it could be and it seems that the developers are working on that for version 2 due out soon. Grav ( is another very similar option to Statamic and it’s free so testing it out is a no-brainer.

    *And before anyone jumps in, yes, I know about Kirby. It’s nice and it has a backend but it didn’t keep up as well as Statamic or Grav and isn’t nearly as robust as these two options. That being said, it’s a great option as well for static site generators.

    • 36

      Neither Statamic or Kirby are static site generators, but flat file CMSs. They are no easier (in fact harder) to scale than database backed CMSs, no more performant and no less vulnerable to PHP exploits, etc…

      • 37

        Could you elaborate a bit on your answer?

      • 38

        Ramon Lapenta

        November 5, 2015 3:47 pm

        Statamis is a flat file CMS with a static generator functionality. See I’ve just tested my site using the cached version only (not static) and I get pretty good scores all over, so good that i don’t feel I need a static site at all.

      • 39

        Mathias, that’s actually not true. Statamic can generate a Static site that’s even compatible with Github Pages. While running as a dynamic CMS it may not be more performant that some database-driven CMSes, it is certainly faster than others. Everything falls in a spectrum.

        And while nothing can save you if your server is exploited, there are no opportunities for SQL injection which does eliminate a large portion of the penetration possibilities. Also, having your entire site version controlled — from configuration and templates to content and users — scaling and replicating servers is very simple, and in the event your server is exploited – recovering is just a deployment away.

        • 40

          Curtis Blackwell

          November 5, 2015 5:46 pm

          what do you know? do you even flat-file, bro?

        • 42

          True, it’s quite hidden in the advanced feature part of the documentation (nothing about it in the docs on installing, etc), but there is actually an option to output static HTML from the control panel now. Docs doesn’t really mention any way to script this (run it from the CLI, integrate it in a deployment process, trigger it when new content is added, etc), but definitively interesting to see that it’s being worked on :)

          Totally agree that having all the content as plain text in version control can in itself be a huge advantage over storing everything inside a database, and can really help a lot if you ever need to recover from being hacked!

      • 43

        Daniel Fowler

        November 5, 2015 4:08 pm

        Harder to scale? That’s hardly true… I manage a 2,000-page website with 450 users and a team of content editors on the back-end, powered by Statamic. It’s a delight to work with and a content editor only needs basic social media skills to manage pages all across the site.

        The old clunker website I replaced was 6,000 pages and powered by WordPress. It was totally unmanageable on the backed.

        I’d wager to say that anyone who feels Statamic doesn’t scale has built their site very inefficiently and needs to get better at programming.

        • 44

          Daniel Fowler

          November 5, 2015 4:24 pm

          Sorry, I didn’t mean for this message to sound so defensive. I’m just a big fan of Statamic – been using it for 2 years now on really big websites. If cluttered with resource-intensive add-ons it might degrade the experience, but I use 3 add-ons for my 2k-page website and our speed isn’t an issue on the front or backend.


        • 45

          I’ve built 2 sites with Statamic. I love it and my clients love it – coming from Concrete5 and WP.

          Daniel, you should do a write-up on how to manage a site that large with Statamic. I bet you could write a book – I’d buy it :)

  18. 46

    Prabhuram Baskaran

    November 3, 2015 3:41 pm

    Static Sites load faster and so they help in provide in better SEO ratings.

    At MockFlow we have built a complete website platform for building static websites , check it out at

  19. 47


    Great article.
    I think static websites are a great solution for blogs or corporate websites but for an e-commerce website I don’t think it’s the best. In my company we have developed an e-commerce solution for our customers with lots of rules to determine the price of a product, that is to say, about the country, the currency, the customer group, the customer, the geozone… some customers group can’t see the same products…

  20. 48

    So everything old is new again.
    Back in the day when at 1.5 MBit connection to the internet was something special, Microsoft released their “Proxy Server” to let commonly retrieved external websites be “cached” for multiple internal users to consume; improving the response time of their browsing experience.

    Now we’ve flipped that around, and we’re caching the outward bound website traffic, feeding periodically updated static data-sets to our customers/consumers. This to improve their browsing experience, and to improve the security of our sites and data.

    Maybe what we’ll finally learn in the end, is to not try to apply the latest fad and IT gee-gaw to every situation. Realtime data is nice for weather, and banking, etc… but realtime data is not needed for product catalogs… as Sears proved back in 1888.

    So now we have “Static Website Generators”… a fancy way of saying “caching outbound websites and data”.


↑ Back to top