Progressive Enhancement Is Faster

Advertisement

The aim of republishing the original article by Jake is to raise awareness and support the discussion about the role of progressive enhancement within the community. We look forward to your opinions and thoughts in the comments section. – Ed.

Progressive enhancement has become a bit of a hot topic recently, most recently with Tom Dale conclusively showing it to be a futile act, but only by misrepresenting what progressive enhancement is and what its benefits are.

You shouldn’t cater to those who have deliberately disabled JavaScript, unless of course you have a particular use case there, e.g. you’re likely to get significant numbers of users with the Tor Browser, which comes with JS disabled by default for security. If you do have that use case, progressive enhancement helps, but that’s not its main benefit.

You should use progressive enhancement for the reasons I covered a few weeks back. However, my article was called out on Twitter (the home of reasonable debate) as just “words on a page” and incompatible with “real-life production”.

Of course, the article is an opinion piece on best practice, but it’s based on real examples and actual browser behaviour. However, I wanted to find more tangible “real-world” evidence. Turns out I was staring at it.

I Love Tweetdeck

I don’t have Chrome open without a Tweetdeck tab. I use it many times a day, it’s my Twitter client of choice. It does, however, depend on JavaScript to render more than a loading screen.

Twitter used to be the same, but they started delivering initial content as HTML and enhancing from there to improve performance. So, they deliver similar data and cater for similar use cases, this makes them perfect for a real-world comparison.

The Test

To begin, I:

  • Set up a Tweetdeck account with six columns,
  • Closed all other apps (to avoid them fighting for bandwidth),
  • Connected to an extremely fast wired network,
  • Used the Network Link Conditioner to simulate a stable 2 Mbit ADSL connection and
  • Cleared the browser cache.

I recorded Tweetdeck loading in a new tab, then cleared the browser cache again and recorded six Chrome windows loading the equivalent content on Twitter (here’s the launcher if you’re interested).

Now, 2 Mbit may sound slow, but I’ve stayed at many hotels where I’d have done dirty, dirty things for anything close to 2 Mbit. The maximum broadband in the area I live is 4 Mbit unless you can get fibre. On mobile, you’ll be on much lower speeds depending on signal type and strength.

The Results

Here are the two tests playing simultaneously (note: they were executed separately):

02.080s Tweetdeck renders loading screen. So Tweetdeck gets past the blank screen first.
02.150s Twitter renders three “columns” of tweets, that’s only 70ms later than Tweetdeck shows it’s loading screen.
02.210s Twitter renders six columns of tweets. Twitter has delivered the core content of the page.
04.070s Tweetdeck renders six empty columns. Twitter is downloading background images.
05.180s Tweetdeck renders its first column of tweets.
06.070s Tweetdeck delivers another column.
06.270s …and another.
08.030s …and another.
08.230s …and another.
10.120s …and another, and that’s all the core content delivered by Tweetdeck. Tweetdeck is fully loaded at this point, whereas Twitter continues to load secondary content (trends, who to follow etc).
10.120s Twitter finishes loading secondary content.

So Twitter gets the core content on the screen 7.91 seconds earlier than Tweetdeck, despite six windows fighting for resources. For the first bit of core content, Twitter gets it on screen 3.03 seconds sooner.

Twitter takes 4.09 seconds longer to finish loading completely, but this includes a lot of secondary content and heavy background imagery that Tweetdeck doesn’t initially show. On Twitter, the core content is prioritised.

That’s with an empty cache, but what about a full one? Here’s the same test, but ran a second time without emptying the browser cache:

00.060s Tweetdeck renders loading screen. So Tweetdeck gets past the blank screen first, again, but much sooner.
00.090s Twitter renders the first “column” of tweets.
01.010s Twitter renders second “column”.
01.110s Twitter renders third “column”.
01.190s Twitter renders fourth “column”.
01.200s Tweetdeck renders six empty columns.
01.230s Twitter renders fifth “column”.
01.240s Twitter renders sixth “column”. Twitter has now delivered all core content.
02.100s Tweetdeck renders first column.
02.160s Tweetdeck renders second column.
02.260s Tweetdeck renders third column.
03.030s Tweetdeck renders fourth column.
04.010s Tweetdeck renders fourth and fifth columns. Tweetdeck has now delivered all core content.
04.050s Twitter finishes downloading secondary content.

So, with a full cache, Twitter beats Tweetdeck to all core content by 2.77 seconds, and first core content by 2.01 seconds.

Which test represents the “real-life” case? Well, something in between the two. As you browse the Web you’ll knock resources out of your cache, but also the site will cause cache misses through rapid deployments which change the URLs of resources. Current best practice is to combine and; minify your CSS/JS files and give them a unique URL, so whenever you need to make a change, no matter how small, the URL changes and the client has to download the new resource. Roll on HTTP2, I say.

Is This Test Fair?

Well, no. It’s real life, and as such it has the uncertainty of real life. But those are two expertly-built sites that offer similar content.

The test is unfair to Tweetdeck because:

  • A lot of the XHR requests they make could be rolled into one, improving performance despite JS reliance (assuming this wouldn’t have large overhead on the server)

The test is unfair to Twitter because:

  • Six separate requests are made for markup, whereas a true progressive competitor would use one. This incurs the HTTP overhead and repetition of common markup.
  • It undergoes 6x the style calculations and layouts compared to Tweetdeck (because Twitter is running in six separate windows).
  • I zoomed out all the Twitter windows so more content was visible, so the six windows have the paint overhead of scaling.

Of course, Tweetdeck gets away with it because it’s a tab I always have open, so I don’t go through the startup cost often. This is extremely rare for a website, even for those that claim to be “apps”.

You shouldn’t bet your performance on being a perma-tab. Twitter did this, but it turned out people shared and followed links to individual tweets and timelines, and the startup cost made them feel incredibly slow. They fixed this with progressive enhancement.

The Results Aren’t Surprising

Here’s what a progressively enhancing site needs to download to show content:

  • some HTML
  • CSS

…and those two download pretty much in parallel. The HTML will get a head start, because the browser needs to read the HTML to discover the CSS.

The whole of the CSS will block rendering to avoid FOUC, although if you want to put your foot to the floor you can inline the CSS for top-of-page content, then include your link[rel=stylesheet] just before the content that isn’t covered by the inlined CSS.

Rendering isn’t blocked by all your HTML, the browser can update the rendering as it receives more markup. This works with gzipped HTML too. Check out the full WHATWG spec (warning: it’s 2.6mb), although the markup is huge, it gets to first-render really quickly.

Ideally you’d have your script in the head of the page loading async, so it gets picked up by the parser early and enhances the page as soon as possible.

If a site does its content-building on the client based on API calls, such as Tweetdeck, here’s what it needs to download:

  • All HTML (although small)
  • CSS
  • JavaScript
  • JSON (API call)

The HTML, CSS and JavaScript will download pretty much in parallel, but the JSON download cannot start until the JavaScript has downloaded, because the JavaScript triggers the request.

The JavaScript on one of these pages can be significant, for instance Tweetdeck’s scripts total 263k gzipped. As Adam Sontag points out (using Mike Taylor as his medium), that’s the size of a few images we’d use on a page without thinking twice. But we would think twice if those images blocked core content from downloading and displaying, which is the case with the JavaScript on these sites. Hey, images even progressively render as they download, JavaScript can’t do anything until the whole file is there.

Getting something on screen as soon as possible really improves the user experience. So much so that Apple recommends giving iOS apps a static bitmap “Launch Image” that looks like the first screen of your app. It’s a basic form of progressive enhancement. Of course, we can’t do that on the Web, we can do better, we can show actually core content as soon as possible.

I’m open to examples to the contrary, but I don’t think a JS-rendered page can beat a progressively enhanced page to rendering core content, unless the latter is particularly broken or the content can be automatically generated on the client (and generating it is faster than downloading it).

I’m keen on the progressive enhancement debate continuing, but can we ditch the misrepresentation and focus on evidence?

(il)

#price_list { margin: 1em 0 2.5em 0; border-collapse: collapse;width:
100%;border: 1px solid rgba(0,0,0,0.1);border-radius: 8px !important;margin-top:
30px;}#price_list .desc { font-size: 12px; color: #555;}#price_list a { font-family:
“Proxima Nova Bold”, Arial, serif; }#price_list td {padding: 8px 15px;border: 1px
solid rgba(0,0,0,0.05);}#price_list tr:nth-child(2n+1) { background-color:
rgba(0,0,0,0.03); }#price_list thead td { color: #fff; font-size: 13px; text-transform:
uppercase; background-color: #888;}#price_list td.new_price { color:
#cc0000; font-weight: bold; width: 55px;}#price_list td.old_price { width: 55px;
color: rgba(0,0,0,0.45); font-weight: bold;}

↑ Back to top

Jake is a Developer Advocate at Google who's keen on Web performance. He developed Sprite Cow to help ease the pain of sprite sheets. Jake started a blog way after blogs stopped being cool.

  1. 1

    Concise and useful. Thanks!

    0
  2. 2

    One optimization you can do with single page apps is included the initial json request in the body of the html you are downloading (this is done by bustle and discourse, for example). This means you only have to download html/css and javascript, no json is then requested until you change page.

    0
  3. 4

    I look at Progressive Enhancement (PE) more as a philosophical approach to design than as a restrictive methodology.

    PE forces me to focus on the content and the people I expect to use it. Ideally, I want everybody to use it. Starting with this ideal, as I begin to add on features, graphics, and other stuff, I acknowledge a group of users who may not be able to use the content. At this point, I have to make a decision to either relinquish these “enhancements” or accept the loss of this particular demographic.

    Of course, this becomes a challenge for design. When there is pressure to design in such a way to reach the maximum number of people, PE forces me to acknowledge those groups I will not reach. The use of Javascript (JS) is the most challenging aspect of PE, which, in my opinion, is the biggest source of distress in designers surrouding this issue.

    Should it matter if a user disable JS intentionally or JS is disabled by default? I do not believe so.

    There are simply too many sites using JS in a malevolent way, or, at least, suspiciously. Pages may load in the background automatically. JS links sends you to a page that is not identified in the link text. If you visit the newly revamped Sitepoint.com site and pull up an article with embedded links, you will find that the link’s URL does not show in the browser bar until your right-click your mouse.

    I choose to use one of my browsers with JS disabled at all times just for this reason. The surprising things is that the same pages will load noticeably faster. This leads me to further embrace the PE philosophy. If adding JS causing page loads to measurably decrease in speed (Flickr is a good example for this point), then is this “enhancement” worthwhile?

    This is what PE does. It forces a designer to acknowledge what is happening with each layer of tech added to the design, whether you like it or not (the latter being the primary source of disdain for PE). Often, people do not want to see the truth. In design, PE helps reveal some of the truth related to content and users.

    There is nothing wrong with PE. PE offers a challenge. Professional designers are challenge masters, not challenge slaves.

    0
    • 5

      Some great points. I feel many designers do not have PE in their thought process in the conceptualizing and design phases. This is usually the result of designers knowing very little front-end development, which could lead into the discussion that every web designer should have a solid foundation in front-end development, but that’s another argument and one where I’m not sure what side I’m on.

      There are also arguments that PE does not belong in the design phases, it could stifle some? But it seems necessary and the world we live in. These ARE design choices and they are philosophical, and designers should be, at least, considering them in the long run of the project.

      0
  4. 6

    Another poorly written SM post weakly challenging what we all know to be true; progressive enhancement is dead.

    Javascript is the most widely supported language of all time. It is the language of the web whether you like it or not.

    Firefox removed the option to disable javascript as there are plenty of other ways to be secure online.

    If you are really concerned about security then feel free to run your browser in a sandbox. There a number of ways to do this.

    Here is a well written article that makes a great case for what is already true:
    http://tomdale.net/2013/09/progressive-enhancement-is-dead/

    0
    • 7

      Java is certainly the best language for web development. As seo specialist I can only recommend not to put vital or important information in to Java as SEs very often do not comprehend it in the right way. I’m currently working on casino seo site (it’s a SEO site, not casino!) and I encountered some problems because of this.
      The article is awesome, thanks!

      0
      • 8

        What does Java have to do with this? This is about Javascript; they’re not even remotely the same thing and never have been.

        0
    • 9

      I think Tom Dale did not mean that PE is against JavaScript. And it isn’t. As always, the truth is somewhere among the two: these ‘tools’, techniques are here for s to make user experience best. I have been using a lot of JS lately, but also created sites which basically don’t have too much JS, like my blog for instance.
      There is room for ‘classic’ sites and modern apps on the web. The last category will gain more and more terrain due the fallback of classic PCs. But it will probably never disappear, because sometimes all you need is to display the content and nothing more.
      But, feature-rich apps will obviously have a place on the internet, due to the fact that the user habits slowly require them. Only time will tell us the good way.
      Fighting about this is like the fight on which framework is better – all are probably perfect for the purpose they have been created for.
      Let me give you a simple example: Prism.js is a widely known quality code highlight tool perfectly built by Lea Verou. But it requires JS. A friend mentioned that he would need something similar, but without JS. So I built one. I won’t link it, because this is not the purpose here (go to my blog, you’ll find among my projects if you are interested), but now we have tools for both.

      My point is that we have a big number of tools, let’s just use them wisely. There is no such thing that PE is dead because of JS. We shouldn’t even touch this.

      0
    • 11

      Speaking of poorly-written, your comment contains only blind assertion and no reasoning. You also suggest progressive enhancement is about people with JS disabled, which is odd, because I say that’s not what PE is about, in bold, right at the start of the article.

      How did you conclude my article was poorly written when it’s clear you didn’t read past the title?

      0
  5. 12

    The comparison is unfair. And this is where most people get confused. A website, and web app should not be compared. I consider Twitter more of a website than a full blown web app. I believe all content driven websites need to enhanced with javascript, but to make the assumption that a web app should be PE driven is just futile.

    The app is built in JavaScript, so it should never have to cater for the few who choose to have it disabled. A web app is built for interaction and long use as opposed to reading the quick article. Performance is another topic altogether, but to consider a large web app to only be 245kb in size is something I don’t think we can complain about. I have built large apps, where the js file amounts to just over a 100kb gripped including templates.

    0
    • 13

      You’re using “app” as an excuse. What is a web app? Why is Tweetdeck an app and Twitter not?

      I’ve seen developers define an app as something that requires JavaScript to render, and defend sites that require JavaScript to render because “they’re an app”. It’s complete circular nonsense.

      “Web app” is a marketing term like “Web 2.0″, it has no meaningful technical definition and shouldn’t be used in technical debate.

      0
      • 14

        Isn’t an app something that performs a specific function or role, therefore since Tweetdeck performs the function of Twitter management I think its fair to refer to it as an app?

        0
  6. 15

    When I started programming, the internet was in its infancy (consumer wise), there was no CSS, Netscape ruled and JavaScript had yet to be invented. Hell, IE was only a rumour. Google was just a misspelling of a number.

    Yet we developed applications that people could use. It was revolutionary. It changed the world [of our users].

    It seems today we are obsessed about the technology that drives the web, rather than delivering the value customers desire most. Do I really need to enable JavaScript, CSS and Images to view my bank balance? What about typing this comment? What about reading the news? Or perhaps JavaScript is a necessity to view photos of my family? How about my health record?

    The bottom line is that JavaScript isn’t the be all and end all of the web. It only serves to enhance our experience of the web. We spend so much time to make it “pop” that we have no time left to deliver the value to our users. Not only that, but our persistence to rely on JavaScript makes our applications less secure, less accessible and more costly to test and maintain. [After all, you do authenticate and authorise AJAX requests right? And you do validate inputs on the server side right? You do test this every time you make a change, right?]

    I have always seen progressive enhancement as a philosophical process to first develop the application as if we were in the 90s, meaning that it still works when JavaScript, Images and even CSS is disabled. That way I can support any browser that supports the HTTP protocol. This approach has distinct advantages in that my focus is on accessibility and content. Then systematically enhance the experience of the user depending on the capabilities of their browser. That way if I ever run out of time, my application may not necessarily “pop”, but it serves its primary purpose.

    Don’t get me wrong, I love a well designed interactive website. I just believe that we are too obsessed with technology and have forgotten who we are doing this for in the first place.

    0
  7. 16

    To me, all websites are web applications; they just vary in their intricacy and capabilities. From a simple static page to large scale entries that embrace form and function in traditional SDLC methodologies.

    HTML, CSS and Javascript are the 3 kings of web development. The first 2 are living standards which are ever-evolving, but they constitute the static end of what we render: structure and presentation. We need Javascript to manage the behavior of the user with our interface. It is the handler logic that puts function into our form.

    Much of the progressive web is guideline and not hard, fast rule. I’ve never personally been a fan of progressive enhancement, but I’ve loved Responsive Web Design principles and Mobile First methodologies. They are device agnostic and change our traditionalist mindset to see the web for what it is. To embrace constraint and capability in what must be one of the forefront fields when it comes to change.

    Everyone and their grandmother wants to coin the next phrase, and our perception of the web paradigm will ebb and flow with that tide. We just need to see what has worth in its words, offering us a timeless wisdom, and what amounts to nothing less than snake-oil (sounding great in our ears, but when put into practice, we find that makes for a more regressive than progressive experience)

    0
  8. 17

    I totally agree with the subject title. It also helps to make content accessible by making it accessible to all users as a starting point, afterall, that is what the HTML5 spec is heavily built to allow. All HTML5 content should be accessible without enhacements or responsive design.

    However, statements like “You shouldn’t cater to those who have deliberately disabled JavaScript” is like saying you shouldn’t cater for people who have deliberately chosen non-1024 x 768 displays. Progressive enhancement is about getting the foundation right and thus making no assumptions … the HTML5 foundation when used properly, should allow all users to access content. Progressive enhancement should then be used, to add additional enhancements … progressive enhancements should in no way make decisions about pre-enhanced content as such enhancements should be secondary considerations.

    I also agree that progressive enhancement (when done corrrectly) should provide the best loading performance (albeit maybe not experience – that would be unique to the target user). I often disable javascript these days because the amount of crap that loads, especially when using multiple browser tabs.

    Adobe Flash suffered from products made by cruddy developers and Javascript has always (and still does) suffer the same difficulty. The problem itself is poor development and sadly, the end-user problems that result from that are just the same. I used to develop in Adobe Flash using progressive enhancement techniques, and I know only a few others that did … Javascript suffers the same problem. As soon as a technology forms a large class/library of features that can easily be dragged/dropped, it is doomed to be abused.

    Progressive enhancement requires planning to be done correctly, but developer culture at the moment is still about pushing drag/drop features when a technology gets popular, and it sadens me that all the good HTML5 has brought to the table is being abused by needless javascript libraries and the like (when I say that, I mean in majority of use cases).

    0
  9. 18

    Interesting article – I particularly follow how Twitter’s use of an initial static layout proves the usefulness of PE to Twitter, and therefore proves the argument that PE is not dead.

    I find the comparison with Tweetdeck somewhat shady though – the old adage of using statistics to prove whatever you like!

    I consider that Twitter can gain advantage from an initial download because they can architect their solution by engineering their entire approach from the database up.

    Tweetdeck would have to read the Tweets via the Twitter API, so if they chose to download static HTML first, they’d just be transplanting the Tweet download from the client to the server. This would actually be less efficient for Tweetdeck.

    So again my take is :-

    Twitter == Argument proved
    Twitter vs. Tweetdeck comparison == Pointless Window Dressing

    I’ll now don my flameproof Internet trousers ;)

    Mark Rabjohn
    Integrated Arts Ltd

    0
  10. 19

    Because I don’t see any reference in the articles/comments to any kind of content consumer beyond a human user, I’ll mention I think it’s important to remember that systems which rely on scraping (you know, Google, Facebook, etc.) require a good amount of server rendered HTML for anything remotely resembling success. The various comments that talk about this being 2013 and why care about users who don’t have JS are extremely short-sighted. My clients care deeply about search and the ability to socially share their content. Sure, some application states are irrelevant outside of the context of what a user may be doing, but many (like say a product detail views) are not. Architects really need to be pragmatic, choosing the best solution for the functional and business requirements. Leave ideology out of it.

    0
  11. 20

    Both are good solutions to a problem, but PE gives more flexibility and lets you do magical things such as http://www.youtube.com/webrenovators (this sample is a demonstration of PE to provide the richest possible experience for both desktop and mobile users)

    0
  12. 21

    From my perspective, PE isn’t about just about designing for old browsers or disabled Javascript. It’s about safely designing your content for the future. It’s absurd to think that the only programs that will use your HTML are web browsers. Just think about how much of an impact mobile devices had on the web development when they entered the scene. Mobile devices completely changed the way that we look at the process of developing and designing HTML. PE is a response to that impact.

    There will be future devices that want to render your HTML content. What if these devices don’t even have the capability to run Javascript? That means your Javascript-dependent website is “Dead, Baby”.

    Progressive Enhancement gives you best possibility of “future-proofing” your content. Period.

    0

Leave a Comment

Yay! You've decided to leave a comment. That's fantastic! Please keep in mind that comments are moderated and rel="nofollow" is in use. So, please do not use a spammy keyword or a domain as your name, or else it will be deleted. Let's have a personal and meaningful conversation instead. Thanks for dropping by!

↑ Back to top