Menu Search
Jump to the content X X
Smashing Conf Barcelona

You know, we use ad-blockers as well. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

Lightening Your Responsive Website Design With RESS

Editor’s Note: This article features just one of the many solutions for creating high-performance mobile websites. We suggest that you review different approaches such as Building A Responsive Web App1, Improving Mobile Support2 and Making Your Websites Faster3 before choosing a particular solution.

This article explains how to use RESS (responsive design with server-side components) to make significant performance and reach improvements to a website for both mobile and desktop devices alike. This technique requires just a few lines of code, some simple configuration and no ongoing maintenance.

Your website will change from one that works on desktops, tablets and smartphones to one that works on almost anything anywhere and loads faster in all cases. It’s hard to over-emphasize the importance of this, but if you need a good case study, read about what happened to YouTube4 when Google lightened its pages (spoiler: entire new territories opened up to it).

The following three screenshots show a sample website with increasing levels of RESS optimization being applied to it and the resulting overall page sizes:

iPhone Screenshot Original5
Original: 1,027 KB (large preview6)

iPhone Screenshot Partial optimization7
Partial optimization: 253 KB (large preview8)

iPhone screenshot full optimization9
Full optimization: 153 KB (large preview10)

The discerning reader will note that the screenshots all look the same, and that’s the whole point: The optimization procedure we’ll be applying ensures that the optics of the website remain intact, while improving the user experience significantly due to the smaller page size. In fact, the screenshots are not exactly the same (click to view them at native size), but the slight differences in the quality of the images tend to go unnoticed on phone screens.

Language issues aside, making a website truly accessible worldwide usually comes with two problems:

  • Diversity of devices
    Remember what the first two W’s in WWW stand for? A truly global website should work on any devices that your customers use, with varying connectivity standards, world-wide.
  • Constrained connectivity
    Many places are not well connected to the Internet, whether in terms of capacity, data plans, etc. This problem affects people living in modern well-connected cities as much as people in less developed rural areas.

This article will help your website break through this glass ceiling (or floor) and enable everyone everywhere to reach your content with equal ease. Note that we’ll be using open-source software in conjunction with a device-detection library to achieve this goal; there are many device-detection solutions out there (such as WURFL, OpenDDR11 and DeviceAtlas — I was involved in developing the latter one).

We’ll demonstrate three levels of optimization, each building on the previous one, but each of which can be implemented separately, too. These optimizations will take on the following:

  1. Reduce image payload (the biggest effect),
  2. Reduce JavaScript and CSS payload,
  3. Further optimize based on bandwidth detection.

Before we begin, let’s confirm why this process is necessary.

Background On Responsive Web Design And RESS Link

Responsive Web design is fast becoming the preferred method for making a website mobile-friendly, and with good reason. For many websites, it achieves a reasonable balance between mobile-friendliness and ease of implementation.

But design isn’t just about appearance, and the main problem that we all are struggling with in Responsive Web design is that by “default” (when using the simplest way to display and hide containers with display: none;) all devices are sent the same payload. It means that devices with low-resolution screens are sent the same images as those sent to high-resolution devices, even though they can’t show them at their native resolution. This is inefficient at best and market-limiting at worst. The result is a page that works well on high-end well-connected devices with generous data plans.

Many attempts have been made to solve this problem, with varying levels of effectiveness. Probably the most promising approach has been to have the browser determine the most appropriate image version to fetch. The W3C has an ongoing initiative12 to standardize an approach, but this likely won’t be commonplace in browsers within the next year or so. The candidate solutions — the new <picture> element and new srcset attribute for the <img> element — each has its own issues and benefits. Rather than discuss them here, I suggest reading Jason Grigsby13’s “The Real Conflict Behind <picture> and @srcset14.” In the meantime, polyfills such as Boris Smus15srcset-polyfill16 will mimic some of the proposed attribute’s behavior.

We’ll describe a relatively straightforward approach to minimizing page weight in RWD using a touch of server-side magic. We’ll use device detection on the server to optimize images when they’re requested by the browsers by reducing their dimensions to the optimum size.

Some people use Src17 (neé TinySRC) and other image-resizing libraries, but doing this yourself is just as easy, and then you’ll have total control over the result. Yes, this does exchange one external dependancy for another, but it also enables you to optimize much more than images. We’ll use PageSpeed18 for this purpose, an open-source project from Google.

PageSpeed is a Web server module that applies multiple best practices to a website without forcing the developer to change their workflow. These optimizations include combining and minifying JavaScript and CSS files, inlining small resources, removing unused meta data from each file, and re-encoding images to the most efficient format accessible to the user. PageSpeed is available for both Apache and NGINX. We’ll take advantage of a little-used image-resizing feature of PageSpeed.

The resulting weight savings are dramatic and require very little work. If you follow the steps in this article, you should be able to cut down on image weight many times over with three easy steps, only four lines of code and one line of configuration.

This article assumes that you are using the Apache Web server and are comfortable with some light PHP coding, but the techniques will work with NGINX19 also, and any programming language.

The Website Link

Our sample website is based on Twitter’s open-source Bootstrap20 Web framework, to save the world from my design skills. I created a single-page “website” for a fictional mobile phone store. The page is visually rich, with an industry-average breakdown21 of HTML, images and JavaScript. This page is based on a lightly modified version of Bootstrap’s carousel template22. Here is the page in its entirety, as you would see it in a desktop browser:

full website image

The website has an approximately industry-average payload that breaks down as follows:

Component Size on disk
Image 941 KB (73%)
JavaScript (mostly minified) 159 KB
CSS 170 KB
Total 1,281 KB

Instructions Link

Step 1 Link

Install PageSpeed. This is best done by following Google’s instructions23. The installation process will usually activate the module for the default website, but you might need to ensure that it works with your virtual hosts24, if they’re configured. Basically, you just have to add a line to each one, or get them all to inherit from the default configuration server-wide.

Next, restart your Web server.

Step 2 Link

Get a device detection library set up. In our case, if you are using DeviceAtlas (which we’ll use just as an example here), you’ll need to enter your license key in the DeviceAtlasCloud/Client.php file after you unpack the ZIP file. We then can use DeviceAtlas to determine the optimal size for images.

Step 3 Link

Copy DeviceAtlas’ PHP file to a directory where it is executable by the Web server. In this case, I’ve created a directory in the root of the website, named DeviceAtlasCloud. Enter the following code at the top of your HTML file or website template to set up a couple of variables that we can use throughout the page. If you’re using a different solution, then the syntax will vary but the same properties should be available.

  include 'DeviceAtlasCloud/Client.php'; // instantiate client
  $results = DeviceAtlasCloudClient::getDeviceData(); // fetch props 
  $props = $results['properties']; // store result
  // set $width to correct width or "" if unknown 
  $width = (isset($props['displayWidth'])) ? $props['displayWidth'] :""; 

The impact of this fetching of properties should be no more than a few milliseconds if everything is set up correctly as most solutions will cache data.

Step 4 Link

The final step is to give all of your images that might need resizing a width attribute, set to use the $width variable:

<img src="" width="<?php echo $width; ?>" alt="image description" />

Thus, images will now have their width attribute set from the $width variable, which is set automatically to the maximum display width for each device. Then, PageSpeed will notice the width="…" tag for each image and scale it down if necessary, replacing the image source’s attribute with a reference to a resized version of the same thing. There is no need to set the height attribute because PageSpeed automatically retains the aspect ratio. Resized images are cached, so there isn’t really any significant impact on the server. Refer to the PageSpeed configuration notes below for more fine-grained control of this cache.

Add this variable width tag only to images that will need to be resized for each device; that is, don’t add it to images that are already small enough (such as bullets and icons). Manually resizing images that require special attention might also be wise. Also, be aware that the width attribute for each image will need to play nice with any CSS used for image styling. Anyone using a content management system (CMS) might have to use a different technique for this, depending on how much access the CMS gives them to the underlying HTML.

Background images might require a different approach, depending on how they are implemented on the website. Still, PageSpeed will read inline style="…" tags.

Result Link

With those changes made, it’s time to see how the website looks. To measure the impact of this step, I tested the download speed of our website for different devices and network speeds, using the ever-useful Charles Proxy25 as well as real devices that I forced into various network configurations to reduce available bandwidth.

Before making the changes, the page’s overall size was 1,027 KB, regardless of device (i.e. a dynamic range of exactly 1.0). The breakdown is as follows:

Component Size on disk Size over network
Images 941 KB 941 KB (73%)
JavaScript 159 KB 55 KB
CSS 170 KB 27 KB
Total 1,281 KB 1,027 KB

This is the over-the-network size of the page, thanks to gzip compression from Apache. The RWD makes the page look OK on small screens, but it isn’t an efficient use of network resources because the original images are about five times wider (in pixels) than the average phone screen and, hence, impossible to display at full resolution without panning. In fact, the sheer size of the page means that it doesn’t even finish loading on some devices.

After following the above steps, the payload breakdown for an iPhone is this:

Component Size over network
Images 177 KB
JavaScript 51 KB
Total 253 KB

So, the page has dropped to one fourth of its original size, with image sizes accounting for the majority of this reduction. On devices with lower-resolutions screens, further gains are possible; the same page on a Nokia feature phone (Nokia 6230) now has a total image weight of just 89 KB, a large saving compared to the original. Importantly, the before and after websites have no perceptible difference to the user; the image data that was removed cannot easily be seen by the mobile user.

After following the four steps outlined here, the page’s weight now varies between 1,027 KB and about 150 KB. In other words, the page has gone from having a dynamic range factor of exactly 1.0 to a decent 6.8, with the image payload shrinking from 941 KB to just 80 KB. This will have a huge impact on real-world customers:

  • Much faster page-loading times lead to better engagement and fewer drop-offs.
  • A smaller impact on the user’s data plan leads to more return visits.
  • Wider device and network compatibility leads to improved reach.

Here’s the loading-time effect on a Retina iPhone using 3G and 2.5G networks:

Device and network Before After
iPhone 3G 14s 6s (2.3× faster)
iPhone GPRS 2m 30s 35s (4.3× faster)

The results on Android devices are similar. On lower-end devices with smaller screens, more dramatic improvements are possible because the image-resizing gains are greater.

Going Further, Part 1: JavaScript And CSS Link

So far, we’ve looked only at the main source of bloat on pages: images. But screen size shouldn’t be the sole factor in our methods — user contexts and constraints demand more of a multi-device publishing strategy because many additional optimizations are to be made. For example, if you know that the requesting device doesn’t support JavaScript or rich CSS, then ditching them might make sense. This is quite straightforward to accomplish.

If we add another line of PHP to the top of our HTML file, we can do a little more.

$highEndDevice = (isset($properties['browserRenderingEngine']) && in_array($properties['browserRenderingEngine'], array('Gecko', 'Trident', 'WebKit', 'Presto')));

We are making the determination of a low-end device based on its rendering engine. This is a crude rule but good enough to demonstrate a point. Based on this variable, we can now selectively include some resources only if they’re beneficial. In this case, if the device appears to be low end, we’ll jettison the CSS and JavaScript, because low-end phones will have problems with both the file size and the rendering of this CSS, and they usually won’t run the JavaScript:

<?php if ($highEndDevice): ?>
    <link href="css/bootstrap.css" rel="stylesheet">
    <link href="css/bootstrap-responsive.css" rel="stylesheet">
    <link href="css/additional.css" rel="stylesheet">
<?php endif; ?> 

And now the JavaScript:

<?php if ($highEndDevice): ?>
    <!-- Le javascript
    ================================================== -->
    <!-- Placed at the end of the document so the pages load faster -->
    <script src="js/jquery.js"></script>
<?php endif; ?>

This saves a further 72 KB (quite a reduction from 253 KB!) and actually makes the page look better on low-end devices (see the screenshot at the end of this article), in addition to being quicker to render. Now, when viewed on a low-end device, the page’s weight is as follows:

Component Size over network
Images 50 KB
JavaScript 0 KB
Total 53 KB

This means that our simple RWD page has gone from a fixed size of about 1 MB to a highly varying one that goes as low as about 50 KB, or 20 times smaller than the page we started with. Not a bad result for fewer than 10 lines of code. The net result is that our page is now viewable on almost anything, anywhere, from televisions to feature phones — and quickly! You might not be targeting televisions and feature phones, but now they will come to you.

Note: In testing this approach on real devices, HTML5’s doctype tag did not cause problems. The page loaded on pretty much every device I tried.

Going Further, Part 2: Connectivity Detection Link

Most of the weight savings so far hinge on certain low-end mobile devices not needing full-resolution images, rich styling or JavaScript. This set of techniques is highly effective, but we can do more to cater to our users and extend our reach. Desktop devices sometimes need help, too. Anyone who has used a laptop over airport Wi-Fi or in a poorly connected country knows just how frustrating viewing large pages is.

Connectivity detection comes to the rescue here. The idea is that, if you can detect the connectivity available to the requesting browser, then you can apply similar image-compression techniques dynamically according to the client’s available bandwidth. This can make a huge difference to the browsing experience, at some cost to page fidelity: If we detect a poor connection, then images can be aggressively compressed without reducing their pixel size. The result is a page that loads much faster, with only a slight impact on the experience — the page’s layout and overall appearance will be preserved. Depending on the compression levels, many people won’t even notice.

The W3C is working on making bandwidth information available to the browser, but the Network Information API26 is still in draft status and so won’t be widely deployed in the near future.

In the meantime, we can use some server-side capabilities. DeviceAtlas incorporates a feature to do exactly this, enabling the developer to make useful decisions about what to send to the client when bandwidth is limited. For this example, we’ll do something simple yet effective: If the bandwidth available to the device is detected to fall below a certain threshold, then we will redirect the browser to a different virtual host. This virtual host will be served by the same Web server and will serve exactly the same page, but it will trigger a different set of options for PageSpeed. In this case, we’ll change the image-compression level from its default to something much lower — say, 20%. At this level, images will still be very recognizable and will fit the page’s layout but will be many times smaller in bytes.

What follows is a simple way to achieve this, explained merely as a quick example of what is possible rather than as a definitive technique. First, add a new virtual host to your server’s configuration, and configure PageSpeed to use different settings for it. Then, restart your Web server. I am using and as my virtual hosts. Note that the DocumentRoot is the same in each case — the exact same HTML is being served for both virtual hosts.

<VirtualHost *:80>
    ServerAdmin webmaster@localhost
    DocumentRoot /var/www/
    ModPagespeed on

<VirtualHost *:80>
    ServerAdmin webmaster@localhost
    DocumentRoot /var/www/
    ModPagespeed on
    ModPagespeedImageRecompressionQuality 20

Next, add the connectivity-checking code to your website’s template. This is what switches the virtual host if connectivity appears to be poor:

require_once 'DeviceAtlasNPC.php';        // check network performance
$deviceAtlasNPC = new DeviceAtlasNPC();   // instantiate NPC
$quality = $deviceAtlasNPC->getQuality(); // test network performance
$path = $_SERVER['SCRIPT_NAME'];

switch($quality) {
    case DeviceAtlasNPC::HIGH_QUALITY:
        if ($_SERVER['HTTP_HOST'] == '') {
            header("Location:".$path );
    case DeviceAtlasNPC::MEDIUM_QUALITY:
        if ($_SERVER['HTTP_HOST'] == '') {
            header("Location:".$path );
    case DeviceAtlasNPC::LOW_QUALITY:
        if ($_SERVER['HTTP_HOST'] == '') {
            header("Location:".$path );

Yes, the initial redirection to the low-bandwidth website does affect the loading time, but this penalty is far outweighed by the net savings in doing so, particularly for users with constrained bandwidth; high-bandwidth users should be almost unaffected. The cost of this redirection on a slow GPRS connection is approximately one second, but the resulting savings can add up to minutes.

Overall Results Link

So, let’s look at how this all comes together with the various optimizations, device types and connectivity options. The following table sums it all up.

Original website Add image resizing Add adaptive JS and CSS
Page size Loading time Page size Loading time Page size Loading time
Desktop (high-speed) 1027 KB 3s 921 KB 3s 921 KB 3s
Desktop (56k modem) 1027 KB 3m 27s 921 KB 3m 14s 921 KB 2m 05s
iPhone 3G 1027 KB 14s 253 KB 7s 253 KB 6s
iPhone GPRS 1027 KB 2m 30s 253 KB 40s 253 KB 40s
Feature phone (2G) 1027 KB 203 KB 35s 87 KB 25s
Original website Add network adaptivity
Page size Loading time Page size Loading time
Desktop (high-speed) 1027 KB 3s 921 KB 3s
Desktop (56k modem) 1027 KB 3m 27s 227 KB 40s
iPhone 3G 1027 KB 14s 153 KB 5s
iPhone GPRS 1027 KB 2m 30s 153 KB 25s
Feature phone (2G) 1027 KB 25 KB 12s

These loading times are all worst-case scenarios, tested with an empty cache. Subsequent page loads as you traverse the website will feel much faster.

With all of these optimizations in place, we now have a more sensible responsive design that scales dynamically in richness and size, from 1 MB all the way down to 25 KB, all in. The website incorporates the best of RWD and server-side optimizations to yield a dynamic range factor of over 40.

The website is now responsive to multiple factors, not just one: screen size, device capabilities and network performance. As a result, the reach of this page has extended from desktop and smart devices in well-connected locations to almost everything, everywhere, regardless of connection type. Apart from getting our image size tags to populate dynamically, we didn’t have to do much work to make this happen.

Website before optimization on various devices:

Desktop: 1360 × 768 pixels, page size 1,027 KB

iPhone: 320 × 480 pixels, page size 1,027 KB

Nokia 6300: 240 × 320 pixels, page size 1,027 KB

Website after optimization on various devices:

Desktop: 1360 × 768 pixels, page size 1,027 KB

iPhone: 320 × 480 pixels, page size 153 KB

Nokia 6300: 240 × 320 pixels, page size 25 KB

Circling back to the page linked at the top of this article, Chris Zacharias, speaking of his experience optimizing YouTube’s page weight, says the following33:

“[Previously], entire populations of people simply could not use YouTube because it took too long to see anything… By keeping your code small and lightweight, you can literally open your product up to new markets.”

By using some of the techniques outlined in this article, you might be able to achieve similar results for your website.

Notes Link

PageSpeed has a couple of other useful tricks up its sleeve35.

Many thanks to Jonathan Heron36 of McCannBlue37 for reviewing this article.

(al, il)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36
  37. 37

↑ Back to top Tweet itShare on Facebook

Ronan Cremin is the CTO of dotMobi. He focuses on building tools and services for mobile web applications, such as the award-winning goMobi and DeviceAtlas products. Mr. Cremin also represents dotMobi at the World Wide Web Consortium (W3C). Mr. Cremin has more than a dozen years of experience in mobile web and internet sectors, including stints in leading markets like Japan with NTT DoCoMo, Europe with Vodafone, 3 and Orange, and the USA with AOL. Prior to dotMobi, he focused on product management in the mobile application sector at Valista. Before Valista, he was at Critical Path, where he managed email and directory products. Ronan holds a BE in Electronic Engineering from University College, Dublin.

  1. 1

    Very interesting technique… Is there an existing open source alternative to DeviceAtlas ?

    • 2

      Yes, OpenDDR ( is the nearest open source equivalent but doesn’t include an integrated client-side option and connectivity analysis tool.

    • 3

      There’s OpenDDR — I know they started with a fork of WURFL’s database from when it was still open-source and hosted on SourceForge. Apparently they faced some sticky legal issues, because the maintainers of WURFL claimed that they couldn’t do that (although every open-source license I know of says you can). Not sure how big they are now, but they do seem to be growing (although probably more slowly than a commercially maintained one). They have SDKs for Java and .NET too, which is nice.

      Edit: Looks like the author beat me to it :-)

      • 4

        WURFL Maintainer here. I can confirm that the openDDR thing isn’t solved yet.

    • 5


      There’s some RESS resource for mobile and tabled devices here

    • 6

      Hi, Luca Passani of WURFL (and ScientiaMobile) here. Since you ask about open-source device detection, WURFL is very arguably still Open-Source, in that it can be downloaded from our site with the AGPL license (endorsed by both FSF and OSI). This includes the source code.
      In a nutshell, you can use the API free of charge if you are willing to release your application’s complete code base with an AGPL compatible license (this is good ol’ RMS-style “copyleft” extended to software that runs on a web server).

      Of course, we understand that this is not very viable for many organizations (particularly commercial ones). Those organizations can acquire the same software (plus valuable goodies such as frequent updates) under a different commercial license. This is the foundation of our business model and the one that allows ScientiaMobile to pay for resources that keep the repository updated, which is why people love it.

      Finally, we also offer the WURFL Cloud, which includes a free offer for hobbyists and those who are simply experimenting. This comes with support for all major (and quite a few minor) languages.

      Thank you

      • 7

        @Luca, not sure I understand the paragraph: “In a nutshell, you can use the API free of charge if you are willing to release your application’s complete code base with an AGPL compatible license (this is good ol’ RMS-style “copyleft” extended to software that runs on a web server).”.

        If the code that connects to the API (free of charge or subscription based) is MIT licensed or similar your code can be private and licensed or sold without problems.

        @Ronan thank you for the article! I would add the fact that you should also have discussed about content strategy, not the files (CSS, JS) but the actual content from the page. It’s important for people to understand that they can help their RWD implementation with Server-Side detection (API or locally) in so many ways.

        @Andrew thanks for pointing out Mobile Detect, we’re working hard to keep it up to date so I understand all the work companies like DeviceAtlas, ScentiaMobile, 51Degrees, HandsetDetection, etc. put into their platforms to provide a wide array of Server-Side features.

        • 8

          @Serban I very much agree that the actual content of the page can (or should) be altered per device, but wanted to tackle the basics first. I think that devices like Google Glass and the various emerging smart watches at one end, and TVs and in-car browsers at the other, may force a change in thinking here. The use cases differ radically. A one-size-fits-all approach to page content is very appealing but may have natural limits.

          Watch out for the next article!

        • 9

          @Serban, You cannot sell MIT code for a living. I know that Richard Stallman has said that you can, but this is only an academic hypothesis because nobody pays money for something that is also available for free and it is virtually impossible to run a business on it (and believe me, with years of WURFL experience I know what I am talking about). Granted, companies like RedHat and MySQL have done it, but that’s software with hundreds of thousands of users. More specific kinds of software are harder to monetize with a purely FOSS approach.
          If you are talking about reselling WURFL with an AGPL license (regardless of the fact that are parts are not AGPL), means that you are passing the obligation to open source their complete code base to your customers. If one is running a real business, licensing WURFL from ScientiaMobile is a much better option for quite a few reasons.
          Developers should consider the fact that we release the source code as a feature, and not as an incentive not to pay for WURFL. We have granted discounted (and even free licenses) to no-profits and other non-commercial institutions in the past.

          About keeping code private, you would be surprised to learn how powerful AGPL is in penetrating concrete walls and put a copyleft on things that would normally be considered private under regular GPL. Not by coincidence, all IP lawyers I’ve met (on both sides of the Atlantic) recommend companies to sail miles away from anything AGPL….

  2. 10

    Joey van Dijk

    October 8, 2013 7:15 am

    A nice and complete setup that works! ;)
    Images are the biggest culprit optimizing your website for multiple devices and therefore I wrote : a responsive images solution that works with pure javascript on the client. It reacts to onload, resizes and DOM-updates. for a demo and for more information.


    • 11


      Thanks for the comment.

      While I agree that your solution fetches appropriately-sized images it does require that all of these images variants are created in advance. With PageSpeed all of this heavy lifting is done for you, automatically.

      Different sites will have different publishing workflows, of course, so it’s important to pick the one that works best for you.

  3. 12

    Before anyone says ‘hey, this totally breaks the REST paradigm because now you’re no longer serving the same resource all the time’, actually, it’s still compatible with RESTful approaches: the User-Agent header in the request becomes part of the input, so consistency is still maintained for a given input.

  4. 13

    It’s great this technique makes the website responsive to more than just screen size, but also device capabilities and network performance, which all improves page speed- a key UX factor that is not always considered by designers because it falls in the realm of development.

  5. 14

    Melvin Thambi

    October 8, 2013 6:31 pm

    Interesting article, will give a try and let you know the result. Kudos for your great effort :)

  6. 15

    A free Windows-alternativ to Charles Prox is

  7. 16

    This seems to be a very long article that basically says “compress and rescale your images” and “don’t load files you don’t need”.

    I thought that was common sense (and common knowledge).

    Or did the advent of broadband make people forget that page size and load time is important?

    • 17

      Common sense isn’t that common ;-)

      Plus the article describes actual techniques, not just what to do.

    • 18

      Clearly you consider yourself informed and therefore this article is not aimed at you and at the `uninformed` instead! I personally wasn’t aware of these specific techniques although I agree it is obvious that we should be optimizing to reduce pageload/bandwidth where possible ;)

      Thanks Ronan!

  8. 19

    I did a lot of optimizing on images and resource usage for a MVC based site recently. It dramatically improved the load on bandwidth, devices, and servers by using RESS. I am a huge proponent of using it as, with all major US carriers going to basically metered internet, it seems like scaling down page sizes would be more critical now than ever. In .NET I’m creating tailored CSS and JS packages for features that each browser/device can use.

  9. 20

    I’ve been using a similar technique with php mobile-detect and a simple resize/cache script (it needs to work on servers where pagespeed isn’t available), but there is always an area where I get stuck – what size the images should be scaled to for different devices.

    Am I missing something here? Using the max width of the screen does not make sense to me and seems to be a big drawback of this method. It works fine when all images are 100% width of the screen but what about the pictures of the phones that are missing from the comparative images?
    If each of those images were originally 600px wide and I view on a screen which is 1200px what happens then? The width attribute is set at 1200px and it gets enlarged with an increased filesize and gets constrained by css for the area it lives in?
    This would be the same for anything other than a mobile where you would probably want the images 100% width.

    Please fill me in as this is something I am wrangling with at the moment and can’t find a solution I’m happy with :(

    • 21


      Thanks for the comments.

      I think that the main point of resizing to the device screen width is that you know it can’t display any more detail than this so it’s a good saving vs. the original, even if the image is styled to be less than 100% width within the page.

      With PageSpeed in particular I’m pretty sure that it doesn’t do any scaling *up* of images before sending because that would go against its philosophy (and is easily done by the browser in any case).

  10. 22

    Ilincescu Mihai

    October 12, 2013 10:10 am

    This page speed from google is working only for apache servers for IIS and for microsoft and windows world ?

  11. 25

    Barbara Bermes

    October 15, 2013 7:53 am

    Hi Ronan,

    Great article. Thank you.

    A few questions in regards to connectivity detection and this blog post (clients-side detection via media queries,, e.g. Ilya’s reply in the comments).

    Do you only apply server-side connectivity detection ($quality = $deviceAtlasNPC->getQuality();) on first initial load of the page or on each page request? What are the $quality intervals that you check against, how granular are those?

    • 26


      Thanks for the comments.

      The $quality intervals are configurable but the default is 1 in every 10 page loads for each client.

      The granularity is 3 levels, approximately equivalent to real-world GPRS / 3G / WiFi rates. We toyed with making them finer-grained but we think this is good enough for the vast majority of use cases.

      The really nice thing about this technique is that it works on practically any device, even the lowest-end feature phone. This is important because it is these very devices that tend to need the most help, not the latest smartphone.

  12. 27

    For now, the best PHP library I found for implementing a RESS solution is Detector : (Github :

    It’s based on some robust code : “The server-side portion of Detector is based upon modernizr-server by James Pearce (@jamespearce) and the browser-detection library ua-parser-php. Detector utilizes Modernizr for its client-side, feature-detection support.”

    It’s also easy to use and install.

  13. 28

    It’s based on some robust code : “The server-side portion of Detector is based upon modernizr-server

  14. 29

    A great article, very thorough and informative, thanks :)

  15. 30

    Christian Krammer

    December 12, 2013 3:37 am

    Device detection is flawed from the ground up in my eyes. The minute you set up such a library it is already outdated. Apart from that: something like array(‘Gecko’, ‘Trident’, ‘WebKit’, ‘Presto’)” really gives me the shivers. Just using the WebKit engine doesn’t make a device a highend device.

    In my eyes feature detection is still the way to go. I like the way the BBC does it with the “cutting the mustard” technique and classifies the devices into “old” and “new”. That’s also how I did it lately at Works quite good, because this this way you can also assume that a good level of advanced CSS support is available and don’t need to deal with oldIE or the like. Just assume “old” and “new” and that’s it.

    • 31

      Thanks for the comments Christian.

      > The minute you set up such a library it is already outdated.

      The device detection libraries certainly do need to be kept up to date but the good ones are updated daily and stay 100% current to all intents and purposes, meaning that they will successfully recognise 99+% of devices.

      > Apart from that: something like array(‘Gecko’, ‘Trident’, ‘WebKit’, ‘Presto’)” really gives me the shivers.

      I agree, and said so in the article, “This is a crude rule but good enough to demonstrate a point.”

      > In my eyes feature detection is still the way to go.

      Feature detection is a good way to detect features but isn’t going to help you to lighten your site, which was the point of this article. Note that JS-derived screen properties are next to useless, though they will presumably improve:

      One other point: many of the feature detection libraries do their own UA-based logic behind the scenes e.g.

  16. 32

    Another good Charles Prox alternative is HTTP Debugger Pro:


↑ Back to top