Website Performance: What To Know and What You Can Do

Advertisement

Website performance is a hugely important topic, so much so that the big companies of the Web are obsessed with it. For the Googles, Yahoos, Amazons and eBays, slow websites mean fewer users and less happy users and thus lost revenue and reputation.

In your case, annoying a few users wouldn’t be much of a problem, but if millions of people are using your product, you’d better be snappy in delivering it. For years, Hollywood movies showed us how fast the Internet was: time to make that a reality.

Even if you don’t have millions of users (yet), consider one very important thing: people are consuming the Web nowadays less with fat connections and massive computers and more with mobile phones over slow wireless and 3G connections, but they still expect the same performance. Waiting for a slow website to load on a mobile phone is doubly annoying because the user is usually already in a hurry and is paying by the byte or second. It’s 1997 all over again.

Screenshot

Performance is an expert’s game… to an extent. You can do innumerable things to make a website perform well, and much of it requires in-depth knowledge and boring testing and researceh. I am sure a potential market exists for website performance optimization, much like there is one now for search engine optimization. Interestingly, Google recently announced that it will factor performance into its search rankings, so this is already happening. That said, you can do a lot of things without having to pay someone to point out the obvious.

Know Your Performance Blockers

Performance can be measured in various ways. One way is technical: seeing how fast a page loads and how many bytes are transferred. Another is perceived performance, which ties into usability testing. This can only be measured by testing with users and seeing how satisfied they are with the speed of your interface (e.g. do they start clicking on your JavaScript carousel before it is ready?).

The good news (and hard truth) about performance is that 80 to 90% of poor performance happens in the front end. Once the browser gets the HTML, the server is done and the back-end developer can do nothing more. The browser then starts doing things to our HTML, and we are at its mercy. This means that to achieve peak performance, we have to optimize our JavaScript, images, CSS and HTML, as well as the back end.

So here are the things that slow down your page the most.

External Resources (Images, Scripts, Style Sheets)

Every time you load something from another server, the following happens:

  1. The browser opens up the Internet’s address book and looks up the number associated with the name of the server that’s holding the things you want (i.e. its DNS entry).
  2. It then negotiates a delivery.
  3. It receives the delivery (waiting for all the bytes to come in).
  4. It tries to understand what was sent through and displays it.

Every request is costly and slows down the loading of the page. This is also caused by browsers loading things in chunks (usually four at a time) rather than all at the same time. This is akin to ordering a product from a website, choosing the cheapest delivery option and not being at home between 9:00 am and 5:00 pm. If you include several JavaScript libraries because you like a certain widget in each, then you’ll double, triple or even quadruple the time that your page takes to load and display.

Scripts

JavaScript makes our websites awesome and fun to use, but it can also make for an annoying experience.

The first thing to know about scripts that you include in a document is that they are not HTML or CSS; the browser has to call in an expert to do something with them. Here is what happens:

  1. Whenever the browser encounters a <script> block in the document, it calls up the JavaScript engine, sits back and has a coffee.
  2. The script engine then looks at the content in the script block (which may have been delivered earlier), sighs, complains about the poor code, scratches its head and then does what the script tells it to do.
  3. Once the script engine is done, it reports back to the browser, which puts down its coffee, says good-bye to the script engine and looks at the rest of the document (which might have been changed, because the script may have altered the HTML).

The moral of the story is to use as few script blocks as possible and to put them as far down the document as possible. You could also use clever and lazy JavaScript, but more on that later.

Images

Here is where things get interesting. Optimizing images has always been the bane of every visual designer. We build our beautiful images in Illustrator, Photoshop or Fireworks and then have to save them as JPG, GIF or PNG, which changes the colors and deteriorates the quality; and if we use PNG, then IE6 arrives as the party-pooper, not letting us take advantage of PNG’s cool features.

Optimizing your images is absolutely necessary because most of the time they are the biggest files on page. I’ve seen people jump through hoops to cut their JavaScript down from 50 KB to 12 KB and then happily use a 300 KB logo or “hero shot” in the same document. Performance needs you!

Finding the right balance between visual loss and file size can be daunting, but be grateful for the Web preview tool, because we didn’t always have it. I recall using Photoshop 4 and then Photoshop with the Ulead SmartSaver, for example.

The interesting thing about images, though, is that after you have optimized them you can still save many more bytes by stripping unnecessary data from the files and running the files through tools that further compress the images but are non-lossy. The bad news is that many of them are out there, and you’ll need different ones for different image formats. The good news is that tools exist that do all that work for you, and we will come back to this later. For more advanced optimizaition techniques feel free to take a closer look at the Smashing Magazine’s articles Clever JPEG Optimization Techniques, PNG Optimization Guide and Clever PNG Optimization Techniques.

Simple Tools You Can Use Now To Improve Performance

All of those companies that obsess about page performance offer tools that allow you to check your own website automatically and make it easy to work around problems.

Test Your Performance

The first thing to do is find out how your website can be optimized. Here are three great tools (among others that crop up all the time) to use and combine.

Yahoo’s YSlow

YSlow is a Firebug add-on from Yahoo that allows you to automatically check your website for performance issues. The results are ranked like American school grades, with A being the best and F being the worst. The grades are cross-linked to best practice documentation on the Yahoo performance pages. You can test several settings: “classic YSlow,” which is targeted to Yahoo-sized websites, “YSlow 2″ and “small site or blog.” Results are listed clearly and let you click through to learn.

YSlow Smashing mag overall grade.

In the components view, YSlow lists all of the issues it has found on your website and how serious they are:

Smashing Magazine on YSlow components view.

The statistics view in YSlow gives you all information in pie charts:

Smashing Magazine in YSlow - statistics.

The tools section in YSlow offers a lot of goodies:

  • JSLint
    Checks the quality and security of your JavaScripts by running them through JSLint.
  • All JS
    Shows all JavaScript code in a document.
  • All JS Beautified
    Shows all JavaScript code in a document in an easy-to-read format
  • All JS Minified
    Shows all JavaScript code in a document in a minified format (i.e. no comments or white space)
  • All CSS
    Show all CSS code in a document
  • All Smush.it
    Automatically compresses all of your images (more on this later).
  • Printable View
    Creates a printable document of all of YSlow’s results (great for showing to a client after you’ve optimized the page!)

YSlow tools.

Google’s Page Speed

Like YSlow, Page Speed by Google is also an add-on for Firebug. Its main difference is that it does a lot of the optimization for you and provides the modified code and images immediately.

Smashing Magazine on Google Page Speed.

Page Speed’s other extra is that it monitors the overall activity of your page, allowing you to see when a document loads other resources after it has been loaded and to see what happens when a user rolls over elements or opens tabs and menus that load content via AJAX.

Smashing Magazine Page Speed Activity.

Be careful with this feature, though: it hammers your browser quite hard.

AOL’s WebPageTest

Rather late to the game, AOL’s WebPageTest is an application with some very neat features. (It is also available as a desktop application, in case you want to check Intranets or websites that require authentication.)

WebPageTest allows you to run tests using either IE8 or IE7 from a server in the US or the UK, and it allows you to set all kinds of parameters, such as speed and what to check for:

Pagetest web page performance test.

Once you have defined your parameters and the testing is completed, you will get in-depth advice on what you can do to optimize. You’ll get:

  • A summary,
  • Detailed results,
  • A performance review,
  • An optimization report,
  • The content breakdown,
  • The domain breakdown,
  • A screenshot.

Web page performance test results.

One very cool feature of WebPageTest is the visual recording you get of how long it takes for page elements to show up on screen for users. The following screenshot compares the results of this blog, Ajaxian and Nettuts+:

Web page visual comparison by  you.

You can even create a video of the rendering, which is another very cool thing to show clients.

Once you get the test results, it is time to fix any problems.

Use Image Sprites

Image Sprites were first discussed in an article published by Dave Shea and based on the work of Petr Stanicek. They have been covered extensively here before, but understanding their full benefit is important before you start using them:

  • All of your images will be available as soon as the main image has loaded (no flickering on roll-overs or other annoyances).
  • One HTTP request is made, instead of dozens (or hundreds, in some cases).
  • Images have a much higher chance of staying cached on the user’s machine because they are contained in a single file.

Shea’s article points out a lot of cool resources for creating CSS Sprites but misses one that was released not long ago. Sprite Me was produced by Google (under the supervision of Steve Souders) and allows you to create Sprites automatically from any website, even via a bookmarklet. It analyzes the images on a page and offers you various options before generating the Sprite and CSS for you.

Here’s a video of Steve showing Sprite Me in action:

Sprite Me on ebay by  you.

Optimize Your Images

You know now that Page Speed can automatically optimize your images. Another way to do this is with Yahoo’s Smush It, which is a set of image optimization tools that analyze your images, create the smallest possible versions and sends you a ZIP file of them all.

You can use Smush.it directly in the browser or automatically from YSlow. The website tells you how many bytes you can save by optimizing your images. This is yet another cool thing to show potential clients when pitching for a job.

Yahoo! Smush.it™

Smush.it optimization results.

Collate Scripts and Load Scripts on Demand

As noted, try not to spread your <script> nodes all over the document, because the browser stops whenever it encounters one. Instead, insert them as far down in the document as possible.

You could even collate your scripts automatically in one single include using back-end scripts. Edward Eliot wrote one of these in PHP a while ago. It lets you create a single JavaScript include for all of your scripts and one for your CSS files, and it even versions them for you.

JavaScript can be added dynamically to the page after the page has loaded. This technique is called “lazy loading,” and several tools are available to do it. Jan Jarfalk has one to lazy load jQuery plug-ins.

Some JavaScript libraries let you import only what you really need, instead of bringing in the whole singing-and-dancing library. YUI, for example, has a configurator that allows you to pick and choose what you need from the library and either gives you a single URL where you can get the different scripts or creates a JavaScript that loads them on demand:

The YUI Configurator

Notice that a tab tells you what the overall size of the library will be.

The main trick in lazy loading is to dynamically create script nodes with JavaScript after the page has loaded and only when they are needed. I wrote about that two years ago on 24ways, and it has been a best practice for displaying badges and widgets for a long time now.

Use Network Distributed Hosting

If you use a library or CSS provided by a library, make sure to use the hosted versions of the files. In the case of YUI, this is done for you if you use the configurator. And you can pick from Yahoo or Google’s network.

For other libraries, there is a Google code repository of AJAX libraries. This is useful for a few reasons:

  • Visitors to your website will get the JavaScript and CSS from the server that is as geographically close to them as possible and won’t have to wait for your server to send the information from around the globe.
  • There is a high probability that these servers are faster than yours.
  • Visitors who have visited other websites that use the same includes will already have them on their computers and won’t need to load them again.
  • You save on bandwidth and can easily upgrade the library by changing the version number of the include.

While you probably wouldn’t be able to afford distributed hosting for your own files, Coral makes an interesting offer to distribute your data onto a network of servers for an affordable $50 a month.

Watch Some Videos

If you want to see how some of this work, check out the following videos, especially Nicole Sullivan’s, which shows some very cool CSS tricks:

Follow The Leaders

To learn more about website performance, here are some resources and people to follow. (Be warned: some of the content is technically tough.)

(al)

↑ Back to top

An international Developer Evangelist working for Mozilla in the lovely town of London, England.

  1. 1

    I like the SpriteMe thingy, this always was a shitty job.

    0
  2. 2

    Not wishing to state the obvious, but surely the simplest solution is to add a mobile stylesheet stripping out all the extraneous crap that people with tiny screens don’t need to see.

    0
  3. 3

    I have been using smush.it on images for a while now, and find it saves loads on file size. Does using valid html/xhtml with properly nested tags etc. help a page to load and render faster? I have heard this rumoured but never read any solid proof of it.

    0
  4. 4

    @mike for those mobiles that support them (which are not many), yes. But you still load the main document…

    0
  5. 5

    Great article!
    I’ll make sure to consider most of the optimization methods mentioned.
    I could imagine that performance optimization will loose even more importance on the long run…the internet (as well as the mobile internet techniques) will become faster and faster anyway.

    0
  6. 6

    Great Post. Read these books “High Performance Web Sites” and “Even Faster Websites” written by Steve Souders. Amazing read and thorough insight into website optimisation techniques and methods.

    0
  7. 7

    I highly recommend Aptimize WAX (http://www.aptimize.com/how-it-works ).

    0
  8. 8

    @Grant – I can’t imagine using invalid html/xhtml/css NOT making things slower, to be honest … not only would the browser need to load the html/css and figure out how to render it, it would ALSO have to guess and assume a lot of extras because of things like missing end tags and whatnot … granted, if you left off quotes on attribute values, maybe that wouldn’t, but missing end tags would certainly make the browser take longer to render, since it has to assume where you wanted the end tag to be

    besides, it’s just good practice anyway ;) …

    another thing to add to the CSS section is: instead of using nasty CSS hacks to get things to work in real browsers AND IE, use a separate stylesheet for IE and put it inside conditional statements in your ‘head’ element. This way, other browsers won’t even see or bother to download the IE-specific CSS and they won’t have to deal with possibly invalid (or at least very confusing) CSS! Personally, on some sites, I’ve even had two IE stylesheets. One I name ie6.css and one I name ie+.css … though it’s been a while since I’ve cared about IE6 working perfectly ;) … and don’t forget to minify both your JavaScript and CSS! Yahoo has a great tool for that!

    0
  9. 9

    Nice post. YSlow and PageSpeed are good tools. The CSS and JS are a challenge because of plugins in WP.

    0
  10. 10

    Thanks for this excellent treatment on the various tools available for analysis. I’ve been writing a series on web performance on my blog as well: http://luke.ehresman.org/blog/series/313/web-performance

    0
  11. 11

    Good article Chris!

    You’re right that there are some obvious website performance optimization techniques. Turning on Gzip compression for example is a no-brainer, and can have a significant impact. But even some of the well-known techniques are not easily applied, and involve considerations and choices to be made.

    If you run a high-traffic site and are serious about ux/conversion optimization, you need an expert. Someone who has all the knowledge and knows how to apply that knowledge to that particular site. You need a solid analysis to find the problems and understand how problems are related to eachother. For example, JavaScript optimization is not trivial. Where to place the JS in the document, which technique to load the external JS without breaking the site’s functionality and hurting the user experience, …
    Solving the load time problems, implementing the *best* solutions, needs to be done carefully, step-by-step.

    So, yeah, I agree that there is often some low hanging fruit, but hiring an expert to lead the optimization every step of the way is definitely a wise decision for large e-commerce and media sites.

    0
  12. 12

    love the article. SpriteMe is really interesting nice job. thanx

    0
  13. 13

    Chris Heilmann on Smashing! Glad to see you write here… =]

    0
  14. 14

    excellent article…cheers to SM

    0
  15. 15

    very useful information, thank you dude!

    0
  16. 16

    Great stuff. I needed an overview :-)

    0
  17. 17

    I thinkone of the most important things is the thing with the requests! We are at a time where some request headover takes more time than the ressource to be loaded!

    And this is something the internet itself can not resolve with its speed ;).

    Like it when I got all those green points in PageSpeed and YSlow :).

    0
  18. 18

    I haven’t tried it myself yet, but “Web Optimizer” seems to do many of the tweaks you suggest fully automated. I will definitely have a look at it for my next project.
    http://www.web-optimizer.us/

    0
  19. 19

    It ‘burns my cookies’ that your news feed still puts all your ads up front and I have to wade through them to get to your article. Will you ever change this or should I just discontinue subscribing??? I subscribe to dozens of other feeds and yours is the only one doing this. Please be more considerate.

    thx, Art

    0
  20. 20

    Great post, thanks!

    What I’d really like to know is at which point Google will consider the page loaded for the sake of it’s rankings. Will that be HTML only? Or HTML+CSS? Or HTML and all files referenced in the document? Will it follow @import directives in CSS or run JavaScript that inserts more script tags to load further scripts?

    You could potentially cheat the system by using small ‘loader’ files for larger CSS and JS files if it didn’t check that.

    0
  21. 21

    The Sprite article link (“extensively here before”) points to the wrong website. I guess it should be:
    http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/

    Excellent article, thanks!

    0
  22. 22

    Yes, image sprites are helpful and I use them regularly.

    HOWEVER (and this is a big one)

    All of that extra space used by your image sprites can and will send your browser’s memory usage through the roof if you aren’t careful. If you are working on a memory-sensitive site/application, keep this trade-off in mind!

    0
  23. 23

    No word about automated tools for website performance optimization like Aptimize or WEBO Site SpeedUp – http://www.web-optimizer.us/

    0
  24. 24

    Thiago A. Villa Menezes

    January 6, 2010 8:19 am

    What… A… Wonderful… USEFUL POST!

    0
  25. 25

    Nice resources!
    Thanks for sharing..

    0
  26. 26

    For my bachelor thesis, I’ve written File Conveyor.

    File Conveyor is a daemon written in Python to detect, process and sync files. In particular, it’s designed to sync files to CDNs. Amazon S3 and Rackspace Cloud Files, as well as any Origin Pull or FTP Push CDN, are supported.

    Check it out! :)

    Also as part of my bachelor thesis, I wrote the Drupal CDN Integration module, which will come in handy if your site is running Drupal.

    0
  27. 27

    Awesome collection!!-
    Thanks for Sharing ..

    0
  28. 28

    Vaibhav | Programming Kid

    January 6, 2010 9:14 am

    Great post. I use a combination of Firebug, Yahoo-Y!Slow, Google PageSpeed to rectify page load times and performance issues. I feel they are very strict and at times the solutions are a bit too much. If you want an A from Y!Slow, the page would consits of nothing but text or a replica of Google!

    0
  29. 29

    Great article showcasing some awesome tools – thanks for this. I use YSlow for all of my projects as it gives great information on everything. I’ve used sprites on one or two projects but always find that different browsers parse the code for sprites slightly different and I get cut off images/etc. It’s a pain to optimize your website, but you’re right – when you have millions of users coming to your website, that’s when you should REALLY worry.

    Until then don’t have a billion animated gifs on your website – take it easy on the people still visiting your website, but you don’t necessarily have to take all these extra steps in fully optimizing it because the server load won’t be as huge a part of the equation.

    0
  30. 30

    Thierry Koblentz

    January 6, 2010 9:43 am

    In the “Use Network Distributed Hosting” section, I’d add that the maximum of simultaneous requests is per server, so hosting assets on different servers should increase parallelize downloads.
    And what about hosting static assets on cookieless domains?
    The Google tool mentions both techniques (http://code.google.com/speed/page-speed/docs/request.html)

    @gLaNDix: using Conditional Comments with external styles sheets for IE increases the number of HTTP requests so I don’t see how that would help…

    0
  31. 31

    My brain just exploded. This article is one of the most useful I’ve seen. Thank!

    0
  32. 32

    Heya Chris…

    Great article!

    I am kinda surprised you didn’t mention Data: URI in this post. Data URI is a great way to make all your images into one http request, since the images are encrypted into the css file.

    I have found that it reduces http requests drastically and works great with modern browsers. (unfortunately doesn’t work with IE6 etc..)

    0
  33. 33

    yes,useful article indeed

    it’s funny because just this morning I was reading the final chapter of the Smashing Book, where it says you 2 guys never compromised on the quality and size of the images in the SM articles, and funny enough just few hours later I see (myself) for the first time images with bad quality :)

    However, content is important and your content is always good

    0
  34. 34

    Another option for the script “lazy loading” (aka, “on-demand loading”) is LABjs.

    In fact, LABjs is specifically designed to replace your script tags and load your scripts in parallel in a more efficient fashion while your page loads (though it can also be used to grab scripts on-demand at a later time as well). LABjs loads all scripts in parallel (as much as the browser will allow), but uses some “tricks” to make sure that it executes them in the proper order if you specify that there are execution-order dependencies.

    For instance:

    $LAB
    .script(“framework.js”).wait()
    .script(“plugin1.js”)
    .script(“plugin2.js”,”plugin3.js”)
    .wait(function(){
    // initialize framework and plugins
    });

    0
  35. 35

    This is a fabulous post. Very comprehensive. It will benefit me, and others I’m sure, greatly. Thank you for all of your hard work.

    0
  36. 36

    “80 to 90% of poor performance happens in the front end”

    As a software engineer who started on the front end and then moved heavily into server side technologies, I can tell you this is simply not true. Although server side technologies are used primarily to cache content like HTML, CSS, JavaScript and images, the bottlenecks I see are mostly related to third-party services, and too many XHRs tied into a slow back end. Buffering, caching and optimizing DB queries, and optimizing programming patterns are much more efficient at speeding up your Web application then trying to optimize the front end. You’ll get a greater gain in performance, which doesn’t excuse lazy client-side scripting and markup, but I would focus more energy on letting creatives/devs do what they do best, and let technologies like Memcached (or a service like Akamai) do the heavy lifting when it comes to performance boosts.

    0
  37. 37

    Hmmm. Very nice article!!

    0
  38. 38

    Can I use lazy loads for the script that create content?
    I have tried for a widget on my blog using the method google analytics use (createElement(‘script’), and that worked. but the problem is that that was not adjusting itself well with the existing content. Like content below the widget was coming up and then there was not much space for the widget to get the space. Placement was also a problem.

    0
  39. 39

    You’ve pretty much forgot to mention Firebug’s “Net” tab, it also allows live watching http requests, headers, loading times, etc. And, it does the same job as the Google’s tool when it comes to viewing live ajax calls.

    Yslow + Firebug’s Net tab = 100% win :)

    0
  40. 40

    Really One Great Article!

    Thanks nice people!

    0
  41. 41

    In the section, “Use Network Distributed Hosting”, I’m surprised you didn’t mention anything about “Amazon CloudFront” http://aws.amazon.com/cloudfront

    0
  42. 42

    erm…. nothing new and its missing many important things. Not a good article.

    0
  43. 43

    Great tuts. Thanks for sharing.

    0
  44. 44

    A great read and a point that should be kept in the mind of every website owner.
    I find Yahoo’s Yslow to work best for me.
    However, I Know that Chris is a Yahoo guy, but could you just a do a bit less of obvious promotion. Of course that is only a thought. Nothing that hurdles the readability or the awesomeness of the post. :)
    Cheers

    0
  45. 45

    I don’t know if anybody tested Smashing Magazine in YSlow, the screenshots you provided gave it an overall grade of A, mine gives it a C with overall performance at 78. Weird.

    I am currently on with cleaning (optimising) our company website and this article came just when I needed it. Great article. Alot of people come on here to bitch, but not everybody is a web super hero and it is good to refresh the brain once in awhile to get rid of the bad habits. You can still have a creative looking, image/script intensive site and still make it perform well.

    Just my two cents anyways.

    Cheers

    0
  46. 46

    Great article Chris!I already knew YSlow but will try the others to compare. Btw, I saw you on the DevDays in Amsterdam and the YQL talk was really good as well. Congrats!

    0
  47. 47

    Thanks for all that !

    I knew about Yslow and SpriteMe, but still it’s interesting.

    For Yslow on SmashingMagazine, i got a C with 79…
    … and “Grade E on Reduce the number of DOM elements : 1741 Dom elements”, shame on you :p

    0
  48. 48

    Thank you for posting up such great resources. I was actually in the process of optimizing a site the other day and these will definitely help me on my endeavor.

    0
  49. 49

    Excellent article. Too bad this site chooses to ignore those the needs of us that would rather share articles via email.

    0
  50. 50

    I had no idea about SpriteMe, Smush It or Coral – I’m knocked out. Thanks.

    0
  51. 51

    Sprites you will have to do manually (don’t over use this technique). Then you will have to balance the number of http requests. After that you will have to optimize your DOM elements.

    All these after you purchase a good hosting plan :D

    0
  52. 52

    Don’t forget to mention that people should give up external JavaScript ressources and host their own web analytics software. I often see web pages (even famous so-called tech sites) that take forever to load just because they use some external JS library. Sometimes their sites don’t even load because the other (external) server is down. Not professional. Just to save bandwidth? Absurd!

    Same thing with Google Analytics for example. Often it’s not available and it’s searching and searching and searching. As an end-user I wouldn’t want to wait. Best is to host EVERY single thing on the own server and don’t try to fetch external content at all.

    0
  53. 53

    In the article it implies that you received an A, but I did the YSlow test on SmashingMagazine and you received a C. It just seems a little misleading, that’s all.

    0
  54. 54

    PunyPNG is a really great online img compressor:
    http://www.gracepointafterfive.com/punypng

    Goes above and beyond Yahoo’s Smushit.

    0
  55. 55

    Duncan McDougall

    January 7, 2010 3:32 pm

    I recommend the developer tools in Safari. This gives you a cracking breakdown of your pages elements showing which order files are downloaded, how long and with what latency. A real eye opener into the parallel downloads problem.

    0
  56. 56

    Excellent post. I think website performance is one of the most crucial, overlooked parts of web development, and many of the points mentioned here are simple practices that can have a great effect on a site. Good stuff.

    0
  57. 57

    nicely written.

    0
  58. 58

    Also, do take a look at the latest in Google Web Toolkit’s arsenal (which can be used with any web site, not just GWT based sites):

    http://code.google.com/webtoolkit/speedtracer/

    0
  59. 59

    Eko Setiawan - camp26

    January 8, 2010 8:21 pm

    Thank you, I get a lot of knowledge. I will practice it.

    0
  60. 60

    Great article! It was the first time that I heard of SmushIt. Fantastic tool

    0
  61. 61

    Improving Website Performance must be a priority when planning websites. This post is really useful and as a designer I am concerned about speed and usability. Thanks

    0
  62. 62

    Kartlos Tchavelachvili

    January 10, 2010 5:21 am

    Thanks for the resources

    0
  63. 63

    Excellent article!

    As for mobile users, back end developers can use PHP classes to determine the browser type ( as I have done ) and then serve the appropriate content. Determining the browser type client -side is useless, since all the content has to be loaded anyways. I think there is a lot to be said in optimizing web performance by keeping all the content calculating on the back end ( where it belongs ).

    0
  64. 64

    Zlatan Halilovic

    March 18, 2010 6:25 pm

    I really feel like I’ve just read a book.

    Thank you Christian for writing such an amazing article :)

    0
  65. 65

    “in-depth knowledge and boring testing and researceh.” spelt research wrong.

    0
  66. 66

    This is a really interesting article. Some of the things mentioned here (like combining js files and serving images from cookieless domains) can be done on the fly by the site itself, so you don’t have to do it manually each time you make a change. There are packages that you just add to your site and they take care of the performance improvements, such as
    http://www.codeproject.com/KB/aspnet/CombineAndMinify.aspx

    0
  67. 67

    Presentation on “Performance Related Changes and their User Impact” by the google & bing masters reveals that Site performance/speed does matter. They have delivered very well and comprehensive presentation on site performance. You may visit here to see what they are saying.
    http://ready2seo.blogspot.com/2011/04/site-performance-increase-your-site.html

    Thanks.

    0
  68. 68

    Related to this great article I have developed a website worth algorithm that you can try here:
    http://www.webuka.com
    Keep up the good work!

    0
  69. 69

    I tried using a ‘media=mobile’ stylesheet for my website, and tested in opera mini on my mobile, but that browser by default uses the ‘ media=screen’ one. It only uses the mobile style sheet if you switch it to ‘mobile’ browsing in the advanced settings. It seems that for the time being it is best not to rely on ‘media=mobile’ to give mobile users a better experience, but rather try and optimise your ‘screen’ style sheet as best you can to support all major browsers and screen sizes.

    0
  70. 70

    Adding a mobile stylesheet is only one part to the larger goal of increase the performance of the site. I agree that mobile stylesheets should have been mentioned in the article, but the focus was increasing the performance of the site for all users.

    0
  71. 71

    With a mobile stylesheet, mobile users still need to download all the content and then get it stripped out. This is not good for performance anyway, and bad for users who pay per byte. In terms of performance, it’s less about visual clarity, and more about getting the bytes out there as fast as possible.

    0
  72. 72

    State.The.Obvious???

    January 6, 2010 1:46 pm

    What a crapload… lets put all the obvious here then… use a css reset, don’t use images, don’t use flash, don’t use gif, don’t think, don’t learn, don’t be creative…

    0
  73. 73

    @DesignLovr – Ironically, I’ve personally noticed a lot more articles online here semi-recently about optimizing websites … more than I remember seeing in the days of dial-up … and I just got a letter from Cox the other day saying they’re upping their speeds again :) …

    granted, it’s still important! it’s kinda like microsoft adding more and more bells and whistles to Windows since machines are getting faster and faster … just because you have more resources doesn’t mean they need to be wasted! ;) … just imagine how fast sites will be if optimization plays an important role in the development of a site AND internet speeds keep increasing!!! :D

    0
  74. 74

    That is so not true with YSlow2. My own portfolio: http://icant.co.uk/ gets an A!

    1
  75. 75

    The sad part is that in some cases it doesn’t matter how fast the broadband gets, that isn’t the bottleneck. More often than not it is the latency in the round trips to the servers that is making the pages slow (which is why several js files is slower than a single one with all of it merged). This makes things like sprites all the more important (though you’ll be able to get away with larger and larger files).

    The latency can’t get any better in a lot of cases as it is constrained by the speed of light over fiber and the only way to reduce it is to move the content closer to the user (aka CDN) or to reduce the impact by eliminating as many round trips as possible.

    0
  76. 76

    If you go into the webmaster tools section they have exposed some information about performance which may provide some hints. That data looks to be collected from users with the Google toolbar and appears to up to the onLoad event firing (full page loaded including html, css, js and images and from real users so no cheating for a Googlebot).

    0
  77. 77

    When it comes to rankings your competing against sites with similar content, and presumably similar audiences. So if your site attracts a lot of users with 56k modems, and your page takes 10 seconds to load because of that, then probably competing sites are in the same situation. I agree that it’s not perfect though.

    0
  78. 78

    Oh really? Well that’s clever but now that leaves me thinking about sites where their users are more likely to be on slower connections. I dunno… sites about modem troubleshooting or something. Will they be adversely affected in their pagerank, even if their server is blisteringly fast to a broadband user?

    0
  79. 79

    @Mark, with all of the Ph.Ds at Google i bet they considered that people with slower connections will download slower. And rather than taking only the download time from the user they would take the file sizes and response times from the server as well to make educated estimated speed rankings…. at least i would hope so…

    0
  80. 80

    And there you answered it yourself – the IE forking is a pain with data URIs (you can send mhtml to IE to get the same for the whole document – explained here: http://www.phpied.com/data-uris-mhtml-ie7-win7-vista-blues/). I am not a big fan of forking for browsers so I did not add it yet. This may not be the last performance article here, so we can come back to that :)

    0
  81. 81

    Smashing Editorial

    January 6, 2010 11:18 am

    Please understand that we actually need these ads to keep this magazine alive. To make the publication of this article possible and make it available for free, we need resources and financial support. This is why we have ads – actually, we would love to get rid of them, but we can’t, because otherwise Smashing Magazine won’t be viable.

    0
  82. 82

    Times change, load times get shorter :)

    0
  83. 83

    Smashing Editorial

    January 6, 2010 11:21 am

    Thank you, the link was updated!

    0
  84. 84

    I am a hacker, I didn’t even know automated tools exist :) In Yahoo we train engineers to do the right thing from the start, saves time and money in the long run.

    0
  85. 85

    That hurts. I am tired of the argument that “you are a xyz guy, why do you talk about products from company xyz”. Well, because when you are a guy working at xyz that is what you work with and know and you should only write about things that you have used and verified for yourself. We have enough “look at this awesome widget” or “this tool does all the work for you, you don’t need to know anything” articles and IMHO this is why the web is the mess that it is now.

    If you watch any of Steve Souder’s presentations on performance – and he is the main expert in the field – you will see that he mentions all the Yahoo products I mentioned, too, and he now works for Google – who the world always sees as our biggest rival.

    In this article I mentioned Google’s page speed, Google’s SpriteMe, AOL’s product and explained the main issues of performance. I could have made a walk-through of the performance section of the Yahoo Developer Network and told you this is where all the information is and omitted the rest of the resources. Then and only then you’d have the right to claim “obvious promotion”. The same goes for the YUI configurator – I could have explained Dojo’s packing system or the MooTools configurator, but I am very happy that YUI can be retrieved from Yahoo’s *and* Google’s CDN, something I wanted to point out – companies work together on this.

    This is very common problem in web design / development and blogging: we are happy to complain about big companies not following the same standards and best practises that we consider necessary to get our job done but when they do and they offer information and products the first response of the “geek in us” is that they are a big company and evil and probably will only try to rip us off. Ever wondered why a lot of companies do never bother to show up at web design conferences and share their finds?

    This makes my job as a standard evangelist a lot harder, as I need to convince people in the company that giving information and research results out to the community is a great idea as we can help the market as a whole to become better. Feedback like this is a killer argument for the company to tell me to focus on the inside instead of doing that.

    0
  86. 86

    Enlighten us. :)

    gzip, data URIs, multiCurl, memcaching… I might write an article about those, but as an overview of performance I consider them too “code” for Smashingmag.

    0
  87. 87

    I strongly disagree. Chris did an outstanding job and I appreciate all of his hard work. Thank you, Chris!

    0
  88. 88

    Since the comment was totally taken in a light it was never meant to, I would request you to kindly remove the comment. I understand what you said in the reply and agree completely. I mentioned that I myself love Yslow (I guess you missed that out).
    Talking about Yahoo is only natural from a guy at Yahoo and on the same level, talking about Google natural from Eric Scmidt. I know that is natural.
    I talked of the usability and the awesomeness of the post and yet you happened to miss it all and pick up a line that I said about you promoting Yahoo in a larger light. Of all the things I said you picked the one negative issue.
    To end my rant, kindly remove my comments from this article.
    As always I would be an ardent reader of Smashing Mag and say stuff when I like it and say it out loud when I do not. If that hurts the author, they are honestly requested to delete the comment.

    1
  89. 89

    Relax, Inspiring Pixel (if that is your real name), Chris was just defending himself on the claim of self-promotion, regardless of your intent. Many others may have had the same impression, so he’s entitled to address the issue without any harm or disrespect to your comment.

    Chris, great job on this. Definitely a lot here that’s beyond my current knowledge, so I’ll have to delve deeper into the material one day when any performance issues come up.

    1
  90. 90

    Agustin Amenabar

    January 7, 2010 6:53 am

    That depends on how the code is written, if it is properly structured, separating content from layout, separate style sheet for mobile is the way to go. Most times the biggest images are used for background, so if not present in the mobile css, the mobile browser will not make the requests for background images, because they are not in the style sheet it is rendering.

    Also, you must have in mind what are the strengths of your server, the servers I work with are usually very far away from the public that visit the sites, so latency is big, but download time is excellent once started. So I try to keep it the requests down, and work with larger files.

    0
  91. 91

    Agustin Amenabar

    January 7, 2010 6:57 am

    ‘media=mobile’ or ‘media=handheld’ do not work propperly, one must employ much more complicated detection systems, specially for modern smartphones, but once implemented you have huge control over what to show to whom.

    0
  92. 92

    Yes and no. I agree with Google Analytics having been very slow (they fixed that lately when they changed it) but I disagree strongly with hosting JavaScript libraries yourself as the best solution for everybody.

    Getting YUI from Yahoo’s servers or any other library using the Ajax Libraries API from Google will deliver the library files from a server geographically closest to the end users and packed and optimized in all the ways we can. If this is slow in your neck of the woods that is one thing but doesn’t make it a best practice. The other benefit is that if a user has been on a site that uses this library before then it will be cached for them already instead of loading – let’s say jQuery – for every single site you go to. What drives me crazy is when people use 5 different libraries because they like one widget in each. But this is another article that is actually… well I am not saying anything.

    0
  93. 93

    Well in fact you missed on big thing : specify sizes for block elements (img, div…) to prevent browser reflows.

    A browser reflow happens when the browser has to change the layout because, for instance, it didn’t know the size of an image, then it loads it and have to rearrange all the layout to make it fit.

    when this happens, it really hurts the user. it’s just like the page is telling him : “don’t use me yet, i’m still loading” even if what’s loading is not absolutely required.
    it can also lead to clicks errors if the reflow happens right when the user was clicking.

    0
  94. 94

    Good points Joey. These are perceived performance which is a topic worthy of an own article. Maybe I will interview some people about it.

    0
  95. 95

    Ah now I get it :) Actually the screenshot of YSlow was for http://icant.co.uk and not for smashingmag.

    0
  96. 96

    Marco’s post does conjure a few good questions/points:
    1- if you (web developer) use google analytics for a website and the analytics servers are slow, subsequently reducing the performance of your site, does google penalize you for poor performance even though the performance is impacted, in part, by their own service?

    2- If you use YUI or another library cache service, and the server determined to be closest to you is non-responsive, does the service then fail over to the next closest and, if so, a) how many fail overs till it fully fails, and b) is this still faster than downloading the entire library from the original host server?

    3- Unless I am misunderstanding how YUI and other such services work, wouldn’t there be at least one, if not more, added DNS queries to resolve the library servers? Even so, I would imagine there was ample bench-marking to determine that, all else being equal, the cached library service is faster than the standard alternatives.

    4- Finally, I think it is important to keep in mind that a “best practices” article such as this one is no replacement for critical thinking and research skills. Mr. Heilmann, and any other authority in any other industry, can only present tools and ideas that should/can be considered for any web project. I can think of very few instances where we can say, “it is ALWAYS better to do …. x, y, & z” because what we choose to employ should be dictated by the needs of the project.

    BTW, the article was helpful and appreciated.

    0

Leave a Comment

Yay! You've decided to leave a comment. That's fantastic! Please keep in mind that comments are moderated and rel="nofollow" is in use. So, please do not use a spammy keyword or a domain as your name, or else it will be deleted. Let's have a personal and meaningful conversation instead. Thanks for dropping by!

↑ Back to top