Menu Search
Jump to the content X X
Smashing Conf Barcelona

You know, we use ad-blockers as well. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

How To Automate Optimization and Deployment Of Static Content

A lot of traffic between users and your site comes from the static content you’re using to set up the user interface, namely layout graphics, Stylesheets and Javascript files. [Content Care Dec/21/2016]

This article shows a method to improve the providing of static content for a web platform. Further, it will show you a way to automate the deployment of these files, so you can deliver them with least effort but with maximum performance.

This tutorial will take some time to set it up, but it’s going to save you hours of work in the future and will improve your page speed significantly.

You might be interested in the following related posts:

1. Why do I need this? Link

There are several approaches to optimize the delivery of contents.
Some use on-the-fly compression via the server itself or a scripting language, which costs performance and does not optimize the structure and content of the files.

The method shown here prepares the files once and also merges and optimizes the code of CSS and Javascript files before the files are compressed, which makes the delivery of them even faster.

  • Most browsers download only two files from a source at once5. If a page needs to load more files from one domain, they get queued.
  • More files to transfer mean more requests to the server, more traffic and more usage of the server’s performance. For the users of the platform, this means longer loading times.
  • The more steps you need to deploy these files, the more space for mistakes is given.
  • Deployment is boring. It’s far more exciting to invest some time once, to set up a reusable automation, than wasting time doing the same copy/paste/upload actions over and over again.

Compare this two screenshots that show the same content before and after the optimization.
The apricot colored parts of the bars stand for the status “in queue” while loading the page.

Before optimization
The “in queue” status means nothing less than: Wasted time of the user.

In this example, the loading time was reduced by 33%, the transferred data size was reduced by 65% and the number of requests to the server even by 80%.

After optimization
By using CSS sprites and merged CSS and Javascript files, there is no queue for the loading of the basic static content.

2. How to improve your static content Link

Besides caching, there are some principles to make the whole setup of static content more efficient right from the start of development.

  • Use CSS sprites6 for your layout graphics. It not only saves you a lot of traffic and loading time. If you got used to maintain your graphics like this, you’ll notice that it can be much more comfortable to have your layout elements e.g. in a single Photoshop file.
  • Don’t blow up the loading time with your CSS and Javascript files. Combine the files of each kind into one single file, minimize them (e.g. by removing line breaks and other unnecessary characters) and compress them using GZip to get the load even smaller.
    This is covered in this tutorial.

3. Automating the deployment using Ruby Link

This automation is written in the Ruby language7. So you need to have Ruby installed on your development computer. The installation is pretty simple on Windows and Mac OS X even ships with Ruby.

Please keep in mind: The scripts are completely independent from the language you’re using to run your site! You can also use them to deploy content e.g. for a PHP project.

If you’re not used to work on the command line, the next steps might look cryptic to you. Don’t hesitate, clench your teeth and take care that you always pass the correct path of the files to deal with. You only have to go through this once. When you’re finished, all you need to do is to type one single command to start the whole process.

To make it easier for you, this is the folder structure used for this tutorial:

+ css/
- fonts.css
- grids.css
- layout.css
- reset.css
- static.min.css (generated)
- static.min.css.gz (generated)

+ js/
- framework.js
- gallery.js
- plugin1.js
- plugin2.js
- start.js
- static.min.js (generated)
- static.min.js.gz (generated)
+ deploy/
- batch
- ftp.rb
- gzip.rb

index.php (or something similar)

Requirements for this tutorial Link

Don’t worry, besides Ruby you won’t need anything, that you don’t probably already have:

  • A simple text editor to save the source code.
  • A command line tool, like Windows cmd.exe or Mac OS X Terminal to call the scripts.
  • A FTP account to try out the upload script.
  • A project to enhance.
  • Optional: The Firefox add-on Firebug8 (or something similar for other browsers), to see the enhancements afterwards.

Juicer Link

Christian Johansen created a tool called Juicer which enables you to merge and minimize CSS and Javascript files. To install it on Windows, simply type

gem install juicer

in the command line. When you’re using Mac OS X use

sudo gem install juicer

After the successful install, Juicer will ask you to extend it with YUI Compressor9 and JSLint10 for Javascript compression and verification. You do this by typing

juicer install yui_compressor

and after that

juicer install jslint

in the command line.

Preparing your files Link

The good thing is, you don’t have to change anything in your present files. But to make sure, the files are merged in the order you want to, you need to set up two additional files.

Let’s assume you have four CSS files and five Javascript files:

<link rel="stylesheet" href="./css/reset.css" type="text/css" media="screen" />
<link rel="stylesheet" href="./css/fonts.css" type="text/css" media="screen" />
<link rel="stylesheet" href="./css/grids.css" type="text/css" media="screen" />
<link rel="stylesheet" href="./css/layout.css" type="text/css" media="screen" />

<script type="application/javascript" src="js/framework.js"></script>
<script type="application/javascript" src="js/plugin1.js"></script>
<script type="application/javascript" src="js/plugin2.js"></script>
<script type="application/javascript" src="js/gallery.js"></script>        
<script type="application/javascript" src="js/start.js"></script>

Create a new Javascript file, called static.js with the following content:

 * @depends framework.js
 * @depends plugin1.js
 * @depends plugin2.js
 * @depends gallery.js
 * @depends start.js

After that, create a CSS file static.css with this content:

@import url("reset.css");
@import url("fonts.css");
@import url("grids.css");
@import url("layout.css");

Now you’re ready to run Juicer in your command line.

For Javascript:

juicer merge -i --force ./js/static.js

The parameter -i means that the merging process won’t be cancelled, if JSLint thinks you have errors in your Javascript code. The parameter --force means older versions of the minified file will be overwritten.

For CSS:

juicer merge --force ./css/static.css

As a result you will see two new generated files named static.min.js and static.min.css. You probably want to know if they still work for your site, so go ahead and test it, by replacing the old bunch of link and script tags in your html header with the two new ones.

GZip Compression Link

When your minified files work fine, you can go on to compression. If you get Javascript errors or your CSS layout looks weird, you should re-check the order of the files to merge.

Below you see a small Ruby script that saves a gzipped copy of a file.

require 'zlib'

file_to_zip = ARGV[0]

puts "Gzipping #{file_to_zip}..."

base_name = File.basename(file_to_zip)

file_name_zip = "#{file_to_zip}.gz"

base_name_zip = "zip_#{base_name}", 'w') do |f|

  gz =
  IO.foreach(file_to_zip) {|x| 
    gz.write x

puts "Gzipped version saved as #{file_name_zip}"

Save it as gzip.rb and call it like this:

ruby gzip.rb ../js/static.min.js

When you look at your folder, you’ll notice a new file called static.min.js.gz. Now do the same for the CSS file:

ruby gzip.rb ../css/static.min.css

Important: Make sure that your server provides the right content encoding and content type information to the clients, so they understand that this files are gzipped content.
You can do this in many ways. For Apache, an example for the .htaccess file would be:

<FilesMatch ".(js.gz)$">
AddType text/javascript .gz
AddEncoding x-gzip .gz

<FilesMatch ".(css.gz)$">
AddType text/css .gz
AddEncoding x-gzip .gz

FTP-Upload Link

You’re on the home stretch. Now you have all improved files together and need to upload them to your host. Again, here’s a little Ruby script that does exactly that for you.

require 'net/ftp'

ftp_host = ARGV[0]
ftp_user = ARGV[1]
ftp_password = ARGV[2]

localfile = ARGV[3] #e.g. "../js/static.min.js.gz"
remote_dir = ARGV[4] #e.g. "www/js"

ftp =, ftp_user, ftp_password) 

puts "FTP - Status: #{ftp.status}."

puts "FTP - Go to directory: #{remote_dir}."
puts "FTP - Uploading file: #{localfile}..."

files = ftp.list puts "FTP - Your file was uploaded here:" puts files


Save it as upload.rb, put in the credentials of your host, eventually change the destination directory and then call it like that:

ruby ftp.rb admin mysecret ../js/static.min.js www/js

Which stands for:
ruby ftp.rb ftp-host ftp-user ftp-password file-to-upload ftp-destination-folder

The minified Javascript file should be uploaded now. Repeat this for the CSS file and the gzipped versions of both.

Please note that the folders on the ftp server, where the files are copied to, must already exist.

Putting it all together Link

You got all the pieces of this puzzle and now you can finally put them together, by grouping the commands in a batch file. On Windows it would look like this:

@echo off

echo -------- MERGING JS FILES --------

call juicer merge -i --force ../js/static.js 

echo -------- FINISHED MERGING JS FILES --------

echo -------- MERGING CSS FILES --------

call juicer merge --force ../css/static.css

echo -------- FINISHED MERGING CSS FILES --------

echo -------- COMPRESSING FILES --------

ruby gzip.rb ../js/static.min.js

ruby gzip.rb ../css/static.min.css

echo -------- FINSISHED COMPRESSING FILES --------

echo -------- UPLOADING FILES --------

ruby ftp.rb admin mysecret ../js/static.min.js www/js

ruby ftp.rb admin mysecret ../js/static.min.js.gz www/js

ruby ftp.rb admin mysecret ../css/static.min.css www/css

ruby ftp.rb admin mysecret ../css/static.min.css.gz www/css

echo -------- FINISHED UPLOADING FILES --------

Save it in a file called e.g. batch.bat, insert your FTP credentials and call it like this:

cd deploy //switch to subdirectory

On OS X you can do it like that:


echo -------- MERGING JS FILES --------

juicer merge -i --force ../js/static.js 

echo -------- FINISHED MERGING JS FILES --------

echo -------- MERGING CSS FILES --------

juicer merge --force ../css/static.css

echo -------- FINISHED MERGING CSS FILES --------

echo -------- COMPRESSING FILES --------

ruby gzip.rb ../js/static.min.js

ruby gzip.rb ../css/static.min.css

echo -------- FINSISHED COMPRESSING FILES --------

echo -------- UPLOADING FILES --------

ruby ftp.rb admin mysecret ../js/static.min.js www/js

ruby ftp.rb admin mysecret ../js/static.min.js.gz www/js

ruby ftp.rb admin mysecret ../css/static.min.css www/css

ruby ftp.rb admin mysecret ../css/static.min.css.gz www/css

echo -------- FINISHED UPLOADING FILES --------

Save it in a file called e.g. batch, insert your FTP credentials and call it like this:

cd deploy //switch to subdirectory
sh batch

And the content gets merged, minified, compressed and uploaded in one rush.
Additionally to the CSS and Javascript files, it would be a good idea, if the graphics file for the CSS sprites would be uploaded too? Just add an additional call of the upload script in the batch file. It can be used with any filetype.

4. In your template Link

Further development and bug fixing is hard to do in a minified file. But you can add an if-clause in your templates to vary the referencing of your files, depending on your environment.

Take the original files for your local development and the minified for your online version. Additionally you should check if the users browser supports gzipped content (most modern browsers do).

An example for PHP:

if($_SERVER["SERVER_NAME"] == "localhost") {
  //local development server - load all files
} else {
  if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')){
    //production system - client accepts gzip     
  } else {
    //production system - client does not accept gzip

And an example for Ruby on Rails:

<% if RAILS_ENV == 'development' %>
  <%# local development server - load all files %>
<% else %>
  <% if self.request.env['HTTP_ACCEPT_ENCODING'].match(/gzip/) %>
    <%# production system - client accepts gzip %>
  <% else %>
    <%# production system - client does not accept gzip %>
  <% end %>
<% end %>

5. Get even faster with subdomains Link

Like mentioned before, most browsers only download two files per host simultaneously. You can bypass that rule by creating some subdomains for your static content and referencing the files through them.

For example, create a subdomain on your web hosting account, that’s called, which points to

Do this for all of your folders, containing static content and use the subdomains to reference your resources:

This way, the rule won’t be applied and all files can be downloaded at the same time.

6. Conclusion Link

If you got all of this working for one of your projects, you surely recognized the advantages for you as a developer and for the users as the consumers of your content.

These examples don’t claim to be perfect. But if you didn’t approach to automate recurring tasks in your daily workflow as a developer, you hopefully have an idea now, how it could look like.

Always keep in mind, that computers were invented to spare you the boredom of repeating, simple tasks. So save your time for more important things.

Of course, this doesn’t have to be the end of the line. If you’re interested in the automatic deployment of whole applications, you should have a look at Capistrano. Again, it’s written in Ruby, but you can also use it for PHP platforms.

Further Resources Link

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14

↑ Back to top Tweet itShare on Facebook

Christian Bäuerlein recently obtained his degree in Media System Design. He is passionate about Web development and creating all kinds of projects with JavaScript, PHP, Ruby and Java. You can follow him on this journey through his Twitter profile.

  1. 1

    Heh. Writing an article on optimization and failing to notice the javascript bottleneck. Good job. You should move scripts to the bottom of the page to avoid scripts hindering the download of other files.
    Furthermore, if you’re going to advocate a technique, why not actually reference it by name? Using subdomains for content are called Content Delivery Networks (CDN).

  2. 2

    This looks interesting, let me give it a try.

    Thanks :)

  3. 3

    In my view it’s very important to optimize the delivery of contents. Also i like the Atomization technique. Nice tutorial Christian, Thanks for sharing.

    DKumar M.

  4. 4

    Armig Esfahani

    July 19, 2009 1:11 am

    pretty interesting… thank you for the post..

  5. 5

    I agree with most, except the sub domains.

    Creating sub domains will increase the number of parallel downloads, but reduce the speed of DNS lookups. The more domains the more lookups.

    Most modern browsers do 6 to 8 parallel downloads. IE7 does 4.

    Therefore, depending on your visitors, you might be better off using one domain for every 6 http requests. You should not count the html file, since it will need to be downloaded by itself first, and then tell the browser what else is needed.

    Also your css images will not start downloading until the CSS file has been downloaded by the browser. Most large websites (yahoo, google) use inline CSS, because it reduces http requests and speeds up the image downloads.

  6. 6

    Emil Bonsaksen

    July 19, 2009 1:20 am

    Great article and great tips on optimizing!

  7. 7

    Very interesting stuff. I’m going to give this a shot first thing Monday. Thanks!

  8. 8

    There is a project called Web optimizator (Web optimizator) that aim to automate all clientside improvements for websites. The main features are: very simple installation on server, website’s resource compression, support for some popular CMS/CMF systems (e.g. Joomla, Drupal, WordPress, Symfony, …) and much-much more.

  9. 9

    Well, i always use gzip and Inline css and JavaScript. I just file_get_contents(…); them with PHP. And I use sprites ofcourse…

  10. 10

    Nice Tutorial. Greets from MSD =)

  11. 11

    great article, this is what exactly i’m going to do next week :)

  12. 12

    Great and useful tips… thanks

  13. 13

    sudo gem install juicer
    Building native extensions. This could take a while…
    ERROR: Error installing juicer:
    ERROR: Failed to build gem native extension.

    /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby extconf.rb install juicer
    can’t find header files for ruby.

    Gem files will remain installed in /Library/Ruby/Gems/1.8/gems/hpricot-0.8.1 for inspection.
    Results logged to /Library/Ruby/Gems/1.8/gems/hpricot-0.8.1/ext/hpricot_scan/gem_make.out

    Installing Juicer didn’t work on Mac OS X 10.5.7 – any ideas? I’m totally new to Ruby.

  14. 14

    @Frederic: Two possible solutions:
    – Install XCode like mentioned here

    – Try to update Ruby Gems with this command: gem update –system – see

  15. 15

    Sunny Singh

    July 19, 2009 3:16 am

    GZIP and sprites are definitely the easiest to do, for me at least.

    Not sure about inline CSS, since it can’t get cached but in most cases (page specific css) can go between style tags. Also I have one main css file “uecore.css” so everything is referenced from one file such as resets and template styles.

    This form should really have a website input by the way, most blogs do it and it’s useful, why not smashing magazine?

  16. 16

    Oh god what is the world coming to? RoR in Smashing Magazine Examples without PHP examples. *sigh*

  17. 17

    Jasmin Halkić

    July 19, 2009 3:42 am

    Nice tips, thanks!

  18. 18

    very helpful! Thanks!

  19. 19

    Oliver Schrenk

    July 19, 2009 5:41 am

    Nice article, but you should rely on rsync instead of f t p. It offers synchronizing over a secure connection and best of all only syncs what is necessary, which is helpful if you have big projects.

  20. 20

    Hi Andy,
    your remark about the JS bottleneck is absolutely correct. But please mind, that this article is not a complete round up of all optimization techniques, it just shows a optimization method for the file contents itself. Further possibilites – like the placement of javascript, expiring headers and etags – can be found via the section “Further Resources”, e.g. the Yahoo Best Practices.

    A CDN usually depends on a distributed system, not only on a subdomain.

  21. 21

    As far as I know, subdomains can’t prevent the browser from queueing, because their content of and is coming from the same server. Browsers load two files simultaneously from the same server, not from the same domain.

  22. 22

    Why are you doing this manually? Rails can do this out of the box for you during runtime.
    Well the css and js part that is.

  23. 23

    This article is really interesting! I definitely want to give it a try. Thanx!

  24. 24

    Your’e right, but afaik Rails only merges the css/js files, but it does not optimize the content of it.

    I personally call this batch file as a callback of a SVN commit, so it is automized within my workflow and my server doesn’t have to bother about this.

  25. 25

    Fantastic article and very practical. I have a large site that I can’t wait to optimize this week and better improve its performance and responsiveness. Thanks for posting such a very applicable solution to this topic, Christian!

  26. 26


    This is a great post. I’m going to try these Ruby scripts on a project I’m working on right now. Previous methods of optimization seemed so tedious and manual but this has actually created a nice workflow. Also I’ve heard of Capistrano but only thought it was for Rails deployments. But I’m going to take a look at that as well.

  27. 27

    Great and interesting article. But i think that for deployment capistrano is way better.
    Thanks :)

  28. 28

    Wonderful! That can help improve my PHP project. Thank you.

  29. 29


    July 19, 2009 7:59 pm

    Which is better – creating real subdomains or using .htaccess to rewrite a directory into a subdomain ( into I would assume the latter is a bit easier to maintain, but not sure about the performance.

  30. 30

    Am I the only one who couldn’t install yui-compressor by default? I had to edit the installer script to go to the new site to grab it from there.

  31. 31

    For those of you looking for yui compressor you can use tinyoptimizer, it uses the same engine but gives you a nice windows interface. Handles js and css and also has a png crusher in it.

    As for the article above I would like to also recommend that people use an alternate domain and not just a sub domain. Depending on your cookie structure you can still pass cookies for static files on the same domain. If you use a completely different domain cookies will not be passed with the files. For more info on this topic

    Lastly if you are looking for a cheap CDN without the complication or cost of others. You can try TinyCDN. It leverages Amazon but gives you a nice console to manage your files, gzip and score them for optimization.


  32. 32

    Aaron Peters

    July 20, 2009 12:07 am

    It’s good that SM writes about methods and techniques for speeding up load times.
    However, a few remarks are in place here.

    1) Versioning
    This article misses out on one very important and relevant topic: Expires Headers.

    As per the Yahoo! Performance Rules (link), you should set far future Expires Headers on static content that does not change often, like CSS and JS files.
    As a result, visitors who return to the site will not have to request these files from the server, because they are loaded from the browser cache.
    When you need to change the JS file (or CSS), you simply upload it with a different file name (e.g. combined.v2.1.js) and change all the references to the JS file (the links in the HTML).

    Don’t use ?v123456 to apply versioning, because some proxies like Squid don’t cache files with parameters.

    2) Gzip compression
    Site owners who have access to the Apache config should best use mod_deflate (or mod_gzip for Apache <2), and not .htaccess.
    Link tip: Steven Stoyanov explains these and other Gzip serving methods for those on shared hosting on Sitepoint.

    – IE7 does 2 parallel downloads per host, not 4 (note: IE7 actually does 4 on HTTP 1.0, but most servers use HTTP 1.1)
    – Yahoo and Google indeed use inline CSS, but Yahoo at least uses external CSS too.
    The rationale is that the inline CSS speeds up the initial load time, and the external CSS is loaded after the full page is loaded, so users will have that CSS in the browser cache and ready for subsequent pages.

    The maximum # of parallel downloads is per domain, not per server.
    Read the Steve Souders’ blog post for a nice roundup and details.

    If you want to have your site visitors enjoy the benefits of maximum parallel downloads, you need to use real subdomains. See the above mentioned points and links about parallel downloads.

    • 33

      Hiren Khambhayta

      March 31, 2011 3:04 am

      thanks for your further guideline on helen’s query as I had the same question.
      Thanks any way

  33. 34

    @ArneTR: No it is not. The mentionend article is about dynamic file merging/compressing in php.

    This article here is about optimizing/deploying the contents of static files with automated scripts and uses completely different techniques to achieve this.

  34. 35

    I think having a gzipped version of a CSS and JS is a pretty wrong practice. Apache, Lighttpd and IIS also have a built in gzip compressor. The webserver will cache the gzipped content and this way you can forget about the compression at all. Your webserver will take care of it.

    Apache configuration:
    AddOutputFilterByType DEFLATE text/css text/ja
    vascript application/x-javascript application/javascript

    With static HTML content or if you can accept some overhead you can add some more mime types: text/html text/plain text/xml

    This way the HTML markup (and your plain text and XML files) will be gzipped and you can speed up your site more.

    Please note, that you need to turn the apache deflate module on.

    I think this is the right way to do some compression on your static content. And this way doesn’t need any backend code modification and I think it’s safer (you can care about really old browsers in the webserver configuration)

  35. 36

    You’re right, this is another proper solution. But the one shown here has some advantages:
    – Merging of files
    – Optimization of files (e.g. with YUI-compressor)
    – Even validation of js code with JS-Lint
    – You can replace the FTP class e.g. with an Amazon S3 class, to upload it on a S3 bucket. Together with the cloud front service you can maintain a real Content Delivery Network with that (if anyone wants the code of the S3 class, let me know).

  36. 38


    July 20, 2009 1:43 am

    @Aaron Peters: Thanks for your answer. I understand that HTTP requests follow “per server” basic. However, I don’t really get how a browser can tell the difference between and If it can *not, then multiple requests should be sent, is it?

  37. 39

    what is the tool used to log HTTP GET requests in the screenshots ?

  38. 40

    @ttalbot That’s the Network tab of Firebug

  39. 41

    Josh Minnich

    July 20, 2009 6:41 am

    This is a very good idea. I’ll have to look into this more on my next project.

  40. 42

    Can you provide an example using SSL ftp?

  41. 43

    You can use another FTP lib like or you try this code snippet: I haven’t tested it myself, but you should do something like this:
    – Copy the code from the given URL below require 'net/ftp' (line 1) and replace with (line 10).

  42. 44

    On high traffic sites (1k+ uniques a day) mod_deflate is a better choice the gzip.

    Also minify ( is definatly worth a look, it combines and minifies all your css file and also js files on the server when they’re requested and the cache’s them for future requests. So you get the comfort of easy to use files, and logical structure without the negative effects of excess http requests and bulky file sizes.

  43. 45

    Using FTP is definitely a bad practice, please use SFTP instead.

  44. 46

    Even with all this i’d have to go a long way before i could sort out the mess i have to work with. Great article.

  45. 47

    Richard Castera

    July 20, 2009 4:11 pm

    Great Article! Thanks!

  46. 48

    @Chris Great article by the way. The juicer, and the scripts are pretty good solution to automate the css/js deploy process. When you serving your CSS/JS from S3 it’s good a solution to pre-gzip your content. But any other cases I think it’s so much better to let your webserver do this thing.

  47. 49

    The subdomain tip is a huge one! Will mos def use it for my blog to come.

  48. 50

    Jonathan Kupferman

    July 23, 2009 8:48 pm

    Excellent guide, anything that helps people improve their websites performance is a win for everyone. Aaron pointed out something that is incredibly important that I think should be stressed which is far-future expire headers. This is generally one of the simplest and easiest improvements since it usually only requires adding a line or two to a web server config file, those configs can be easily found online. Expire headers are especially important on sites where users generally visit multiple pages since one the second and third page they should be loading most assets from cache. Regardless of how compressed/minified your css/javascript/images are, it is almost always faster to load from cache then it is to re-download it becuase of internet latency.

  49. 51

    @Chris Thanks, it’s working now! ;)

  50. 52

    Mathew Davies

    August 4, 2009 7:09 pm

    5. Get even faster with subdomains

    There’s a DNS lookup overhead and may not always work.

  51. 53

    Drupal already combines, compresses, and caches. You only need to add the subdomain trick if you have lots of other files filling the queue.

  52. 54

    @peter: Drupal doesn’t have data:URI and CSS Sprites. Web Optimizer does

  53. 55

    Of course this is proven time and again since this article was authored. My question is how to best handle that with SSL. If you mix secure and unsecure content, browsers warn, and viewers get nervous.

    One doesn’t want to buy SSL certificates for 4 domains (,,, Even if you do with two domains (, and, you are now up to two SSL certificates, and two ip addresses. Is this the cost of doing business, or is there a better way to mix and match SSL with static content?


↑ Back to top