Menu Search
Jump to the content X X
Smashing Conf Barcelona 2016

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

The Ultimate Guide To A/B Testing

A/B testing isn’t a buzz term. A lot of savvy marketers and designs are using it right now to gain insight into visitor behavior and to increase conversion rate. And yet A/B testing is still not as common as such Internet marketing subjects as SEO, Web analytics and usability. People just aren’t as aware of it. They don’t completely understand what it is or how it could benefit them or how they should use it. This article is meant to be the best guide you will ever need for A/B testing.

What Is A/B Testing? Link

At its core, A/B testing is exactly what it sounds like: you have two versions of an element (A and B) and a metric that defines success. To determine which version is better, you subject both versions to experimentation simultaneously. In the end, you measure which version was more successful and select that version for real-world use.

This is similar to the experiments you did in Science 101. Remember the experiment in which you tested various substances to see which supports plant growth and which suppresses it. At different intervals, you measured the growth of plants as they were subjected to different conditions, and in the end you tallied the increase in height of the different plants.

A/B testing example1
Large version2

A/B testing on the Web is similar. You have two designs of a website: A and B. Typically, A is the existing design (called the control), and B is the new design. You split your website traffic between these two versions and measure their performance using metrics that you care about (conversion rate, sales, bounce rate, etc.). In the end, you select the version that performs best.

What To Test? Link

Your choice of what to test will obviously depend on your goals. For example, if your goal is to increase the number of sign-ups, then you might test the following: length of the sign-up form, types of fields in the form, display of privacy policy, “social proof,” etc. The goal of A/B testing in this case is to figure out what prevents visitors from signing up. Is the form’s length intimidating? Are visitors concerned about privacy? Or does the website do a bad job of convincing visitors to sign up? All of these questions can be answered one by one by testing the appropriate website elements.

Even though every A/B test is unique, certain elements are usually tested:

  • The call to action’s (i.e. the button’s) wording, size, color and placement,
  • Headline or product description,
  • Form’s length and types of fields,
  • Layout and style of website,
  • Product pricing and promotional offers,
  • Images on landing and product pages,
  • Amount of text on the page (short vs. long).

Create Your First A/B Test Link

Once you’ve decided what to test, the next step, of course, is to select a tool for the job. If you want a free basic tool and don’t mind fiddling with HTML and JavaScript, go with Google Website Optimizer223. If you want an easier alternative with extra features, go with Visual Website Optimizer4 (disclaimer: my start-up). Other options are available, which I discuss at the end of this post. Setting up the core test is more or less similar for all tools, so we can discuss it while remaining tool-agnostic.

You can set up an A/B test in one of two ways:

  • Replace the element to be tested before the page loads
    If you are testing a single element on a Web page—say, the sign-up button—then you’ll need to create variations of that button (in HTML) in your testing tool. When the test is live, the A/B tool will randomly replace the original button on the page with one of the variations before displaying the page to the visitor.
  • Redirect to another page
    If you want to A/B test an entire page—say, a green theme vs. a red theme—then you’ll need to create and upload a new page on your website. For example, if your home page is, then you’ll need to create a variation located at When the test runs, your tool will redirect some visitors to one of your alternate URLs.

Once you have set up your variations using one of these two methods, the next step is to set up your conversion goal. Typically, you will get a piece of JavaScript code, which you would copy and paste onto a page that would represent a successful test were a visitor to arrive there. For example, if you have an e-commerce store and you are testing the color of the “Buy now” button, then your conversion goal would be the “Thank you” page that is displayed to visitors after they complete a purchase.

As soon as a conversion event occurs on your website, the A/B testing tool records the variation that was shown to the visitor. After a sufficient number of visitors and conversions, you can check the results to find out which variation drove the most conversions. That’s it! Setting up and running an A/B test is indeed quite simple.

Do’s And Don’ts Link

Even though A/B testing is super-simple in concept, keep some practical things in mind. These suggestions are a result of my real-world experience of doing many A/B tests (read: making numerous mistakes).

Don’ts Link

  • When doing A/B testing, never ever wait to test the variation until after you’ve tested the control. Always test both versions simultaneously. If you test one version one week and the second the next, you’re doing it wrong. It’s possible that version B was actually worse but you just happened to have better sales while testing it. Always split traffic between two versions.
  • Don’t conclude too early. There is a concept called “statistical confidence” that determines whether your test results are significant (that is, whether you should take the results seriously). It prevents you from reading too much into the results if you have only a few conversions or visitors for each variation. Most A/B testing tools report statistical confidence, but if you are testing manually, consider accounting for it with an online calculator5.
  • Don’t surprise regular visitors. If you are testing a core part of your website, include only new visitors in the test. You want to avoid shocking regular visitors, especially because the variations may not ultimately be implemented.
  • Don’t let your gut feeling overrule test results. The winners in A/B tests are often surprising or unintuitive. On a green-themed website, a stark red button could emerge as the winner. Even if the red button isn’t easy on the eye, don’t reject it outright. Your goal with the test is a better conversion rate, not aesthetics, so don’t reject the results because of your arbitrary judgment.

Do’s Link

  • Know how long to run a test before giving up. Giving up too early can cost you because you may have gotten meaningful results had you waited a little longer. Giving up too late isn’t good either, because poorly performing variations could cost you conversions and sales. Use a calculator (like this one6) to determine exactly how long to run a test before giving up.
  • Show repeat visitors the same variations. Your tool should have a mechanism for remembering which variation a visitor has seen. This prevents blunders, such as showing a user a different price or a different promotional offer.
  • Make your A/B test consistent across the whole website. If you are testing a sign-up button that appears in multiple locations, then a visitor should see the same variation everywhere. Showing one variation on page 1 and another variation on page 2 will skew the results.
  • Do many A/B tests. Let’s face it: chances are, your first A/B test will turn out a lemon. But don’t despair. An A/B test can have only three outcomes: no result, a negative result or a positive result. The key to optimizing conversion rates is to do a ton of A/B tests, so that all positive results add up to a huge boost to your sales and achieved goals.

Classic A/B Testing Case Studies Link

Here are some case studies to give you an idea of how people test in the wild.

Writing Decisions: Headline Tests on the Highrise Sign-Up Page7
37signals tested the headline on its pricing page. It found that “30-Day Free Trial on All Accounts” generated 30% more sign-ups than the original “Start a Highrise Account.”

37signals A/B test8

“You Should Follow Me on Twitter Here”9 (Dustin Curtis)
This much-hyped split-test involved testing multiple versions of a call to action for Twitter followers. Dustin found that “You should follow me on Twitter here” worked 173% better than his control text, “I’m on Twitter.”

You should follow me on Twitter here10

Human Photos Double Conversion Rates
A surprising conclusion from two separate A/B tests: putting human photos on a website increases conversion rates by as much as double. Scientific research backs this up, saying that we are subconsciously attracted to images with people.

Human photos vs. generic icon

Google Website Optimizer Case Study: Daily Burn, 20%+ Improvement11 (Tim Ferriss)
A simple variation that gave visitors fewer options too choose from resulted in a 20% increase in conversions. The winning version was also much easier on the eye than the control in its detail and text.

Home page screenshot12

Two Magical Words Increased Conversion Rate by 28%13
The words “It’s free” increased the clicks on this sign-up button by 28%, illustrating the importance of testing call-to-action buttons and how minor changes can have surprisingly major results.

It's free screenshot14

Changing the Sign-Up Button from Green to Red
Along with its other A/B tests, CareLogger increased its conversion rate by 34% simply by changing the color of the sign-up button from green to red!

Green v/s Red

Single page vs. multi-step checkout15
If you have an online store, it is quite common to see visitors abandoning the purchase process at the time of checkout. This A/B test found out that a single page checkout process works much better at completing sales than multiple-page checkout process.


"Mad Libs" style form increases conversion 25-40%17
Defeating conventional wisdom, in this A/B test it was found out that a paragraph-styled form with inline input fields worked much better than traditional form layout. Though the result was probably specific to their offering as it wasn’t replicated in another, separate A/B test18.


Complete redesign of product page increased sales by 20%
A software product company redesigned their product page to give it a modern look and added trust building elements (such as seals, guarentees, etc.). End result: they managed to increase total sales by 20%. This case study demonstrates the effect of design on sales.


Marketing Experiments response capture case study – triple digit increase in conversions20
Through a series of A/B tests they optimized the mailing list opt-in rate by 258%. Focus was to remove all distractions and require the visitor to only provide email address. For completing his/her complete profile, the landing page motivated the visitors with an Amazon gift card (which was again split tested).


Tools For A/B Testing Link

A number of tools are available for A/B testing, with different focuses, price points and feature sets. Here are some:

Resources For Deep-Diving Into A/B Testing Link

If you’ve read this far, then A/B testing has presumably piqued your interest. Here, then, are some cherry-picked resources on A/B testing from across the Web.

Get Ideas for Your Next A/B Test Link

Introductory Presentations and Articles Link

The Mathematics of A/B Testing Link


Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36
  37. 37
  38. 38
  39. 39
  40. 40
  41. 41
  42. 42
SmashingConf Barcelona 2016

Hold on, Tiger! Thank you for reading the article. Did you know that we also publish printed books and run friendly conferences – crafted for pros like you? Like SmashingConf Barcelona, on October 25–26, with smart design patterns and front-end techniques.

↑ Back to top Tweet itShare on Facebook


Paras Chopra is founder of Visual Website Optimizer, the world's easiest A/B testing tool. Used by thousands of companies worldwide across 75+ countries, it allows marketers and designers to create A/B tests and make them live on websites in less than 10 minutes.

  1. 1

    Interesting case studies, will look at implimenting some of that onto sites I am building.
    Good post, keep it up

  2. 2

    Anas Nakawa

    June 24, 2010 2:25 am

    “no words can describe the excellent post !”
    thanks alot

  3. 3

    Martin Leblanc

    June 24, 2010 2:34 am

    Great guide, Paras!

  4. 5

    Emily Smith

    June 24, 2010 2:40 am

    I really liked these case studies – make for great reading and inspiration.

  5. 6

    Andrew Ingram

    June 24, 2010 2:46 am

    How can it be an ultimate guide without even mentioning local minima/maxima in passing?

    And I disagree with the notion that you should never let artistic vision trump the results of a test. Artistic vision often includes knowledge of where things will be going in the future and you shouldn’t let a paltry 5% improvement get in the way of that.

    • 7

      Paras Chopra

      June 24, 2010 3:18 am

      The issue of minima and maxima is not crucially important, in my opinion. That’s because A/B testing is a methodology which lets you do any time of test you want to perform. Though I agree you may want to do both: small changes and radical changes in order to see if you are truly optimizing global maxima.

      Well, artistic sense over test results is a subjective opinion! Though my own preference is to always let data speak for itself.

      • 8

        Interesting, I think this comes down to the debate between the marketer and the designer. And it will always be a great debate. There is value on both sides. It’s finding common ground and a compromise between them both. Ultimately – the data should speak for itself…and ultimately, it’s the data that SHOULD guide the designer to understand what should be kept in mind for the future .. not just in the way of aesthetics but also when taking usability, conversions and overall success into account.

      • 9

        This sounds a bit like someone with a hammer trying to turn every problem into a nail. A/B testing is great for certain very narrow, *but critical* channels of user experience — also for domains that are more or less traditional and fixed in their expected format. At the level of strategy, over-all site orientation and high-level goals, though, A/B tests are the equivalent of using a narrow beam flashlight to look for something that you don’t really know the size or contours of, somewhere in North America, in the dark (their search is ultra-focused, binary and random or grouped around the current location). It has its place and we need good tools, but please don’t make it out to be the answer for all things UX just because it’s your baby. (By the way, I think A/B testing is a mighty fine hammer, there just needs to be some balance in its application and the evangelism surrounding it.)

  6. 10

    Sparsh Gupta

    June 24, 2010 3:14 am

    Great Article

  7. 11

    HTML Artists

    June 24, 2010 2:54 am

    Interesting…very use-full article
    thanks for sharing paras

  8. 12

    Mohanad Najeeb

    June 24, 2010 2:54 am

    Great article!
    how about more UX case studies. I think it would be great.


    • 13

      Anne Holland

      July 6, 2010 3:51 pm

      Paras – Thanks for mentioning our site as one of your favorites for more testing info. Currently we have 65 a/b test and multivariate testing Case Studies, complete with creative samples and results data, up on our site for your surfing pleasure. We add new tests weekly too.

      The case studies include B2B tests, ecommerce tests, email tests, lead generation form tests… you name it! That’s our job as testing journalists – enjoy!

  9. 14

    John Skinner

    June 24, 2010 3:02 am

    Good to see coverage of this.

    However, we should be careful not to overstate the results of a single test – the effects of a change can be very audience/context specific.

    The example used to demonstrate that “a single page checkout process works much better at completing sales than multiple-page checkout process” was, in my opinion flawed.

    As well as changing the checkout from a multi-page to a single page process, the test also significantly changed how ‘guest’ customers were treated.

    In the first example many first time customers may have assumed that they would have to create an account due the the page layout and the copy in the panel headers. It is not immediately clear that you can buy as a ‘guest’. In the second example this potential barrier has been removed. Instead, returning customers are asked for a login, whilst it is much clearer that new customers can continue without registering.

    • 15

      Paras Chopra

      June 24, 2010 3:27 am

      John, great comments. I will go even as far as saying that you should never trust results of a case study. Yes, case studies give ideas and inspiration for A/B testing but applying them without doing any testing on one’s own website is a grave mistake. As you said, results depend on a ton of factors which include context, product offering, brand value, etc. So, all A/B case study results should be taken with a pinch of salt.

  10. 16

    Sanchit Thakur

    June 24, 2010 3:24 am

    Great Post Paras!

  11. 17

    Jean-Francois Monfette

    June 24, 2010 5:39 am

    This is a very good post about A/B testing. I go to whichtestwon every week to try to see if I could pick the winning test. Sometimes results are surprising.

    Tools : There is a website called whichmvt that made a very nice table about the features and pricing of many a/b testing tools (including yours). It’s worth a look for anyone searching for a tool to do A/B tests.

    Good luck with your startup and keep on A/B testing !

    • 18

      Paras Chopra

      June 24, 2010 6:21 am

      Jean-Francois, thanks for wishing me well! Yes is an excellent site (though I didn’t include it because “multivariate testing” may have confused some readers).

  12. 19

    Great article. Nice balance of theory and practice, without making the content too simple or too complex. Maybe next time discuss multivariate testing?

    • 20

      Thanks alot – your answer solved all my problems after several days strggluing

  13. 21

    Nice article. I particular like the case studies presented. GREAT JOB!

  14. 22

    That’s a great article indeed, easy to follow and very informative. Thanks!

  15. 23

    Aplos Systems

    June 24, 2010 8:08 am

    Great article. Very informative and you gave some great show cases.

  16. 24

    Teylor Feliz

    June 24, 2010 8:39 am

    Very good and informative article.

    Thanks Paras Chopra and SM

  17. 25

    Have you seen ? Seemed missing from the article. We use it at our company for a/b testing in our funnels and elsewhere.

    Cool article Paras

  18. 26

    In the don’t section, the third bullet points says “Don’t surprise regular visitors.” How would someone be able to implement that?

    Well, I guess browser cookies might be helpful, as you can check whether the visitor to your site is a regular or visitor. What are some other ways to implement that though?

    • 27

      Paras Chopra

      June 24, 2010 9:32 am

      Yes, most A/B testing tools store the variation displayed to visitor in a cookie so that on his repeat visits the same variation is shown.

  19. 28

    April Sayler

    June 24, 2010 8:54 am

    I literally conducted a test with Google Website Optimizer yesterday and am implementing into our CMS now. How ironic that you post this article today. Thanks much!

  20. 29

    Eric Goldman

    June 24, 2010 9:15 am

    Great article – thanks for sharing. I would like to add just two thoughts:
    1) Inbound Marketing and Marketing Automation tools also provide A/B Testing capabilities. Of course they aren’t free, but if you go down the road of Inbound Marketing Automation (IMA), you will get the capability as a “free” add on to the lead management, qualification and nurturing functions.
    2) As IMA tools also track people’s digital footprints around your site as they interact with your landing pages, in addition to the conversion metrics you get for the A/B test, you can also determine which pages held visitor’s interests and which ones resulted in them leaving the site. In other words, you also gain insight into your prospect’s behavior which add depth to the results of your A/B tests.

    • 30

      Paras Chopra

      June 24, 2010 9:46 am

      Eric, interesting point. Can you name a few marketing automation tools that have A/B testing integrated? I’ll admit that I don’t have enough information on this front.


↑ Back to top