Menu Search
Jump to the content X X
Smashing Conf Barcelona

You know, we use ad-blockers as well. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

A Roadmap To Becoming An A/B Testing Expert

A/B testing, also known as split testing, is the method of pitting two versions of a landing page against each other in a battle of conversion. You test to see which version does a better job of leading visitors to one of your goals, like signing up or subscribing to a newsletter. You can test two entirely different designs for a landing page or you can test small tweaks, like changes to a few words in your copy.

Running A/B tests on your website can help you improve your communication with visitors and back up important design decisions with real data from real users. With the multitude of tools available (detailed later), split testing has become easy for even non-technical people to design and manage.


Further Reading on SmashingMag: Link


When To Start Testing Link

Start testing only when you have enough visitors and enough conversions to run the test in a timely manner (a conversion happens when someone completes one of your goals). What does that mean?

The ideal number of visitors will vary according to your typical conversion rate. Plan on at least 1,000 visitors for each variant, and 150 conversions for each variant. For some websites this might take four hours to complete, for others an entire month.

To find out exactly how many visitors you’d need to run a test, plug a few basic metrics into Evan Miller’s sample-size calculator6.

You could run a successful business with 80 visitors a month, but you wouldn’t be able to run a statistically significant7 A/B test. Don’t start A/B testing before you’ve done any marketing. Get a steady flow of people to your website before doing any optimization.

Keep in mind that you don’t have to build your product before starting A/B tests. You can test a splash page and find out how future customers respond to planned features and pricing tiers.

Your First Test Link

For your first A/B test, keep it simple. Tweak your existing landing page to get your toes wet. Focus on low-hanging fruit:

  • copy in h1 and h3 headings;
  • copy in call-to-action buttons (for example, “Try it free” versus “Sign up” versus “Get started”);
  • the color, size and position of call-to-action buttons;
  • a short page versus a long page (by hiding long sections).

You can run multiple variations at one time, so dream beyond just two tweaks. You can also run multiple A/B tests at one time, but focus on one at first to get the hang of it.

Run the tests anywhere from a couple of days to a month. Your A/B testing tool will declare the statistically significant winner. Once a winner has been declared, make the winning variant part of your permanent website by updating the underlying code.

Then, clear the way for more A/B tests.

Where To Go From Here Link

Low-hanging fruit is a perfect place to start, but A/B testing isn’t all about that. Sure, testing button colors and heading copy will improve your conversion rate, but think beyond how a page looks. Think outside the box:

  • Highlight features versus benefits.
    Are you pitching features of your product in a checklist? Try illustrating the benefits of the product by describing a dream scenario for the customer.
  • Accelerate page-loading.
    Keep the landing page super-simple, and make it load in under a second.
  • Show a big video with someone talking.
    Try removing the big screenshot and dropping in a big video that demonstrates your product, perhaps one in which you’re talking to the camera. Make a personal connection.
  • Survey new customers.
    Talk to new customers to see what made them sign up and what has been most valuable to them so far. Highlight these in a test.
  • Find out what confuses new customers.
    Ask new customers what confused them about the website or what questions they had that were left unanswered. Incorporate your answers into an A/B test.
  • Add testimonials.
    Use social proof in the form of testimonials. Try putting an avatar or company logo next to each name.
  • Change the writing style of headings and main content.
    Change the style of headings and content on your blog to see how it affects newsletter subscriptions. Try writing two versions of the same article.
  • Experiment with pricing tiers and your business model.
    Change your pricing structure, even if only on your public-facing page, to see how potential customers respond.
  • Make the sign-up form super-short.
    Remove any unnecessary steps in your sign-up form.
  • Radically change the design.
    Try a different approach with your landing page, like what Campaign Monitor did8 with its big modal for new visitors.

Remember that conversion isn’t a one-time deal. When you say that you want more registrations, what you’re really saying is that you want more lifetime customers. When you say that you want more podcast listeners, you’re really saying that you want a larger dedicated audience that won’t stop telling their friends about you.

Monitor how your A/B tests affect your long-term goals.

Tools To Use Link

To run A/B tests, I recommend using VWO9 (Visual Website Optimizer). You won’t need to edit your website’s code for each test and redeploy. Instead, you would use the tool’s WYSIWYG editor.

(View large version11)

With VWO, you can do split URL testing, which tests two completely different pages, as well as multivariate testing, which tests more than two variants at a time — think of it like A/B/C/D/E/F/G testing. VWO uses statistical significance to declare the winner. A new version of the software is coming out soon, too.

Optimizely12 is another simple WYSIWYG tool, and Google Analytics Content Experiments13 is a solid free option. If you’re looking to A/B test an email campaign, use Campaign Monitor14.

To set up your first test with VWO, install the code15 above the closing <head> tag in each page that you’re going to test.

The only other step is to define your goals16. Complete this sentence: “I want more…” Perhaps you want more people to sign up for your free trial, subscribe to your newsletter, download your podcast or buy something from your store. Your goal will determine which variant in the test is the winner.

Taking It Too Far Link

A/B testing is not a silver bullet. Optimizing the conversion rate will make a good landing page better, but it won’t fix a product or company that has fundamental problems.

Narrowly focusing on A/B testing will turn your customers into mere data points. Your customers are not conversions that you push down a funnel. They are real people with real problems, and they are looking to your website for a solution.

Don’t sell out your visitors for short-term gain. Putting a giant red “Try it free!” button will increase conversions, but your business won’t be any better off in 12 months. Keep the long game in mind.

The goal of A/B testing isn’t to mislead potential customers into buying something they don’t want, but rather to clarify how you should communicate a product’s benefits and to make sure that customers understand the language you’re using.

As long as you’re confident that the product is great, then just use A/B testing to tweak how you present it to customers. If you know that the product is helpful, then don’t try to manipulate visitors through your funnel. Always put the product first.

Finally, don’t ignore your gut. You can use data to back up your instinct, but rely on your experience in the industry. Don’t become 100% data-driven.

Further Reading Link

(ml, al, il)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27

↑ Back to top Tweet itShare on Facebook

Kevin runs an A/B testing service to help business owners improve their conversions. He also created an app to track your iPhone use called Moment.

  1. 1

    Thanks for the article Kevin, but I am afraid that most of your audience will find that just tweaking colors or copy will not produce any meaningful results for their businesses. Let me explain…

    We’ve found that a successful approach to A/B testing is really dependent on the type of company you’re operating and product that you’re offering. Small cosmetic changes to the UI or copy often result in equally small changes to click-thru or conversion rates, and so these A/B tests require relatively greater levels of statistical power in order to achieve significance.

    For mega-traffic companies like Google or Amazon, these kinds of tests are worth the cost of testing because a sub-1% lift still contributes substantially to their bottom line.

    But for everyone else, ‘shallow’ A/B tests of a button color or call to action will often yield inconclusive results. Here’s an article from the founder of GrooveHQ detailing such an experience:

    If you’re running a small or medium business – or even a larger one that does not have the scalable testing practices of a tech giant like Amazon in place – testing deeper changes to the product, UI layouts or entire UX workflows are what move the needle.
    This is what we’re now calling ’empathic A/B testing’ – where tests are designed with empathy for users.

    If you ask the questions: What changes can I make to my product or website that would motivate my users to take the actions I want them to take? What are they looking for? What do they care about? And why? More often than not, I think you’ll find that the answer is not ‘a different button color’

    In the end, A/B testing is really a very unsophisticated way of answering the question ‘What works better?’ because you are sending a fixed proportion of your users to a suboptimal variant for the duration of the test. We’ve done a lot of research into better solutions to this problem, and have found that a dynamic approach using a learning algorithm always leads to faster results and higher average conversion rates. You can read more about that here:

    • 2

      Kevin Holesh

      July 11, 2014 5:33 pm

      Thank you for your thoughts, Zac. I definitely agree that small tweaks will sometimes lead to small results. I only suggested these copy and button color tweaks as a very first test to get your feet wet with A/B testing. This by no means is a permanent solution and the only thing you should focus on.

      Your attitude of empathy is a great one. “What will make this better for my customer?”

  2. 3

    Martin Spierings

    July 11, 2014 3:46 pm

    Great article. More companies should implement and execute this to improve their sites and applications.

    One thing that would have made this article better is (a link to) some technical examples. How would you do this in popular web-frameworks, how would you do this in AngularJS for example. Or with your iOS apps?

  3. 6

    Amazing!.. Really like the way you represented the details in your posts. It’s appreciating and enjoyed it a lot. Thanks

    • 7

      Kevin Holesh

      July 12, 2014 3:18 pm

      I’m happy you liked the article. Thank you for the kind words :-)

  4. 8

    Any decent Email Service Provider offers A/B testing, not just Campaign Monitor.

    • 9

      Kevin Holesh

      July 12, 2014 3:19 pm

      There are definitely other options. In my experience, Campaign Monitor is by far the best A/B testing experience, which is why I recommended it in the post.

  5. 10

    Thanks! Perhaps it’s time for me to look over the split plugin to WordPress?

  6. 11

    Jordi Griell Barnes

    July 13, 2014 1:30 pm

    I do A/B testing for pharmaceutical companies, and segmentation and targeting is the key to meaningful reports.

    We use ClickThroo because it is server side but there is no multivariate testing like in VWO and licensing is also more expensive. VWO on the other hand is client-side and cheaper, we only use ClickThroo because we my boss says that server data is more accurate.

    • 12

      Kevin Holesh

      July 13, 2014 7:10 pm

      Hi Jordi! Thanks for your thoughts.

      Server side A/B testing can be much more powerful and very accurate, but is usually much harder to implement. You definitely need technical skill for that.

      A regular old business owner could do some basic A/B testing with VWO.

  7. 13

    This is a great overview of wonderful road map. Thanks for sharing.

  8. 14

    Joe Wojciechowski

    July 15, 2014 4:22 pm

    I am a big proponent of testing, especially when the tests are conducted correctly and have measurable results. Far too often have I been in situations where the mindset is, “Just change X and we’ll see what happens.”

  9. 15

    Daniel Kemeny

    July 17, 2014 2:08 pm

    Great article. Love the VWO, we use it a lot for A/B testing. Even for the simple tests (changing the copy in the CTA) or the revolution-tests like exploding the product page into pieces and reconstructing brand new layouts for each variation with jQuery. Now the UI became much better – a bit difficult to understand after the previous one, but having great new features added making the testing possibilities limitless. Thanks to VWO our department constantly comes out with CR% wins which means a lot of £££ for the company. And yes, sometimes you have to leave the logic behind.

    “Don’t try to understand UX! It’s like understanding every single person.”

  10. 16

    Do not, I repeat DO NOT just run a test for ‘a couple of hours’ as suggested in the article even if you have ‘enough’ conversions.

    Your site might convert completely differently Monday morning compared to Thursday afternoon, so if you don’t let a test run to take into account days/weeks of business then you’re making assumptions about the winner.

    Ignore the VWO ‘you have a winner’ messaging and let the tests run for a proper length of time.

    Yes, test and get interested by trying stuff out (and making mistakes – I know I did at the start), but don’t make massive decisions on changing your site on the whim of a computer algorithm.

    • 17

      Good call pointing that out. Looks like the article was updated to “a couple of days” or more.

      We actually recommend thinking in weeks. We run tests for a minimum of a full week and try full weeks after that. If you have “too much” traffic, throttle it so you can run your test long enough. See for more on this

  11. 18

    Dennis van der Heijden

    August 1, 2014 12:20 am

    Hi Kevin, I hope before next article you can take a look at for your A/B testing needs :-)

  12. 19

    Hey, just noticed that you are suggesting to use the Google Analytics Content Experiments. Some of us are using google sites… Sadly they do not work together. I’d like to submit my workaround to those of you who are in the same boat as me.

    50/50 Traffic Routing Methodology for Google Site AB Testing

    Just wish their tools would play nicely together already! urggg!!

  13. 20

    Wayne Carrigan

    December 4, 2014 7:59 pm

    Good article. I absolutely recommend getting a copy of Chris Goward’s book, You Should Test That!

    It’s a must read for anyone wanting to learn A/B Testing and Conversation Optimization.

  14. 21

    No one can be 100% data-driven, simply because not everything can be tested in all circumstances. If the data is incomplete, interpretation is needed to fill the gaps and make connections. Even so, it is always better to have evidence. I’m surprised how comfortable people are with assumptions, even when they are easy enough to test. I think if something can be measured, it should be, and should bear on decision-making. If your guts says your new redesign is great but people are not completing the task on the site or not finding something, maybe it’s not that user-friendly.

    The industry is not nearly data-driven enough. Many people don’t understand being data-driven. Some are just not great at numbers. Others, who practice soft decision making, are threatened by it. Being data-driven means knowing what data to use and what to use it for. It means being adept at turning data into information. It doesn’t threaten design. Design leads to behavior change, and behavior is represented in data. So, data and design are inextricably linked.

    Here’s another case of taking it too far: plenty of people are so confident in – dare I say in love with – their product, that they do not see the flaws in it. I’ve literally seen sites where it’s not clear how to sign up.

    Testing and optimization helps the product shine, makes it easy to learn about, easy to buy, and easy to use.

  15. 22

    In the end some things just won’t make a noticeable difference. A good example is the color of buttons and styles of text. In this case, research may be the answer. For example, we all know the size / style of text that is optimal for ease of reading or grabbing our attention. Similarly, we all know the complimentary color schemes and psychology and impact of certain color hues on our decision process. There are thousands of resources online. An A/B testing process may turn out to be a waste of time in this case.
    The objective of every business owner should focus on content / design ease of use first and foremost. A/B test are only worth it if you have established a way to measure your findings and have two drastically different but well thought out approaches that could work, all things considered equal. For example, when coming up for a logo at Helprace ( we decided that we wanted to convey trust and calmness. That’s why green and blue was used in the form of the logo. Not much thought went into this decision simply because of the amount of information regarding this subject available on the web.


↑ Back to top