Posts Tagged ‘Testing’

We are pleased to present below all posts tagged with ‘Testing’.

Comprehensive Review Of Usability And User Experience Testing Tools

Usability and user experience testing is vital to creating a successful website, and only more so if it’s an e-commerce website, a complex app or another website for which there’s a definite ROI. And running your own user tests to find out how users are interacting with your website and where problems might arise is completely possible.

Verify

But using one of the many existing tools and services for user testing is a lot easier than creating your own. Free, freemium and premium tools are out there, with options for most budgets. The important thing is to find a tool or service that works for your website and then use it to gather real-world data on what works and what doesn’t, rather than relying purely on instinct or abstract theories.

Read more...

Reliable Cross-Browser Testing, Part 1: Internet Explorer

In a perfect world, cross-browser testing would be straightforward. We would download a legacy version of a browser, run it, and be able to instantly test our pages and scripts without a single care in the world. The reality of cross-browser testing, though, is very different. Issues such as runtime conflicts when running multiple versions of the same browser and inaccurate third-party testing tools mean we can spend hours just evaluating whether a testing set-up is anywhere near reliable.

screenshot

I’m a user-interface developer at AOL (yes, we’re not dead yet!), and in this multi-part post I’ll take you through the exact set-up we use to accurately test content that will be potentially viewed by up to millions of users with a very diverse set of browsers. This set-up is similar to the one used by some of my colleagues at Opera, Mozilla and Google, so, fingers crossed, we’re doing this optimally.

Read more...

Review Of Cross-Browser Testing Tools

At some point in the future, the way that all major browsers render Web code will likely be standardized, which will make testing across multiple browsers no longer necessary as long as the website is coded according to Web standards. But because that day is still a way off (if it will really come at all), testing your design the advanced browsers as well as legacy browsers is a necessary part of any project.

Screenshot

The old-school way to test code was to load your website on as many computers as you could find, using as many different combinations of browsers and operating systems as possible. That was fine if you had access to a bunch of different computers (and had some time to kill). But there are much more efficient ways to test across browsers, using either free or commercial Web services and software. In this article we review some of the most useful ones.

Read more...

Multivariate Testing 101: A Scientific Method Of Optimizing Design

In a previous article on Smashing Magazine, I described A/B testing and various resources related to it. I have also covered the basics of multivariate testing in the past, yet in this post I’ll go deeper in the technical details of multivariate testing which is similar to A/B testing but with crucial differences.

In a multivariate test, a Web page is treated as a combination of elements (including headlines, images, buttons and text) that affect the conversion rate. Essentially, you decompose a Web page into distinct units and create variations of those units. For example, if your page is composed of a headline, an image and accompanying text, then you would create variations for each of them. To illustrate the example, let’s assume you make the following variations.

Read more...

Multivariate Testing in Action: Five Simple Steps to Increase Conversion Rates

The attention span on the Web has been decreasing ever since Google had arrived and changed the rules of the game. Now with millions of results available on any topic imaginable, the window to grab a visitor's attention has decreased significantly (in 2002, the BBC reported it is about 9 seconds). Picture yourself browsing the Web: do you go out of your way to read the text, look at all the graphics, and try to thoroughly understand what the page is about? The answer is most likely to be a straight "no." With bombardment of information from all around, we have become spoiled kids, not paying enough attention to what a Web page wants to tell us.

A/B testing example

We make snap decisions on whether to engage with a website based on whatever we can make out in the first few (milli)seconds. The responsibility for making a good first impression lies with designers and website owners. Given that the window of opportunity to persuade a visitor is really small, most designs (probably including yours) do a sub-optimal job because the designer in you thinks in terms of aesthetics. However, most websites do not exist just to impress visitors. Most websites exist to make a sale. Whether it is to get visitors to subscribe to the blog feed, or to download a trial, every website ultimately exists to make a sale of some kind.

Read more...

Test Usability By Embracing Other Viewpoints

As Web technology improves, users expect Web-based widgets to be useful, content to be relevant and interfaces to be snappy. They want to feel confident navigating a website and using its functionality. They crave being able to get things done with little friction and on demand. And demand they do.

Various layouts, any of which might work

[fblike]

People are picky. When a website gives them problems, they are less inclined to use it. From a design perspective, testing for a good user experience entails making improvements based as much on critical feedback as on design expertise. As long as your website is around, offering a good user experience is critical. And like the website itself, improving the user experience doesn’t end when the website launches.

Read more...

In Defense Of A/B Testing

Recently, A/B testing has come under (unjust) criticism from different circles on the Internet. Even though this criticism contains some relevant points, the basic argument against A/B testing is flawed. It seems to confuse the A/B testing methodology with a specific implementation of it (e.g. testing red vs. green buttons and other trivial tests). Let’s look at different criticisms that have surfaced on the Web recently and see why they are unfounded.

Screenshot

Jason Cohen, in his post titled Out of the Cesspool and Into the Sewer: A/B Testing Trap, argues that A/B testing produces the local minimum, while the goal should be to get to the global minimum. For those who don’t understand the difference between the local and global minimum (or maxima), think of the conversion rate as a function of different elements on your page. It’s like a region in space where every point represents a variation of your page; the lower a point is in space, the better it is.

Read more...

↑ Back to top