Why User Testing Beats A/B Testing All Day Long

Nicole Dieker

Follow any conversion marketing agency blog and they’re going to hit one point over and over again: if you want to know if something works, A/B test it.

To optimize conversion rates, you’ve got to run A/B tests on just about everything.

Then, you’ve got to take the winners of your A/B tests and run them in new A/B tests.

(Otherwise, how will you know whether a red CTA button really converts better than an orange one? Or a green one?)


A/B testing can feel like opening a series of infinite boxes; you complete one test and it opens right up into the next one – image source


But with all this talk about A/B testing, it’s like we’re forgetting the other big way to get user feedback on your website.

Some experts suggest it’s more effective than A/B testing at showing you how to improve your site and drive up your conversion rates.

This method helped SuperOffice increase conversion rates by 50 percent. 

It helped the Great Orchestra of Christmas Charity increase conversion rates by 420 percent.

Can you guess what it is?

(I’m pretty sure you already know what it is. It’s in the title of this post, after all.)

The answer: user testing.

Or, as online marketing expert Steven Macdonald writes in a guest post for UserTesting:

For instant, low cost and actionable feedback, usability testing reigns supreme, and it never fails to deliver.



How User Testing Differs From A/B Testing

We know you don’t really need a long explanation of what user testing is vs. what A/B testing is. Ott Niggulis at ConversionXL sums it up nicely:

A/B testing is done on live website with real visitors who have no idea that they are part of a test.

Usability testing on the other hand usually involves people observing recruited testers complete a given set of tasks on the website.

User observation can take place in-person or it can be remote, online—recording users’ screen and voice as they comment their thought process out loud.

Now that we’ve got that out of the way, let’s look at a few specific ways in which user testing is different from A/B testing:

User testing can often be done with a small sample size. 

How many people do you need to A/B test your site before you get valuable data?

With user testing, you can often solve a website problem with just a handful of testers.

Jim Ross at UX Matters explains:

The number of participants you need depends on the type of testing you’re doing.

If your goal is to test a design to find problems, then fix them through an iterative design process, it’s better to test with five or six participants at a time, over several rounds of testing.

If your goal is to assess the usability of a user interface and you’re gathering metrics, you’ll need a larger number of participants.

Or, as Scott Baldwin of Yellow Pencil puts it:

For a moderated test you can get decent test results with somewhere between five to seven users. 

For unmoderated tests you often purchase a batch of sessions at a fixed cost, but these can be limitless.

Curious about the difference between moderated and unmoderated tests?

Keep reading.



User testing gets you results quickly.

If you run a moderated test, you get results by the end of the testing session.

If you run an unmoderated test, you can often get results back within hours.


Now that’s way faster than waiting for results of an A/B test – image source

Of course, you need to do the prep work involved in developing thoughtful questions and tasks for your user testers, but you’ll start to see results right away.



User testing is about asking questions.

Steven Macdonald describes some of the questions he set up for his user testers, such as:

Look at our home page for five seconds and then look away. What do we do?”

Ask smart questions and you’ll get honest answers, which you can then use to improve your website.



User testing is about task completion. 

You can’t control whether a prospective customer goes all the way through your sales funnel, but you can ask your user testers to complete the task and give you feedback on the process.

As Niggulis writes:

Usability testing involves setting a series of tasks for people to complete and noting down any problems that they encounter. It really is as simple as that.

As simple as that!

If you aren’t already convinced that it’s time to incorporate user testing into your conversion optimization strategy, let’s take a look at how user testing outperforms its best friend, the A/B test.


They’re both great…but maybe one is a littler better – image source


How User Testing Outperforms A/B Testing

Obviously, no one is suggesting that you stop A/B testing. 

However, if you’re not adding user testing in the mix, you’re missing out on opportunities to increase conversion rates in a way that A/B tests just aren’t capable of doing.

As Ott Niggulis writes:

Jakob Nielsen was able to show in an analysis that the mean increase of conversion rates with usability optimization was 87%.

(He considered 66 studies with before and after measurements in which the usability of websites was optimized with the help of usability tests).

On average, the conversion rate could almost be doubled.

While that analysis was done ~5 years ago and the average level of website usability has gone up since then, the point remains valid.

Steven Macdonald states that “user testing has helped form the basis of a successful conversion rate optimization strategy which increased conversion rates from 0.39% to 1.61%.”

And, as we noted above, user testing is faster and requires fewer tests for useful data.

Consider that in your performance metric, too.



How to Use Your User Testers

When you run A/B tests, you know that the people coming to your website are, at the very least, potential customers.

Sure, they might have stumbled onto your well designed landing page by accident, but most of the people who visit your website have some specific reason for looking you up, which is why their A/B testing data is so valuable.

These people are your product’s potential users.

But what about the people who sign up for user testing gigs?

How can you trust their feedback, when they might not even know what your product is about?

Can KlientBoost trust someone who’s never heard of PPC ad agencies to successfully test its landing page?

Can you trust random Internet people to successfully test your website in a way that drives conversion?


“Sure, I’ll give you feedback” – image source


Well, if you’re curious about what kinds of people sign up for these jobs: I’ve been a user tester.

With UserTesting, in fact.

I picked up user testing as a way to make a little extra money, and got to visit websites, answer questions, and comment on which parts of the site I found confusing and unclear.

So if you don’t mind having someone like moi test your sites, you’ll probably be fine with your own user testers.

There are two big ways to work with user testers, by the way.

1) The first way is to hire your own people, bring them in, and do what are called “over the shoulder” or “moderated” tests as your users navigate your website.

As Scott Baldwin notes, this is a great way to pick users who match your buyer personas, so you can see how your ideal customer actually responds to your website.

Advantages: You can ask questions and have a conversation about the process as it is happening.

Disadvantages: A person’s performance often changes when the person is being watched, and you might unwittingly guide your users towards choices they might not have made if they were testing your site at home.

2) The second way is to run an unmoderated test, which is pretty much what it sounds like: users test your website without your direct moderation.

This often means outsourcing your user testing to a site like—you guessed it—UserTesting.com Or TryMyUI, or Userbrain, or any of the numerous other options out there.

Do some research before you pick a testing site to make sure it can offer what you need.

Consider asking around for recommendations.

Advantages: You get your site reviewed by people like me (and other people who want to make a little extra cash).

Disadvantages: You have less of the one-to-one feedback that you get with over-the-shoulder testing.

Also, some people who are doing this just to make a little extra cash will whip through the tests as quickly as possible, without thinking too carefully about their responses.

This could be to your benefit, though; after all, your prospective customers are going to visit your site and make similar quick decisions.

There are, of course, a number of other ways to solve the user testing problem, such as bringing user testers into your company and recording them while you watch from another room.

You could also use screen capture tools or analytic tools to watch people test your site from a distance.

Figure out what way works for you.

Once you have your user testers, how can you best use them? 

Heather McCloskey at UserVoice has a great list of ways to utilize your user testers, including the five-second test we referenced earlier:

When the five seconds are up, ask the subject some questions about what they saw and what they can recall.

You should have the questions prepared in advance, and will want to stick to the important things such as:

whether they could tell what the point of the screen was (i.e. “What was this site about?”),

if they understood the site/app (i.e. “What can you do on this page?”),

and if it was obvious what and where the call-to-action was (i.e. “How do you think you would sign up for a free trial?”).

She stresses focusing on open-ended questions to get the best answers:

Some of the best feedback you’re going to get from your users is the feedback you never expected to hear.

If the sole purpose of your user testing is to validate your existing ideas and opinions, you may miss out on important feedback that identifies new problems.

If you ask your subjects multiple choice questions, you may have already steered the responses into how YOU see the situation, and that’s not much more useful than taking the test yourself.

McCloskey also suggests conducting your user tests in a real-world situation:

Building a recipe app?

See how well it works on an iPad while someone is actually cooking, wet hands and apron included.

Read her full list for more tips.

You can also facilitate a strong user test by asking users to complete a task, such as “sign up for our mailing list,” or by simply letting users explore your site and watching what they do.

How often should you implement user testing as a conversion optimization strategy? 

As Matt Downey of 45royale writes:

It should bob and weave its way throughout every aspect of the project cycle.

If that sounds daunting, remember this: it won’t take you that long to figure out how your users are interacting with your website.

We referenced Jakob Nielsen earlier, but to quote him directly:

As you add more and more users, you learn less and less because you will keep seeing the same things again and again.

There is no real need to keep observing the same thing multiple times, and you will be very motivated to go back to the drawing board and redesign the site to eliminate the usability problems.


Once you’ve finished your turn at the drawing board, guess what: it’s time to user test again – image source

Yes, we know that dumps you back into the infinite loop of testing, but you were going to be there anyway with your A/B tests, so why not add in some user testing and maximize your conversion?
P.S. Did you like what you read here? Gained some insight? :) Tweet and post this to your peeps to share the wealth.