We’re redesigning our site, but there’s internal debate about whether the new design is any better than the current site. What’s the best way to determine which one is better?
Q: We’re redesigning our site, but there’s internal debate about whether the new design is any better than the current site. What’s the best way to determine which one is better?
Rather than continue to debate internally, you should get data from your end-users so you can make an informed decision. The guy down the hall has an opinion, and so do you, but your users’ responses matter a lot more.
Many techniques can help you figure out which design is better, but we’ll discuss two here. Both are types of “competitive benchmarking”, also known as “competitive usability testing” or “A/B testing”.
First, you could run a competitive test based on the think-aloud methodology. In this test, a user performs tasks with Version A of the design (or site, etc.) while thinking out loud. You make note of what the user liked and didn’t like, why they failed to complete certain tasks, where they got lost or confused, etc. Then the user does the same thing with Version B of the design while you observe. At the end, you ask the user which version they preferred and why. Do this with other representative users and look for patterns in the responses. The winning design usually emerges (“8 out of 10 users preferred Version A, and Version A was easier to use”), along with plenty of ideas on how to improve it.
Second, if the new design is already implemented, you could run a live test in which some percentage of visitors to your site get the new design while the rest see the usual design. You look at web analytics data and metrics you care about (for example, conversion) to decide whether the new design is “better”.
We recommend using both of these techniques together (the first during the design prototype phase and the second after implementation). But either technique is better than none.
Tips for competitive testing include: using the same tasks and the same metrics for both versions; asking users “why?” when possible; and alternating the presentation order across test participants if they see both versions (half will see Version A first, and the other half will see Version B first). It’s also good to have users compare designs at the same level of fidelity and interactivity—don’t test static wireframes against a live site and expect to get valid results. In this case, you could create static wireframe pages from your live site for the test.
Expero offers a training course on Competitive Usability Testing for people who want to learn the techniques in depth.
Tell us what you need and one of our experts will get back to you.