Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?
Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?
A well-executed think-aloud study can yield useful quantitative data (for example, rate of an occurrence or behavior) as well as qualitative data. We recommend that before you invest in more costly testing techniques, you try collecting and analyzing the quantitative data you may be missing from your think-aloud studies.
Start by counting how often important things happen during the study sessions (a user fails to complete a specific task, a user goes to the Help section, etc.). Ask users to rate the usefulness, ease of use, or appeal of particular features, as well as the overall user interface. All of a sudden, you have hard data to report: “Users rated the feature a 3.4 out of 5 on Usefulness”; “70% of the users tried to access Help when trying to create an account”.
A more advanced but very helpful step is to quantify what the users said during the study. This is “content analysis,” a technique from the communication and psychology fields. In content analysis, you categorize users’ comments (for example, as positive or negative), then you count. This yields even more hard data: “78% of users’ comments about creating an online account were negative, and only 22% were positive”; “Overall, 32% of users’ negative comments about the site focused on account creation, which suggests that this area needs a lot of improvement”.
Tell us what you need and one of our experts will get back to you.