User testing: Just do it. It’s not difficult. Here’s how.

User testing (sometimes called usability testing) is simply having real people test drive your website or app for the purpose of identifying problems and opportunities for improvement. It is not difficult, but it does require effort. And based on my experience, most businesses or agencies are either not doing it, or if they are, they’re not doing it often enough. User testing — whether formal or informal — should be part of the web or app design and development process in an intentional way. And once a site or app launches, it should never be “set it and forget it.” User testing should be done periodically to make sure all is well. It’s like getting the oil changed on your car — user testing every 3-6 months is a very good practice.

Why?

Quite simply, user testing prevents problems and improves business outcomes. In my Twelve Dictums of UX Design, the number one dictum is that you are not the user. As soon as you know how something is supposed to work you can’t see it from the viewpoint of someone who doesn’t. There’s no better illustration of this than the infamous story of the $300 million button.

Hear this: In all the many years and the many, many user testing sessions that I or my team conducted, I have never once failed to learn something valuable that can improve a website or app. Often those learnings are significant.

How to do it

I said it wasn’t difficult and that’s true. You could say it’s not rocket surgery. (That’s borrowed from the title of a great little book, Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems by Steve Krug. That book is the source of much of the information in this post. I highly recommend it for anyone wanting to go deeper.) 

There are numerous user testing methodologies, each with its own place and purpose. But in general, they can be categorized in two groups: moderated and unmoderated. Moderated testing requires someone who conducts or directs the test, and a testing participant. They are working together in real time, side by side (in person or virtually). Moderated testing is the topic of this post. Unmoderated, or self-directed testing, is a blog topic for another day.

While not difficult, it does take knowledge to do it right. Whether you do formal or informal testing, you should follow the same basic guidelines. Here’s what you do.

“Formal” Testing Steps

  1. Focus on what specific business outcomes or other goals you want to evaluate. What’s most important to the site owner? What’s most important to people using the site? Where do those two overlap? Narrow this to a list of 3-4 desired outcome(s).
  2. Formulate several tasks that testing participants would need to accomplish to get to the desired outcome(s). For example, a task could be to locate a certain product on a website and purchase it. You might include 5-7 tasks, but you want to limit testing to what someone can accomplish in about 30 minutes at most. Create a moderation guide that lists the tasks. Have this document reviewed by the team or client to make sure everyone has the same expectations about what will be tested.
  3. Recruit participants. According to the Nielsen Norman Group, five people will uncover 85% of the issues. Therefore, for formal testing you’ll need five participants (or ten if you test both desktop and mobile, which is always a good idea). The participants should not be part of the team working on the project. They should match the personas of typical users of the site or app. Offer to pay them a stipend for their time. Your budget will dictate, but $25-50 for half an hour is a reasonable starting place. 
  4. Prepare to record the testing sessions before you begin and test your setup ahead of time. You don’t want to waste participants’ time futzing with technology. For remote testing you can use Zoom or another remote screen sharing app that allows for recording. Or there are more sophisticated tools such as Lookback or UserTesting that let you make notes within the timelines of the recordings among other helpful features. (Lookback or UserTesting can be used for either moderated or unmoderated testing.)
  5. When you start your session with the participant, it’s very important to coach them before they begin. Help them feel at ease. Assure them that they can’t make a mistake. Tell them that you are not testing them but rather you are testing the website or app and you know ahead of time that it’s not perfect, so they should expect to run into unintended roadblocks to accomplishing the tasks you will give them. It’s very important to instruct them to “think out loud” as the go about trying to accomplish the tasks so you know what they are thinking. You can warn them that thinking out loud may be unnatural to them and because of that you may remind them again from time to time. Tell them that even though you are recording their voice and actions, you won’t post the recording on the internet (if that’s true) and that the recordings will only be used for you and your team to understand flaws in the website or app. (If you do plan to post the recording or share it on other ways, you may want to get formal permission to record them and have them sign a release.) It’s a good idea to follow a written guide for this participant coaching. Steve Krug offers free scripts for this purpose on his website.
  6. Begin by letting the tester get familiar with the computer or mobile device. Then ask them to glance over the first page and to make any general observations they have. Don’t prompt them too much, just ask them what they think the site or app is about, scroll up and down if applicable, make some initial observations, but not to click on anything yet. This may or may not yield anything valuable, but it gets them comfortable with the process and lets you remind them to think out loud.
  7. Start with the first task on your moderation guide. Ask them to perform the task (e.g., “Locate product X and purchase it.”). Avoid the temptation to answer their questions about how to do it — that’s the purpose of the test, to see how users accomplish tasks without help. Just remind them to think out loud. If they seem pressured or embarrassed about not accomplishing a task, remind them you are testing the website or app, not them, and if something is not clear it’s the website’s fault, not theirs. Proceed this way through all the tasks. Don’t get bogged down in note taking to the extent that it distracts you or the participant. But do take brief notes when the participant makes key observations. Jot down the time of those comments in the recording timeline so you can easily find them when you watch back later.
  8. Wrap things up with general questions if desired. Here are some of my favorites:
    • What was the best thing about this site or app?
    • What was the worst?
    • If you had a magic wand, what’s the one thing you would do to improve it?
    • (If relevant:) You mentioned that _____ didn’t work the way you thought it should. How do you think it should have worked or how would you fix it?
user testing (or usability testing) session

That’s it — thank them and send them on their way.

It’s a good practice to consider the first test to be a “test of the test.” In other words, you may decide to modify some of the tasks slightly based on how well the first test goes so that the results of the remaining tests are more valuable. Sometimes you even need to fix things that are broken on the website or app itself based on the first test, especially if you’re testing something that’s still in development. Don’t worry that the tasks for the next four tests may be slightly different from the first one. You will learn valuable lessons, I promise.

After finishing five tests (or 10 with both desktop and mobile), your job is to identify a handful of key learnings. What are the top 3-5 issues that most participants ran into? Check back over your notes, listen to the recordings, find the patterns. This may sound daunting, but I assure you that the top 3-5 problems (or rather, “opportunities for improvement”) will rise the surface like cream. Focus on those key opportunities, ignore most of the rest. Summarize those issues in a brief document for the client or the team. Included a description of the issue (with screenshots if that’s helpful) along with your suggestions (or the testing participants’’ suggestions) for how to improve them. The main purpose of user testing is not to solve usability problems but to identify them. So, offering suggestions for how to fix the problems is optional. Inevitably potential solutions may suggest themselves, but there may be other better solutions as well. That’s the job of the UX designer, but observations from the person who moderates the testing sessions is often very helpful.

“Informal” Testing

For informal testing you follow the same steps outlined above but scale the whole process back in whatever way is appropriate for the situation. The principle is any testing is better than no testing at all. 

Here are some ways the process might be simplified for informal testing:

  • You could use two participants (or four if both desktop and mobile are being tested). 
  • Test only one or two tasks which may take only a few minutes. 
  • Offer participants a $10 Starbucks or Amazon gift card instead of a $50 stipend. 
  • You might not need to record the sessions. With only a couple tasks and a couple people, you can probably take notes as issues pop up.

Regardless of how you simplify the process, the critical steps not to leave out are defining the tasks (#2 above) and coaching the testing participants (#5 above).

Try it. You’ll be amazed at what you learn.

The adage, “test early and often” is the UX equivalent to “an ounce of prevention is worth a pound of cure.” If you employ the process outlined in this post, I guarantee you will learn sometime valuable that will improve your website or app, reduce user frustration, and contribute to better business outcomes.

If you want help with user testing, please let me know!



2 thoughts on “User testing: Just do it. It’s not difficult. Here’s how.”

  1. Thanks, John.
    How does this compare to beta testing with a group of clients or patients? We just launched a new website – start small and simple but, the owner of the practice started adding content that he thought was important. The beta test was well received (with some minor bugs to work out). But now the site loads slow and our analytics are lagging behind our old site over the past month. A technical industry, but I am not a big fan or redundancy and content viewers will not read. I wish we had a group of participants to review and provide comments – that would have helped me keep the site more simple. Fingers crossed. It still is one of the best in our industry, just cumbersome.

    Thanks for the great read.

    Reply
    • Hi Les, glad you found the post helpful. To answer your question about comparing this type of user testing to “beta testing,” it all depends on how you go about the beta testing. User testing could be called beta testing if it’s done prior to a site launch. Sometimes there’s a process of sending links out for people to review and provide feedback prior to the launch of a site — and I’ve seen that called “beta testing.” Nothing wrong with that at all — it can be very valuable. Any testing of any kind with real users is always valuable. However, if you want the most value from user testing, it’s best to employ a methodology like the one I described. Hope that’s helpful!

      Reply

Leave a Comment