*This article has been updated since it was originally published in 2017*

Want to avoid usability testing disasters before they happen? There are two simple ways to make sure your usability test delivers useful, accurate insights:

  1. Write a solid test plan.
  2. Run a pilot test.

This article explains how these two simple actions can help you avoid failure and get the most out of your usability testing sessions.

“In preparing for battle I have always found that plans are useless, but planning is indispensable.” – Dwight D. Eisenhower

Planning is a critical part of the usability testing process. Without planning, your testing activity runs the risk of disorganisation at best and misleading or useless results at worst. Spending time planning will reduce the risks of the testing going off the rails and enhance the potential for great insights. This article explores the reasons for planning your usability testing process and just what you should think about.

1. Write a usability test plan

Why plan?

The Eisenhower quote above suggests that it is the act of planning that is important, not the plan itself. By carefully considering the needs of the situation, in this case usability testing, and allowing for as many variables as you can (there are always more) is more critical than the actual product. Planning for planning’s sake is “useless” according to Eisenhower, but the act of sitting down and thinking about what you need to achieve and how you are going to do it is essential to success. You will never have infinite resources to run any testing project, and even if you are blessed with vast resources, using them wisely will allow you to save your excess for future activities.

What should your test plan cover?

There are four critical things you need to consider when running usability testing activities of any size or scale. These considerations are:

  • What are your research goals?
  • What tools and methods will you use to achieve them?
  • Who are you going to test with?
  • What resources do you need to get those goals done?

 

The very first thing to consider is what is the goal of your research? Understanding what you want to achieve will determine how you will achieve it.

- What are your research goals? Are you trying to increase conversion? Are there suspected problem areas with your site or system? Do you have disagreement in your team about a particular design element? Are you trying to determine why users are abandoning at a particular point in a process?  Decide what outcomes are important (check the project and business goals too).

- What test artefacts or deliverables will you produce after testing? Are you going to create a test report, issues register in Excel, or maybe a presentation with video clips?

- What is your methodology? What tools and methods are you going to use to achieve your goals? Examine your research and your test assumptions and design an approach to meet them. We often use a range of tools and methods beyond simple scenario task based testing, to do things like explore user page impressions and visual hierarchy, pre vs post use impressions, brand alignment semantic differential scales, or even mixing up IA testing using tools like Maze with usability testing of a site or prototype.

- Who are your users and how many participants should you test with? The general rule is 5 participants from each unique user group you are testing with.  Avoid friends, family and co-workers as they are often not reflective of your actual target audience and may have knowledge that your users typically do not have. We generally recommend using a professional market recruitment agency to find participants as they are worth their weight in gold.
 
- Test measures. How are you measuring success? We often measure task efficiency using a scale: easy/medium/hard/fail.  You may look at error rates if testing a system. Task completion time might be important (although there is a trade-off here as users can’t think aloud when being timed). Measuring subjective user satisfaction is also usually important.
 
- Write test scenarios. This goes back to your testing objectives. What is the purpose of the testing? The tasks you are asking your participants to complete should reflect those goals. Be sure to write tasks in context i.e., write a sentence of context before the actual task and make sure your scenarios are not leading.
 
- How much time do you plan to spend on each task? It’s important to prioritise your tasks. You won’t have time for everything, so make sure you’re targeting your high priority goals first, based on the most common goals of the user group. If testing websites with more sensitive content e.g., domestic violence, we recommend adding another 15 minutes to your sessions so you don't appear to be unempathetic to your research participant by rushing them.
 
- What constraints are there to the testing or limits to scope? Are you dependent on a critical system or person? Are results due at a certain time? These constraints will influence your approach and deliverables. If you're testing a prototype to improve the design quickly, your report can be simpler than if you're testing to support a business case.
 
- Location & equipment. Where are you testing? Via Teams/Zoom? In person? Are you testing onsite? What equipment do you need? Finding a way to record your sessions can help streamline your analysis, communicate findings, and allow your stakeholders to unobtrusively observe sessions.  We like to use Zoom because it lets us change the participants' name to protect their privacy.
 
- Testing team. Who will moderate test sessions? Often it is good to have two moderators to ensure you stay fresh and provide different perspectives. Will you have observers logging data? Who is writing up the results?
 
- Paperwork & materials. What paperwork do you need? There are often a bunch of forms you need and there are many templates available. Before starting any recording, you will need an informed consent form. A moderator guide is a useful document for each test session, to record results. Task cards are also useful for participants to refer back to during in-person testing.

 

Don't forget a checklist

Create a checklist for all the items above. Getting the basics right can help you concentrate on the important things - the testing itself and making the participants feel comfortable on the day. The less you are worried about forms, technology, and timings, the better your test session will be.

2. Run a pilot test

There are a couple of key things you can do to ensure a smooth test day. The first is to run a pilot session at least one day before your first test. Get a participant (they don’t need to meet the brief exactly, but it's best if they are somewhat on spec) and run them through the test plan exactly as if you were running a normal session.

This will do a few things for you:

  • Test your timings and prioritise tasks if the session runs too long.
  • Ensure the scenarios are clear and understood by participants.
  • Confirms the test methods you are using will give you the results you planned.
  • Identify technology issues with your testing environment or the system you are testing (e.g., bugs you need to be aware of if the site is not live yet).

You should run a pilot the day before testing, at the very least, to give you enough time to make any tweaks or adjustments you need to, and allow you to create the final versions of tools like a moderator’s guide.

It is best to prepare everything the day before testing so you aren’t scrambling the next morning. A checklist can really help - by capturing all the details in one place, it frees up your mental space. Just follow the checklist and focus on the session itself.

Oh, and the last thing on our internal checklist is often overlooked but very important.

Make sure you get a good night’s sleep!

Having a plan and running a pilot might sound simple, but they’re your best defence against usability test failures. These steps will help you catch issues with your test before you're live with participants, saving you stress and ensuring smoother sessions.

You’ll find even more guidance on writing a Usability Test Plan inside our UX Training Portal – and templates too, so you can get started right away!