Analyze user experience issues on your website that are lowering your conversion rate.
Plan out your usability test, including how it will be run, online or in-person, what you are trying to figure out, and how you’ll get the information you need.
Based on Usability.gov’s excellent materials, here are the elements that a plan should cover:
- Scope. What exactly are you testing? The whole site, only parts of it? The navigation, navigation and content, just content? In this step, you indicate what exactly are you going to test.
- Purpose. Identify the concerns, questions, goals and hypothesis you are going to test. These could include questions like can users easily find info about product X? Or is our store locator feature easy to use? Or is the check-out procedure easy to use?
- Schedule and location. When and where are you going to be running the test? In-person or remote/online?
- Participants. How many people are participating in testing, do they have to be from a certain demographic?
- Scenarios. How many tasks and scenarios are your participants going to go through, and what are they exactly? This is where you will formulate concrete questions and actions like find product X and fill out the form for ordering.
Start testing early on in the product lifecycle. For instance, the site doesn’t even need to be online yet. Adopting a smaller changes approach is much cheaper too. Rather than changing one huge lump after waiting too long, plan to make small incremental changes throughout the lifespan of your site or product.
If you have a niche or specialized product, recruit testers that match your ideal customer profile from your own network, like your blog readers, newsletter subscribers, Twitter followers, and friends of friends.
If your ideal customer is essentially ‘everyone’, using remote testing services can work well. However, if your ideal customer is more specific, these services usually won’t suit you as their testers often won’t be familiar with your niche. For example, testers recruited via usertesting.com couldn’t figure out the purpose of a site that helps bloggers build courses online because they lacked industry knowledge, making their feedback largely useless.
Invite no more than 5 users in a single test group or run as many small tests as possible instead of one big elaborate one. For instance, use the first 5 testers to find out about your biggest issues, and use the next 10 users to find out the rest of the problems. With larger groups, you run the danger of learning less because you’ll keep seeing the same things over and over again.
Reward testers for taking their time by giving them free products or services, coupons, or $25 Amazon gift cards, especially when you test within your own network.
Conduct over-the-shoulder usability tests in person with a dedicated room, a working computer with screen recording enabled, and a notepad & pen for writing down your observations.
- Give the test subjects a list of tasks you want them to do on the site like browse around, look for product X, find something you like and buy it, and observe them do it.
- Ask testers to comment on everything they do and to think out loud.
- Clarify a task if the tester has trouble understanding it.
- Ask anything extra you might want to ask.
- Consider a more formal lab setting with cameras if you also want to film the participants’ facial expressions. However, be wary that people might change their behavior when they know they are being watched.
- Listen to your users’ feedback by being open-minded and without trying to prove a specific outcome or support a hypothesis you set before the testing.
Conduct remote usability testing online via a tool like UserTesting.com, Validately, or TryMyUI to save time and resources.
This is especially useful if you have a global user base, as it’s cheaper, faster, and easier to get feedback across time zones.
- Record audio and what’s happening on your screen via screen capture software, which comes with the tools.
- Set tasks to be possible to complete within ~20 minutes to align with the software’s usual video length limit.
- Obtain the video recordings after the session ends to see and hear exactly what was going on.
- Consider the limits of technology where you can’t control the quality of the feedback as you would in person, for example, a user might spend 10 minutes doing stuff that is not important, or run on a technical glitch irrelevant to the test.
Gather all the data from your testing, highlight main points, compare how each of the testers did on each specific task, and create a list of identified usability problems that need a fix.
Look for common comments made and issues that testers ran into. Watch over recorded testing sessions to identify clear trends among testers and to put their challenges in context, such as who these people were and their skill level. Just because one or two participants had trouble with something doesn’t mean that you should immediately jump and change the thing.
Don’t go into usability testing expecting a specific outcome, and only looking for evidence that supports your hypothesis. Try to keep an open mind and listen to your users’ feedback. See what the results say, do your best to learn from them, and your site will be that much better for it.
For instance, a usability test revealed that prospects of a UK-based travel agency site were often confused about the prices and weren’t sure if the prices were per person, per night, or if flights and hotels were included. Ironically, the prices were so low that the company was harming its own conversion rate. Clarifying the pricing and what is included helped the agency increase its site conversion rate by 19%, adding an amazing £4 million to its annual revenue.