Business Benefits
Understand how your target audience thinks about and interacts with your product.
Determine what part of your product you wish to test and a specific test objective.
User testing should be specific to a particular feature, use case, or part of the product experience. I want to watch users in my application is too general - I want to watch first-time users interact with new feature X to perform task Y or I want to see users over age 50 unbox and assemble my product are more specific and testable.
Recruit subjects for your user test who will help you achieve your test’s objectives.
For example, if you want to test how brand new users will interact with your dashboard, consider asking prospective customers on your email list. Customers also make great user-testers, if they are in the correct segment you’re looking for—think power users vs newbies, users from big companies vs solopreneurs, etc.
If you can’t find your perfect user-testers, don’t worry too much—simply seeing your product through the eyes of someone who doesn’t work at your company can provide a crucial fresh perspective. So friends, family, and office neighbors are all fair game.
Aim for 5-15 test participants.
5 people will cover many of the scenarios, challenges, and questions you will experience at scale. Results will get repetitive beyond 15 participants.
Design various test scenarios for your testers. Give your testers enough information to complete a task or series of tasks using your product without leading them.
Share as little information as possible with your test subjects so your test fairly assesses the situation without influencing your subjects.
Create a testing process that minimizes variance. Design your testing environment so that your test participants see, hear, and experience as close to the same thing as possible.
Script your test introduction, scenario descriptions, and questions. Ensure your prototype or application remains consistent to minimize the variance your test subjects experience.
After a subject has completed a test scenario or task, ask follow-up questions, focused on producing open-ended responses.
For example:
- I noticed you did X, can you explain why?
- What did you think of the experience of using this feature?
- What was unclear about performing this task?
- What questions do you have?
Compile and analyze the data created by your tests, focused on patterns, unexpected outcomes, and ancillary learnings.
- When compiling your data, look for patterns—did several testers have trouble on a particular part of your app or product? Did they ask similar questions?
- Since user testing can often reveal problems and opportunities outside of what you were directly testing, be sure to comb through your data for unexpected user feedback gold!
Last edited by @hesh_fekry 2023-11-14T12:35:46Z