Conduct unmoderated remote user testing

Contributors

@ben-labay-2


Business Benefits

Learn how people navigate your site, whether they can find a particular product or feature, and where there are areas of friction.


Choose a testing tool like TrymyUI, UserTesting, Validately, or Userlytics.

Define a scenario that leads users to the area of the website you want to test.

For example, “You’re in the market for a handheld vacuum cleaner, see an ad for a Dyson product, click on the ad, and land on this URL.”

Define your user intent; informational, navigational, commercial, or transactional, based on the content of the page.

  • Informational intent wants to learn about a product.
  • Navigational intent wants to search and browse a site.
  • Commercial intent wants to do pre-purchase research.
  • Transactional intent wants to buy.

Based on your defined intent, create tasks for users.

For example, if you want to know how discoverable your product comparison tool is, the task could be, “Find two hand vacuums priced under $100 and compare their features.”

You could also create a:

  • Broad task: “Find the selection of available handheld vacuum cleaners.”
  • Specific task: “Compare two handheld vacuum cleaners priced under $100, and choose one to purchase.”
  • Funnel completion task: “Purchase the handheld vacuum cleaner you selected.”

Reword leading tasks and instructions like “Use the product comparison tool to choose a hand vacuum cleaner.”

Recruit 5–10 testers for each device you want to test.

Use a service or research agency to identify a panel of testers, recruit site visitors to participate, or find users within your own network or company.

Hold one practice run per device to troubleshoot and resolve issues with your test process and technology.

Troubleshoot the test recording, data, and other session details. Did any test language confuse your users? Did any part of the scenario confuse your users?

Launch the test to all users.

Watch the session recordings to document areas of friction and record questions that users ask themselves.

Which tasks, or parts of a task, took longer than expected? Where did users have to go back? What copy was unclear? Where did users verbally express confusion or frustration?

Count how many times the issues you identified occurred.

Convert percentages in your report to “_ out of _” statements.

For example, “4 out of 5 tablet users had trouble finding the product comparison tool.” Percentages give a false impression of statistical precision.