Use the ResearchXL Framework to come up with more winning tests

Contributors

@paul-boag


Business Benefits

Develop more winning tests that have a higher impact on your bottom line.


Start with heuristic analysis to identify areas of interest.

Assess each critical page in conversion on the site for the following:

  • Relevancy: Does the page meet user expectations in terms of content and design? How can it match what they want even more?
  • Clarity: Is the content or offer on this page as clear as possible? How can it be simplified or made clearer?
  • Value: Does the page communicate value to the user? Can it be communicated better? Can user motivation be increased?
  • Friction: Is anything on the page causing doubts, hesitations, and uncertainties? What makes the process difficult? How can it be simplified?
  • Distraction: Is there anything on the page that doesn’t help users take action? Does anything unnecessarily draw attention?

Avoid random comments, strictly stick to assessing pages for the above criteria, and document your findings.

Conduct cross-browser testing, cross-device testing, and speed analysis, and fix all technical bugs discovered.

  • Cross-browser and cross-device testing: Open up Google Analytics and go to Audience > Technology > Browser & OS report. Apply device segments, narrow your selection down to a specific browser version, find out if particular browsers convert less than others, and brainstorm possible reasons why.
  • Speed analysis: Open up Google Analytics and go to Behavior > Site Speed > Page Timings and turn on the Comparison to spot slower pages. Starting with the ones that have the most traffic, mark down all URLs that take longer than 10 seconds to load, and click the individual URLs for a list of all issues found.

Analyze your web analytics data to determine the what, where, and how much, of any problems you’re facing.

You need to know in advance, what you want to know and what you are going to do or change based on the answer. If nothing, then it’s not worth testing. Make sure your data is accurate and that everything important is being tracked, before the analysis.

Start with an analytics health check if your Google Analytics setup was done by someone other than yourself. In particular, make sure that it will gather all the data to make an informed decision relating to any test you are running.

Perform mouse tracking analysis to gain better insights into what your users do and how they behave on your site.

Use heat map sample sizes of at least 2000-3000 views per page/screen before fully trusting any results.

A few useful types of heat maps and mouse tracking analysis to look at include:

  • Click maps: Visual representations of where people click. Click maps help identify images or text that people think is a link or want to be a link.
  • Scroll maps: Scroll maps show you how far people scroll down and are useful for prioritizing content and identifying where you need to tweak your design.
  • User session replays: Similar to user testing, but without the script or audio. Session replays give you valuable insights into how your actual visitors interact with your site.
  • Form analytics: Not exactly mouse tracking, but mouse tracking tools like Hotjar, Inspectlet, and Contentsquare have this feature. Form analytics tools analyze form performance down to individual form fields and form optimization, a key part of CRO.

Use tools like Hotjar or Qualaroo to conduct on-site surveys and customer surveys, to gain qualitative insights on your customers.

Use either exit surveys or on-page surveys, to survey on-site users. Ask survey questions about the particular job the page is meant to perform, and find out whether users experience any fears, uncertainties, and doubts (FUD’s), or friction while on the specific page.

Send email surveys to recent, first-time buyers to determine whether they experienced any friction during the buying process. Aim for 100-200 responses. Don’t ask yes/no questions and avoid multiple choice whenever possible.

Conduct user testing to determine how actual people interact with your website and their thought process while using it.

Don’t ask for testers’ opinions, just observe what they do and pay attention to what they say and experience. Include 3 types of tasks in your test protocol:

  • A specific task.
  • A broad task.
  • A funnel completion.

Conduct copy testing to learn how your audience perceives your copy.

Recruit 15-20 people from your audience, formulate research questions about your copy, conduct interviews, and compensate panelists. Alternatively, use a tool like Wynter to do this for you. Look for answers to the questions:

  • What does your headline make your audience feel?
  • Do they care about the arguments you’re making?
  • Which benefits are they most interested in?
  • Do they even understand the copy in a specific paragraph?
  • After reading everything, what remains unclear and what objections do they have?

Add all identified issues to a master action sheet and allocate them to buckets.

Allocate every issue identified into one of these 5 buckets:

  • Test. Obvious opportunities to shift behavior, expose insight, or increase conversions.
  • Instrument. Anything analytics-related. This can involve fixing, adding, or improving tag or event handling on the analytics configuration. Instrument both structurally and for insight into the pain points you’ve identified.
  • Hypothesize. This is where you add any pages, widgets, or processes you’ve found that just aren’t working well but can’t identify a single, clear solution. Use these to brainstorm hypotheses and test plans driven by evidence and data.
  • Just Do It – JFDI. Issues that can be easily fixed or changed with minimal effort or are micro-opportunities to increase conversions. Items in this bucket can either be deployed in a batch or as part of a controlled test.
  • Investigate. Anything you need to ask questions about or do further digging, such as findings requiring testing with particular devices or more information to triangulate the problem.

Mark every issue with a star rating, based on ease of implementation and potential impact.

Use a scoring system from 1 to 5. 1 indicates a minor issue with low potential revenue or conversion value, but still worth fixing, and 5 indicates a critically important issue likely to have a significant impact if fixed.

Give more weight to issues that affect a large portion of your visitors, like problems on high-traffic pages or in the checkout funnel. Give more weight to issues that affect a large portion of your visitors, like problems on high-traffic pages or in the checkout funnel.

Add every issue you’ve identified to a spreadsheet and start with the things that make an immediate positive impact.

Create a spreadsheet with 7 columns:

  • Issue: What is the problem you have identified?
  • Bucket: How did you define the problem (see step 8).
  • Location: Where on the site is the problem?
  • Background: Any additional context relating to the problem.
  • Action: What specific next step needs taking?
  • Rating: How urgent is the problem (see step 9).
  • Person Responsible: Who is going to be responsible for addressing the problem?
1 Like

Hi @Rafaella Martins. Just testing or?

.