Understand the behavior of users and improve your A/B testing process.
List the pages that you would like to test and develop hypotheses as to how they can be improved. Design alternative versions of the pages that you would like to use in your experiments. Define the experiments goals that will allow you to evaluate the experiment. For example, purchase conversion rate, bounce rate, and form submission rate.
Hotjar, CrazyEgg, or Matomo, offer A/B testing and session recording capabilities. Mouseflow is an alternative session replay tool that integrates with A/B testing tools like Optimizely or Unbounce.
Launch and evaluate the main experiments goals like purchase conversion rate, bounce rate, and form submission rate, using the A/B testing tools.
Use the results from the A/B testing tools to determine which version of the page worked better.
Use the session recording to better understand the behavior of the users on the underperforming pages.
- Analyze how far the users scroll down the page. See if any crucial information is missing above the fold.
- Analyze how users interact with large blocks of text. Do they pause and take time to read the text or just scan it and scroll down further?
- Analyze whether any elements in the navigation distract the users. Do they leave the page prematurely by clicking on the link?
- Analyze how the users interact with the videos and images. Do they view the product pictures in the gallery? Do they click to watch the product video? How long do they watch the video?
Document your findings based on session recording observations and develop additional hypotheses for testing.
Brainstorm the hypotheses with your team to come up with new experiment ideas. Launch new experiments based on these ideas and verify your hypotheses using session recording.