Improve user experience and engagement on low-traffic sites.
Test ideas that are backed by previous data or research.
Don’t test any random ideas, as this is not fruitful and can waste resources. Aggregated patterns - either derived from your research, UX studies, or projects like Good UI - can be great starting points for prioritization.
Use customer feedback, net promoter scores, and sales to listen to the voice of your customers.
Perform more in depth, basic analysis of your funnel and exit pages.
Prior to planning your A/B test, use this research to develop a more specified target.
Use tools like Formisimo to analyse the performance of your forms.
Invest resources to focus on micro-conversions for steady wins and development.
For example, look at pages per visit, clicks, and traffic sources.
Consider tests with low statistical significance.
For example, a test of 95% significance with visits per variation of 350k requires 70 days of testing, while a test of 80% significance would only need 60 days at traffic of 300k.
Test only your top pages to focus your efforts and resources.
Test radical changes to avoid inconclusive test results.
Launch changes and simply compare before and after results using time series analysis.
Tools like Causal Impact can help estimate effects on changes that can’t be run via controlled experiments. Otherwise, simply launch changes based on your research or intuition and keep an eye on the time series data to make sure things are improving. Tie this together with qualitative guardrails to make sure you’re not breaking the user experience with any new change. For example, session replay, event tracking, and heat mapping can provide useful insights and even quality assurance for CRO.