Make better business decisions by increasing clarity and confidence in your data.
Auditing your analytics forces you to question the integrity and quality of your data, and you’re forced into conversations about what you’re collecting in the first place. There are plenty of things at the account level that should be subject to question. For example, is your analytics data sampled? While the question may be simple, the answer determines how you deal with your data. Go through your account report-by-report at the highest level, question whether the data makes sense on an intuitive level, and never stop questioning the integrity of your data.
For example, it’s not uncommon for your Google Analytics ecommerce data to be slightly different than that of your ecommerce software. This could be for a variety of reasons, but you want to make sure it’s within a reasonable degree of accuracy (>95% similar). The more similar the data across different systems is, the more solid it is and the more confident you can be in it.
A lack of correct cross-domain or subdomain tracking can fundamentally fracture your view of the customer journey. Checking for correct tracking starts at the highest levels and goes all the way through your Google Analytics setup. For example, there are a variety of questions to explore within your property settings:
- Is the default URL correct?
- Are your referral exclusion settings correct - for example, when you use PayPal to process payments?
- Is enhanced link attribution turned on?
- Are Demographics and Interest reports enabled?
- Is Google Search Console properly linked?
Some common technical issues that are relatively easy to fix but hard to spot include:
- Missing tracking code on certain pages, resulting in self-referrals or unrecorded visits.
- Missing tracking code on 404 pages and 500 server pages.
- setDomainName missing on subdomains, resulting in the referring keyword not being tracked and two visitor sessions being recorded.
- iFrame banner tracking causing double cookies.
- URL fracturing in behavior reports.
- Missing manual tags in emails, newsletters, RSS, social, Google Products, and Google News.
Question the importance of the metrics you’re tracking. Oftentimes you’ll come to the sad conclusion that you can’t act upon the data you’re collecting. While this doesn’t negate the power of exploratory data analysis, it does bring the inherent value of certain metrics into question. Always ask yourself, What does this data mean for the business? It’s easy to overestimate the importance of a given micro-conversion or micro-metric.
Asking yourself what you can do to get more meaningful results moves you up the digital analytics maturity curve, as you go from questions like, Can I even trust these numbers? to What data tracking should be in place for optimal insights and actionability? Some questions you might ask at this level include:
- Have you mapped out meaningful events for a more comprehensive view of customer behavior?
- Do you have defined outcomes or conversion points, KPIs, and dashboards?
- Are your data sources merged in a way that you can view a larger part of the full customer journey?
Keep in mind that deciding what to track is a balancing act, not a copy/paste recipe. Tracking everything may confuse a team that may have been better served by focusing on a few key events, while being too conservative with your tracking may hide key elements of customer behavior.
Pushing out an analysis or report with errors in it both leads to bad business decisions and undermines the credibility of the analyst or analytics team. Start with common sense: Do the numbers look weird? You can ask a team member or a colleague to QA the data briefly to detect anomalies.
- Pull up each query and segment in the tool you created it in and walk back through what’s included.
- Re-pull the data using those queries/segments and do a comparison with wherever you put the data to do the analysis.
- Proofread the report for poor grammar, typos, or inadvertent backwards labeling.