Develop an analysis framework and choose the right tooling

Business Benefits

Develop a methodology to work with analysis tools in your organization and choose the right tools based on hard criteria to support this.


[Conduct end-user interviews ](<https://cxl.com/institute/playbook/conduct-user-interviews/(opens in a new tab)>)with stakeholders such as Product Owners, Business Analysts and Team Leads to determine analytical requirements based on business needs

Force stakeholders to rank all business analytical needs based on urgency and importance as well as business impact. For example, “average page views per month” is not as impactful, urgent nor important for the business as “subscription churn per plan type for the last 12 months, segmented by signup date to the product and monthly/yearly plans”.

[Determine the right model of tooling](<https://cxl.com/institute/playbook/choose-an-analytics-tool/(opens in a new tab)>) by assessing each need based on self-service feasibility

Here you will assess each business need in the context of how easily it can be enabled in a self-service analytics model for your organization.

Use a ranking system, such as “ABC” where A = easily accessible, to rank each business need.

Example: “subscription churn per plan type for the last 12 months, segmented by signup date to the product and monthly/yearly plans” is ranked as “B = moderately accessible, with some training”

Create a list of all the business analytical needs collected, ranked by those 3 factors, in the same spreadsheet as rows.

Sort the list based on the business impact, then importance and urgency (Eisenhower’s Matrix). For the top 20% of items, use 80% of the time you have allocated to this initiative to bring them to reality. For the other 80%, you will invest the remaining 20% of your time.

Determine the most common self-service model that is suitable for your organization

Based on the entire list of your stakeholders’ analytical needs, determine which of your self-service models is most appropriate (i.e. with your ranking system “ABC” or otherwise).

Example: nearly 90% of your analytical needs are categorized as “B = moderately accessible, with some training”.

Create a list of all the features you need for the self-service model in a spreadsheet, as rows.

Each self-service model will require different types of tools. An org looking to do only complex, analyst-accompanied analysis work where stakeholders are given reports created by a central team might opt for a SQL-based reporting solution directly on the data warehouse, where another org might choose to use a simple tool such as Redash to offer users in the company an easy way to create simple analyses.

Cluster them into categories such as “Financials”, “User Enablement”, “Data Visualisation” etc.

Example: the feature “users can easily share reports with other users in the org” is in the “user enablement” category.

Find the tools that fit your self-service scenario and add them as columns to your spreadsheet, ranking their fulfilment of each feature

For each tool make a rank of how well they fulfill the requirement (in each cell), for example, 0 = not at all, 1 = somewhat, 2 = fully.

Example: Heap Analytics ranks “2” for “users can easily share reports with other users in the org” as they have a good report sharing feature.

Set a meeting and align with your stakeholders on which tool best fits all criteria.

Your desired outcome of the meeting should be deciding on which tool you will then “test drive.”

Run a simple proof-of-concept with the tool you selected on a smaller subset of your business/product.

If possible, go beyond a limited 7/14-day trial and get at least one month of hands-on experience The test drive should involve as many of your stakeholders as possible but MUST include the users who will be working with these tools daily (not just C-level or managers).

Repeat the alignment meeting and proof-of concept if the tool doesn’t meet expectations - otherwise, purchase and implement your selected tool.

Some possible reasons for “bust” could be:

  • The required budget will be higher;
  • Client support was not good; the tool interface was poor;
  • Feature X didn’t fulfill the functionality as you intended

Last edited by @hesh_fekry 2023-11-14T11:45:41Z