BLOG

Drawing the line between the reporting and analysis layers: The true nature of Dashboards

PUBLISHED ON

Tweet about this on TwitterShare on FacebookShare on LinkedIn

Analysis vs Reporting

There has always been a heated debate as to whether:

  • Analysis and reporting should be separate layers
  • Dashboards should be considered “reporting” or rather fall into their own category (“dashboarding”?)

Both dilemmas can be found at play in some good blog posts by Gary Angel (“Coupling Analytics and Reporting”), Avinash Kaushik (“Your team’s Data Capture, Data Reporting and Data Analysis effort allocation is 15%, 20%, 65%”) or Tim Wilson (“Why I don’t put recommendations on dashboards”). But I think they can be resolved more effectively by simply bringing a human element into the picture.

Differentiating between dashboards, thoughtful (ad hoc) reporting and in-depth analysis may not serve the analyst’s purpose when his/her role is considered in isolation.

An analyst will require alerts and status updates to help him define the priorities of his analysis efforts. He will need data visualization tools to complete the exploration of data and identify patterns. But alerts, updates, and visualizations become an indivisible part of the pre-existing analysis environment. A single place to go in search of answers or insights.

Now, a strong need for a separate reporting and insight delivery layer arises when other stakeholders come into play as consumers of metrics and insights (i.e., as decision makers). And this will invariably be the case unless:

  • The analyst is working on his own (becoming the single “consumer” of his output)
  • The data subject to analysis or reporting falls within the highly operational sphere. A need for real-time or near real time updates and decisions does away with this distinction, as there will be no time to look at data in a paused, reflective manner (this would be akin to driving your car looking at your speed and level of petrol).

How would that data delivery layer take shape? Without any doubt, it should focus on performance rather than the mere availability of data (no matter how well visualized). But, save that,  Why not let every organization choose the most appropriate balance between scalability and manual input?

Consider the following options:

  • Status updates: this is the traditional scorecard. It can be automatically updated at the very time that the new data arrives (i.e., start of the month for monthly data)

  • Status updates + quick first answer: this is the basic dashboard, where KPIs are accompanied by second-level metrics, tables and graphs that bring dimensions into the picture as a pre-defined first answer to whichever questions may arise from the KPI’s status

  • Status updates + quick first answer + processed answers: this is the edited dashboard (the analyst’s input comes into play asynchronously) where data and visualizations are subsequently accompanied by words

  • Workflow-powered dashboard: where not only comments are added, but also suggested actions and the people responsible for them

  • Ad hoc, manual report: a summary prepared by the analyst to highlight the “what”, “why” and “how” of the data analyzed for a given period. This could take the form of infographics or combine words and visualizations in the most effective way to get the message across.

5 flavors of performance-driven reporting

In the image above, the more rounded the icon, the more it depends on the analyst’s input for every data update, and the less scalable it becomes at enterprise level.

What are your thoughts? Would you favor one of these options for a particular reason?

Tweet about this on TwitterShare on FacebookShare on LinkedIn

Sergio Maldonado

Founder & Chairman at Sweetspot. Author, speaker on analytics, marketing technology, privacy compliance. JD, LLM (Internet law). Once a dually-admitted lawyer. Father of three. I love surfing and cooking.


Add a comment

Try Sweetspot today!

Not Another Dashboard.