BLOG

How to overcome the big data trust factor

PUBLISHED ON

Tweet about this on TwitterShare on FacebookShare on LinkedIn

 

If data is collected, yet marketers can’t trust it enough to make decisions around it, is it really worth all the effort? If anomalies, gaps and discrepancies are constantly rearing their ugly heads, how can marketers be expected to lay their jobs on the line making decisions around data that may or may not be accurate? We’ve heard these questions countless times and with good reason.

So to dig a bit deeper into this subject matter we partnered with Hub’Scan and SmartCurrent in order to discuss common data accuracy challenges and share tips on how to go about safeguarding your data analysis and reporting practices to ensure the highest standard of data quality. After all, marketers should feel comfortable and confident when using their reports to make better decisions.

What causes mistrust in data?

A number of factors may make marketers feel uneasy. For example: your numbers may not line up, or you find unexpected drops or wacky outliers that raise a red flag. But does this necessarily mean you’ve found inaccurate data? The first step is always considering the factors that could be at play:

  • Perhaps this isn’t actually poor quality data, but a real pattern of behavior that could tell you something about your customers or experience (awesome!)
  • Incongruent results between data collection systems may simply be the result of different measurement definitions. Frustrating as this may be, teams should double check how information is gathered by each system to verify the accuracy of the data. A common example of this is Facebook Ads vs. Google Analytics. The data doesn’t line up because the two tools calculate their metrics differently. Wondering why? Check out State of Digital’s explanation.
  • A data collection issue. In this case teams can collaborate and take steps to locate and amend the problem. This is where tagging solutions and audits come into play.

Each situation above is an opportunity for learning and improvement, but unless you are continuing to audit your data quality you cannot assure the quality of your data, and therefore, data consumers within your organization cannot feel entirely confident consulting data and using it to make strategic business decisions.

Creating a virtuous cycle of trust

In our webinar, Judah Phillips, founder of SmartCurrent and analytics pioneer, shared his ideas on how to create a virtuous cycle of trust in data. He noted that the implementation of both proactive and reactive best practices, or this virtuous cycle, will allow organizations to tackle data quality and trust at both ends of the spectrum.

A Proactive Approach

The pursuit of higher data quality starts with a proactive approach.

This part of the cycle is aimed at safeguarding one’s data collection efforts from the start. The data layer can be extremely complex and inconsistencies may not stand out to the human eye, especially as business needs grow and change. To make data collection sustainable and accurate, especially in fast-moving and dynamic environments, Judah suggests running automated data audits on a regular basis.

A full manual data audit, while potentially effective where human error doesn’t come into play, will likely take weeks, if not months, to complete. Perhaps you’ve considered sampling as a viable and more time-efficient alternative. In practice, however, it’s often not representative enough to fully ensure a high standard of data collection, and may even cover up some sneaky surprises that could come back to bite you later on down the road.

Judah suggests taking the safest and most reliable path of employing a tag management solution to automate the process and save significant amounts of time. Some auditing platforms, like Hub’Scan, even have built in tag correctors which make it easier to spot and fix tag issues instantaneously.

This approach is beneficial because not only will data consumers rest assured that data quality is consistently undergoing audits, but it will give analyst the freedom to spend their time doing what they do best: analyzing data and not manually auditing and fixing tags or broken links.

A Reactive Approach

On the other side of the cycle we have a reactive approach to data quality.

This approach allows humans who are attempting to make decisions and act on their data to use their unique experience, knowledge, collective intelligence and expertise to spot anomalies and lapses in data quality.

The most common method of facilitating this approach is through executive-level reporting.

While the whole point of a dashboard is to present accurate data to enable decision-making, and stakeholders will need to trust this data in order to take action based on it, dashboards can also be used to isolate early anomalies. Not only can reporting solutions deliver data with sufficient context to allow business stakeholders to properly interpret it and spot these, but they can also provide collaborative functionalities that allow these individuals to react in order to trust in data.

For example, where an executive spots an unforeseen drop in performance for a key metric on their report, they can flag the issue and request verification from an analyst. The analyst can then get back to them with further information.

  • In the case that a data collection issue was uncovered, they can explain the steps they took to resolve it and ensure data quality, perhaps through a data audit.
  • Or perhaps the analyst will run some further analysis and discover that the anomaly is not an anomaly and simply reflects an abrupt change in consumer behavior due to a marketing action taken. They could then share a textual description of what occurred, or annotate an interactive timeline with details on the impact of this on the metric.

To reactively manage data quality and build trust, executives should never leave any outliers unquestioned. It’s vital that they raise their hands anytime they question a data point. After all, it’s not necessarily a quality issue, but rather an insight that could help them optimize their strategy. And only when teams are able to collaborate, share knowledge and take advantage of collective expertise can a reactive approach be successful.

sweetspot quality assurance dashboard

Closing the loop

To really put the virtuous cycle in place in your organization, it is essential to have a culture that supports both a proactive and reactive approach to building trusted data collections and reporting systems. After all, you want your data to be accurate and actionable, and if stakeholders don’t feel it’s trustworthy, well then you might as well not have any data at all. So focus on actionability, regularly run audits and generate reports that can help you leverage data quality and don’t forget that it’s a cycle for a reason: ensuring data quality is a continuous endeavor that requires constant attention in a dynamic environment.

Check out the full webinar for Judah’s ideas on how to create a virtuous cycle for data trust.

Tweet about this on TwitterShare on FacebookShare on LinkedIn

Holly McKendry

Sweetspot Marketing Director. Wakeboarder & travel enthusiast. Communication Studies graduate of Texas State University, San Marcos.


Add a comment

Try Sweetspot today!

Not Another Dashboard.