The Importance of Data Quality to Your Business Analysts

Following data quality best practices requires establishing metrics for the cost of errors so that your business can approach customer data analysis strategically. Both false positives—the linking of unrelated records—and false negatives—leaving related records unlinked—can carry a substantial cost for your business’s reputation as well as unrealized marketing opportunities. Uncaught errors can lead to poor business decisions, unnecessary expenditures, and customer dissatisfaction. Some errors are more costly than others, so the metrics have to consider both their likelihood and cost. The business use case, which describes the process of an outside actor (customer or partner) achieving a specific goal with your business, is the most important guiding principle when it comes to data quality. Accurate, actionable data will help your business analysts facilitate more targeted, personalized customer service, increase the ROI of your technology investments, and drive sustained results across teams and platforms.

Measuring Data Quality Best Practices

A paper from the MIT Information Quality program, “An Evaluation Framework for Data Quality Tools,” presents four measures for data quality best practices: completeness, accuracy, consistency, and relevancy. Implementing comprehensive internal processes that check and recheck data can measure for completeness and consistency. Obtaining accuracy requires a comparison with both external and internal data sets. Evaluating relevancy depends on your business’s goals and objectives. Effective data quality tools must reject inaccurate data when it is detected and be able to resolve or reject inconsistent data depending on the context of the particular data.

Asking questions relevant to the business use case can improve your system’s data quality best practices:

  • What data sets have the biggest impact on business processes?
  • What data points should be corrected first?
  • What data cleansing criteria will return the greatest benefit for the cost?

Your business analysts should be at the forefront of your efforts to leverage data quality tools, master data management systems, and data governance for a disciplined marketing strategy that delivers results. That’s why it’s important to take a long-term, strategic approach to data quality, not just constantly react to problems as they arise. Fixing mistakes after the fact can cost a lot more than spending the time and initial investment in data quality tools that catch errors in advance. Inaccurate data can lead to missing opportunities and acting on the wrong information, resulting in legal and financial risks.

Customizing Data Quality Best Practices

The goal for marketers and business analysts is to remove all false negative errors. They want to link all accounts possible so they can be most efficient with their contacts and sales campaigns. They care a little less about the false positives and more about the false negatives, since losing a valid contact is worse than having a spurious one. Business context is the key. In record linking, there is always a balance between false negatives and false positive errors.

In some industries, false positives are the biggest problem. An example would be attaching the wrong address or phone number to an account. In healthcare, this can lead to the delivery of personal data to the wrong person, violating HIPAA privacy requirements.

Profiling incoming data helps to avoid costly surprises. If analyzing a new data set shows a high error rate, it may have even more uncaught errors. In this case, it’s necessary to decide whether to accept it in spite of the errors, perform additional cleansing, or reject it outright. Rejecting a poor-quality data set may cost less in the long-run than using it. So, think of data quality tools as an integral part of your business strategy that helps avoid embarrassing and costly errors, and provides your business analysts with the resources they need to drive long-term, sustainable growth.

Black Oak Analytics employs High Performance Entity Resolution (HiPER), an entity identity information management (EIIM) system that allows you to effectively identify and market to a targeted audience with increased efficiency through entity matching and resolution.

If you are interested in in learning more about how your business can more effectively follow data quality best practices, contact Black Oak Analytics today.

Leave a Reply

Your email address will not be published.