Ten mistakes to avoid in data quality management

By I Huntly, Rifle-Shot Performance Holdings


Published in:

Electricity+Control March 2014 (pages 4 – 6)

Enquiries: Ian.Huntly@rsph.co.za


Download the full article on Ten mistakes to avoid in data quality management in PDF format.


The corporate data universe consists of numerous databases linked by countless data interfaces. While the data continuously moves about and changes, the databases and the programs responsible for data exchange are endlessly redesigned and upgraded. Typically, this dynamic results in information systems getting better, while data quality management deteriorates. This is unfortunate, since quality is what determines data's intrinsic value to businesses and consumers.

Yet we tolerate enormous inaccuracies in databases and accept that most of them are riddled with errors, while corporations lose millions of dollars because of flawed data. Even more disheartening is the continuous growth in the magnitude of data quality management problems, fostered by exponential increase in the size of databases and the further proliferation of information systems.

There are ten mistakes to avoid in data quality management


Inadequate staffing of data quality teams

A data quality management team must include IT specialists and business users. It also needs data quality experts.


Hoping data will simply ‘get better’

One of the key misconceptions is that data quality management will improve by itself as a result of general IT advancements. The only way to address the data quality management challenge is through a systematic, on-going programme that assesses and improves existing data quality levels.


Lack of data quality management assessment

Assessment is the cornerstone of any data quality management program. It describes the state of the data and advances understanding of how well the data supports various processes.


Narrow focus

Data quality management programs should focus equally on all of the company’s data.


Bad Metadata

Data quality management programs should start with extensive data profiling.


Ignoring data quality management during data conversions

The ideal data conversion project begins with data analysis, comprehensive data quality assessment and data cleansing.


Winner-loser approach in data consolidation

The correct approach to data consolidation is to view it in a similar light as data cleansing.


Inadequate monitoring of data interfaces

The solution to interface monitoring is to design programs operating between the source and target databases that are entrusted with analysing the interface data before it is loaded and processed.


Forgetting about data decay

Recurrent data quality management assessment and sample comparison against trusted sources provide information about the rate of decay and show the categories of data that are most prone to quick decay.


Poor organisation of data quality metadata

Design a comprehensive data quality metadata.


Take note:

  • Corporations lose millions of dollars because of flawed data.
  • Data quality management is intrinsic to businesses and consumers.
  • Data quality management can be managed by simple, but skilled intervention.