97 per cent of companies’ data fails to meet basic standards, says report

data code screen

Approximately 97 per cent of companies’ data fails to meet basic quality standards, according to a report published in the Harvard Business Review

Data quality is a critical issue, and if this report is to be believed, then the vast majority of data is so poor quality that it might not be worth using.

Using a scale of 0-100 to indicate data quality – 100 being the highest quality obviously – the researchers asked managers to score data quality based on 10-15 data attributes.

Their findings included the following:

  • Only 3 per cent of data was of an acceptable quality.
  • Almost half of the records had at least one critical error, which would impact work.
  • The variation in data quality ranges to 0 to 99 per cent.

The researcher did a simple calculation to show how much extra money poor quality data can cost.

 

They use something called the “rule of ten”, as they explain.

“Suppose you have 100 things to do and each costs a $1 when the data are perfect.

“If all the data are perfect, the total cost is 100 x $1 = $100.

“If 89 are perfect and 11 are flawed, the total cost is 89 x $1 + 11 x $10 = $199.”

They add, however, that in reality, the operational cost is far higher, as the rule of 10 does not account for lost customers, bad decisions, or reputational damage.

The researchers add: “These results should scare all managers everywhere.

“Even if you don’t care about data per se, you still must do your work effectively and efficiently.

“Bad data is a lens into bad work, and our results provide powerful evidence that most data is bad. Unless you have strong evidence to the contrary, managers must conclude that bad data is adversely affecting their work.”