data quality image

Data quality is said to be the single most important determining factor of success or failure – the better the quality of your data, the more likely you are to make better decisions. 

In business as in life, nothing is guaranteed. But good quality information is increasingly being demanded by decision-makers who know that the volume of data now being collected is colossal, and there should really be a way to get some useful insights out of that, if only it was presented better.

Citibank has issued a request to media agencies to improve their data quality if they want the bank’s business.

The company wants to spend more on digital advertising but wants marketing experts to prove the value of their information.

Speaking at the 4As Data Summit earlier this week, Michael Wexler, director of digital insights and marketing effectiveness for Citibank, was quoted by AdExchanger.com as saying: “If you’re not up on your tagging and keeping track of [multiple data sources] as a business, those things will really kill you.”

Another organisation calling on providers to improve their data quality is the US government.

In a report, the US Government Accountability Office criticised the Centers for Medicare & Medicaid Services (CMS), the agency within the Department of Health and Human Services (HHS) that administers the Medicaid program.

Medicaid errors relating to data quality have cost the government an estimated $36 billion, and the GAO says it found that the usefulness of the data provided by the sources is “s limited because of issues with completeness, accuracy, and timeliness”.

The GAO adds the CMS “has not fully developed its plans to ensure the quality of T-MSIS data, and its plans for using these data for oversight purposes remain preliminary”.

The effort to redesign or replace the Medicaid IT systems in 30 states across the US is ongoing, and sounds like it’s in disarray, which may give the incoming Republican administration of Donald Trump more leverage in political discussions on the subject with Democrat supporters of the current health system of the country, popularly known as Obamacare.

Meanwhile, a nonprofit consortium of companies and organisations called XBRL US has published its third set of data quality standards.

The standards were formulated by XBRL’s data quality committee and are freely available.

The organisation says the rules are mainly designed to identify and correct errors in their Securities and Exchange Commission filings.

Campbell Pryde, CEO of XBRL US, says it’s “critical” for public company filers to make sure they are providing the best quality financials.

According to Experian, one of the largest credit checking agencies in the world, plain old human error is often to blame for poor data quality.

The company conducted a survey of more than 800 data professionals at large companies in the UK, US and France.

In the resulting report, called The Data Advantage: How Accuracy Creates Opportunity, Experian concluded that good quality data leads to costs savings, increased efficiency and many other business benefits.

Joel Curry, UK managing director of Experian, says: “Data quality clearly remains high on the agenda as organisations worldwide increasingly see data as a strategic asset – indeed, data quality strategies are now in place across the vast majority of organisations worldwide.”

Experian’s survey found that “more than 99 per cent” of organisations now have a data quality strategy in place, and yet perhaps a surprisingly high 94 per cent of organisations “suffer from common data errors”.

The most common data errors Experian found were:

  • Incomplete or missing data – 55 per cent
  • Out-dated information – 45 per cent
  • Duplicate data – 43 per cent
  • Typos – 32 per cent

Experian lists a number negative impacts of poor quality data, as well as the corresponding positives of having good quality data, but those are things that most people could probably guess.

The methods of – or suggestions for – managing contact data accuracy used by the companies questioned by Experian were:

  • Use a dedicated point-of-capture software tool – 40 per cent
  • Use dedicated back-office software to clean contact data – 40 per cent
  • Measure response and return-to-sender rates – 36 per cent
  • Carry out data analysis using Excel spreadsheets – 40 per cent
  • Work through databases manually line by line – 27 per cent

Wait! There's Always More!

Do you love the insights you find on EM360˚? Receive blog updates direct to your inbox.

LEAVE A REPLY

Please enter your comment!
Please enter your name here