Why Poor Data Quality is Holding Back Big Data

Remember the old adage, “Garbage in, garbage out”?

We can’t expect an algorithm to deliver accurate insights if it is fed with low quality data. Yet, all too often poor data quality only becomes a recognised issue when the algorithm starts delivering obviously dud results. The rest of the time, the vast majority of enterprises are content to put up with incomplete, inaccurate, duplicate or otherwise low-quality data, which can be felt any number of ways: a minor inconvenience here, a laggy system there and a reliance on guesswork, workarounds and manual processes.  

But when it comes to the real impact of bad data, that’s just the tip of the iceberg.  

The Hidden Cost of Poor-Quality Data

Strong data governance helps companies get a handle on their data, produce better insights and improve compliance. It lowers risk and helps them adapt more readily to market change. So it is alarming that the UK government estimates businesses spend 10-30% of their revenue handling issues relating to data quality, something that’s an inherently fixable problem.   

Data quality is also an issue that’s not going away any time soon. Gartner estimates that a third of businesses are investing heavily in data-reliant initiatives like AI that streamline processes, enable faster decision-making and confer competitive edge. But low data quality could seriously stall these efforts. Poor quality insights – or lack of trust in those insights – result in reduced revenues, decreased productivity, operational inefficiency, missed business opportunities, as well as increased exposure to regulatory, security, and compliance risks. 

What’s more, with some sectors experiencing several years’ worth of digital transformation in just a few months during the pandemic, many companies are now more reliant on data but less able to handle it.  

Raising Data Standards

The path to good data governance starts with a targeted overhaul of data practices to examine where methods are outdated and to ensure data is standardised. However, a longer-term approach is also required in the form of a data governance strategy. Faced with a huge volume of various data types, a comprehensive data governance framework takes into account context, requirements and the business value of the data. Additionally, it creates clear lines of accountability when it comes to maintaining accurate, consistent, timely data that conforms to all necessary regulations. 

Everyone’s Responsible for Good Data

Considering the effects of poor data quality can be felt right through the enterprise, data quality should be an enterprise-wide priority. Data governance, therefore, needs a cross-functional approach. This should begin at the top – senior execs are the ones capable of driving change across business functions. Next, educate employees to understand how the data they touch is used more widely. Tracking and resolving data issues quality issues fast is also important: seeing data quality being taken seriously encourages all those in the organization to be more accountable.  

AI Loves the Jobs You Hate

AI has been transformational for big data projects and now it can provide a much-needed boost to data standards reform. Bots are used to clean up data (optimising, de-duping, cataloguing, metadata handling and so on). These tools help ensure rapid ROI, dramatically reducing the time it takes to transform unruly data.  

Why Quality is a Competitive Differentiator

If you ask any business to highlight the digital transformation steps it is prioritising in the year ahead, the reply would almost certainly include data-intensive projects – analytics, cloud, AI, RPA, ML. Ensuring data integrity will be key to the success of these initiatives. No matter what stage of digital maturity a business is at, it can take practical steps to boost data quality and as a result improve decision-making. In the words of Gartner’s Melody Chien, “Good quality data provides better leads, better understanding of customers, and better customer relationships. Data quality is a competitive advantage.” 


About the Author

Martin Williams is the UK head of Infostretch, a digital engineering services firm focused on helping organisations accelerate their digital initiatives from strategy and design to development, testing, implementation and data intelligence. Backed by Goldman Sachs Asset Management and Everstone Capital, Infostretch’s custom digital engineering solutions leverage deep technical expertise, agile methodologies, and cutting-edge development to create a scalable digital roadmap. Martin is responsible for planning, development and client delivery for the UK.

Featured image: ©Quardia

Copy link