AI projects are faltering as CDOs grapple with poor data quality

Chief data officers say they can't maintain consistent data quality, and that it's affecting AI outcomes

featured-image

Only a third of businesses are making meaningful progress in adoption, and are pinning the blame on poor data quality. More than two-thirds (68%) of said data quality was their top challenge, according to the Ataccama . Meanwhile, four-in-ten struggle to maintain consistent data quality, directly hindering AI outcomes.

The report noted that trust is a critical factor when leveraging data in daily operations. Without it, organizations face inefficiencies, poor decision-making, and the risk of compliance failures, ultimately limiting their ability to achieve business objectives. However, the report found most organizations still struggle with .



"Untrusted data erodes every decision it informs. Without real insights into data quality, businesses risk cascading failures, from unreliable AI outputs to stalled growth," said Krishna Cheriath, chief digital Officer at Thermo Fisher Scientific. "Trust must permeate every layer — data, models, and decisions.

" Bad data leads to bad insights, the report noted, affecting decision-making, slowing down operations, and wasting valuable resources. Similarly, it jeopardizes compliance, leaving organizations vulnerable to regulatory and financial risks, and diminishes ROI. Knowledge gaps are hampering digital progress Knowledge gaps around data trust and governance are slowing progress, the report found, with a lack of unified standards leading to inconsistency.

Without guidelines for data formats, definition, and validation, CDOs find it hard to establish a centralized system of control. A third of organizations experience processing delays because there are too many barriers that stand in the way of integration. "Fragmented systems bleed efficiency and inflate costs," said Andrew Foster, chief data officer at M&T Bank.

The report also found are still a major barrier to innovation, with CDOs finding that outdated systems are ill-equipped to handle increasing data volumes. Many systems are designed to provide period data updates, for example, rather than continuous, real-time streams. It’s this area in particular that’s causing serious trouble for CDOs ramping up AI adoption, with only a third of enterprises reporting meaningful progress on this front.

The report called for new national standards for data quality in the UK, and suggested that the proposed National Data Library - a core goal within the - could play a key role in bolstering national data governance benchmarks and best practices. These standards would ensure clear compliance guidelines while supporting the UK’s pro-innovation regulatory goals. "The report makes one thing clear: initiatives rely on a foundation of trusted data," said Jay Limburn, chief product officer at Ataccama.

"Without addressing systemic data quality challenges, organisations risk stalling progress. The UK’s approach to shows how aligning data trust principles with national standards and infrastructure modernization can deliver tangible results.".