Data quality isn’t a new problem for marketers. But the consequences of leaving it unaddressed are increasingly evident. Gartner estimates that on average, poor data quality costs each organization upwards of $12.9 million annually due to misleading insights, poor decisions, and, ultimately, wasted resources.
When we commissioned this research, we simply wanted to understand the extent of the data quality problem for marketers and, where possible, its root causes. However, we weren’t fully prepared for the extent of the issue and how widespread it is. That, on average, CMOs estimate 45% of the data their teams use is incomplete, inaccurate, or outdated, is a red flag. Almost half of the data that marketers are using is effectively worthless.
This red flag is made even more alarming by the rise of AI analytics tools. Marketing-specific AI tools have evolved so rapidly in the past few years, and they will continue to revolutionize the way marketers work with data.
However, if you are working with flawed and poor-quality data, the most advanced AI analytics in the world will still only give you flawed and poor-quality insights. The simple truth is, if you aren’t going to fix the quality of your data, any investment into AI tools is at best a waste of money, at worst a recipe for disaster with teams acting upon flawed and misleading insights.
That said, perhaps this warning couldn’t have come at a better time. Precisely because of the advancements in AI, this represents an opportunity for marketers to push the issue of data quality to the forefront of business objectives. And some CMOs clearly recognize this. According to the research, 30% see improving data quality as the most important lever they can pull when it comes to improving performance.
The silver lining of course, is that AI isn’t just limited to data analysis. AI tools also have the capacity to help marketers improve data quality through automating processes, checks, and alerts. Approached correctly, businesses now have the chance to finally fix their data and in doing so, extract the most amount of value from AI-based analytics, not to mention the huge investments companies have already poured into their data and cloud infrastructures.
Ultimately, those businesses that do acknowledge the importance and invest in the quality of their data will be the ones who succeed in this new AI-driven era of marketing. Those who don’t will be the ones left behind.
On average, CMOs estimate that 45% of the data their teams use to drive decisions is incomplete, inaccurate, or outdated.
of CMOs see improving data quality as the biggest lever they can pull to improve marketing performance.
of CMOs believe that the biggest problem with their data is it is incomplete.
Top 3 marketing performance levers:
Top areas needed to improve data quality:
On average, CMOs estimate that 45% of the data marketers use to drive decisions is of poor quality. Importantly, not a single CMO thought the data driving their marketing decisions was more than 75% complete, accurate, and up-to-date.
That should set off alarm bells.
If the information guiding your decisions is wrong or missing, you’re essentially navigating with a faulty map. Every campaign plan, budget choice, and performance report risks doing more harm than good.
In the age of AI, poor-quality data becomes even more dangerous. As Vincent Spruyt, Global Chief Product Officer at KINESSO, puts it: “If you couldn’t automate without AI, you cannot automate with AI.” AI doesn’t fix cracks in your foundation; it accelerates them. Feed it incomplete, inconsistent data, and it will produce flawed insights faster than ever.
And, the problem goes beyond bad decisions. When data quality is low, trust erodes. And once trust is gone, people stop using the data altogether. Decisions revert to gut feel, and your analytics investment becomes an expensive ornament.
Data quality refers to the overall health and reliability of your marketing data. This means making sure your data is accurate, complete, consistent, unique, timely, and valid. When your data meets these standards, it becomes a trusted foundation for making informed decisions, running effective campaigns, and uncovering key insights.
On the other hand, poor-quality data can lead to costly mistakes, like sending promotions to outdated contacts or making critical strategic choices based on flawed information. These errors can damage customer trust, inflate budgets, and undermine the performance of your marketing efforts.
Simply put, data quality is the gold standard for ensuring that your data is fit for purpose. It’s what guarantees that your marketing data is reliable enough to act on with confidence.
Unsurprisingly, 30% of respondents say improving data quality would have the biggest positive impact on marketing performance, making it the most prominent area for improvement. This is followed by automating data workflows (22%) and improving data democratization (21%).
Marketing teams are keenly aware that focusing efforts on data quality as a whole would make a significant impact on their performance, and this urgency has likely been catalyzed by the evolution of analytics tools powered by AI that depend on clean, high-quality data. However, such low levels of data quality across the respondent demographics imply that many of them are still in the early stages.
Nevertheless, the fact that respondents are naming the problem here and acknowledging data quality as a complex issue to address, instead of ignoring it, is promising to see. It points to a broader shift in marketing operations. Marketers are prioritizing concurrent foundational fixes that require cross-team collaboration, like setting up processes for cleaning, aligning, and validating data. It's a sign of maturity in data strategy: get the basics right across the pipeline first, then scale smarter.
Asked what would have the biggest impact on performance, teams showed different priorities based on how automated their marketing data operations are.
Interestingly, it’s only marketing teams with high levels of automation that are the most concerned with data quality as a whole. Marketing teams with low levels of data automation ( who are still in the manual data wrangling stage) are more focused on access and automation, i.e., getting all their data in one place and automating the data collection process. In the early stages of automation, it makes sense that this foundational step comes as a priority before data quality. Data quality can only be an issue if you have access to your data.
This points to the notion that while data quality is a key issue, if you’re still wrangling data manually, automation is the first priority. Without it, quality efforts won’t scale, and you’ll spend significantly more time fixing problems than preventing them.
For this analysis, we grouped respondents by the proportion of key data operations areas they have automated across seven areas: data access, gathering, cleaning, validation, integration, reporting, and analysis.
Those with less than a third of these areas automated were classed as low automation, those with one-third to two-thirds automated as medium automation, and those with more than two-thirds automated as high automation.
Across industries and regions, data quality is widely seen as the leading marketing performance lever. Financial services report the highest figure at 66% as regulatory demands make accuracy essential. Consumer packaged goods follow at 40% where clean, consistent data is needed to manage complex product lines and distribution networks.
Three industries stand out for not placing data quality at the top. Marketing agencies put automation first at 26% reflecting the need to streamline processes across varied client datasets. Technology companies split evenly between transformation and quality at 27% each showing a sector balancing the challenge of structuring data with the push to raise quality standards. eCommerce leads with internal data skills at 27% indicating that the data and platforms are in place but performance gains depend on teams being able to interpret and act on information.
By region, data quality leads everywhere but the second priority varies. In the UK, 26% chose improving access and democratization reflecting a focus on how teams reach and use data. In the US and DACH, 26% selected workflow automation pointing to a drive to move data more quickly and efficiently through the pipeline.
Oddly, despite admitting that 45% of their data is poor quality, and touting the improvement of data quality as the most powerful lever to drive marketing performance, 85% of respondents said they trust the quality and completeness of their data. That contradiction suggests a troubling norm: marketers know the data is flawed, but they’ve grown used to working around it.
A 100% stacked bar with segments for each response option; each segment’s width equals its percent of respondents.
For years, the issue of data quality has been something of an open secret in marketing. While teams know improving data quality would drive better results, the task is a daunting one. Coupled with uncertainty about where to begin, this has led many to put it off in favor of less sustainable short-term wins - like building another dashboard instead of fixing the data pipeline underneath it. Data quality has been talked about as a problem area for so long that the alarm bells have faded into background noise. Low-quality data is accepted as the cost of doing business.
This makes data quality marketing’s biggest blind spot, hiding in plain sight. Data quality is so deeply embedded in the day-to-day that many teams no longer question it. However, this complacency won’t last. The rise of AI in analytics is forcing marketing teams to confront data quality head-on. Not only is high-quality data the fuel for effective AI models, but AI-powered tools now make it faster and easier to clean, transform, and govern data at scale.
When asked what the biggest problem was with the quality of their data, the top answers were completeness (31%) and consistency (26%), with these two issues combined accounting for more than half of respondents.
That completeness ranks so high is concerning, as it’s an early and important part of the data pipeline. This tells us that almost a third of marketing teams don’t have access to all the data they need, and suggests issues with teams being unable to connect to and extract data from all the platforms they use. Needless to say, incomplete data will drastically skew any insights you hope to draw. Fixing this usually comes down to decent automation and monitoring. With automated pipelines, validation checks at the point of ingestion, and alerts when something’s missing, you can stop incomplete data before it warps your reporting.
Consistency ranking as such a salient data quality issue indicates that businesses are failing to accurately clean and harmonize their data in a way that allows them to compare apples to apples across data sources. That means they’re struggling to standardize things like date formats, currencies, naming conventions, or even, on a more fundamental level, how metrics are defined. Automation can help alongside a robust data governance strategy and various tools such as data dictionaries, mapping, and transformation tools. But it starts with cross-team agreement on the standards and processes you’ll follow.
The biggest data quality issues change depending on how automated a team is. Once again, we can see a data maturity curve beneath these responses. Teams with less automation struggle with foundational gaps, while those further along deal with scale and structure. At low automation, the priority is simply pulling all the data together quickly enough to be usable. At medium and high automation, teams focus on ensuring the system stays clean, reliable, and scalable as data flows grow more complex. This involves more complex tasks which depend on automation, such as de-duplicating data, aligning formats and definitions to be consistent, and putting checks in place to ensure accuracy.
Some data quality challenges cut across every market and sector, but others are shaped by local regulations, industry norms, and the complexity of the data environment. In DACH, the biggest struggle is keeping data consistent across platforms, a sign that even when datasets are complete, mismatched formats and standards can undermine their usefulness.
By contrast, both the UK and US cite completeness as their primary blocker, pointing to a more fundamental problem: data that’s missing or only partially captured. In the US, concerns about uniqueness and consistency also rank high, suggesting a broader fragmentation problem where duplicated or conflicting records creep into analysis.
Industry patterns tell a similar story. Marketing agencies grapple most with consistency, which isn’t surprising given the need to reconcile multiple client and platform feeds into a unified view. For CPG, retail, eCommerce, and finance, completeness dominates the list of concerns, likely reflecting long data pipelines and heavy reliance on third-party sources where drop-off can easily occur. Media and entertainment stands apart, reporting the highest concern for uniqueness. This implies that duplicate audience records or campaign data are skewing performance tracking.
Good data quality is key to making data-driven decisions that will improve marketing performance. Without a solid foundation of accurate data, teams cannot hope to gain accurate insights.
Improving marketing performance isn’t about pulling a single lever; it’s about recognising how tightly everything is linked. Data quality depends on access. Access relies on automation. Automation only works when teams are aligned around shared processes. These aren’t standalone problems; they’re all part of the same ecosystem.
Marketing teams may well be turning a blind eye to their poor data quality because it feels insurmountable. The truth is, improving data quality isn’t one big task that keeps getting put to the bottom of the to-do list - it’s a gradual and ongoing overhaul of the entire, interconnected data pipeline. The most effective teams don’t treat data quality as a one-off project to fix, but a dichotomy to manage. They choose to invest in structure over shortcuts. Because when the foundations are solid, everything else works better: reporting gets faster, decisions get smarter, and teams stop spending their time untangling yesterday’s mess.
Invest in the fundamentals: consistent naming, clear ownership, automated flows, and taxonomies that make sense across platforms and teams. These are the foundations that make everything else possible. You don’t need to put out every fire. You just need to stop building things that burn.
In terms of priorities around data governance, Access and ownership top the list, with 28% saying this is their biggest focus for the coming year. Again, this indicates that teams are struggling with clear responsibility and roles surrounding data. Before they can move forward, marketing teams need to take a step back to decide exactly who controls what data and processes. Without clear responsibility, even the best tools can’t prevent gaps or misuse.
Monitoring is the second-highest priority, chosen by 24% of respondents who want stronger oversight to catch errors early and keep quality from slipping. This focus reflects a promising shift from reactive clean-up to proactive prevention, where teams aim to spot anomalies before they impact reporting or decision-making. Security follows at 16%, showing that compliance pressures, evolving privacy laws, and the reputational risk of breaches remain high on the agenda.
This research was commissioned by Adverity to explore the current state of marketing data quality, the challenges teams face, and the levers they believe will most improve performance.
The report is based on a survey of 200 CMOs across the US, UK, Germany, Austria, and Switzerland. Respondents were evenly split between B2C brands and marketing agencies. All data was collected in Q2 2025.
Lee McCance, CPO at Adverity, brings 20+ years of product leadership from roles at GroupM, Essence, and McAfee. He’s now spearheading Adverity’s expansion into AI-powered, customer-centric data analytics solutions.
Lily Johnson is a Content Manager at Adverity, where she leads the creation of research reports, long-form editorial, and thought leadership on topics ranging from data governance to retail media and AI in marketing. She also produces Adverity’s The Undiscovered Metric podcast, bringing expert voices into the conversation around data and marketing. With seven years’ experience in B2B content marketing, she’s helped shape content strategies across the SaaS, retail, and events sectors.
Tom Rennell is Head of Content & Communications at Adverity, where he leads the team responsible for all brand, editorial, and external messaging across the company’s owned channels. With over a decade of experience in content strategy, communications, and storytelling, Tom has shaped messaging for global organizations ranging from Alibaba to the United Nations.
Respondents came from small and midsize businesses as defined by Gartner; small businesses are defined as organizations with an annual revenue of between 20 million and 50 million USD while mid-sized enterprises are defined as organizations with an annual revenue of between 50 million to 1 billion USD.