How "Good Enough" Quality is Eroding Trust in Your Big Data Insights
Results from Precisely’s 2019 Enteprise Data Quality Survey
Precisely’s 2019 Enterprise Data Quality survey explores the challenges and opportunities for organizations looking to bring data quality across the enterprise as data volumes grow and new technologies emerge.
Precisely polled 175 respondents, 69 percent of whom work for organizations with over 1,000 employees. Participants represented a range of industries, with the largest percentage coming from Financial Services (25%), as well as a range of positions, ranging from CDO to Data Analyst, with the majority in data-focused roles (29%).
Good data isn’t good enough any more
There is a disconnect around understanding, confidence, and trust in the data and how it informs business decisions.
72 percent responded that the quality of the data used to run their business was good or better and 69 percent stated their leadership/c-suite trust data insights enough to inform business decisions on them. Yet, they also reported that only 14 percent of stakeholders had a very good understanding of the data and that less than 60 percent of the data was well understood by stakeholders.
More than 70 percent also reported that sub-optimal data quality negatively impacted business decisions, and almost half found that untrustworthy results or inaccurate insights from analytics were due to a lack of quality in the data fed into systems such as AI and machine learning.
Understanding Data Across the Organization
- 14% – very good understanding
- 48% – good understanding
- 29% – partial understanding
- 6% – minimal understanding
- 3% – very little or no understanding
Download the this Precisely report and see highlights from the survey as well as a deeper look at the full report results.