As C-suite members who carry out a business’s overarching goals, ethical executives should do everything in their power to be mindful of the data quality management their organization undertakes. Even a slight degradation in data quality can result in significant costs and require substantial person-hours to resolve, especially if the problem is ignored and allowed to escalate into a catastrophic issue. Learn more about why data quality management matters immensely, what you and your organization can do to rectify errors and prevent them from becoming unnecessarily burdensome when they occur, and how to develop best practices when handling data.
What is Data Quality Management
Data quality management is a set of practices, tools, and administrative capabilities used to ensure that data produced is consistently accurate, complete, and reliable, providing high-quality information throughout the entire data process. Executives who fail to recognize the importance of data quality management will find their key decision-making abilities hampered, their data incapable of producing dependable analyses, and may even risk their future business success.
Executive Framing: Why Data Quality Matters
1. Data Quality is Contextual but Always Business-Critical
All the data an organization possesses holds a wealth of knowledge critical to achieving actionable results if properly managed. Not everyone understands that to produce high-quality data, the data must be fit for purpose and should not be left as an afterthought. When executives incorporate strong data governance policies into the entire data collection and storage process, they can execute complex processes, including risk mitigation, data integration for software development, analytics, and master data management, with relative ease. However, executives should also regularly evaluate their quality management processes to create benchmarks for comparison and ensure that their data remains consistent in terms of quality and reliability.
Depending on the type of business involved, a single error can cause either a minor annoyance or immense difficulty. For firms like banks that rely on completely accurate data, even one incorrect transaction can have a knock-on effect, leading to payments not reaching their intended accounts, customers becoming dissatisfied, and the wrong data can negatively impact the bank’s general reputation. Conversely, a medical business using an error-filled database can still derive accurate diagnoses by employing sophisticated algorithmic software. The meaning of the phrase “fit for purpose” can vary wildly between companies. Still, one constant for executives to remember is that business performance and risks are directly tied to data quality. Keep a close eye on data quality to improve business performance and help minimize risks to the company, as any failures in data quality can result in significant costs, including lost time, money, clientele, and even damage to its positive public reputation.
2. Quality Failures = Strategy Failures
Executives looking for concrete examples of how a failure in data quality can lead to grander strategic failures should keep three scandals of the last decade firmly in mind: the Volkswagen emissions scandal, now known as Dieselgate, which cost the company more than 25 billion dollars, British Airways and Marriott’s 100 million Euro plus fines for mishandling data and not complying with regulations, and the Amazon Web Services (AWS) S3 outage that cost the company over 150 million US dollars due to a single typo performed during a routine maintenance and optimization session. Each incident is especially noteworthy for executives seeking to prevent issues arising from fraudulent data practices, failure to comply with government regulations, and even genuine accidents that can result in substantial financial costs.
Volkswagen took deliberate action in collusion with auto manufacturers Audi, Porsche, Daimler, and BMW, conspiring to avoid competition for technology designed to reduce emissions. Volkswagen and Audi even went so far as to deliberately install vehicle software intended to provide false readings and successfully fool official emissions testing equipment. As a result, Volkswagen was fined 2 billion euros in regulatory fines, in addition to $ 25 billion, leading to permanent reputational damage for every automaker involved due to a lack of data integrity.
Data failures by British Airways (BA) and Marriott were directly responsible for the hefty fines levied against the companies by the European Union’s General Data Protection Regulation (GDPR). In 2018, both organizations suffered data breaches, resulting in the compromise of data for thousands of BA customers and millions of Marriott customers. Whereas Marriott was fined almost 100 million euros for waiting two months to report the data breach, in contravention of the standards laid out by GDPR, British Airways was fined 183 million euros after the GDPR investigation determined that poor security standards caused their data breach.
Sometimes, even genuine errors and accidents can still result in a substantial loss of revenue, as seen in Amazon’s costly mistake, which led to a loss of approximately $ 150 million or more. While data executives are doubtless familiar with Denial of Service or DoS attacks, sometimes a simple mistyped line of code can lead to similar effects, as was the case for Amazon in 2017. When attempting to reduce a billing system’s capacity temporarily, an Amazon employee typed “SERVICE.”RESTART() instead of SERVIC.RESTART(); from this simple typo, a torrent of digital chaos unfolded across the internet, directly impacting major websites using Amazon Simple Storage Service (S3), business and otherwise, including users of Medium, Netflix, Quora, Reddit, Trello, and Spotify, to name a few.
The errors led to users being unable to use workplace efficiency software, watch episodes using streaming services, access valuable news and information, and much more. This cascading effect that cost various businesses over 150 million US dollars was caused by three factors exposed by the initial typo: the fact that the restart command impacted many more servers than anticipated, that Amazon’s data quality was insufficient for the guardrails and safety precautions to consider the error to be critical, and that their recovery system for S3 depended entirely upon S3 functioning correctly in the first place.
While it might initially seem like these failures arose from purely technical issues or human errors, these three examples underscore the importance of data quality management. The deliberate deceit of regulatory agencies, tardiness and non-compliance with legal statutes, and causing a four-hour internet outage are all strategic-level consequences that derive from a failure to understand how data malfunctions can be costly for a business in both financial and reputational matters.
3. The True Cost of Fixing It Late
By failing to embed data quality controls for data that enters a company system, executives are setting themselves up for massive quality control (QC) issues along with extensive monitoring processes, all of which can lead to unnecessary expenses. Rather than having to invest in costly QC and monitoring or needing to apply a late-stage data fix, executives can save their organizations considerable time and money by incorporating data quality controls during data collection and data analysis, maintaining company standards, and determining actionable goals from the results. These concepts are good business practices in their own right, and incorporating data quality monitoring throughout also aligns neatly with practices such as focusing on long-term return on investment (ROI) and lean business principles. Examples of lean business principles include minimizing waste while maximizing the consumer’s perceived benefits, adopting a customer-centered approach, and promptly identifying and resolving inefficient processes.
Business Benefits of Data Quality Management
Enable Confident Decision-Making
Executives must prioritize good data stewardship and governance to establish confidence in data quality. Dashboards for data pipelines may provide a valuable means of representing any insights derived by a business. Still, data quality control is necessary for executives to be confident that the dashboard is operating using high-quality data. Without information being derived from metrics designed to compile and measure data accurately, the value of a given dashboard may be little more than a visually pleasing representation of data that cannot be relied upon.
Improve Operational Efficiency and Focus
By instituting cross-company standards for data quality management, businesses have demonstrated its importance, allowing executives to determine key performance indicators (KPIs) and utilize dashboards to effectively visualize data, identify genuine pain points, and pinpoint problem areas. Disorganized reports are transformed into a time-saving system that regularly delivers measurable and actionable results by implementing data quality standards throughout the information gathering and storage processes.
Unlock Organizational Alignment and Scale
In a case study involving a healthcare company with thirteen corporate units that were fragmented and operating on separate data standards, the executive hired by the company to improve their efficiency and accuracy implemented a set of shared data definitions. There were also changes to the company’s data quality standards to improve the speed and accuracy of retrieved information. The company was able to switch from presenting clients with 150-200 page PDFs to data visualizations without compromising the strength of the analysis. By providing this company with a new underlying infrastructure for data quality management, the business doubled its valuation thanks to a unified reporting system. They were also able to supercharge the business’s existing expertise, enabling faster time-to-insight and allowing for actionable items and valuable insights to be derived more quickly for decision-making purposes.
How Executives Should Approach Data Quality
Think of Data as a Strategic Asset
Far from being the sole domain of a company’s IT department, executives must always consider that managing data quality and addressing any issues that arise from mismanagement can negatively impact the business on multiple levels. By keeping data quality management at the forefront of strategic planning, executives can ensure that risk mitigation measures are working as expected and that data processes are operating efficiently and error-free. Innovation can be easily tracked, and actionable items can be produced for both employees and customers alike.
Support Data Governance with Resources and Authority
If the role of Chief Data Officer is currently unoccupied or in need of revitalization, establishing or re-establishing this position is of utmost importance. Without someone in place to implement proper data stewardship practices and allocate funds for company-wide data governance initiatives, a business exposes itself to unnecessary risk, including fines for non-compliance, delays in providing critical data, and even potentially leaving the institution vulnerable to cyberattacks, hackers, and other malicious actors.
Set Clear KPIs Around Data Quality
To quote the late, great Dr. George D’Elia, a master of survey design and a long-time pollster, “Without data, yours is just another opinion.” While stated a bit bluntly, this sentiment is crucial for executives when determining performance metrics for data quality, for without quality information, any attempt to accurately measure a data system or make improvements has already undermined itself. In particular, executives should demand clarity for vital data metrics, which include the business impact of sub-par data, the overall accuracy percentage, and the overall completeness percentage.
Embed Data Quality in Culture and Strategy
Investing in data quality management is far from a one-time process for the C-suite; it should instead be considered an area that requires continued focus and attention. Data quality management should not be a static system that relies on occasional updates; it must evolve to meet the specific needs of a given company. To remain effective and efficient at gathering valuable data, the data quality standards must be self-sustainable, provide actionable data that is measurable over time, and be incorporated into data processes throughout the business.
Closing Argument
As executives, it’s time to take proactive steps towards enhancing your data quality management. Assess your current data practices and identify areas for improvement. Invest in training and tools that promote data integrity and accuracy. By prioritizing data quality, you not only protect your organization from risks but also unlock the full potential of your data-driven initiatives. Don’t wait until it’s too late—start building a strong foundation for your company’s future today!
Frequently Asked Questions (FAQs)
Why is data quality management important for our organization?
Data quality management is crucial because it ensures accurate, reliable, and timely data. Poor data quality can lead to costly mistakes, misguided strategies, and lost opportunities.
What are the main risks associated with poor data quality?
Poor data quality can result in financial loss, damaged reputation, regulatory non-compliance, and decision-making based on incorrect information. It can also lead to inefficient operations and wasted resources.
How can we assess the current quality of our data?
Conduct data audits and quality assessments to measure accuracy, completeness, relevance, and consistency. Regularly analyze your data sources to identify any discrepancies or areas for improvement.
What tools or technologies can help improve data quality?
Consider investing in data quality management software, data cleansing tools, and analytics platforms that provide real-time insights and monitoring capabilities.
How does improved data quality lead to increased ROI?
High-quality data enhances decision-making processes, optimizes operations, and increases customer satisfaction, contributing to improved performance and profitability in data-driven initiatives.
What steps can we take to foster a culture of data quality in our organization?
Encourage all employees to take responsibility for data quality, provide training on best practices, and establish clear policies and procedures for data management. Leadership support is essential for cultivating this culture.
How often should we review our data quality management practices?
Regular reviews should be scheduled at least quarterly, but the frequency may vary depending on the volume and nature of your data. Continuous monitoring is advised to keep data quality in check.
What are some common challenges in data quality management?
Common challenges include data silos, inconsistent data formats, lack of governance, and insufficient training. Addressing these issues requires a coordinated effort across the organization.
Can we achieve data quality overnight?
No, achieving high data quality is an ongoing process that requires commitment, resources, and regular evaluation. It’s essential to establish a long-term strategy for sustained improvement.
What role do employees play in maintaining data quality?
All employees are integral to maintaining data quality. They should be vigilant about data entry, adhere to established protocols, and report inconsistencies or errors promptly. Encouraging a sense of ownership can significantly enhance data quality.