Measuring Data Quality for Ongoing Improvement: A Data Quality Assessment Framework
- 9h 5m
- Laura Sebastian-Coleman
- Elsevier Science and Technology Books, Inc.
- 2013
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies.
- Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges
- Enables discussions between business and IT with a non-technical vocabulary for data quality measurement
- Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
About the Author
Laura Sebastian-Coleman, a data quality architect at Optum Insight, has worked on data quality in large health care data warehouses since 2003. Optum Insight specializes in improving the performance of the health system by providing analytics, technology and consulting services. Laura has implemented data quality metrics and reporting, launched and facilitated Optum Insight's Data Quality Community, contributed to data consumer training programs, and has led efforts to establish data standards and manage metadata. In 2009, she led a group of analysts from Optum and UnitedHealth Group in developing the original Data Quality Assessment Framework (DQAF) which is the basis for Measuring Data Quality for Ongoing Improvement.
An active professional, Laura has delivered papers at MIT's Information Quality Conferences and at conferences sponsored by the International Association for Information and Data Quality (IAIDQ) and the Data Governance Organization (DGO). From 2009-2010, she served as IAIDQ's Director of Member Services.
Before joining Optum Insight, she spent eight years in internal communications and information technology roles in the commercial insurance industry. She holds the IQCP (Information Quality Certified Professional) designation from IAIDQ, a Certificate in Information Quality from MIT, a B.A. in English and History from Franklin & Marshall College, and Ph.D. in English Literature from the University of Rochester (NY).
In this Book
-
Data
-
Data, People, and Systems
-
Data Management, Models, and Metadata
-
Data Quality and Measurement
-
DQAF Concepts
-
DQAF Measurement Types
-
Initial Data Assessment
-
Assessment in Data Quality Improvement Projects
-
Ongoing Measurement
-
Requirements, Risk, Criticality
-
Asking Questions
-
Data Quality Strategy
-
Directives for Data Quality Strategy
-
Functions of Measurement: Collection, Calculation, Comparison
-
Features of the DQAF Measurement Logical Model
-
Facets of the DQAF Measurement Types