The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Morgan Kaufmann Publishers and the Object Management GroupTM (OMG) have joined forces to publish a line of books addressing business and technical topics related to OMG's large suite of software standards. OMG is an international, ...
This book will appeal to data quality and data management professionals, especially those involved with data governance, across a wide range of industries, as well as academic and government organizations.
The best books occupy precious desk space, dog-eared and highlighted. By this standard, Danette McGilvray's book, Executing Data Quality Projects: Ten Steps to Quality Data and Trusted InformationTM, will be absolutely ravaged, ...
Morgan Kaufmann, 2013. Print. Ladley, John. Data Governance: How to Design, Deploy and Sustain an Effective Data Governance Program. Morgan Kaufmann, 2012. Print. Ladley, John. ... W., Leo L. Pipino, James D. Funk, and Richard Y. Wang.
Master techniques in: • Data profiling and gathering metadata • Identifying, designing, and implementing data quality rules • Organizing rule and error catalogues • Ensuring accuracy and completeness of the data quality assessment ...
This book offers an overview of why data governance is needed, how to design, initiate, and execute a program and how to keep the program sustainable.
This book provides a strong connection between the concepts in data science and process engineering that is necessary to ensure better quality levels and takes you through a systematic approach to measure holistic quality with several case ...
Data Quality: The Accuracy Dimension is about assessing the quality of corporate data and improving its accuracy using the data profiling method.
This 20th anniversary edition includes a series of detailed case study interviews by David Whitford, Editor at Large, Fortune Small Business, which explore how organizations around the world have been transformed by Eli Goldratt's ideas.
In this comprehensive book, Rupa Mahanti provides guidance on the different aspects of data quality with the aim to be able to improve data quality.