Publications from consultants range from lightweight ‘teasers’ designed to whet the appetite to informative teaching material. DNV GL’s RP 497, ‘Data quality assessment framework’ (DQAF), a 40 plus page free download, falls in the middle of spectrum. DQAF’s target audience is customers, consultants and the data quality community at large.
The basis for DNV GL’s approach is the ‘stringent’ ISO 8000-8 data quality standard that requires ‘complete definitions for both metadata and the conceptual model.’ The principle behind the ISO standard is to ‘evaluate data as correct (good) or incorrect (bad).’ This obvious principle is illustrated with a singularly uninformative graphic.
DNV GL is primarily a classification and technical assurance company and its view of data quality veers towards ‘the impact of data quality on operations.’ This is assessed using tools for risk analysis ‘such as bowtie models, risk matrices, and fault tree analysis.’ Some may consider this scope creep from mainstream data quality. Likewise, the coupling of information security to data quality makes for an extremely broad field of study. DNV GL claims that ‘a high level of maturity of data quality is generally associated with higher levels of information security.’
DQAF takes a DAMA-esque approach, proposing a framework for ‘checking that the quality of a data source matches the criteria appropriate for its context.’ In addition, a data quality maturity framework is proposed for corporate self-assessment and improvement.
Full-blown data quality assessment requires more RPs covering sub topics and publications including DNV GL’s ‘DQA for sensor systems and time series data,’ which is not publicly available,
ISO 8000-8, data quality maturity models by Loshin and CMMI, W3C data on the web best practices, ISO 31000 risk management, DNVGL-RP-0496 cyber security resilience management and ISO 27000 information security management! More formalism is evidenced in terminological definitions from ISO/IEC 11179-1:2004.
In this reviewer’s opinion, DQAF illustrates the difficulty of the abstract, multi-domain approach to data quality. For instance, in navigation data, quality pitfalls are likely to happen down inside the data and may require a deep understanding of the domain to spot and fix. The high level meta model and KPI-style approach to data maturity could lead to a false sense of ‘information security.’
© Oil IT Journal - all rights reserved.