More from the SMi E&P data management conference

Records management and disasters. Standardizing Ppdm. ENI, KOC on data management.

Speaking at the SMi E&P Data Management conference in London earlier this year, Paul Duller (Tribal) and Alison North (AN Information) provided a salutary tale of the importance of information management. In 2010, a natural gas pipeline operated by Pacific Gas & Electric exploded in San Bruno, a suburb of San Francisco causing eight deaths and considerable property damage.

The US National Transportation Safety Board determined that inadequate quality assurance in a 1956 line relocation project was a likely cause, along with inadequate maintenance. North was called in as an expert witness because of her experience with old paper records. Her investigations showed that the GIS system that PG&E used for its integrity management program contained ‘inadequate and misleading information.’ The paper records showed that what the GIS system reported as ‘seamless’ in fact had a longitudinal seam that should have had a more stringent maintenance program. North emphasised the risks that companies run from such ‘dark data’ of uncertain provenance that can show up in a legal discovery process. Similar risks are inherent to other corporate data sources such as email.

Robert Best (Petroweb) asked ‘why standardize a standard?’ It turns out that there are good reasons to want to standardize a Ppdm implementation and Petroweb, along with OpenSpirit, TGS and OilWare are working to develop tools (data load, export, triggers etc.) and techniques to make for a robust Ppdm deployment. While the vanilla Ppdm provides the data model and some guidelines, it does not say how primary keys are defined, how well coordinates should be stored or how CRSs are defined. While the Ppdm AREA can be used to define a hierarchy of state, county (or country, block), this is hard to implement. Ppdm is kicking off a work group on implementation standards that is to develop a reference implementation.

ENI’s Paul Richter likened data management to cleaning up a messy child’s room. No sooner has it been done than you are back to square one. Enter data ‘utopia’ where things stay nice and tidy all the time. This can be done with a ‘robust modular framework’ that fits the business’ needs. First, identify your master database and develop a method to reconcile CRS/UOM across different data stores and applications. Then develop a strategy for reconciliation and data maintenance. ENI is working with One Virtual Source (OVS) on such a system, using data quality metrics to trigger clean-up processes. Richter is to present more on this project at next month’s PNEC.

KOC’s Hussain Zaid Al Ajmi described a similar approach using ‘front end’ integration and data QC project in multi vendor environment. Dispersed data has been cleansed and ‘stringent’ naming and UoM/CRS conventions applied and all copied to a new OpenWorks master project database a.k.a KOC’s single source of truth. The plan is now to promote the master to a ‘central authenticated data bank’ and add automated workflows and processes. BP’s Project Chile is reported as taking a similar approach.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.