About 80 attended the 11th Smi E&P Information and Data Management conference in London last month. Schlumberger ‘s Eric Abecassis’ presentation introducing Petrel Data in Context sparked off an interesting discussion on the merits and otherwise of indexing and cataloging against ‘Google’ type search. Classifying E&P documents is a very difficult task that ‘would take for ever!’ Hence the automated tagging approach of Data in Context/MetaCarta.
However, Caspar Schoorl and Karen Blohm from data management specialist Fugro Data Solutions came down on the index/catalog side of the debate, particularly in the context of acquisitions and mergers and the need to combine different corporate data systems. For Fugro, metadata data use has been increasing significantly since 2000.
Al Kok (Saudi Aramco) likewise argued in favor of cataloguing, advocating ‘right-size’ metadata capture which can be manual or automated. Saudi Aramco’s Exploration Legacy Data project ran from 2002 to 2006 and included document scanning and metadata capture to the database. Workflow and processes have been developed for indexing and data loading, assuring ‘rule-based metadata management.’ Search can now be by metadata, full text and/or GIS based. A data governance program monitors roles, responsibilities and data policies.
Martin Turner presented Hess’ PPDM-based master data store and federating GIS ‘data nodes’ and data governance procedures. Hess is also a ‘cataloguer,’ leveraging a modified FDGC metadata standard in a move from paper-based maps to a lightweight, thin client web browser accessing data in a federation of a PPDM well master database, Tobin Land Suite, SAP, seismics and production data. Federation is achieved through a combination of ETL, SDE, ArcGIS Server and web publishing tools. Geodatabase governance uses a simple model that ‘expects users to think, that’s their job.’ Data is spatialized in the global Hess GIS Geodatabase and stored using the GIS ‘node’ concept. A GIS data node is a combination of GIS data and application files—all stored in a common folder structure. ‘Node keepers’ determine folder structure and manage data and access. Nodes transform loose data concepts into ‘something that is useful to Hess corporate.’ Turner reports that ‘attitudes towards metadata capture and GIS data management are changing.’
Tarun Chandrasekhar (Neuralog) outlined Pemex’ deployment of a PPDM-based data store in the form of NeuraDB, and has developed well data lifecycle processes and a ‘satisfactory’ division of labors between IT and the business. Pemex’ well log library dates back over 90 years and includes a huge Mylar and paper archive. Today, Pemex’ digital well log repositories span physical documents, network drives, application databases, DMS and custom and commercial well log repositories. NeuraDB is now the hub of a well log ‘knowledge factory,’ capable of supporting the log lifecycle from field logs through processing and interpretation. Work order and data quality management are supported via the PPDM database. IT has been involved in the project to assure data QC. This has resulted in the publication of a ‘Guide for Certification of Analog and Digital Geophysical Data’ by Pemex IT/Operations unit. Pemex’ drilling department is now custodian of the well log repository, and uses ‘well researched’ QC practices to provide quality processed data for Pemex’ interpreters.
Achim Kamelger described OMV’s ‘ISIS’ Project, a master data store, built around Schlumberger’s Seabed data model. OMV’s acquisition of Rumanian oil company Petrom resulted in ‘too many databases,’ with multiple links, synchronization issues and the need to duplicate data across applications. Proliferating Petrel projects led to multiple sources of ‘almost the same’ data. Directional surveys in Excel required a huge editing effort prior to load to Petrel, which then produced multiple trajectories!
ISIS is a set of federated databases with master data in a shared master data store (MDS). The MDS holds master data for structured, unstructured and spatial data. Dataflows are driven by processes and standards. OpenSpirit and ProSource are currently under test. Schlumberger was the development partner. A proof of concept MDS was built using the Seabed data model accessing data in GeoFrame, OpenWorks, ArcGIS and a DMS (Documentum). The idea was to ‘buy not build’ and to ‘configure not program.’
‘Hard and soft facts’ emerged from the proof of concept. The key is that you need to deal with people issues and to focus on the end user’s needs. OMV has developed a cook book with Schlumberger addressing issues such as naming conventions. Scaling up from the proof of concept was a potential pitfall. This was addressed by user acceptance testing, service level agreements and ‘SP3R2’ (standard processes, roles and responsibilities). OMV underestimated users’ resistance to the DMS. If a secretary does not support the DMS, the boss will not use it!
Dong’s Fleming Rolle noted a ‘disconnect’ between data rights management and access/copy control, especially with regard to data ‘outside the firewall’ used in data rooms and partner meetings. Today, you can take all logs from a new well on a $50 SD card, or a 3D survey on a 250GB portable drive! We need more rigor regarding ‘informal’ copying and sharing. The problem is, if you deploy ‘ultimate security,’ people won’t be able to do their jobs and they will probably find ways around the security system anyhow. We protect our large data sets with ASP/DSP, VPN, Citrix/ThinAnywhere solutions. But what about the CEO’s PowerPoint presentation to shareholders? Dong has developed a data ownership model that defines data types, and ownership roles for ‘advisor,’ ‘strategy owner’ and ‘business owner.’ The model is being integrated into corporate ‘stage gate’ workflows, so that the company knows where data is at any point in time. Dong is using ISO 27001 and 27002. Rolle recommends that this 40 page document be read and understood by IM and IT. It deals with HR issues, encryption and access control.
The conference generally reflected industry consolidation around master data, GIS, data quality and governance. Not exactly rocket science or, as one observer put it, ‘Where are the new ideas?’
This article is an abstract from The Data Room’s Technology Watch report from the SMi E&P Information and Data Management conference. More information and samples of this subscription-based service from www.oilit.com/tech and email@example.com.
© Oil IT Journal - all rights reserved.