14th PNEC E&P Data Conference, Houston

Shell on seismic data management, Petrel’s evolving role and R5000 deployment—its ‘most complex project ever.’ Continental Resources deploys a PPDM/NeuraDB master data system. ExxonMobil struggles with global seismic inventory. Southwestern Energy teams with Petris on data quality.

According to Cora Poché, Shell has petabytes of data online and shelves of cartridges. But despite a global seismic data management policy and standards, sometimes projects are executed with incomplete data. Poor audit trails mean that data still can get lost and may be re-purchased. Shell has been working to improve this situation since 2008 with a global map-enabled index of all of Shell’s seismics. Company policy for data preservation, remastering and vectorization has been published as the Shell E&P Global Seismic Data Management Manual. The project started in the US and was going global when the oil price cratered in 2009 and the project was scaled back. Poché observed a strong inverse correlation between the oil price and discretionary spend on data management!

David McMahan outlined how Continental Resources got started with a PPDM-based master data system spanning G&G, vendor data, physical inventory and workflow improvement. Continental buys data from IHS, HPDI and MJ Systems which is used inter alia in Geographix projects. NeuraDB, an ‘out-of-the-box’ PPDM database from Neuralog was deployed in a SQL Server/ESRI ArcSDE environment. This was extended with data loading tools, an EDMS and web services updates from IHS Enerdeq and HPDI. The 5 million well dataset is now updated nightly. Continental is now planning for integration with a new accounting system and a transition to the PPDM 3.8 data model. All of the new data loaders are now available in the ‘shrink wrap’ version of Neuralog which will likewise be migrating to the 3.8 database later this year.

Schlumberger’s data guru Steve Hawtin has been looking at the impact of the Data Management Association’s writings and activity on the oil and gas vertical. DAMA has published a set of data management best practices in data management in the form of a Data Management Dictionary (2008) and the Guide to the Data Management Body of Knowledge (2009). DAMA has carved up data management into functions and elements and laid down guidelines for data governance—where Hawtin sees a pressing need in the upstream. But E&P differs from the main body of DAMA (financial services) in that it adopts a buy rather than build policy and is mostly confronted by integration issues. Here DAMA gives some good pointers although not all are applicable. Hawtin believes that attempts to leverage the DAMA approach of reference data and master data management failed in E&P in the 1990s with Epicentre. In respect of metadata, Hawtin believes ‘our definition is completely different!’ While the majority of the DM-BOK is a valuable resource, E&P data managers need to be aware of conflicts.

Like Shell, ExxonMobil is fretting over its global seismic data inventory and is working on improving access to its massive 3D datasets as Jim Blackwell explained. The system will provide a world map-based front-end to a database of proprietary and vendor data with metadata links to other databases of field tapes, survey notes, legal rights etc. Following an evaluation of third party solutions—ExxonMobil decided to extend its own proprietary database in a major in-house development leveraging ArcGIS/SDE and the PetroWeb application.

John Deck described Southwestern Energy’s data quality framework, a joint development with Petris Technology. Prior to the project, Southwestern was managing its data using a combination of Excel, SQL Server and folders. Deck was brought in to ‘sort this out,’ and to improve data management of apps such as Petra, Kingdom, OpCenter and Property Master. The project centers on the creation of a single source of truth, common naming conventions and data governance. One early win was the use of Dynamic Drilling’s Apollo Dart WITSML aggregator to replace manual data entry from emailed deviation surveys. Petris’ DataVera has been deployed for quality assurance, currently profiling some 30 data types. Cleansed data is moved from app data stores into a new PPDM database. Other relevant standards include AOGC, API, and AAPG. ‘How do you know when quality is good enough?’ Deck suggests this is a balance ‘between perfection and a major screw up such as drilling a well in the wrong place!’

Hector Romero observed that Schlumberger’s Petrel has evolved from Shell’s tool of choice for static modeling to a complete workbench for petroleum geology. But Petrel data management is problematical with an ‘explosion’ of multiple copies with no audit trail. ‘Forgiving’ ascii import is prone to bad/incomplete data. Geodetic integrity is also a challenge and users risk ‘loss of context’ as data migrates. In 2007 senior data managers got together to introduce standards around Petrel in Shell. This led to the implementation of the Petrel reference project and better geodetic control. Shell’s mainstream interpretation environment is Landmark’s OpenWorks/SeisWorks. These links to geodata in ArcSDE via OpenSpirit and the Petrel reference project. Shell’s ‘Epicure’ middleware is also used to write back to OpenWorks along with context/attributes ‘as far as possible.’ Jeremy Eade joined Romero to describe how Landmark has built plug ins for both Petrel and Landmark’s PowerHub/PowerExplorer for direct connectivity between the two environments. Landmark’s R5000 data environment is the Shell standard and PowerExplorer is the data managers’ tool of choice.

Petrosys’ Volker Hirsinger provided an overview of the state of the art of seismic master data management. Seismics has evolved from a 2D ‘frontier’ tool to the geologists’ equivalent of an MRI scan! But alongside the high-end, companies still need to handle legacy data. Keeping seismic data in perpetuity means managing media and metadata—and is not always allocated the required budget. Geoscience Australia was forced to abandon its ambitious transcription effort through lack of funds. Ancillary documents, navigation data, velocities, processing workflows and multiple coordinates make for diverse data sets and a serious knowledge management issue. There is often too much focus on ‘managing’ large volumes of field data—while other key data sets are neglected. Increasingly workstation vendors ‘corner’ segments of the marketplace—for instance, much subsalt data in the Gulf of Mexico is ‘locked’ to Paradigm. International oils now run multiple workstations to cater for such local differences—increasing the seismic data managers workload.

Peggy Groppell described R5000 deployment as ‘Shell’s largest and most complex technical project undertaken,’ particularly in the light of a directive to extend the OpenWorks/Linux environment to Windows. All Shell’s client software is now 64 bit Windows with Linux servers. Shell’s 123DI flagship subsurface interpretation tool was originally developed at Bellaire in 1985. This has evolved through ‘nDI’ and now ‘GeoSigns,’ Shell’s R5000-based ‘next generation’ system. GeoSigns comprises tens of millions of lines of C++ and Fortran—and would be too much work to port completely to Windows. Shell opted to leverage Nokia Qt to provide cross platform functionality between Linux and 64 bit Vista. Other porting problems included broken symbolic links in Windows and password issues in the Windows Wallet. Whatever the root cause of a port problem, Groppell observed, ‘the developers get the blame!’

This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.