SMi Data Management 2004

A glance at the agendas of successive data management conferences may give the impression that ‘plus ça change, plus c’est la même chose*’. In reality, companies have made significant progress in data and information management. Data management theory has been understood for a few years now, increasingly it is being put into practice. Initiatives include top-down standardization of applications, data and systems as practiced by ExxonMobil, the measurement of conformance with corporate nomenclature standards (Statoil), the application of Demming’s quality management techniques to data capture (Saudi Aramco) and the integration of structured databases with unstructured documents (EnCana). In the past it has been fashionable to associate data management with outsourcing. However, none of the oil companies presented papers involving outsourced projects—all are very ‘hands-on’ when managing their corporate data assets.


Steve Zelikovitz described how Exxon and Mobil managed their merger and created the new standardized global IT infrastructure. During the merger – the watchwords were, ‘KTBR’, keep the business running and ‘standardize’, on either Exxon, or Mobil solutions . New ‘solutions’ were not an option. Some $100 million per year savings in computing expense were achieved thanks to reduced complexity. The ‘absence of standards has created a mess – we are still mining the landfill for data nuggets’. Repositories abound because applications have driven data architecture (or rather the lack thereof). Cheap data storage has created exponential growth in data volumes – ‘we are awash in data’. In 2000 the company experienced 30-40% per annum data growth, costing $600-700k per TB to manage. An archival/deletion strategy was implemented. So far a total of 70TB of data has been archived/deleted—‘increasing the needle to hay ratio’.


The DTI stores key documents in its internal ‘Matrix’ DMS (developed around Tower Software’s TRIM product) as Stewart Robinson explained. The system stores documents as digitally signed, legally admissible evidence. PON 9, a new Petroleum Operation Notice has been written to fit in with the idea of a National Archive. The oil and gas division of the DTI seems to be a trailblazer in e-government. Data reporting is to be done through a web browser with metadata in Dublin Core. The UK Oil Portal will only accept XML and PDF digitally signed documents. Logging on to the repository enforces XML-based cataloging – ‘you can’t deposit a document without cataloging it’.


Malcolm Fleming presented the Deal Data Registry which is intended to relieve companies of the legal burden of data retention and the National Hydrocarbons Archive – a new research-oriented subset of UK data managed by the BGS. Two trials have been completed on the DEAL Data Registry (cores) and for data archival on the Hutton Field. The Deal Data Registry (DDR) was launched in September 2003 with funding from UKOOA, DTI and CDA. The DDR will catalogue well cores, cuttings, reports and logs, 2D and 3D seismic surveys. A separate initiative – the National Hydrocarbons Data Archive (NHDA) will contain a select subset of license data – small enough to be economically manageable, and large enough to be useful.


According to Landmark’s Laura Schwinn, industry has to do more with less people. E&P productivity challenges of a declining workforce and complex reservoirs are forcing better productivity. In 1960 there were 1.6 million oil and gas workers. By 2020 there will be a mere 100k. This implies that a 7% growth in productivity will be required to keep pace. Schwinn’s infomercial touched on Flare’s catalogue and Landmark’s Decision Space Portal. Schwinn cited a couple of data management war stories—one from the North Sea involved a well collision due to a mis-identified well trajectory.


Eldar Bjørge provided an update on Statoil’s ongoing data improvement effort. Statoil’s data management system tags approved data with quality control and context information (QCC) before storage in the corporate ‘results’ database (CDB). Bjørge warns that it is hard to compromise between too many attributes and ease of capture. Compliance metrics show the system is working well—in June 2003 some 5% of picks were in conformity with the standard – in December 2003 this had risen to 92% - ‘quite an amazing change’. Field and prestack seismic data is stored in an offline tape archive. Raw data is stored in PetroBank and interpretations in the CDB. Statoil has also implemented ‘SPV’—a low cost tool for seismic volume archival on an IBM TSM robot. The system is now being rolled-out to Statoil’s Global Exploration unit.

Saudi Aramco

Ibrahim Al-Ghamdi explained how Saudi Aramco is digitizing legacy datasets for improved accessibility and applying quality methods to improve the data capture process. Al-Ghamdi’s presentation emphasized Saudi Aramco’s focus on data management and data quality. Aramco’s paper seismic archive is being scanned to TIFF in a ‘push for accessibility’ and to eliminate ‘muda’ or waste. Aramco is assigning significant resources to data cleanup—by addressing the sources of data capture problems. Al-Ghamdi insists that ‘fix a data point and you have just fixed that point—fix a process and you have eliminated future problems’. Al-Ghamdi advocates dual data entry. This may be expensive, but is recommended as increasing data quality and to identify problems with individual data clerks who can be coached or moved on to other tasks. Quality stems from the identification of people errors and process errors – one should ‘map and challenge’ processes constantly.


Glenn Mansfield (Flare) and Pete Paragreen (Centrica) presented Flare’s ‘Raptor’ system built to capture operations and production data from Centrica’s gas storage facility. Centrica Storage operates the North Sea Rough Field which holds around 76% of the UK’s storage capacity—around 10% of peak demand. Raptor is a knowledge-information-data (KID) store with embedded workflow. Users log on to see what they have to do next or to drill down for complex queries. A leak management reporting and tracking system was embedded in Raptor as part of the UK HSE/UKOOA drive to reduce hydrocarbon release.


Thierry Gregorius reported on Shell’s global ESRI-based GIS infrastructure, SAP Enterprise Portal and a new Microsoft .NET development standard. Gregorius cited Waldo Tobler’s law—’everything is related to everything else, but near things are more related than others’. Shell’s GIS professionals form a GIS Technical Advisory Panel (TAP) and are involved throughout the E&P data lifecycle. Example usage includes permit maps, spill prediction, environmental protection and forecasts of exploration success. Raster images get attention! A satellite imagery can be superimposed on a structural geological interpretation. Many of Shell’s explorationists have become expert GIS users. Shell’s rationalization and new world organization steered by HQ has brought worldwide standards and a global infrastructure. Three IT super centers (USA, Holland and Malasia) support Windows 2000 desktops with Microsoft .NET as the development standard. This is ‘quite a change in mindset’ for many in Shell. GIS layers are hyperlinked to documents in the DMS through the metadata. A Shell global UID system was described as ‘work in progress’. A GIS-enabled web-front end lets users mine data from multiple databases – in geology, reservoir engineering etc. This works across the SAP Portal and public data sources. Shell is in the process of designing its spatial data infrastructure to offer ‘joined-up’ GIS. Gregorius reports that the IT infrastructure is easy – but the combined catalogues are hard to deploy.


Helen DeBeer explained how EnCana is linking structured data in corporate databases to unstructured data in document management systems. The idea is simple – to use data from structured databases to make pick lists which are used to classify documents. EnCana uses this technique to link its Seitel EDM seismic data management system (structured data) with unstructured data in Open Text’s LiveLink Document Management System and in-house developed Oracle datastores. EnCana has built such systems for tracking exploration opportunities, for managing IT projects and to build a seismic survey/navigation and inventory database. In all cases the same philosophy is used. Metadata management is the key, adding industrial strength search to LiveLink’s limited capabilities.


ENI has built web portals for technical users (Landmark’s Team WorkSpace) and knowledge workers (SAP Enterprise Portal). Antonio Carlini decribed how the portals hide infrastructure complexity from users and offer worldwide access to IT facilities in Milan and Houston. Screen sharing between two remote workers has proved very popular. A ‘semantic search engine’ powered by Invention Machine’s Cobrain has proved ‘very powerful’. Like Statoil, ENI uses Schlumberger’s results DB to store project summaries. The portal has been linked to an external Finder database and a warehouse outside of Milan. Some 21 ‘workflows’ have been developed and are to be deployed ‘massively’. The keys to project success – leave data where it is, no ‘revolutions’ and a modular architecture.


Another SMi ‘regular’, Duncan McKay, updated attendees on ConocoPhillips’ digital data room preparation. ConocoPhillips is involved in a major post merger disposal program focusing on non core assets (low revenue, high G&A maintenance). The North Sea unit continues to develop its scanning workflows and was able to create a dataset of 26,000 files, 14GB of data in a record 2 months for a recent disposal. The intent is to produce a polished product, ‘it’s a sales job after all’. The digital data room is a facet of ConocoPhillips’ CROP initiative. The campaign for reduced paper is not ‘don’t use paper’ just ‘don’t store paper’.


Database consultant Niall Young has been involved in two UK DTI sponsored projects, Vantage, an offshore ‘passport’ and digital training record, and EEMS, a system for reporting environmental emissions. The LOGIC Vantage People on Board (POB) experienced poor take-up. The Environmental Emissions Monitoring System (EEMS) is a 12 year old system – all companies are required to submit returns. This is to be web-enabled and linked with the DTI Oil Portal as part of the new digital signature initiative. Uses (will use) SOAP, Web Services, XML Schemas and Oracle’s XML database extension ‘XDB’.

This article has been abstracted from an 11 page report produced as part of The Data Room’s Technology Watch report service. If you would like to evaluate this service, please contact

* Things never change.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.