SMi Data & Information Management

The fourth SMi Data and Information Management was held in London this month. As chairman Neil McNaughton remarked, the conference was particularly rich in oil company case histories. Both Statoil and Shell Brunei presented papers which illustrated the ‘new wave’ of data management - whereby applications are used to clean and QC data prior to storage in a new breed of corporate data store. Other significant contributions came from Enterprise Oil - on spatially enabling the E&P portal and Shell-Exxon joint venture NAM on another emerging theme - the use of metadata - data about data - to manage documentary and other information across the enterprise.

Conoco

Duncan McKay gave a follow up paper on the fate of the Saga North Sea document asses that Conoco UK acquired in 2000. Hays RSO is now image-enabled and is used by Conoco’s explorationists as a ‘one stop shop’ for locating documents. Conoco are looking to migrate to Open Text’s LiveLink to add full-text search to its document management, even though the software is expensive and does not allow for floating licenses. McKay’s experience of acquisitions and mergers has led him to advocate planning for data rooms and even to set aside a room for this permanently. Data is a critical asset for Conoco, while public data can be managed in a shared environment like CDA, good management of proprietary data is a source of competitive advantage for Conoco.

Shell

Erik van Kuijk (Shell) commented that the ‘missing link’ in document and information management is a ‘catalogue of catalogues’ allowing for a centralized search, adding that Shell is currently prototyping a meta catalogue at Shell. Van Kuijk believes that the basic metadata can be captured 42 attributes. Shell plans to share this categorization with other oil companies in the near future, through CDA and POSC.

Statoil

Eldar Bjorge updated Statoil’s upstream IS/It strategy ‘SCORE’ project which has been running since 1998. SCORE is currently extending to the drilling arena with the NaviBoB project. SCORE has resulted in Statoil’s choice of OpenWorks as project data store, and Slegge for the Corporate Data Store (CDS). Statoil’s objective is for a ‘robust data management environment.’ To achieve this, some 40 standards are in daily use within Statoil. These cover around 100 data types in 10 disciplines, all with standard definitions of the most important data types, standard data flows. Data management processes are organized through the web portal. The aim is to populate the CDS with ‘used, quality controlled data.’ Population is performed at project milestones and data is flagged and documented as it enters the CDS. New projects can then kick off with validated data and interpretations. Staoil’s Slegge (also known as the Schlumberger EPDS or the ‘iStore’) currently holds around 30 data types. Schlumberger is also developing an EPDS browser ‘iSurf’ providing table and map browsing of data in the iStore.

Paras

Alan Smith (Paras) recognizes three types of data repository. The ‘bank vault’ – single company, the ‘marketing’ e-commerce data sales portal and the ‘club’ members only à la CDA or DISKOS. Smith drew from Paras’ experience – a lot of it in South America - to attempt to analyze the benefits of different repositories. He concludes that though costs can be reduced through sharing resources and cleaning up data jointly, such benefits may not always be realized. In the ‘clubs’, many still duplicate data and few have destroyed their own copies because of the expense. Moreover, the Clubs have proved to be slow movers. Smith’s vision of the future is of minimal data duplication. Contractors and data ‘publishers’ will offer online access to client company’s workstations. This is to be achieved through web services ( UDDI, XML and SOAP). Smith asked ‘what’s holding up web services’ suggesting that the desire to ‘touch and hold’ one’s own data may exist. On bandwidth, Smith doubted whether this would grow quickly enough to match rapidly expanding demands of upstream bulk data delivery.

Enterprise Oil.

Chris Jepps (Exprodat) gave an overview of Enterprise Oil’s Web/GIS vision. Enterprise’s early work with Landmark on Open Explorer proved the usefulness of GIS as integration technology. Subsequent work focused on making GIS easier to use – by loading pre-built maps into ArcView for instance. But take up for the GIS tools was low, mainly because of the underlying UNIX platform. Meanwhile Portal technology was evolving so Enterprise (with contractor Exprodat) built the Enterprise Rapid portal which became WoW (web OpenWorks) later sold to Landmark. To enhance the usability of the GIS front end, Enterprise deployed ESRI’s Internet Map Server (Arc IMS). But the vanilla front-end web browser supplied by ESRI was not up to much. So Enterprise re-wrote the front end HTML viewer in JavaScript. The lightweight ArcIMS GIS fulfils 90 % of Enterprises business needs. ArcIMS deployment is effective in terms of license costs, network traffic and ‘knowledge sharing.’ In the future, Enterprise plans greater integration of mapping with other applications and accelerated rollout of ArcView 8 which should remove the need for Shapefile caching.

Brunei Shell

Paul van der Kooy told how Shell Brunei’s data managers lacked feedback from users as to the poor quality of its corporate data. Shell decided to implement a data quality visibility program that monitored data quality by recording requirements, urgency and other metrics. A data table shows targets and the degree to which these were met or underperformed. Color coding of this information allowed for visual analysis of Data Quality Performance Indicators (DQPI). Another data issue in Shell Brunei was that the ‘last step’ of interpretation – populating the corporate database was often forgotten. The old system involved cutting and pasting of log data to produce a summary report for the corporate database - a complex and error prone process. The solution was to move the CDB to the heart of the workflow. Recall now constitutes the core of the work process and is used to generate accurate and timely TRAPIS reports. The CDB ‘is not dead’ – but is enhanced by bi-directional data flow to and from Recall.

Troika/FileTek

Jill Lewis (Troika) and Sias Oosthuizen (FileTek) presented File-Tek’s StorHouse RM combination of relational database and hierarchical storage management system. FileTek has been working with data transcription specialists Troika to develop a seismic archival system capable of storing and recovering seismic data on a trace-by-trace basis. A paper by Richard Summers of startup Infoarchitectural Dynamic Technology described futuristic databases whose structure would learn and evolve from queries. Watch out for the Internet Enabled ‘True Spatial’ Database currently in development as Interspace. This will roll out next year and which will provide 2D and 3D mapping using this technology on a ‘cellular Linux’ database.

BHP Billiton

According to Nick Larcombe, acquisitions and mergers are making our business move at a greater speed than the data lifecycle. BHP’s Technical Information Systems Strategy Projects ran from 1995-1997 and covered the subsurface ‘value chain’ through to asset disposal. The projects looked at technical applications, data support, processes and organization. The data management long term goals derived from this were ‘to improve BHP’s E&P data management by providing on-line access in each regional office to essential technical data relevant to that office, validated, completed and up to date’. BHP Algeria has been a test bed for these techniques which have been deployed with help from Venture Information Management. A project steering group carried out a data management survey of the asset team using a questionnaire to determined the workflow and processes. For example what data went into Finder, and what data ended up on diskettes in someone’s desk drawer! Interviews were conducted and recorded. A results matrix of Data Type against Storage Location was established. 87 data management ‘issues’ were identified and grouped into categories. The study provided a powerful tool to identify bottlenecks, a business case for improvement projects, a holistic assessment of the impact of changes and a base line for measuring future improvements. Less tangible benefits included improved relationships with the asset team. Larcome concluded that culture is much more important that standard software in improving the overall work process.

POSC

Paul Maton (POSC) stressed the value of information sharing – citing the Open Source movement. A residual problem is the lack of a common understanding of semantics and context of data and information. While data management has driven down the time spent looking for data, data quality remains a problem. In 2001 POSC (based on work done by Shell) released a new ‘very simplified’ version of Epicentre and the Software Integration Platform (SIP). They ‘cleaned out lots of inheritance complications’ and have published a draft specification for comment. Also working with BGS and DEAL on a PipelineML. Maton presented what was described as an ‘emerging 2002 program’ as follows;

- Continue XML work

- Continue practical well log standards

- Pilot Web Services

- External collaboration

Maton concluded with a plug for the DTI/IBM/POSC web services initiative.

NAM

According to Alessandro Allodi, NAM has some 1,200 different information sources. To manage and access these in a coherent way, metadata (‘information about information’) is required. Allodi has developed a sophisticated and adaptable metadata management system to leverage metadata gathered in NAM’s ‘DIANA’ system. NAM’s data management vision is that ‘all the information that a business professional needs must be available at any time, any place in the appropriate format and with a known quality level.’ Central to this is the creation of the Information Asset Register – the metadata warehouse. This combines the best of object oriented and relational data modeling to make for ‘stability in a world of changing data and business models’. The development, using IBM’s WebSphere, Enterprise Java Beans (EJB) and XML data exchange, took a mere two months to complete – but much longer to get buy-in.

IHS Energy

Jan Roelofsen, (IHS Energy - Petroconsultants) wants us to ‘make more of geological data.’ Roelofsen’s goal is to provide information management tools to help the basin analyst understand the timing and migration of petroleum in the context of the structural evolution of the reservoir. This is done with technology developed by Petroconsultants on behalf of ENI (AGIP) since 1998. The project involved creating an information system capable of replacing drafted petroleum systems charts (such as that due to Kingston 1980). The resulting product, Petroconsultants’ Basin Analysis and Sedimentary Environments (BASE) is an Access database coupled to a GIS ‘play map’ and a query builder to generate Basin Evolution Diagrams. Ages are managed through equivalence tables using a fine timescale with ~1my steps.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.