SMi E&P Data and Information Management

SMi’s fourth annual conference was re-titled “Data and Information Management” this year to reflect an expanded coverage of document, unstructured data and information management. Show attendance was up significantly from 55 last year to 80, with attendees from S Africa, Brunei, N America, Norway, France and Germany. The data management debate has advanced, both in the direction of lightweight data management and in terms of more heavyweight corporate data management – where the true cost (and value) of doing it right has been recognized, at least by Shell’s EXPRO unit.

Exprodat director Bruce Rodney, speaking of Enterprise Oil’s Global E&P Information Portal, set the scene with the observation that data management has proved an intractable problem. People want everything, but still ‘can’t find stuff.’ Rodney breaks data management in to light and heavy components – and distributes these tasks to the front and back ‘offices.’

Back Office

The back office provides the long term corporate memory, and the front office, what is needed to make a decision. Such decoupling allows problems to be solved separately. The front office exposes information through web/GIS paradigm and offers a return path to the archive. The goal is to “find it fast and capture the value”. Enterprise’s Regional Access to Portal Information and Data (RAPID) is an ARC IMS-based development utilizing Exprodat’s Project Documentor (see PDM Vol. 5 N° 8).

The Data Room

PDM Editor Neil McNaughton proposed a scheme for integrating vertical (E&P) applications with horizontal tools such as Office Automation, GIS and document management. Today, web-driven corporate IT is making new demands on domain-specific applications. Corporate-wide document management systems and web portals require a much larger vision of what IT is trying to achieve. Current thinking is that techniques such as API’s and middleware (COM and CORBA) have their place at departmental-level and domain-specific computing.

Uber-schema

But as IT scope expands, these tightly coupled techniques necessitate an unrealisable IT schema (the ‘uber-schema’) of the whole enterprise. Modern integration centres around three concepts – limiting application coupling, sharing metadata and XML-based messaging. A restrained amount of data sharing leads to the concept of the Corporate Metadata Store. Although it is still early days, the hyper portability of XML has a promising future for E&P data replication, sharing and exchange.

GeoQuest

GeoQuest’s Bruce Sketchley reckons that this year, some 2000 terabytes of seismic data will have to be ‘managed.’This includes all processed, in-house, actively managed data but excludes field data. The exponential growth of seismic data volumes is dubbed the seismic ‘tsunami’. To avoid being swept away, Schlumberger advocates ‘virtual services’ for data management. In Aberdeen for instance, BP has an in-house ‘window’ to off-site hosted applications. Schlumberger’s data centers in Stavanger, London, Aberdeen and Paris will allow for similar offerings across the EU, while high bandwidth will facilitate remote access from Milan, Pau and Madrid. Business models for the new ASP-enabled software and data management are still ‘work in progress’.

Norsk Hydro

Duncan McKay, now with Conoco, was mandated by Norsk Hydro last year to ‘sell himself,’ along with the rest of the recently acquired North Sea assets of Saga Petroleum. A virtual data room, with a scanned and indexed set of Saga’s north sea library was opened on 5th May 2000. Conoco signed on the 19th July 2000 and it was a done deal by 1st August. All paper data was bar-coded and scanned onto a set of CD’s. Even electronic data was printed, scanned and output to TIF. Though counterintuitive, this was considered a pragmatic solution.

The ‘Big List’

Saga’s documents were catalogued into one big flat file (the ‘Big List’) and Alchemy software was used to access the files. Two data rooms were built, each with a Sun Enterprise Serve and 10 Centra 2000 workstations. At the end of the project, 4 1/2 tonnes of paper was shredded.

Spectrum

The scanning and infrastructure contractor for Hydro’s project was Spectrum, and Richard Stowe followed up with other IM projects. One involved the OCR of 2800 scanned geotechnical reports (500,000 pages) with output to Alchemy CD’s, another, an Open Text LiveLink installation, hooked into the company’s Finder (GeoQuest) database. Stow considers that take-up of electronic document management (EDM) has been slow, but that the leverage gained from web access to managed documents is changing this. Questioned on the cost of such operations, in terms of storage space savings, Stowe replied that the cost of lost opportunities far outweighed such considerations.

£ 6 million man!

Erik van Kuijk, head of subsurface data management with Shell’s Expro North Sea unit is going to be a popular man. He has convinced his bosses that data management really is important, and has obtained £ 6 million funding for 19 projects. These, developed with help from Flare consultants, set out to provide a firm infrastructure to Expro’s Knowledge, Information and Data (KID) environment. Projects are pragmatic, one concerns rationalizing naming standards, of which Shell currently has 7!

Too hot to handle

Van Kuijk offered a new slant on data management, introducing the concept of data ‘entropy’. Data can be more or less static – with ‘cold’ unchanging data residing in the corporate data store and ‘hot’ data deployed in projects. Van Kuijk suggests that project data can be too hot to handle – or rather to manage, and this should be recognized. Also, continuing with the thermodynamic analogy, over time, even cold data will tend to an anarchic state, will deteriorate and requires ‘energy’ input to maintain its quality.

Publish or freeze!

Data types fall naturally into more or less hot states. But the degree of short term activity can vary also – thus exploration tends to generate a very ‘hot’ state, while production is somewhat cooler. Publishing is defined as making project-level deliverables available at corporate level as new reference data. The cost of publishing – the energy required to move from hot to cold - depends on the ‘entropy’ of the activity. This process involves cooling data down by for instance ‘adding meta data.’ It costs more to publish exploration data than production.

Expert System

A second Enterprise project, the DART acreage evaluation system was presented by Exprodat’s Gareth Smith. The expert system, for quick-look evaluation of exploration opportunities in the North Sea, leverages Enterprise’s corporate database. This is based on Open Explorer and other information management tools such as Open Journal and Exprodat’s own Project Documentor. DART offers ArcView GIS-based data access and retrieval of data which is then piped to Microsoft Access where business rules can be applied. The system talks to other databases such as Arthur Andersen, Asset Geoscience (Target) DTI, LIFT, UKOOA and Enterprise’s prospect inventory. Business rules involve a scoring system – blocks with 3D seismics score high, as do those near facilities.

DEAL

Common Data Access CEO, Malcom Fleming provided an update on the project’s status. Currently, seven data vendors display their products on the DEAL website, with the vast majority (26,000 products) coming from the IHS Energy stable. DEAL has 400 registered users, including around 100 oils, plus 440 ‘anonymous’. There have been some 2,300 logins over the last four months, with 420 logins in January 2001. Registered users hail from 17 countries, 90% outside UK.

SGI

Guy Gueritz, EAME Marketing Manager, with Silicon Graphics (SGI) offered a tally of world-wide Reality Centers. Heavy duty users (with 4 centers a piece) include ExxonMobil, PDVSA and Halliburton with around another 50 other centers at locations throughout the world.

Virtual Insight

Gueritz introduced SGI’s ‘Virtual Insight’ described as a combination of Information Management, Application Service Provision, and a data warehouse. Virtual Insight has been used during reservoir modeling – to store metadata and processing parameters from pre-processing through modeling to post-processing. Virtual Insight is a ‘process-oriented data management solution’ and allows visualization users to ‘keep track and back track’. Virtual Insight runs under Magic Earth and with Landmark’s visualization software and deploys its own independent Oracle database.

Hampton

Hampton Data Services’ consultant Robert Casalis de Pury believes that new technology is on the horizon that will revolutionize the way we search for data. De Pury believes that while the web paradigm allows many data types to be visible and linked to data in the corporate data store it has its limitations. Hampton believes that peer to peer storage à la Napster is about to revolutionize information access. The idea is that spare storage and compute cycles are an untapped resource in the organization and could make up a huge, distributed storage and search engine.

Next Page

Hampton deploys software from Next Page to achieve this. Described as an e-content platform, NXT3 is modular middleware that offers distributed text processing. NXT3 was originally developed for the publishing industry. Content is ‘syndicated’ to other peer servers so that users ‘see’ it locally. In reality, content may be replicated or just indexed depending on performance requirements. NXT3 can link to existing structured and unstructured data and document repositories. The software also provides entitlement management and triggers to push reports out daily, weekly etc. NXT3 can work with Access, Oracle, SQL, Word, PDF, Excel etc. An XML-based content network protocol and control network adaptors are deployed. De Pury noted that ‘peer to peer’ could mean contractor to client.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.