Schlumberger 2005 Tech Symposium, Las Vegas

For the marketing department, it was ‘Petrel, Petrel, Petrel’. Schlumberger Information Solutions’ flagship has carved out a huge piece of the interpretation market in a remarkably short time—4,500 licenses are projected for 2006. But there is a lot of other stuff still going on under the Schlum-berger hood. ExxonMobil’s Elvis impersonator and CIO, Steve Comstock, called for better Petrel data management. The boys from Aspen Tech stole the ‘digital oilfield’ show. And the struggle for a coherent information, management system goes on—with the promise of data nirvana—RSN!

Keynote speaker Dan Miklovic of the Gartner Group observed that IT means different things to different people. For the CIO, ‘IT is ERP’. For the rest of us, IT means technical computing which may be outside the control of corporate IT and present a ‘challenge’ to the CIO. Key aspects of the digital oilfield are E&P data integration and visualization. The solutions will come from ‘grid computing, high performance visualization and sensor networks.’ All of which are currently at different locations on the Gartner ‘Hype Cycle’ curve. Miklovic notes a new ‘asset focus’ as witnessed by the Aspen Tech/SIS partnership (see page 7). He also noted that oil and gas has ‘struggled’ with IT standardization, ‘Other industries have done better’. Following up on this theme in the Q&A, Miklovic further opined that PIDX has been ‘reasonably successful’ but that POSC and PPDM are ‘struggling to gain momentum’. Referring to the Hype Cycle, Miklovic puts POSC in the ‘trough of disillusionment’ and PPDM just past the ‘peak of expectations,’ with folks starting to ask ‘where’s the ROI?’

ExxonMobil

An Elvis impersonator then burst upon the scene to announce, ‘My name is Steve Comstock. I work for ExxonMobil and my job SUCCS!’ After extricating himself from the sequined suit, ExxonMobil’s CIO allowed himself a brief boast on the theme of Exxon’s profitability ‘leadership’. This stems from ‘business principles, ethics, safety, controls, and standards’. What is learned in one place is applied everywhere. Operational excellence is key to Exxon’s success. People should work with their core skills. Geoscientists should not have to ‘figure out how a computer works’. The increasing dependence on technical computing across Exxon’s operations raises the possibility that IT becomes the ‘weakest link’. Exxon is working hard to mitigate this risk, through its Common Environments—UTCS(G) for geology and UTCS(E) for engineering. Hence the Standard Unit of Computing Complexity (SUCC) metric which helps Exxon reduce IT components and complexity. SUCC is obtained by multiplying the number of networks, databases and applications together. Exxon uses Microsoft to ‘reduce complexity’. IT support is also key with standard knowledge sharing, technology, innovation and (more) operational excellence. Comstock likes to ask his own questions in the Q&A. This time his curveball was for Ihab Toma—‘What are we doing to avoid the data quality/management trap with Petrel?’ Toma assured Petrel users that SIS is ‘definitely not going to re-create the same level of complexity. SIS plans transparent data access and ease of use for Petrel’.

Seismic to simulation video

Aside from the asset team fiction we covered in our editorial on page 2, the Petrel video had real-world endorsements from Lukoil (Galina Shevelova), Apache (Mike Bahorich), SCM (Dave Hamilton), Anadarko (John Nieto) and Murphy (Chris Whitmee). SIS president Olivier le Peuch emphasized how Petrel could help data handover issues—a ‘step change’ in the integration paradigm. SIS is further building on Petrel to achieve a ‘single dynamic earth model.’ Petrel offers rapid and even automated update when new data arrives. Petrel extensions are ‘co-developed’ with partners including OpenSpirit, Microsoft, Intel and HP. Whitmee described how Petrel helped Murphy author a 3,000 page document on the Indonesia Kikeh development, in only six weeks.

Model-centric

SIS’ Rod Laver described Petrel as moving from ‘data centric’ to ‘model centric’ computing. Petrel is best known as geologic modeler, but now offers 3D and 2D seismics with Ant Tracking uncertainty all served up from a ‘consistent, easy to use interface’. The Petrel Process Manager captures the complete workflow which can be redone with a few mouse clicks as required. SIS is bullish about the number of Petrel users—licenses are predicted to grow to 4,500 by 2006, with 80 software developers working on the project. One churlish questioner ventured that Petrel was ‘incomplete—there is no seismic inversion, no petrophysics’. Laver stated that petrophysics will be available by connectivity with other software and that SIS is currently investigating in house and third party tools for seismic inversion.

ChevronTexaco

ChevronTexaco’ (CTC) manager of information protection, Rich Jackson, applied a ‘lifecycle IM concept’ to the security of CTC’s exploding data volumes and communications traffic. ‘From the security standpoint, point to point solutions are pointless.’ Like Comstock, Jackson advocates ‘standard components’, data architecture and integrity, business process optimization and automation. Storage is the next ‘battleground’ for security. Unfortunately, CTC’s IT strategy has been vaunted as providing information ‘at any time, in any place, on any device’. This is ‘not great for security’. ‘Making high quality decisions sitting on the beach is important to CTC’ ironized Jackson.

Petrel risk

Martyn Beardsall (SIS) stated that, for reserve estimation, ‘One deterministic answer is not enough’. To date, risk analysis has been ‘one dimensional’ à la @Risk— ‘not appropriate’ for a 4D, dynamic spatial problem. Petrel includes tools for investigating uncertainty everywhere—from seismic processing, through facies distribution to economics. Petrel 2004 introduces ‘case management’ of many ‘equiprobable’ representations of the earth model. Simulations can be run on depth conversions and viewed as movies showing the oil/water contact move up and down. The Process Manager is used to capture and replay scenarios while tweaking inputs. Petrel is a powerful tool used in this way—but this is no video game—there is a limit to the number of dialog boxes a person can face in a days work!

Petrel Data Management

As Comstock’s question implied, Petrel data management is the ‘big issue’ in this community. How do you define and implement naming conventions, units of measure and depth references etc? Today’s partial answer is to use existing systems like GeoFrame, Open Works, Finder to cleanse and manage data. ProSource results capture manager and Petrel Project Builder will soon allow for ‘cleansed’ projects to be built for Petrel. So data management in Petrel is now possible, given resources and rigor. But there remains plenty of scope for screwing things up!

Decide!

A different approach to data analysis, was given by the Decide! folks. Decide! operates at several levels—collecting raw data from SCADA feeds in the Data Hub before cleansing and resampling for consumption by Finder or FieldView. Data is conditioned by removing outliers, de-noising and aggregating to a time interval ‘truly representative of the underlying data’. A rule editor and task launcher allow for post processing for material balance computation or gas allocations. Traditional petroleum engineering workflows load daily data on a monthly basis. Decide! shows cleansed, hourly data in real time. Once a significant amount of data has been collected, Decide’s neural nets and self organizing maps are used to mine the data and perform multi-variate analysis, projections and clustering.

Open Spirit

Clay Harter (Open Spirit) showed how OS connectors serve project data through the OS integration framework and support ‘CRUD’—create, update, delete. OS has added capabilities to ArcView to connect to data in Landmark’s OpenWorks and GeoFrame. Selection criteria can be saved in an XML file and OS can synch data stores at regular intervals. OS is used to bring cultural data into Petrel. Surprisingly, Petrel cannot see data in other Petrel projects. Petrel can ‘consume’ data from an OS server. But Petrel does not serve OS data itself. Harter says that to do this would necessitate a Petrel API from SIS which ‘may happen at some time in the future.’

Roadmap

Dean Quigley (SIS) described how real time information management is used to monitor and improve operational processes. A survey commissioned by SIS found that ‘IM should become more intuitive’ and that ‘technology will remain a differentiator’ - allowing companies to extract information from huge data volumes. SIS’ information stack now rests on the Seabed database, with Open Spirit providing pervasive data access. Data management is now synonymous with ProSource and reflected in product names like ProSource.Results.Manager etc. The legacy database products SeisDB, LogDB, Finder, Drilling DB are all to be rolled into Seabed. And SIS has agreed to make at least one manifestation of the Seabed data model public. ResultsManager can scan and spatialize Petrel projects and capture and store 3D models to SeaBed. But these are so far just snapshots, not an application-independent format. The idea is to be able to ‘see everything in the enterprise from within a single window or map view’. ProsourceCorporateManager, the first component of a replacement for Finder Manager is now out on ‘restricted release.’ But Quigley hastened to add, ‘There is no end of life to Finder’. By 2006, SIS will have a complete IM offering with ‘next generation’ log, document and image management via a link with Documentum. Whether GeoFrame moves to Seabed is ‘under discussion’. Finally, in the Q&A, ‘Ocean’ was described as ‘.NET services for Petrel’. A slight downplay from last year’s key role for Ocean.

This article has been taken from a longer report produced as part of The Data Room’s Technology Watch Reporting Service. More from tw@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.