GeoQuest's GeoForum 98 - sun, surf and … automation. (April 1998)

Francis Mons, Vice President of GeoQuest EAMEwelcomed around 450 customer delegates from 79 oil companies together with some 100GeoQuest personnel to the 1998 edition of GeoForum, GeoQuest's European technology forum.Mons mused that the sum of the production from all the oils present would exceed that ofOPEC. Given that three of the best hotels in Cannes had been taken over for the event(with posted room rates of around $650 per day) the cost of the event was probably worth afew days OPEC production too. PDM brings you the highlights.

Keynote from Thierry Pilenko, GeoQuest's new president.
Pilenko gave a top level view of where GeoQuest is going today. The target horizon for Pilenko's talk was 2005-2010, and the objective - "Automated Reservoir Optimisation". In the same breath, and following each mention of the word "automation" Pilenko stressed - as if to reassure the assembled mouse-clickers that "We don't want to remove creativity, just to eliminate the tedious aspects of the interpretation and optimisation process". Top level talks between service companies and oils have led GeoQuest to set themselves a very high objective. In the above time-frame, the aim is to halve finding costs by upping discovery size, diminishing the number of dry holes and reducing drilling costs. The real challenge, Pilenko stated, is "to do this with existing staffing levels by maximising user potential".
New paradigm
The new paradigm for reservoir optimisation will involve more, new players from sister companies in the Schlumberger group. To date, GeoQuest has provided a "small piece of the cake". To leverage to real time, new sensors and actuators will have to be developed and integrated into the software environment. Among the new players will be Schlumberger's customers and third party providers. Offering access to the GeoQuest software environment has been a "major strategic decision". Open access will be provided through standards allowing third parties to develop their own ideas on a common software platform which will integrate the new automated process environment. In this context, Pilenko affirmed that GeoFrame is already offering such services to third parties and is "100% POSC compliant". (Editor's note - For those of you who have escaped PDM's editorial line on POSC "compliance" we would encourage you to refer to PDM Vol 3 N° 2).
An example of how all this will hang together is the move from today's interpretation environment - still essentially the 3D two-way travel time seismic cube - to the near-term objective of the velocity cube and depth cube deliverables. Further down the road, interpreters will be working directly on the "geology cube" which will integrate the seismic-derived information with corporate knowledge utilising all the interpretation skills available. This geology cube will also incorporate quantitative measures of the uncertainty in its spatial attributes and will be delivered with tools to manage and audit such uncertainty.
Economic space
Once the play types have been defined and associated with certain seismic attributes these can be studied throughout what then becomes the "prospect cube", and as well paths are developed, and potentially modified during drilling, this in turn becomes the "planning cube". Real time input to the planning cube will come from surface mounted sensors recording information transmitted from the drill bit. This will provide continuous information gathering, incorporated into the model in real time, and allowing for changes in the well trajectory. Another example of such real-time adjustment is the possibility that a field can be discovered and appraised in one operation. All this leads Pilenko to another geometrical environment, - where one navigates in "economic space" - where dollar values can be computed in real time for optimal drilling decisions such as dog-leg severity.
Automation key
Pilenko's top level analysis of the way in which Schlumberger's product line has evolved over the years showed how the company began with what was essentially tools and services. The next phase in Schlumberger's development followed from this as the ever increasing data volumes generated by the tools needed managing - leading to the development of data management solutions - the xxDB product line and of course, Finder. Then followed - or perhaps follows, because this is where we are today, the integration of all of this within the GeoFrame environment. The next phase - that slotted for the first decade of the 21st century - will involve the automation of many of these processes and workflows. The business impact of each of these phases has been considerable, but Pilenko anticipates that it is the last, automation phase which will offer the largest return on investment. "Automation will be more important than anything we have done in the past" claimed Pilenko.
ProMIS
Vlad Skvortsov introduced the concept of the Automated Information Factory (AIF). Production engineers require information from different sources and different disciplines often stored at different locations, to different quality standards, versions and perhaps integrating paper data. Ideally the same data should serve for studies at various timescales - reservoir (long term); field management (monthly) and well management (daily). The AIF intends to merge all these scales of observation in a Pilenko-esque datastore centred on Finder, but integrating web access and the use of Oil Field Manager PC based end-use tools.
Finder is presented in the role of a data hub with data sources and sinks such as real-time, field data, paper based data sources, 3rd party digital data, Office apps and analytical software. The AIF - (aka ProMIS) wraps all this up into a single data source, with automatic data capture and loading. Skvortsov claims that today we spend 90% of the available time in data preparation and only 10% on analysis, tomorrow, ProMIS is set to reverse this. These tools rely on the extendibility of Finder using Oracle technology to constitute an application database. In other words there will not be one massive database for all applications.
SEM - revisited
Ian Bryant of Schlumberger Doll Research described how the new reality of the SEM is being developed. The basic problem is that the reservoir is unevenly and under sampled. As an example, the area actually sampled by logs in an oilfield may be as little as 0.0001%. The impact of this depends on reservoir geometry. For flat lying beds you may get away with relatively low sample density. For a labyrinth type reservoir this is unlikely to hold true. Current 3D models honour some of the data but introduce a new problem. That of an implicit confusion between real data and interpretation. There is a requirement to visualise what we know and where we know it; and what and where we don't know. Typically costly processing and interpretation may be performed on some datasets such as well logs or 3D seismics. But neither high resolution log information, nor 3D stratigraphic information actually gets into the model. At a well, complex reservoir information may be collapsed to binary sand/shale voxels - while seismic information may be reduced to top and bottom of reservoir. "If a picture is worth a thousand words, an image is worth as many wiggles".
Deja vu?
So is the SEM Visualisation software revisited? No according to Bryant who describes the use of the SEM in the "validation gauntlet" whereby a fast simulator is used to predict and match iteratively to obtain a number of models that fit the data with an accompanying measurement of associated uncertainty. This was demonstrated with a video of a 3D view of a fault-block in Statoil's Gullfaks field. In one window a ray-traced seismic model was compared with the recorded seismic data. In another the input geological model could be tweaked and the impact of such adjustments viewed in real time on the seismic simulator. The demonstration was sufficient to impress some Statoil personnel in the audience who may have had some second thoughts of the deal they have just struck with Landmark for an enterprise-wide computing solution.
Petrotech
The client side of the outsourcing story was presented by Paul Blair of BG E&P. The decision to outsource was made as part of the de-merger of British Gas which resulted in the creation of BG E&P. Prior to this, British Gas's E&P effort was organised into "resource intensive" asset based teams. A modern enough business paradigm you might think, but there are no sacred cows in the cultural revolution of BPR. At de-merger, E&P was to downsize by 45% in a move to a functionally-based organisation-designed for better utilisation of resources. The lead role in the "Petrotech" outsourcing project was awarded to SAIC. This arrangement came about from BG's desire to have "centralised control through a single point of contact with the primary partner". 
Balanced
Working under SAIC are GeoQuest, Landmark and other contractors. Petrotech is described as a partnership and based around a core of service level agreements. A risk/reward cost model is used and cost savings are shared between BG and the providers. Performance is metered regularly by a "Balanced Business Scorecard". Petrotech has been up and running for one year now. Blair described outsourcing as a "major, non-trivial task". The first three months were a transition phase, with BG staff retained to assist with the process. Subsequent to this transition BG has experienced an (unplanned) 100% staff turnover - with many taking the voluntary retirement package which was on offer. Blair suggested that a retention bonus might have been a better ploy than paying people to quit!
Major findings after the first year
* Expectation levels were unrealistic
* There was confusion over the respective roles of the legacy corporate IT services and Petrotech. The latter was blamed for some of the failings of the former.
* Recruiting and retaining high quality staff proved tough
* Keeping members of the "old guard" in key positions was considered a mistake
* Friction was generated between the different "cultures"
* New technology was introduced "too slowly" - a new approach is slotted for 1998
Sensitive issue
On the positive side, the new professional approach - notably in cartography - overcame some scepticism on the part of the user community. Additionally, substantial cost savings have been reported - as much as 40%, although the situation before the change was such that the baseline has been hard to establish. Outsourcing is a sensitive issue to E&P personnel and Blair was probed by questions from the floor as to the overall efficiency and gains accruing from the outsourcing effort. Blair opened up and stated that the outsourcing decision was taken at "a high level" in the organisation and that not all the results have been positive. "We have lost expertise - outsourcing is a balancing act. In some areas the service is not as good as it was before the outsourcing initiative." The main positive point to date has been the cost saving.
GISWeb
Agip, in cooperation with GeoQuest EAME, have developed a web browser for Finder/Enterprise and other data sources. The main design constraints were to provide access to geographically dispersed data throughout AGIP's world wide operations, over low bandwidth links, and to access heterogeneous E&P data stores. Technology involves a three tier structure with the GISWeb Java client talking through CORBA links and a "dispatch middleware" to CORBA data servers grafted onto a variety of standard E&P data stores. This technology is connectable to any type of E&P data store by the association of a CORBA driver which allows the existing datastore to remain unchanged. Servers have been developed for Petroconsultants' IRIS21, AGIP's Forall environment and GeoFrame. Maps are drawn vectorially on the client screen and can be zoomed, panned and selected. Intelligent scale-sensitive data transmission economises on bandwidth and ensures reasonable performance even over low bandwidth internet connections. Apart from the Java client, another version of the tool also exists as a Finder Smartmap client. This works in a similar way to GeoQuest's GeoWeb product except that it is not limited to a single pre-defined Finder map, and access to foreign data is facilitated. Herve Ganem described the job of converting third party data stores to run as CORBA data servers as "relatively simple" emphasising that the data in the original data stores required no modification for this technology to work. Paul Haines (GeoQuest's Head of Data Management Product Planning) told PDM "These local-level developments are of considerable interest to us in the GeoQuest software division. They provide us with feedback on customer requirements and deliver working prototypes of software modules. We track such efforts closely and will integrate the results of such efforts into our product line if client demand is there".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_4 as the subject.

© Oil IT Journal - all rights reserved.