Kjell Eric Ostdahl, two weeks into his presidency of Schlumberger Information Solutions, gave a competent performance, imagining what E&P would look like ten years from now. Ostdahl noted that connecting information has not added value. Instead it has produced information overload. Now we need to ‘combine technology with process, best practice and business integration,’ to ‘connect workflows’ and generate high quality prospects in ‘days rather than months’. Service companies will work with oils to improve the quality of interpretation especially through the judicious application of 4D seismics. Instrumented wells updating a shared earth model will increase recovery up to around 70%. Finding and developing costs fell by 25% in 2000 and will halve by 2014. Single wells will reach multiple targets, seismics while drilling will look ahead of the bit. Onshore operation centers will take real time feeds from static and dynamic models. Time to first oil will be halved.
On the SIS application front, Petrel is to be the core of SIS’ ‘seismic to simulation’ workflow—expanded to include seismics and ‘forward simulation’. Work is underway, with ChevronTexaco on a new flow simulator, ‘Project Intersect’ which will integrate with Petrel. The SIS Business Consulting Group formed one year ago now has around 100 consultants working on asset optimization, QHSE and HR. The consulting business unit will help make SIS a ‘thought leader’ and ‘turn the industry around’.
Too much G&G!
Ostdahl believes that SIS has been ‘way too focused on G&G.’ But this year has seen a renewed interest in production with new software tools—especially Decision Team’s production data mining. Moving further midstream and downstream, SIS has initiated an alliance with Aspen Technology. Schlumberger’s application portfolio is to consolidate around an ‘open framework, a single shared earth model (Seabed) with an ‘open’ application programming interface, used by SIS to develop its own applications.
A round table discussion led by the Economist’s Vijay Vaitheeswaran, author of the book ‘Power to the People,’ offered an upscale panel with representatives from the IEA, NNPC, Saudi Aramco, Oxy, Shell, Pemex and Total. Vaitheeswaran described a situation where an energy crisis and rising prices were reported as having ‘a serious effect on industry.’ This was back in 1704 when charcoal prices had doubled in a 70 year period! Thus was born the coal-fueled industrial revolution. Such events and, by implication, the situation today, reflect ‘a confluence of crisis, technology and entrepreneurship’.
Oxy CIO Don Moore categorizes the i-field as MOP ‘modern oilfield practices.’ These comprise virtual teams with a ‘plug and play’ workforce, ‘drill to earth model’ practices and the quest for the ‘perfect’ well production index. Knowledge networks, communities of practice and consistent E&P processes ‘bother some people’. But overall, the consistent visualization model across all assets and common look and feel ‘ease access to technical data’. Oxy is also reaping the benefits of the ‘convergence’ of geoscience and engineering tools. Standard software from multiple vendors aims to provide ‘best of breed’ tools for the job.
Shell CIO Graeme Henderson asks ‘will a new technology still be around in 10 years?’ Can staff adapt? Can it be deployed? Over the next decade, Henderson sees a move from wired to wireless, from fixed capacity to on-demand commoditization of systems, from dumb stand alone devices to smart integrated systems. These will enable the fully digital smart field, supported by the ‘loop’ of real time measurement, adapting the model and controlling the field. In one heavy oil steam injection project, micro-seismic monitors are used to aid drilling decisions on multi-branch intelligent completions—integrating e-SCADA over the vMBus. Elsewhere IT’s move to ‘utility computing’ ‘will have a remarkable impact on seismic processing’. Visualization will scale up to always-on ‘vast walls’ supporting worldwide collaboration.
Schlumberger’s Ihab Toma reported an IBM study which found that ‘industry wastes $15 billion per year on bad decisions’. These lead to oversize facilities, dry wells and other costly gaffes. SIS’ solution is to offer global connectivity and support ‘best-of-breed’ niche applications. These should be usable ‘without breaking the workflow.’ Citing the Cambridge Energy Research Associates’ Digital Energy study, Toma stated that all the CERA goals for increased production from i-field deployment have been achieved or surpassed in individual projects. But still, relatively few projects use all available technology. Fewer than one in ten wells are drilled dynamically with wellbore design and/or target adjusted during drilling. While a large amount of real time data is collected, little is done with it.
Toma outlined SIS’ technology blueprint which has the Microsoft .NET-based ‘Ocean’ as the framework for development and production. Linux-cluster-based GigaViz is now integrated with GeoFrame (which remains SIS’ flagship ‘qualification’ tool and SIS’ largest software cost center). According to Toma, 70% of oil and gas companies already run Linux clusters. In 2005, Petrel will have a GigaViz server running over OpenSpirit. Complete seismic workflows are being built into Petrel including the new Ant horizon tracking. Petrel will be ‘Ocean-compliant’ next year. There will also be a real time WITSML feed to Petrel ‘real soon now’. Toma also spoke about Schlumberger’s alliance with AspenTech for field-to-facilities modeling—although this topic was somewhat under-represented elsewhere at the Forum.
Peter Kapteijn (Shell) proselytized enthusiastically for real time production optimization. Shell’s ‘value loop’ involves a circle of four ‘assets’, physical, data, models and decisions. A significant value loop for Shell is the Smart Field (SF) built around an integrated Data and Control Architecture (DACA) linking real time data with asset teams. Today the SF is hampered by masses of inconsistent data, poor models, spasmodic optimization efforts, laborious decision making and poor uncertainty handling. This leads to ‘reactive rather than proactive’ reservoir management. Ten SF pilots are underway including the deepwater Gulf of Mexico Mars field, Brunei’s ‘East Asset’ and the Norwegian Ormen Lange gas field which is ‘completely smart*’. Kapteijn reports that all Shell’s smart field tests have beaten CERA’s digital oilfield forecasts. Better still, the cost of ‘smartness’ is dropping all the time. Ten years ago, a distributed control system (DCS) cost $15-25k/well: today around $5k. Shell’s integrated services platform will be built with components from Invensys (ArchestrA), SIS (Ocean) and Microsoft (Longhorn)**.
Kapteijn concluded his presentation with a compelling ‘Nintendo’ simulation performed for Shell by the Delft Technical University. A video showed water injection and oil production monitored and modeled in real time. A ‘dynamic mode’ allows for optimization, changing settings of valves as encroachment is anticipated. Such actively managed flood patterns can lead to 20 % more recovery and 50 % less water to dispose of. The more complex the reservoir, the better this works. Kapteijn asked, ‘Can industry afford not to do this?’
Schlumberger lead data modeler Jay Hollingsworth unveiled the new Seabed data model architecture. Seabed represents a shift from ‘data-centric’ to ‘i-enabled’ processes. Hollingsworth’s brief was to supply a single logical data model that would support ‘everything’. Seabed is a ‘configurable’ database for Schlumberger’s next generation applications and information management. Unlike existing E&P databases, Seabed will leverage Oracle’s technology to the full—particularly the extensions available in Oracle 9 and 10. All new Schlumberger developments will deploy Seabed. But the technology itself is ‘not for sale’. Notwithstanding this, Seabed was presented elsewhere at the Forum as a future replacement for Finder. Today, products like SeisManager and ReservoirManager already use the new database.
In designing Seabed, Schlumberger took the best of POSC’s Epicenter and the PPDM Association’s model. The result is a modular design, separable to stand alone units by domain, by level of detail and by function. Uncertainty is ‘completely embedded’—forming a ‘third dimension’ in the Seabed data model.
Steve Darell (Iron Mountain) described the ‘tight partnership’ between SIS and Iron Mountain which has resulted in eSearch V2.6, the current release. This resulted from the marriage of Iron Mountain’s Open RSO and Schlumberger’s AssetDB. The web-enabled tool offers multiple criteria search and order—by well, geographic area, permit etc. and interfaces with multiple data sources. Full text ‘Google-like’ searching across all digital assets is now possible. Information is stored independently of its location. Total’s home-grown DocXplo asset management system has also contributed to the product’s evolution. An API allows for integration with third party portals and applications such as Documentum.
Dean Quigley presented Schlumberger’s IT roadmap to the future which targets ‘wider integration with downstream systems’. The foundation is the aforementioned Seabed, with Ocean supporting the drilling and production tools. A souped-up version of the OpenSpirit middleware will assure connectivity with legacy applications—particularly GeoFrame. Today’s ‘silos’—Finder, LogDB and SeisDB—will be replaced with more ‘open’ systems built around Seabed and Open Spirit data access. The plan is for ‘smooth evolution’ rather than abrupt change. Finder will transition ‘when you chose to’ and it will be a ‘one button push’ operation, ‘providing Finder has not been customized!’.
Mike Hauser presented ChevronTexaco’s (CTX) vision for the ‘i-field.’ Even on a high-end Gulf of Mexico asset people still spend 50 to 80 % of their time manipulating data before making a decision. Hauser believes we have to ‘fix processes so as not to be inundated by data’. Today, we have lots of information from automation systems and down hole feeds and ‘we don’t do anything with it’. CTX’s i-field initiative sets out to help assets identity opportunities, to link solutions to assets and fill technology gaps with R&D. Hauser also finds the CERA numbers conservative and sees the i-field as the ‘next wave for a major change in our industry’. CTX does not believe in ‘pilots,’ these do not garner sufficient commitment for success. i-field successes have been reported from the MP41 GOM platform—equipped with wireless technology—and an artificial intelligence (AI) test on the Cymric steam injection project in California where an AI scheduler has standardized complex procedures on the 1Y field. CTX’s partners include Schlumberger, SAIC and Microsoft—described by Hauser as ‘companies who have kind of got it!’
Amy Howell showed how the SIS/AspenTech alliance is making possible combined modeling of the reservoir and production facilities. The Avocet Field Asset Modeler takes fluid compositional forecasts from the simulator and feeds AspenTech’s Hysys facilities model. Real-world development options can be leveraged to maximize NPV over the life of a field.
* The Norsk Hydro-operated Ormen Lange may be ‘smart’ but the Financial Times described it recently as a ‘byword for inconsistency in reserves booking’!
** A surprise, as Longhorn,
Microsoft’s next operating system, is due for release towards the end of 2006!
This article has been taken from a longer report produced as part of The Data Room’s Technology Watch Reporting Service. More from email@example.com.
© Oil IT Journal - all rights reserved.