In the special ‘Boldness in E&P’ session, Tullow Oil Uganda’s Shane Cowley stated that Tullow’s fundamental premise is that ‘teamwork is the key.’ A well drilled in 1930s showed reservoir, seal, shale and shows. But nobody believed that the threshold could be met. A World Bank funded airmag survey proved a significant deep basin. This was later refined with an airborne full tensor gravity gradiometry survey that ‘identified most structures we have now drilled.’ The now proven theme of hanging wall anticlines (pizza slice plays) is ‘a somewhat unconventional play’ of sediment against basement. This has been extended into Ethiopia and Kenya where Tullow has one major discovery and ‘an acreage position the size of England.’
Austria-based Joanneum Research Institute’s Johannes Amtmann is building a seismic attribute database for effective literature research. The project compiled attributes described in interpretation software (CGG/Geovation, Kingdom, Open dTect, Petrel, Promax ) publications which have been classified and stored in a database. The initial work was delivered to OMV as a Microsoft Access database—now migrating to a ‘Seismic Attribute’ web resource.
A Schlumberger Techlog booth presentation called for more rigorous cement evaluation in deep offshore drilling. Deepwater wells with thick casing and lightweight mud present problems for conventional cement bond logging tools (CBL). A new flexural attenuation tool and specialist processing can sense through cement to the formation. When combined with ultrasonic acoustic impedance a clearer identification of top of cement is available along with potential channeling issues and free pipe. The presenters suggested that the API RP96-B3 cement evaluation that relies on subjective evaluation could be beefed-up.
Unicamp, Brazil’s Denis Schiozer has been investigating production optimization and the economics of high-end inflow control valves. A literature review of intelligent wells and optimization revealed some unfair comparisons and confusing terminology. A simple reservoir simulation model was developed to check on the viability of inflow control valves (ICV) and simpler on/off valves. An artificial intelligence method used a genetic algorithm to generate hundreds of cases and compare net present value for different oil prices and water disposal costs. This gave some unexpected results. Not all ICV deployments pay back the cost of deployment. Several cases have the same NPV with different total production. Schiozer concluded, NPV optimization is a complex process.
Clifford Allen (Halliburton) is working on a data system for management of intelligent completions. The idea is to combine Scada data with flow assurance, decline analysis, allocation and well tests data in a hierarchical data structure. While conventional Scada devices have a relatively standard interface, downhole instrumentation is more complex. Fields now have two sets of RTU/PLCs, and dual Scada systems or more. Polling devices across these systems is hard so data usually goes to the historian. But even here, different time stamps make it hard to get data into enterprise reporting.
Halliburton’s answer is its Asset Optimization Service, an ‘open source’ system that handles all data interfaces and adds ‘complex algorithms.’ A screen mimics the well from BHA to surface. Data hooks connect to third part devices for hydraulic control. AOS, sold as a service offering, can optimize production leveraging all the available data. In the Q&A Allen was asked about the status of the IWIS downhole data standard which was supposed to address such issues—he responded that while Petrobras was pushing strongly for IWIS, it is ‘hard to see who will pay for such a fundamental shift.’
Mosab Nasser (Maersk Oil) is critical of seismic inversion models that are ‘thrown over’ to interpreters and geologists who often don’t understand them. Nassar advocates using rock physics to build the reservoir model. A case study of a West Africa submarine fan involved multiple possible geological scenarios. These were triaged using seismic amplitudes and rock physics. The ‘3D close the loop’ process involves rigorous rock physics modeling of the reservoir and forward modeling to a synthetic seismic section. This is compared with the 3D data and differences explained in terms of fluid content. The ‘3D CtL’ process is run from Schlumberger Petrel along with an in-house developed ‘Mod2Seis’ that computes the elastic response. Iteration aligns the model with the data—but, Nassar warns, ‘it may not be right!’ The technique has led Maersk to change its geological concepts—losing channeling and gaining faults. Another caveat is that the conditions need to be right—these techniques ‘would not work in the pre-salt.’
Another model building workflow was presented by Statoil’s Xavier van Lanen and Jan-Arild Skjervheim. The idea is for an automated process from geomodeling through to simulation that can be updated when new data arrives. The model is conditioned to seismics and represents and propagates uncertainty. The process spins-off multiple fluid flow simulations for history matching that then updates the model. Simulation, point estimate and ensemble methods are all run under a workflow manager. The output is forecast production profiles. Subsequent seismic and well data can be used to update depth surfaces and faults. The ‘base case’ model runs on a PC which fire off multiple realizations on the cluster. The workflow manager is used to test sensitivities and scenarios with ‘smart’ workflows. The process can be very compute intensive with for example, 28 CPU hours per realization for a 20 million cell geo grid/4 million cell sim grid. But the results are worth it—showing the effect on structural uncertainty and production profiles. The automated workflow allows Statoil to pursue otherwise unachievable analysis of a field. Whole loop workflows towards an assisted history match and back to the structural model are now feasible and the prior model can be compared with the update and against production.
Several papers addressed computer-aided interpretation. Eric Suter of Norway’s IRIS Research Institute has developed a geometrical transform that ‘stretches and squeezes’ internal bed geometries at will while dissociating rock properties from bed geometry. A geometrical transformation acts on formation boundaries (no grids involved) with intelligent linking across faults. An automated fault update facility inserts new faults avoiding a lengthy manual update. The idea is for an ever green earth model while drilling. The absence of grids means that complex multi-z surfaces such as recumbent folds can be modeled with properties intact. In the Q&A, Jean-Laurent Mallet noted an apparent similarity of this technique with his own UVT transform as used in Paradigm’s Skua.
Steen Agerlin Petersen (Statoil/TUD) wants to go back in time to when the dinosaurs were around. Current static seismic cube interpretation works, but does not use all the geological processes and leaves out a lot of extra information in the seismic data. Enter ‘earth recursion,’ or data restoration/model restoration ‘DR/MR.’ The DR/MR moves back and forward in time—laying and eroding sedimentary units, adding diagenesis to end up with what is observed in the seismics. The DR/MR modeler can fault and fold beds and switch between reflectivity and seismics. This leads to dual ‘contexts,’ reality and simulation that develop over time and intersect at the present where the model meets reality—in the form of rock properties, seismics and logs. Petersen is now working to extend the method to planning and flow simulation. A North Sea example showed how interpretation was facilitated by going back in time and interpreting seismics ‘as the dinosaurs would have seen them!’ DR/MR is claimed to be a great integration workflow for people and disciplines.
On the exhibition floor we saw the latest release of Landmark’s DecisionSpace Desktop with a rather compelling capability for planning and executing multiple horizontal wells for non conventional development. The idea is to move from a blind ‘factory drilling’ concept to a more adaptable approach, targeting shale sweet spots with a holistic integration of all available information—from real time data to GeoEye satellite imagery. For Landmark, this means spanning the traditional Engineer’s Data Model/OpenWorks divide. These two data sources will soon share a logical data model—currently being developed in an internal project named ‘Common Ground.’
Schlumberger’s Petrel 2012 demos were spectacularly well attended. Petrel 2012 includes just about anything you could think of—with new functionality integrating seismic processing with interpretation for ‘seismic-driven reservoir modeling.’ Even the Schlumberger/Chevron developed ‘Intersect’ high-end reservoir flow simulator is now accessible from Petrel. All of the above and more is now tied together with Petrel Studio—with data stored in a ‘Studio Knowledge Database.’ Schlumberger’s GUI specialists are working on an Office 2010-style ‘ribbon’ interface for the next major release.
© Oil IT Journal - all rights reserved.