Around 7,500 attended the 90th Society of Petroleum Engineers Annual Technical Conference and Exhibition in in Amsterdam in October. The opening plenary on ‘affordable energy’ was a rather lacklustre debate that touched on climate change, growth and sustainability, a ‘triple dilemma’ with no easy solution. The IEA’s Christian Besson observed that energy supply is increasing as demand is dropping and this situation may last a while. ExxonMobil’s Neil Duffin spoke of a ‘chain reaction’ as a falling oil price impacts major projects. So far this is a short term phenomenon affecting mostly smaller companies although all are seeking more efficiencies. Technip’s Philippe Barril sees integrating teams and relocating work away from high cost environments as one solution. On the question of standards, Duffin made a call for ‘standards that work,’ not ones where ‘changing one spec means that we are no longer within the standard!’ Barril concurred that standardized design can help cut costs. Reducing paperwork and bureaucracy would also help. Besson saw carbon capture and storage as essential to moderating climate change but Duffin warned that high costs and uncertain regulation were problematical.
Another session reflected on the oil and gas industry’s public image. According to Deborah Shields (Colorado State University), the lack of a ‘social license’ for oil and gas development (especially fracking) is making things difficult. This is a ‘wicked problem’ with no easy solution. Shields cited the work of Lausanne’s sustainable energy systems unit and Braden Allenby’s book. The oil and mining industries have ‘spent years turned inwards, speaking to themselves.’ What is needed is a change of tone in communications with government and the public. Pete Smith (Aberdeen University) pitched in with more climate doom and gloom. Since the failure in 2000 of the climate change agreement, greenhouse gas emissions have accelerated. We now need more investment in energy efficiency and in power plants that sequester CO2. Unabated emissions through 2030 will make for ‘overshoot’ and require an expensive negative transition. Abatement must start now with more nuclear and renewables and fossil fuels to be phased out by 2050. For Smith, the oil and gas industry is ‘alongside the tobacco and arms industries in its level of negative perception.’ Yes this is the SPE!
A special session on aging assets in the North Sea heard from ConocoPhillips’ John Hand on the venerable Ekofisk field that has been producing for 40 years and will likely go on for another 40 (that’s beyond 2050!) Recovery has risen from 15% to 70% and production has caused the sea floor to sink by 9 meters causing sinking structures and well buckling. Injection is necessary but causes ‘water weakening’ in the overburden. To see where the water is going, a permanent fiber optic/satellite link provides 4D seismic monitoring. More fiber provides downhole surveillance, ‘listening’ to wells. 3D geomechanical models help with understanding of the overburden and wellbore stresses. A culture of performance and continuous improvement means that the high-end technologies are being applied to smaller and smaller drilling targets. The ‘integrated operations’ approach transfers easily to smaller fields.
Notwithstanding the politics, industry continues to advance on multiple technology fronts. Shell provided an update on assisted history matching with 4D seismics. Currently this frequently fails from a lack of information. Enter ‘model maturation,’ a way of including prior information such as faults and aquifers that are otherwise left out of the model. History matches on local gridblocks pinpoint model flaws and update the model. The approach is widely used by Shell in its worldwide operations.
A Decision Strategies presentation bust the ‘myth’ of sweet spot exploration. For high variability reservoirs, exploring for sweet spots is ‘inefficient and destroys value.’ While service companies are keen to sell techniques to identify sweet spots, it is better to follow a methodology that proves that a project is viable and to avoid ‘gaming’ exploration with unrepresentative wells. Later in the development cycle, operators can home in on sweet spots to assure early cash flows. But a thorough ‘value of information’ analysis should be applied to additional techniques because ‘some aren’t worth the expense.’
The thorny topic of reserves reporting was addressed in a joint SGS Horizon/University of Houston presentation analyzing recent SEC reporting guidance. In the last five years, most comments from the SEC revolve around the reporting of proven undeveloped reserves and how to interpret ‘materiality’ or ‘significance’ in reporting. PUDs must be developed in under five years from reporting—an issue of some importance in North American shale plays where some reserves will likely have to be de-booked. ‘Undeveloped’ acreage must have a realistic program for its development in the ‘near term’ i.e. within three years. Such ‘simple’ requirements are not met by many companies.
A joint presentation from BP, Shell, Total and the University of Houston described the use of reservoir simulation in estimating (and reporting) reserves. The 2009 SEC modernization of reserves reporting allowed for the use of ‘a grouping of technologies which may include computing,’ opening the door for reservoir simulators to be used in reserves estimation. Previously the SEC only allowed for a deterministic approach. Modeling includes volume calculations, selection of analogs and decline curve analysis, all combining to offer ‘reasonable certainty’. A new framework is proposed for ‘evidence-based reserves classification,’ a systematic approach to assuring that model-based reserves estimates meet standards of reasonable certainty. The framework includes determination of a production mechanism, evidence for static and dynamic reservoir performance, history match, analogs, sensitivities and documentation.
In the digital energy session, a Halliburton paper showed how data-driven predictive analytics can be used to estimate downhole temperatures while drilling. Here a software ‘support vector machine’ running atop a Hadoop file system was used to ‘disentangle’ the complex relationships between various drilling parameters (RPM, WOB, mud flow rate) and formation temp. The approach works in deviated wells but was not so good on horizontal wells. For Halliburton, the oilfield’s digital revolution is unfinished, big data and analytics will be the next phase of the digital transformation.
Another Halliburton presentation has it that digital energy is at a ‘strategic inflection point’ representing a challenging competitive environment. Factory drilling is a fundamental shift from exploration to production. Enter ‘operational intelligence’ that ‘tracks the small stuff and handles the operational graffiti,’ leaving professionals to do their stuff. OVS Group and Platts also got a plug.
Technical data management now underpins Shell’s wells, reservoir and facilities management (Wrfm) program. Wrfm sets out to maximize production from existing assets. TDM addresses issues like data being hard to find and poor ownership. Following a pilot in 3 assets TDM is now being rolled-out globally. Critical data catalogues and data quality standards also ran. Shell now employs TDM subject matter experts while a ‘Lean’ data management facility in Asia can be called on for peak load handling.
If everything was done sequentially it would take two years to drill and complete a typical 26 well pad in Shell Canada’s Groundbirch shale play. Enter Simops, with up to 8 frac jobs per day. All enabled by an acronym soup of Simops, Concops (concurrent) and Mopo, a matrix of permitted ops that leverages Wwims, a ‘wells worksite instructions manual!’
A ConocoPhillips’ study of 15 years of injection into shale formations on the Norwegian continental shelf (NCS) has implications for shale development. Norway has been injecting into low permeability shales for a long time to dispose of well cuttings. The good news is that fracs can be induced with relatively low injected volumes and that injecting increasingly large volumes with a period of shut in can create secondary fracs around the primaries. The bad news is that a lot of fracturing is a-seismic, meaning that micro seismic monitoring may only give a very partial picture of frac formation.
While it was something of a sales-pitch masquerading as a paper, Thinklogical made a reasonable case for using a combination of fiber optic communications and its keyboard video mouse extender to facilitate remote operations of control rooms and real-time operating centers.
Kuwait Oil Co. presented results from its Sahala/Sabriyah digital oil field pilot. Here a model update and ranking methodology has been developed to optimize water flood using a 1.4 million cell model. The approach is said to simplify engineers’ workflows and facilitate onboarding of young professionals.
A joint presentation from Chevron and the University of Southern California/CiSoft showed how time series ‘shapelets’ a ‘new kind of wavelet’ are used to predict equipment failure from oilfield sensor data. Time series data from electrical submerged pumps is used to predict failure using a ‘process-oriented event model.’ The approach (like many before it) faced problems with failed or failing meters. These were addressed by using timestamps of ‘last good (meter) scan.’ The approach is said to be ‘faster than machine learning.’
A group led by YPF presented on fault diagnosis of pressure cavity pumps. Pump sensor real time data trends were compared with a fault database. The approach has been proved on simulated faults and now will be tested in the field.
© Oil IT Journal - all rights reserved.