EAGE 2017, Paris

Doom and gloom as plenary hears longer for lower mantra. Seismic industry in peril! Spectrum Exploration ‘now is the time for ultra-deep water exploration.’ BP’s automated life-of-field workflows. Ikon on geophysics in drilling. DMAKS and sedimentological metadata. Total’s Metis fantasy seismics and Alternative Subsurface Data wireline testing facility. Schlumberger models in the cloud. Efficient, portable code on GPUs. Inside track on Halliburton’s digital transformation.

The plenary EAGE Forum’s theme ‘new collaboration models in exploration’ was a tough one since exploration has already tried pretty well all conceivable forms of collaboration. The panel therefore focused on the more pressing issue, survival, in what is now considered to be a lower for longer oil price scenario.

ENI’s Luca Bertelli while warning that ‘price forecasts are always wrong,’ opined that ‘low’ would likely last through 2018 and force a complete industry ‘reset.’ Bertelli is skeptical of unconventionals. A few shale operators are ‘making skinny margins in the best basins.’ Much has been made of efficiency gains but these have now plateaued and there is little room for further improvement. In fact, costs creeping up again. It is getting harder to justify all this to shareholders. Conventionals remain a better bet, especially when an unconventional mindset is applied to shorten exploration-to-production cycle times. Companies should select simple, conventional asset where infrastructure is in place, cooperate with geophysical contractors on multi-client surveys and ‘respect roles without pampering each other.’ ENI is working towards integration across all geoscience disciplines and leverage its in-house 10 petaflop compute power to mitigate risk with proprietary algorithms. Poster child for the near-field approach is the Nooros Egyptian gas development.

Jean-Georges Malcor could probably have done with a bit more ‘pampering’ from CGG’s clients. Malcor complained of unfair competition from companies with lax HSE standards. Massive cost reductions (50, 60, even 80%!) are all very good but are they sustainable? Geophysics is a very capital intensive industry and contractors can’t invest with a six month horizon. Senior-level talk of collaboration is not translating into procurement, which is all about lower price rather than value for money. Malcor is a relative newcomer to oil and gas. He initially expected a dynamic industry, but ‘not at all, ideas take too long to adopt!’ CGG now has 60 petaflops of compute power and ‘proposals for 100PF are on my desk!’

Eric Oswald (ExxonMobil) offered a tale of non-cooperation at Shell’s expense! ExxonMobil’s Liza oilfield in Guyana was the result of plugging ahead on a ‘super high risk’ play that partner Shell decided to exit. The discovery well was followed by a ‘rush’ 17,000 sq. km. survey, performed by CGG, the largest proprietary survey Exxon had ever done. More speedy decision making made for first oil in under four years from the final investment decision. ‘We wondered why it took so long before!’

EAGE has made it hard to interact with speakers. Questions are now posed through a clunky app. This proved a waste of time anyhow as our questions a) why ENI uses in-house petaflops instead of CGG’s excess capacity? and b) how much of the Liza ‘risk’ Exxon managed to offload to CGG and other suppliers, were not deemed fit to ask.

Malcor observed that while geophysics was an important part of the ‘food chain’ it was nowhere in the ‘value chain’. ‘You buy the survey and after, we disappear!’ He further wondered if it might be possible to ‘contribute a survey in return for an equity position in a future discovery.’

Schlumberger’s Ashok Belani confirmed that neither they nor CGG had made any money from seismics for a long time, ‘land seismics is essentially defunct!’ Multi-client surveys are sustaining the industry to some extent, but current levels of competition mean that ‘even multi-client will be unsustainable.’

Oswald observed that in 2040, oil and gas is still to be the dominant energy source. But this will be hard to fulfil. We need new commercial models, innovation and capital. There are almost 10 million km. sq. of ultra-deepwater (>3000m) areas with over 3km of sediment. We need to figure how to make this viable.

Returning to the collaboration topic, Belani described the depressing offshore situation. Of the 70 vessels in 2013, 1/3 are retired/junked, 1/3 idle/stacked. The remaining 1/3 are in the water with 70% working on in-house multi-client ‘for survival’ and the remainder on proprietary surveys. There is no real sign of anything changing ‘before 2020 or beyond.’ ‘No business can survive 13 years down.’

There is no call for further investment in boats or equipment, or R&D. ‘Deepwater has a role, but it is not a pretty picture.’ On the positive side, Belani sees a ‘world of innovation opening-up with data,’ with inspiration coming from ‘GAFA*’ which are ‘doing unbelievable things.’ The cloud brings efficient, elastic HPC opportunities to seismic processing. Great things are happening in data science and machine learning. Seismic data in libraries is to undergo ‘innovative high-performance processing’ and reduce cycle times.

Recent activity in Mexico shows how collaboration with the regulator has made very advanced surveys available and shortened cycle times. With the cloud it will ‘technically be possible to process seismics in real time.’ This will enable ‘interpretation-based acquisition’ and make more deepwater viable. In the US Permian basin, seismics is now used to find sweetspots, although questions as to economic viability remain.

Howard Leach agreed, seismic technology has been a huge enabler for BP. But exploration is now shifting from the frontiers and the next big resource and back to ‘basins that we know.’ This requires a shift from towed streamer to niche processing. On the plus side, rig rates are down so we can drill more. All BP seismic is acquired by and 90% processed by contractors.

Neil Hodgson’s (Spectrum Exploration) paper was an impassioned and counter-intuitive argument in favor of ultra-deepwater exploration. The received view is that water depths of over 3,000 m are too deep to drill, that they are too expensive, high risk, and anyway there are no reservoirs, traps or mature source rock. For Hodgson these are ‘all lies that are poisoning our industry’s future.’ Total’s Raya 1, drilled in 3,411m of water proved that this is technically feasible. Today’s $200,000 drill ship day rates mean that these wells are ‘getting really cheap’ and ‘we will be drilling in 4,000 m of water by 2020.’ In the Atlantic, source rocks are ‘just about everywhere.’ But best of all, when you look at a cross section of the Atlantic basin in depth (as opposed to time) the whole ocean becomes a huge turbidite stratigraphic trap with a ‘zero chance of failure.’ $40-50 oil represents a great time to go looking for these massive prospects. ‘If you give engineers a billion barrels in 3,000 m of water they will find a way of developing it cheaply.’

Tom Hance described BP’s automated workflow for the integration of 4D seismics with history matching. 4D AHM (assisted history match) performs cell by cell computation of 4D properties using a batch process to compare synthetic seismograms with 4D data. Using data from BP’s Life of Field permanent array on the West of Shetlands Clair field Hance showed how the approach is used to tweak parameters such as fault transmissibility for a decent match. The functionality was implemented in BP’s go-to integration tool, DGI’s CoViz 4D.

On the afternoon of the first day of the conference there was a total electricity failure in the lecture area. The lectures were rescheduled for the following day, but, not all speakers showed up. A big fail for Paris’ main exhibition center and the EAGE unfortunately.

An audience of half a dozen or so showed up for Sophie Cullis’ (Leeds University) talk on metadata approaches and their effects on deep-marine system analysis. Unfortunately, this was less about sharable metadata and more a quasi-commercial plug for the use of DMAKS, the ‘Deep-Marine Architectural Knowledge Store.’ DMAKS is a relational database with controls to ensure ‘consistent data entry.’ This is said to overcome barriers to research deriving from the variety of classification schemas, interpretations and a ‘terminological minefield.’ Leeds has standardized and captured facies types from peer reviewed literature which are stored in hierarchical and spatial context.

As the seismic business withers, some geophysicists are applying their know-how to help drillers work in difficult terrain such as Myanmar. Here, as Ikon’s Alex Edwards related, Ophir Energy was trying to get a better understanding pore pressures. Earlier wells in Myanmar had gone ‘horrendously wrong,’ with shallow gas kicks and lost wells. Paradoxically, other deep wells were drilled with no overpressure at all. Ikon has modeled the basin’s evolution through time, rolling in drillers reports, geothermal measurement and seismic velocities. These are delivered as a pore pressure formation gradient forecasts for use in well planning.

In these days of doom and gloom it was nice to see that someone is looking to the future of land seismics. Total’s METIS concept targets foothills acquisition with a swarm of drones delivering ‘biodegradable’ geophone darts. Robotic self-driving vehicles do the shooting and data is transmitted wirelessly back to camp, leveraging a communications airship (what else!) Partners in the Metis concept are Flying Whales and WirelessSeismic. When the Metis crew is through, the camp HQ, built entirely with edible material, is cooked up for a big feast (only kidding!)

More prosaically, Total has opened a real-world test center in Pau, France for calibrating logging tools. The Alternative Subsurface Data facility is a three story building housing a 9 m high silo where large octagonal standard rock slabs can be superimposed to provide a testbed for third party logging devices. Logs can be certified to ISO 17025/Cofrac standards. The ASD was developed in collaboration with SEMM Logging to provide Total with its own independent verification of logging contractors’ claims and to open up the market to smaller regional logging companies.

In the high-performance computing session, James Hobro outlined Schlumberger’s deployment of models to the cloud (an inside track on Delfi perhaps - see page 1). Cloud computing (mainly seismic modelling) brings resource elasticity. A 1,000 hour job running on a single core can be run in an hour with the simple expedient of using 1,000 cores in the cloud! In seismic imaging however, issues of contention (of cores for main memory or blades for the network) and ‘incoherency’ (delays as parts of the system wait for each other). Tests of finite difference wave equation modelling leveraged C++ 14 codes, Intel’s threaded building blocks and MPI. Hobro spoke of the need to change from sequential to ‘actor-based’ programming, an asynchronous approach comparable to ‘kids programming in Scratch.’

Tilman Steinweg (Karlsruhe Institute of Technology) described the challenge of writing efficient seismic code for the GPU. While many leverage Nvidia’s proprietary CUDA, Steinweg was looking for a hardware-independent solution. Enter LAMA, Fraunhofer Institutes Library for accelerated math apps, an open source solution that runs across CPU, Nvidia GPU and Xeon Phi. A benchmark performed on the Jülich Supercomputing Centre’s JURECA Nvidia Tesla-based supercomputer showed good speedup over a comparable CPU-only machine.

We got a one-one-one with Landmark in their very modest booth, where we heard an exposé on Halliburton’s ‘digital transformation.’ The company claims to have transformed itself, and is now offering advice and services to clients. There is a difference between digitization and digitalization. Digitization has been an ongoing development since the 1970s. Digitalization is more recent and implies business transformation. In the USA, you can take a photo of a check with your iPhone and the money is in your account. Likewise the Amazon store has ‘no lines, no checkout, no cash, seriously.’ In Norway, there are now no fences, goats are corralled with geofencing and electric collars, ‘doing sheepdogs out of business.’

How will this translate to oil and gas? It will be through the widespread application of stuff like computer vision, data science and machine learning. Digital E&P ‘aligns the field with the boardroom’ and helps ‘translate decisions into actions.’ Landmark’s OpenEarth is a poster child for the transformation where DevOps best practices and DecisionSpace code have been contributed. The idea is to allow users to share code (e.g. for automated fault identification) with the community but to retain their intellectual property. This is ‘not to be confused with a marketplace’ (read Schlumberger’s Ocean!). Poster child for OpenEarth is Anadarko. In a production context, the Landmark Field Appliance provides connectivity from IoT/field sensors into the cloud (see the lead in this issue). Like GE, Halliburton is now proposing ‘utilization-based’ maintenance on its pumps, downhole tools, leveraging a digital twin. Elsewhere Halliburton’s IoT cameras and sensors deployed at the well site detect fugitive emissions and intruders.

* Google, Amazon, Facebook, Apple.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.