EAGE 2016, Vienna

'This time is different.’ Even with $100+ oil, discoveries failed to match consumption. Now, sub $50 oil hits the service sector hard. The downturn maybe an opportunity for something new - machine learning, automated interpretation, fractals and high-end computing. Aramco floats upstream ‘W3 Prov’ standard. Schlumberger struts it’s stuff while Halliburton is a no show!

The 2016 meet of the European Association of Geoscientists and Engineers, held earlier this year in Vienna, Austria was a subdued affair. In the opening plenary, Joseba Murillas (Repsol) offered a fairhanded but bleak view of the current industry situation. Exploration budget cuts mean lost jobs especially in the US and in the service sector. National oil companies and producing countries are feeling the squeeze. For consumers the news is good. Refiners are doing well and, at least for integrated oils, ‘still paying our salaries!’ Older employees tend to think this is ‘just another cycle.’ Others believe ‘this time is different’ and envisage a ‘deep transformation of the industry.’

Heiko Meyer (Wintershall) observed that CO2 and local pollution in megacities were key issues although coal is ‘hard to beat on price’ and oil is ‘hard to replace in transport, despite electrics.’ ‘Our children are less tolerant to our industry.’ ‘We need to fight for natural gas as a transition fuel.’ Murillas asked how many oils were working in renewables.

Most have some activity here although, as Tim Dodson (Statoil) said, ‘We struggle with this. We are in competition with renewables but also need to be part of the action.’ Statoil has a separate renewables business which is capital intensive but in smaller projects. This all feeds back into ‘what’s different this time’ as renewables get competitive. Oil and gas execs need to do better job of explaining our primary task. Only 4% of oil goes to power generation and much of the rest is used as feedstock that does not create any emissions. Meyer observed that ‘industry missed a trick with Fukushima.’ Instead of a dash for gas, the Japanese built 40 coal-fired power stations.

The discussion then turned to the future of exploration. Luca Bertelli (ENI) thinks it will be hard to make exploration ‘sustainable.’ Although the oil price ‘can’t stay this low for long,’ it may not get back $100. Is $30 to $50 an anomaly? Bertelli thinks that the anomaly was the last 5 years of super high prices, ‘In 2006 $30/40 was the norm, industry was making money and everyone was happy!’ What has changed is the cost base. The oil price is down 70% but costs are only down 25%. This is in part due to the portfolio mix with more high cost deep water plays.

Unsurprisingly, in 2016 discoveries are running at a six decade low, at around one third of consumption. But even in 2013/14, when global exploration spend was at a peak, ‘we did not deliver.’ ‘Industry can’t discover the equivalent of worldwide consumption, even at $100.’ There was a crisis in exploration before the price drop. We need to rethink how we explore, to find new basins, new ideas and reset our costs.

Ceri Powell (Shell) sees light at end of tunnel. Shell has rebased costs drama-tically, with a 50% year on year reduction in the Gulf of Mexico.

Dodson was less sanguine, observing that the quality of fields developed at $100 were mostly ‘3rd and 4th quartile 1970/80s discoveries, with short to no plateaus,’ representing ‘no more than a blip on the curve.’ ‘We are all struggling with our resource base, our projects are at wrong end of cost curve. Unconventionals will never be high quality assets. More and better new ideas are needed.’

Jean-Georges Malcor (CGG) is focused on cost reduction. CGG is now 50% down on cost compared to 2013. But, ‘is this sustainable? Nobody is making money, not even covering the cash cost.’ ‘Our shareholders don’t like our capital intensive industry with little visibility. We need to work together for better visibility. Five years for me is game over!’

On the ‘new ideas’ front Powell cited Shell’s ‘heartlands’ activity in mature, well known basins since 2005, driven in part by a low tech ‘rejuvenation opportunity now’ program that replaces interpretation workstations with ‘Mylar and colored pencils.’ The RON approach combines global exploration savvy with deep local knowledge. Elsewhere, the low cost environment has allowed for huge basin wide 3D acquisition such as the Sarawak Broadseis 3D.

The theme of doing more with less is seen by many as an opportunity to take a closer look at the data with a variety of novel(ish) techniques. For the (mostly) geophysicists of the EAGE this means applying machine learning to seismic interpretation. Anders Waldeland (University of Oslo) has used machine learning to automate the identification of salt bodies in seismic reflection from a variety of 3D attributes. A simple ‘nearest mean’ metric of data from inside a known salt plug was used to train the system. A plot of coherence vs. grey level co-occurrence matrix was used to determine the salt boundary. A North Sea dataset was interpreted successfully using a Gulf of Mexico-derived classifier. The basic technique is not exactly new, one reference dated back to 1973 (EarthDoc 85125).

Muhammed Shafiq (Georgia Institute of Technology) has evaluated five ‘perceptual and non-perceptual’ measures of textural dissimilarity to develop a 3D gradient of texture metric. This dissimilarity metric is ‘consistent with human perception’ and yields better dissimilarity than non-perceptual measures. Tests on a North Sea dataset show that the perceptual dissimilarity measures are computationally more efficient and better at delineating salt domes (EarthDoc 85267).

Hendrik Paasche (Helmholtz Institute) used data-driven inversion of near surface geotechnical data (logs, seismic, radar) to link shear wave velocity and sleeve friction (a measure of soil strength). Data-driven concepts based on fuzzy sets without prior knowledge did the trick, at least on a site-specific basis. Elsewhere, your mileage may vary. This is not exactly a killer app but interestingly provided some insights into a ‘weak and as yet unrecognized, physical link between electromagnetic wave propagation and sleeve friction’ (EarthDoc 85040).

Xavier Refunjol showed how Swift Energy has used simultaneous inversion and neural networks of log and core data to generate impedance, porosity, and TOC volumes in the Eagle Ford shale play. Refunjol studied lateral and vertical variability of reservoir qualities. Eight wells were used to train the system and results were validated by eliminating one well at a time to compare the resulting log (EarthDoc 85014).

Data management and standards were rather downplayed at this year’s conference. However, an interesting contribution from Aqeel Al-Naser (Manchester University and Saudi Aramco) showed how the World-wide web consortium’s ‘W3 Prov’ standard can be used to tag subsurface data objects with a provenance audit trail. Typical workflows across multiple interpretation systems should ideally carry provenance metadata throughout the workflow. W3 Prov is a graph-based data model of information about entities (e.g. a horizon), activities (e.g. an interpretation), and agents (e.g. the interpreter). The prototype was implemented across a seismic-to-simulation workflow spanning Paradigm Echos, Petrel, Gocad and Aramco’s GigaPowers flow simulator. A mouse-over event pops-up a text box with provenance information such as the date of interpretation and the name of the interpreter (EarthDoc 85402).

High performance computing progress was reported by Hui Liu (University of Calgary) who used a 32k core IBM Blue Gene/Q supercomputer to accelerate large-scale reservoir simulations. A novel scheme allows simulations to be parallelized such that the simulator has linear scalability. Reservoir simulations ‘can be accelerated thousands of times using thousands of CPU cores.’ The huge memory bandwidth of such computers also allows extremely large reservoir models to be computed (EarthDoc 85027).

Michele De Stefano (Schlumberger) has borrowed a fractal generation technique used in the computer graphics/gaming industry to provide ‘pseudo-realistic’ topographies and three-dimensional geophysical models. Applications for the technique include simulating datasets for testing inversion algorithms, interpolation and upscaling. The diamond-square algorithm (DSA) was devised in 1980 by Loren Carpenter of the then Lucas Film company (EarthDoc 85405).

Several papers addressed hardware speedup with a variety of accelerators, a field where Intel now challenges Nvidia with its Xeon Phi coprocessor. Gerard Gorman (Imperial College) observed that a) seismic imaging is hard, b) hardware is complex and c) parallel programming ain’t easy either. But computing is changing ‘like it has never changed before,’ with a plethora of different architectures such that it is hard to find people to run the show. Code is costly to optimize. What is needed is code modernization for high level abstractions and performance. This is achieved with code generators for domain specific languages, without which, high level languages are ‘slow and expensive.’ Gorman gave a ‘shameless plug’ for his LCS-Fast consultancy. Gorman is bringing together open source software developers and seismologists to develop a domain specific language for seismic imaging. Part of the picture is Firedrake, an auto coder that allows an end user ‘write python to run on 10k processors.’ SymPy is presented as a domain specific language for finite difference algorithms. It allows for cross-hardware, rapid development and ‘crucifies’ legacy, hand tuned code! The work was funded by Intel and BG Group.

Daniel Grünewald introduced Fraunhofer’s ACE asynchronous constraint execution codebase for reverse time migration at ‘extreme scale.’ Today, a single shot may be too big for a single GPU device. The answer is to parallelize with Fraunhofer’s ACE Communicator GPI 2.0 with both Nvidia GPU and Intel Xeon Phi support.

Lin Gan (Tsinghua University) described the speed up in reverse time migration using multiple K40 GPU cards. Best results with K40s were 28 times faster than an OpenMP/dual Intel E5-2697 CPU implementation although less optimization effort went into the Intel code (EarthDoc 84814).

Gabriel Fabien-Ouellet (INRS-ETE) has also used GPUs to speed seismic inversion but with a twist. Instead of Nvidia’s proprietary Cuda programming language, here, the open source OpenCL was used to allow for the use of heterogeneous clusters. Tests on large clusters with nodes built with Intel CPUs, NVidia GPUs and the Xeon PHI confirmed the 80x supremacy of the GPU. But OpenCL made for ‘a better usage of the computing resources available using a single source code for a multitude of devices.’ An open source code base for full waveform inversion, SeisCL will be available on Github ‘real soon now’ (EarthDoc 84811).

A sign of the times no doubt, Halliburton/Landmark was absent from the 2016 EAGE leaving the floor open to arch rival Schlumberger. We spent some time on the booth and heard about recent developments with the Petrel Guru, a workflow management plug in that is also available in Techlog, Intersect and even the Next training environment. Guru can be branded with a company’s own logo and configured to what are considered local best practices. The Guru offers advice on checking data quality during data transfer although Schlumberger is keen to emphasize that this does not cannibalize its own Innerlogix quality toolset.

Schlumberger continues to enhance its Blue Cube hardware bundle-cum-cloud solution. Petrel was demonstrated running on an iPad! over the conference Wi-Fi, served from the Schlumberger cloud in Aberdeen. Blue Cube is delivered in partnership with Dell EMC as either a private or remote SaaS offering. Schlumberger claims 250 internal users and the system is said to ‘work as well as a workstation.’ Western Geco’s multi client data now also streams from the cloud. Behind the scenes the Linux KVM provides virtualization while HP’s RGS technology adds remote, ‘thin client’ visualization. The UK Oil and Gas Authority uses the solution to stream seismic data.

Finally, Gaynor Paton (Geoteric) presented the results of a curious investigation into color blindness in seismic interpretation. 23 individuals, five of whom had some form of color deficiency, were invited to carry out various tasks such as determining fault orientation on different data sets with and without color coded orientation. Unsurprisingly, the study found that the use of color helped ‘standardize interpretation.’ The tools used in the study included Vischeck. Konan’s ColorDX was also used to simulate color blindenss. The point of the study escaped us. Could this be a very soft sell for Geoteric’s colorful aids to the interpreter? Surely not!

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.