EAGE 2014 Conference and Exhibition, Amsterdam

European association of geoscientists and engineers plenary on ‘dislike’ of industry by investors and society. Unconventionals growth hard to sustain. How to ‘outsource your credibility.’ Statoil’s 'big’ 4D seismic. ENI’s musical seismics. Modeling nano-scale flow. Hardware/software round-up.

The plenary EAGE forum, ‘Experiencing the energy, doing more with less’ offered panel members an opportunity to opine on pretty much any subject of their choosing. Which they did. Mike Daley (BP/Oxford University) introduced the panel saying that the oil and gas industry was engaged in ‘feeding the world with energy.’ We are at an ‘interesting moment’ with growth in oil and gas consumption but an increasing dislike of industry from investors and society. There is also a notion that everything is harder than it used to be. But actually ‘things have always been difficult.’

Woodmac’s Andrew Latham observed that costs have risen sharply. It now takes $4 to add a barrel to company reserves (it used to be $1) and full cycle returns are down to some 12%. Investors don’t like this when compared to the risk involved. We are seen as an industry that is struggling to perform. As conventionals get harder (pace Mr. Daley) unconventionals are where there is most growth. But this comes at the price of intense effort and our license to operate is not a given. Unconventionals in the sweet spots of the best plays are very attractive. Elsewhere (i.e. for the majority), the economics are ‘pretty marginal at best.’ Companies are wondering how to address complex marginal plays without hiring ‘legions of geoscientists’ because there is not the income growth to support this. Yes, that is what he told the geoscientists to their faces!

Wouter van Dieren (Instituut voor milieu en systemen analyse, a Dutch environmental think tank) added that complexity is not just about technology but about policy too. We depend on fossil fuels but society’s new value system is becoming dominant and challenging.

Shell’s Ceri Powell suggested that it’s all about brainpower. We need more innovation and technology. She proposed a novel metric—not so much petaflops but brain cells/barrel. We need early derisking before drilling with for example CSEM or country-wide 3D seismic surveys. Shell’s ‘rejuvenate opportunities now!’ workshops eschew high tech for old-style Mylar and colored pencils and are ‘hugely successful.

José-Luis Alcover (Repsol) proposed the 3P/3C squeeze on the business as in price volatility, pressure from investors, politics and contracts, competition and costs. We are up against US natural gas prices, short term investors and nimbys. Industry may be more competitive but there is always the risk that we are destroying value. Many aggressive exploration and development bookings result in write-offs. Perhaps it will be the smaller, nimble companies who will survive.

Marc Blaziot (Total) agreed that today, the problem was one of resource quality in terms of finding and producing costs but also its social and environmental footprint. The industry is less accepted than it used to be, ‘even to our own, younger colleagues.’ Doing more with less is a pivotal question for the majors where costs are too high and returns too low. Some recent statistics from Petoro showed that average rate of drilling penetration had doubled in the last 30 years.

Mario Ruscev (Baker Hughes) revealed that 60-70% of completions are uneconomic suggesting that ‘we don’t know what we are doing.’ The biggest social issue in the US is the amount of trucks going round, ‘people can’t take it anymore.’

In the rather contrived Q&A (anonymous written questions filtered by the moderator) someone suggested that unconventionals mainly delivered returns to ‘service companies and acreage speculators.’ Ruscev responded, ‘I wish!’ In one recent frac job, Baker had ‘20 guys bidding against us.’ ‘We bid against folks who don’t need to pay for guys like me!’

van Dieren has worked with Shell on social acceptability of oil and gas projects in sensitive areas. He has little time for Greenpeace, who are ‘dangerous people.’ A decade ago, exploration in Holland’s estuary area was halted by the conservationists. van Dieren managed to convince stakeholders that their perceptions were wrong, presenting the opportunities that production would bring to communities. He also told Shell to ‘remain silent’ as it was not a credible party. His actions worked and there is now gas production from the area and an €800 million fund for ‘sustainable biodiversity.’ The trick is for oils to ‘outsource their credibility.’ Oils need to create a new common value system that embraces wind and solar. Belief in climate change in industry is low even though, ‘It is coming and all major institutes believe this. You are running into a wall you cannot see.’

Notwithstanding all of the above, industry ploughs on with some pretty fancy technology. A talk from Mark Thompson showed how Statoil is doing quantitative 4D/time lapse seismic analysis storing its massive datasets in a Teradata appliance. Statoil’s 4D seismic reservoir monitoring group has acquired a huge amount of data in the last 17 years. And the volumes are increasing constantly with repeat surveys now carried out every six months. Data management is a big headache. Thompson managed to get some time on a Teradata ‘big data’ appliance and managed to standardize and automate 4D data ‘munging.’ The parallel shared nothing architecture and SSD storage is set to ‘make data managers unemployed.’ Now, instead of computing and storing attribute volumes, a user defined function creates it on the fly, right from the data warehouse. Many ad-hoc investigations are now feasible, from survey repeatability analysis to seismic reservoir pressure studies. ‘Put all your data in the same place and it should be easier to integrate people. Wake up it’s the 21st century.’

Schlumberger’s Darrell Coles’ talk was likewise on the topic of big data. A modern seismic survey can have hundreds of millions of shots and trillions of samples. Schlumberger is working on a tool to mine the ‘data deluge’ using guided Bayesian statistics to maximize the information in recorded data. The approach can be used to plan a survey or to cherry pick the most informative data for processing.

Alistair Crosby (BP) observed that our capacity to simulate now outstrips our ability to build the models required for seismic imaging. Modeling is a hand-crafted activity that is holding back progress. BP has been working on a new method for rapid model development by generating synthetic stratigraphy from facies templates and structural morphing. Multiple realizations give different facies distributions. Running the process tens of thousands of times gives a synthetic seismic section that is then morphed into the true geological structure.

Much more fun was Paolo Dell’Aversana’s (ENI) presentation on seismic data analysis with digital music technology. Dell’Aversana thinks that by transforming seismic recodings into audible MP3 files we can leverage Shazam-like digital music technology to perform pattern recognition. Combining sound with imagery should lead to better interpretation. Our brains are made for multi sensory perception. SEG-Y files are converted to Midi and frequency shifted. We listened to the deep rumble of boiling Hawaiian lava. Next the Midi file was played back through a digital piano, sounding like a rather good morceau of Stockhausen.

The morning session on high performance computing for geoscience was probably not very representative of mainstream oil and gas HPC. A presentation from the SEISCOPE II consortium demonstrated reverse time migration on the open source Valhall 3D dataset on an IBM BlueGene/Q at France’s IDRIS/CNRS HPC center. No geophysical gathering would be complete without a presentation on the use of field programmable gate arrays (FPGA) to accelerate seismic imaging. The paper from the SINBAD II consortium found that, ‘while further development is needed in order to realize the potential for acceleration inherent in the platform, our preliminary results give us reason to be optimistic.’ Not very compelling when you think that Oil IT Journal reported back in 2003 that the now defunct Starbridge’s FPGA-based supercomputer was to provide a ‘100 fold speed up over conventional microprocessor-based machines.’

We attended an informative demo of Schlumberger’s CoreFlow service. This uses argon ion tomography to capture core structure. Digital simulations can then be performed to estimate stuff like unsteady state relative permeability and plan development. We asked if enough was known about the rock physics of unconventionals to support such an approach and received a gratifying, ‘Your question is to the point.’ The semi quantitative analysis can give relative values. But, ‘Industry doesn’t really understand the physics of flow in these nanometer-scale reservoirs.’

Our trawling around the exhibition floor brought the following. Paradigm continues work on its ‘Epic’ open data infrastructure (OilITJ October 2013) now with interfaces to Witsml, Prodml, Ppdm and Resqml data sources. A prime time release is planned for 2015. Oracle’s offering in the big seismic data space was unveiled on the Westheimer booth where its ISDS data management system is coupled to a Oracle 12c/Exalogix appliance for performant access to seismic trace data. Landmark was showing its Zeta Analytics technology on an EMC data appliance that runs a Pivotal/Greenplum database (1304). Fraunhofer has renamed its high-end parallel GFS file system ‘BeeGFS’ and spun out this activity into a new ‘ThinkParq’ unit. BeeGFS is a head-on competitor for Hadoop. Meanwhile the latest release of Fraunhofer’s GPI 2.0 HPC development environment adds support for Nvidia GPUs. GPI removes the complexity of adapting programs for different parallel environments. Ikon Science was showing ‘JiFi,’ its joint impedance and facies seismic inversion. JiFi offers a quick way of determining low frequency background models using all available data. The ‘mixed discrete and continuous inversion’ approach is said to correctly capture the physics of the inversion problem. UK-based Big Data Partnership helped Teradata configure its seismic big data solution, used by Statoil to access full prestack data sets (see our May 2014 lead). Shell was loudly touting its own in-house developed interpretation software, 123 DI/Geosigns. One ‘selling point’ is that in-house development offers ‘control over the whole IT infrastructure, to ensure that all the right flavors of Windows, Linux and Oracle work together.’ More from the EAGE’s EarthDoc site.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.