Society of Exploration Geophysicists 2017, Houston

A century after the first seismic patent, the industry is a little shaky as ‘technology delivers more barrels than the world is ready to consume.’ We report on Shell’s production-induced sea floor monitoring. PGS, ‘oil and gas falling behind in HPC.’ UC San Diego, reviving a ‘greatly diminished’ CSEM industry. OpenGeoSolutions ‘beware the AI hype!’ BP ‘is deepwater dead?’ Halliburton fixing ‘unrealistic' DAS expectations. PCA ‘Bayesian statistics fundament of all risking.'

Surviving the catastrophe that was Harvey proved something of a distraction from the other catastrophe that is the parlous state of the geophysical business. While the SEG made the right decision by holding its annual convention a couple of weeks after the major flood, as witnesses by the 7,000 plus attendees, the plenary sessions were somewhat curtailed, to ‘give more time to visit the exhibition.’ SEG president Bill Abriel noted that 2017 represents a century of seismic prospecting that began with Reginald Fessensden’s 1917 patent application for a ‘method and apparatus for locating ore bodies.’ Despite downturns and the graying workforce, SEG demographics are looking OK, albeit with a slight trough in the 35-55 age group. The long term push to globalize the Society has borne fruit. Today, only 22% of corporates are US-based and the 27,000 individual members hail from 128 countries. Abriel highlighted the SEG/Halliburton-backed Evolve education program for young professionals. Evolve is also a certification program, ‘but not licensing!’ Looking forward to next generation exascale, cognitive computing and machine learning, Abriel observed that geophysicists were ‘natural owners’ of high performance computing. The 2018 SEG convention in Anaheim will be a ‘big data/analytics joint venture with silicon valley.’

ExxonMobil’s Steve Greenlee recalled that only a few years ago, the perception was that oil resources were limited. Now technology ‘delivers a lot more barrels than the world is ready to consume.’ Although not much conventional oil has been discovered in the last 15 years, ‘unconventionals have changed the game.’ Is this a lasting phenomenon? Exxon-Mobil’s recent Permian basin acquisition has brought ‘exposure’ to 6 billion barrels at a supply cost estimated by the IEA of $40-45 (Exxon’s figure is lower). But the big issue is depletion. The IEA sees a decline from 95 to 35 million bopd by 2040. Meanwhile, demand is forecast to rise, ‘so we need to find the equivalent of the whole of today’s production again by 2040.’ For this, the world needs a ‘healthy’ geophysical industry, but one that is ‘capable of supporting a low cost of supply portfolio’ which we understand to mean cheap! North American capex is shifting towards unconventionals with, typically, a lower geophysical intensity. While this might be seen as a liability for geophysics, unconventionals are ‘really hard to explore,’ so there is a real opportunity here for geophysical innovation.

For conventional/deepwater exploration, ‘today’s technology is not good enough. Seismics takes too long to process and interpret.’ Also, regulations need to be based on sound science. Here the IAGC is working on advocacy in regard of marine life and regulations. A web search for seismic and marine mammals returns ‘lots of misinformation.’ ‘Houston we have a PR problem! In the Q&A, Greenlee was quizzed on the sustainability of shale. He observed that as elsewhere, there will be cycles related to the oil price. But some of the lower cost, high quality shale projects will last for decades and will impact market for years. Other high cost projects will not come back for any foreseeable time. Regarding shale outside of the USA, Greenlee observed that it was ‘remarkable that there is no [production] as of yet.’ Abriel added that geophysics needs to get closer to engineering in unconventionals, breaking down the silos and getting more integrated. Greenlee concluded that there is a real issue with the health of the geophysical industry. ‘Everything we do depends on a healthy sustainable geophysical industry, but there is no easy answer to the low price problem.’

The special session on the ‘road ahead’ showed that there is some life left in the geophysical dog. There was a good turnout for Paul Hatchell’s (Shell) presentation on ‘seafloor deformation monitoring for geohazards, depleting fields and underburden expansion.’ Ever since the spectacular sinking of the Ekofisk production platform in the early 1980s, industry has been aware of the need for sub cm/year accuracy in monitoring surface deformation. The ‘gold standard’ approach for monitoring was developed by the Scripps Institute and Statoil in 2010 and can measure millimeter changes in 1000 of meters of water. The stations were deployed by ROVs across the Ormen Lange field, but the technique is expensive. Shell is now working with Sonardyne on autonomous recorders that can measure every hour for 10 years! Their low cost means that 175 have been deployed across the field and show subsidence of 2cm/year.

PGS’ Sverre Brandsberg-Dahl promised that de would not talk about big data and analytics in the cloud which was a relief to some. High performance computing has been used in seismic imaging for decades, with the pendulum swinging between different programming models, specialized equipment and now, the promise of commodity HPC services from Google and Amazon. Despite the push for high-end full elastic inversion and reverse time migration, these are not yet routine. The reasons? Turnaround time won the battle and model uncertainty is eating the cake both of which make it hard to justify more fancy physics. To date, algorithmic complexity has been matched by increasing compute power. Even for surveys of 100s of terabytes and up. But new acquisition techniques like continuous shooting and recording and irregular spatial sampling are breaking the classic, easy ‘embarrassingly parallel’ compute paradigm. Brandsberg-Dahl believes that the weather forecasting community does better and has done a great job of explaining the business benefits of HPC, linking the cost of disasters (like Harvey) to the cost of what they are doing. The UK met office has just spent £100 million on a new computer for a claimed ‘£2 billion in benefits.’ ‘We need to make the same argument.’ But HPC is at a crossroads today. Will the ‘unlimited’ flops that the cloud promises be enough? Or is a paradigm shift needed. Brandsberg-Dahl hopes that seismic imaging does not jump onto the commodity hardware of the cloud. For the geophysicist, the compute platform is a competitive differentiator, for both service companies and oils.

Leonard Srnka (UC San Diego) traced the use of marine controlled source electromagnetic (CSEM) prospection back to around 1999. Since then, EMGS has conducted over 540 surveys. CSEM surveying peaked in 2008 then dropped down to 20/year. Today, the CSEM industry is ‘greatly diminished’ due to fundamental physics limits, the non-uniqueness of solutions and ‘unrealistic expectations*.’ CSEM measures resistivity, it is not a direct hydrocarbon detector. It has suffered from competition from enhanced seismic imaging and the downturn. Has the value of the technology been realized? Not according to Srnka. ‘Dry holes have no CSEM response but discoveries do!’ Moreover, ‘no false negatives of any size have been reported after 800+ surveys.’ Other opportunities lie in petrophysical joint inversion and 4D surveys. CSEM should be great for mapping gas hydrates. In Japan, ‘seismic is no longer the preferred method for hydrate research.’ State of the art is represented by EMGS’ ‘Deep Blue’ CSEM, a joint venture with Statoil and Shell. More from UCSD’s Marine EM Lab.

Gregory Partyka (OpenGeoSolutions) has no qualms about the big data/analytical approach and plans to leverage artificial intelligence and ‘do seismic interpretation and reservoir simulation all at the same time.’ This is a challenge because no single database offers the sensitivity and scale to span all the data types involved. Interacting with todays’ data and databases takes too much effort. Moreover, there are inevitably gaps in the skill sets required as we enter the world of big data. The technology behind self-driving cars could provide breakthroughs to seismic processing and interpretation. One word of warning on the big data approach, ‘beware of being dependent on the use of powerful tools in inexperienced hands.’ But if AI is used right we are promised more ‘think time’ and ‘aha’ moments! Another warning, ‘beware the hype.’ In practice, the processing building blocks will stay the same and need to be automated ‘gently.’ AI should make it easier to investigate seismic scenarios. OGS provides a library of pre-rendered images accessible through a browser incorporating ‘motion/animation.’ Suddenly we were watching commercial, a demo in fact! Partyka shamelessly tweaked frequency, ‘Keuler curvature azimuth, noise, spectral decomp and other obscure attributes before concluding with more ‘in praise of AI.’ Partyka suggested that SEG should curate a library of training images.

BP’s Scott Michell asked, in the current oil price environment, is deepwater dead? Current thinking is somewhere between ‘lower for longer’ and ‘low for ever!’ The Gulf of Mexico creaming curve shows a mature basin, the big stuff has been discovered. But now that infrastructure is in place, pipelines need filling up. That’s our job! From 2000 to 2015 deepwater seismic imaging algorithms have seen a lot of refinement. But did this make much difference? Velocity remains the fundamental issue and manual picking of top salt often gives the wrong model. Top salt topography may be very rugose and impossible to pick. Mitchell intimated that in some cases, automated top salt picking beats manual interpretation. In any event, ‘we need better images more quickly for the Gulf of Mexico to be economic.’ And we need a better low frequency acquisition source. Enter BP’s Wolfspar low frequency seismic source. Longer offsets and lower frequencies will help ‘beat the interpreter.’

Brian Hornby (Halliburton) sees digital acoustic sensing (DAS) with downhole fiber optic cables as the way forward for high end borehole geophysics. Borehole seismics has evolved from checkshots, through offset VSP and now, 3D VSP imaging. But the acquisition geometry is awkward and image quality, in the early days (2001), was poor. Overoptimistic feasibility models led to unrealistic expectations. Today things are better with anisotropic velocity models and reverse time migration. DAS is the new kid on the block. Deploying DAS is ‘free’ if the well already has fiber although signal to noise is poor compared to a geophone. In the future, DAS will be a routine supplement to 3D seismics but we need ‘fast-deployable 10k foot arrays.

Patrick Connolly (PCA) sees seismic inversion as transitioning from deterministic to probabilistic methods with Bayesian statistics as the ‘fundament of all risking.’ Stochastic inversion gives a range of equi-probable models, a non-unique solution. But, ‘for the past twenty years we have chosen to ignore this and just select one’. It’s time to take this seriously. AVO studies reduces uncertainty when geological priors such as bed thickness are added to the mix. A lot of software is available for this, Shell’s proprietary Promise, CSIRO’s Delivery, the GIG consortium’s PCube, Ikon’s JiFi and BP/Cegal’s Odisi. Why has take-up been so slow? Because collaboration across many disciplines is involved. But the software is available and ‘dimensionality is not a problem, this is not a compute issue.’

* CSEM’s ‘unrealistic expectations’ were in part fueled by Schlumberger’s Dalton Boutte’s 2004 claim that CESM ‘could replace seismic’ and also by Srnka himself who the following year stated that ‘CSEM methods may prove to be the most important geophysical technology for imaging below the seafloor since the emergence of 3D reflection seismology.’

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.