Before we were thrown out (see last month’s editorial) we managed to capture the opening session of the 2014 Society of Exploration Geophysicists’ annual convention held in Denver last month. The SEG president Don Steeples put current membership at around 30,000 with 37% in the US. Membership is currently ‘stagnant to falling slightly.’ While there have been over a million downloads of abstracts from Geophysics (the SEG’s flagship publication), its companion, The Leading Edge, is losing in popularity. Next up was Tom Petrie, banker and pundit who has appeared on both PBS and Fox. Petrie ran through the gist of his book, ‘Following Oil,’ which traces the rise of US shale production and its implications for US energy independence. Unconventionals are the ‘grand disruptor’ that has halted a four decade long production decline along with shibboleths like peak oil. There are currently some 14 unconventional plays in N America with the potential for huge expansion in the resource base. Economic disruption is following a reduced trade deficit and restored manufacturing competitiveness—especially in petrochemicals. On the technical front, shale has led to dramatic evolution in drill rig design and improvement in geosteering, frac and proppant make-up and, for the geophysicists, micro seismic improvements that have ‘transformed the Bakken.’ On the environmental front the US, depite not being a signatory to the Kyoto accord, is now ‘one of most compliant countries in the world,’ and this has been achieved, ‘thanks to private capital.’
DrillingInfo CEO Allen Gilmer took to the mike stating there was ‘no escape from analytics and big data!’ While these are ‘over used terms,’ seismics is the ‘granddaddy’ of big data, especially with passive monitoring. Microseismics allows operators to compute fractured rock volumes in real time, stopping or changing parameters to optimize treatment. Elsewhere, multivariate statistics on big data can bridge geology and engineering to identify engineering best practices … and also to identify operators that are not so good. One spent $100 million sans any added value, another spent $150 million to add 1.5bopd production. ‘The best operators are miles better than worst.’ On the topic of costs, Gilmer ventured that $50 was the mid range breakeven cost for shale operators but that for some, this could rise to the ‘$60-$100 range.’
Author Chunka Mui segued clumsily into driverless cars, now positioned as a solution to the 4 million US car crashes and 33 thousand deaths per year. Worldwide there are 1.3 million deaths per year that could be saved by ‘taking humans out of the loop.’ There is an ‘industry wide arms race’ to develop driverless cars. Uber got a plug as another ‘killer app’ but Mui sees even more potential by combining driverless with Uber. There would be losers of course, car dealerships and auto insurers and … oil and gas too! According to Google’s Sergey Brin, the timeline for all this is 2017/18, in other words, ‘real soon now.’ The SEG did not solicit questions from the floor, but chair Rutt Bridges did a great job of asking (and answering) more questions than were really necessary.
Speaking in the ‘Road Ahead’ session, Ion’s Christoph Stork described a perfect (seismic) world with million channel/shot surveys but with a correspondingly unworldly price tag. Compromise is necessary, through ‘custom’ acquisition. The new name of the game is ‘compressive sensing,’ tuning acquisition to geological objectives and making smart compromises. Stork made a call for help from academia for better tools for survey design risk reduction and understanding of noise. Synthetic model data is key to acquisition modeling, to ‘show where attributes can be trusted.’
The iconoclasts were out in force. Art Weglein was arguing against the ‘inclusive’ view that primaries and multiples are both signal and ‘should be migrated.’ All current reverse time methods fail Claerbout’s test of source/receive coincidence. Multiples ‘are never involved in imaging and inversion.’ Sergey Fomel argued that the unpopular topic of time migration deserves further research as a means of avoiding the problem of velocity, the ‘elephant in the room’ for depth migration. Fomel is skeptical that the velocity problem can be solved. BP’s John Etgen was beyond skeptical. Despite ‘umpteen’ papers on full waveform inversion, all the wide azimuth, coil shooting bells and whistles, subsalt imaging in the complex Gulf of Mexico still fails. Studies on synthetic data show that as little as 5% of the subsurface can be imaged. ‘It is the velocity model that is killing us’ as small errors in the salt model degrade migration rapidly. Modern acquisition is fit for purpose, but our models are inadequate. Etgen, a keen amateur astronomer, sees hope in adaptive optics, a technique for correcting light’s passage through the earth’s turbulent atmosphere. His suggestion? Use a similar approach, numerically ‘reshaping’ the wavefield in the vicinity of a high contrast interface, ‘riding along with the waves to see where they are having trouble.’
CGG’s Sam Gray offered a more measured view of current seismic methods. ‘Big’ (structural) imaging and ‘little’ (rock property) imaging are on convergent paths. Broadband ‘big’ may include a stratigraphic component and ‘little’ imaging of unconventional targets may benefit from migration. But fractures occur on a centimeter scale, quite different from seismics. ‘We will never get the centimeter spacing required.’ Tomorrow’s rock property investigators ‘will need to know a lot of stuff and some.’ Seismics needs help (from academia) because ‘the little problems are harder than the big ones.’ On the ‘little’ front, seismic attributes continue to fascinate, and multiply. Kurt Marfurt offered a historical overview from Balch’s 1971 color sonogram to today’s high-end attribute analytics from companies like FFA/Geoteric and Marfurt’s own published work on multi attribute cluster analysis. ‘Generative topological mapping’ also ran, a neural net approach to finding a proxy for petrofacies. Headwave’s interactive pre-stack environment also got a plug.
OptaSense has carved itself quite a niche in the fiber optic monitoring arena and helped out with Shell’s evaluation of different kinds of fiber for use in distributed acoustic sensing. The problem with fiber is that it is sensitive to sound waves travelling along the axis of the fiber, less so for perpendicular arrivals. This precludes its use as a regular, horizontally-deployed surface cable. Optasense has extended fiber’s directional sensitivity with a helically wound arrangement which fared well in Shell’s tests. Fiber generates massive amounts of data in a short time. Current systems generate a terabyte in a few hours. This is processed in the field to a manageable SEG-D dataset. The raw data is then chucked!
Simplivity’s pitch is to replace the whole storage/server/network stack with its OmniCube ‘in-house cloud.’ The system comes with tools for data rationalization reported to bring major storage savings through de-dupe across all locations. Petrobras has deployed dual cubes to replicate data from an FPSO to onshore HQ.
Nvidia Index technology has been used by Landmark to bring an 8x speed up to full wavefield modeling. Complex deep water or unconventional plays can be modeled in ‘minutes or hours. Index can provide remote compute horsepower for tablet or laptop thin clients. Index came from Mental Ray/iRay acquired by NVIDIA. Shell is reported to be an enthusiastic user. Headwave/Hue uses the technology to provide management and exploratory analysis of terabyte prestack seismic datasets. Hue is now also marketing its proprietary compression technology (as used in HueSpace) to third party developers. This is said be 25x faster than existing technology, offering smaller files and better signal to noise.
Fraunhofer’s Franz-Joseph Freund is skeptical of the GPU approach, ‘GPUs are limited by the size of the cards and by PCI bandwidth. Direct CPU to memory access is much faster.’ Fraunhofer’s PV4D data visualization engine uses this approach in a new hexahedron viewer that scales to terabyte data sets. PV4D is delivered as a toolkit for 3rd party developers.
According to Ikon Science, ‘today’s seismic inversion is not trusted by modelers.’ Ikon’s ‘Ji-Fi’ technology sets out to change this with Bayesian inversion that operates on a per-facies basis, using prior information from RokDoc’s rock properties library.
Schlumberger is introducing a cloud computing offering for power users of Petrel, Techlog and Omega. Algorithms are said to run up to 1000x faster than on a local machine. The Schlumberger cloud was originally developed for service use. The Geosphere geosteering service has been using the cloud for a year or so. The petrotechnical cloud also provides remote virtual machines for Petrel running field offices. The multi-OS offering is now available for wider industry use.
On the esoteric hardware front we spotted Green Revolution that offers liquid cooled enclosures for compute clusters. The approach allows for 30% overclocking sans fans, air conditioning or raised floors. Supermicro is an OEM. GR’s biggest installation comes in at 10 megawatts. GR is used by CGG and Down Under Geo.
On the Wipro booth, Landmark founding father Royce Nelson, representing his startup Dynamic Measurement showed us his ‘natural source’ electromagnetic technique. Where controlled source EM pumps around a thousand amps into the ground, ‘natural source’ EM (lightning) provides around 30,000 amps per strike. DM uses the North American lightning detection network to triangulate strikes and map peak current, claimed to relate to subsurface geology. The US Gulf coast is well endowed with around 60 strikes/yr/ks2 and ‘90% of fields show up as lightning anomalies.’ DM was awarded a US patent last year for the technology. More from the SEG.
© Oil IT Journal - all rights reserved.