Society of Exploration Geophysicists, Denver

Shale gas, Tarantola memorial, DHI Consortium, Geoprobe on Windows 7, Paradigm Skua 2011.

Shale gas is the theme of the year. The AAPG in New Orleans, the SPE in Florence and now the SEG in Denver all devoted their plenary opening sessions to the non conventional boom. We will spare you another blow by blow account—although the SEG did pull off a coup, getting US Secretary of the Interior Ken Salazar onstage—all dressed-up in designer jeans and bolo tie.

The industry’s argument goes, 1) that gas is abundant and good—almost green, 2) that coal is bad—but has a strong lobby and 3) that more needs to be done to reassure in regard of frac fluids. All the speakers thought that they were playing to the oil gallery. Except that this was the geophysics gallery! Geophysics did not even get a mention until, answering a questioner, Range Resources president Jeff Ventura stated that in the Marcellus shale, 3D seismics was used to assess rock ‘frackability’ although such use ‘is not widespread.’

In the Special Session on the ‘Road Ahead,’ the future turned out to be pretty much what is already happening. Ed Biegert, Shell, enumerated a multiplicity of remote sensing technologies, winding up with a plug for Shell’s ‘LightTouch,’ sniffer technology that dates back to 2003. Craig Beasley (WesternGeco) asked, after a century of seismics, ‘what’s left to do?’ The answer is ‘more of what we are already doing.’ In other words, multi/wide-azimuth and anisotropic acquisition with 30,000 channel onshore ‘super crews’ all amount to a ‘quiet revolution’ in land acquisition. Simultaneous source techniques like WesternGeco’s own ‘SimSrc’ and BP’s ‘ISS’ have brought about a 10 fold productivity gain. ‘Today, new acquisition is driven by requirements not cost.’ On the processing front, full waveform inversion can now produce a 40 Hz sub salt image, an ‘amazing’ improvement over the last 5 years. The main remaining challenge is aliasing, even though ‘you may not realize it until you try to do new stuff with data.’

There was a good turnout for the special session in memory of Albert Tarantola,* professor at the University of Paris and seismic processing maverick-cum-luminary. From a background in astrophysics and general relativity, Tarantola turned to inverse theory and one of most difficult inverse problems in geosciences—the ‘adjoint’ method of seismic inversion. This led to the incorporation of ‘a priori’ models of ‘acceptable’ geological structure, multiple realizations and a ‘pragmatic’ way of integrating geostatistical and geophysical data that underpins much of modern seismic processing. The new approaches implied compute horsepower way beyond what was available at the time, leading to early proof of concept tests on a ConnectionMachine 2. But this work came up against the ‘curse of dimensionality’ and the development of a new ‘smart’ Monte Carlo approach. One of Tarantola’s seminal papers was described as a ‘litany of failures!’ But even these were ‘tremendously instructive’ and influential. Tarantola could be irritating to colleagues—but he managed to get to the core issue and test folks’ convictions. His legacy was evidenced by the SEG’s five sessions on full waveform inversion.

We attended a great presentation on the SMT booth by Mike Forest who chairs the Rose & Associates Direct Hydrocarbon Risk Consortium. Forest, who was previously with Shell Oil, traced the 40 year history of direct hydrocarbon indicators, starting with a ‘health warning,’ that bright spots, flat spots, AVO** anomalies and so on do not actually indicate the presence of hydrocarbons! Shell coined the ‘bright spot’ term back in 1960, in the face of considerable skepticism. This was followed by a period of ‘peak and valley’ days going from optimism to pessimism. Management support and the digital revolution helped with better data and produced significant wins in a 1970 lease sale with the 750 million barrel Eugene Island 330 field which had been identified with the amplitude/background plot, using Aubrey Bassett’s ‘Payzo’ program.

These early successes were followed by pitfall-induced failures and the realization that, for instance a 10% gas saturation gives the same reflection coefficient as an 80% saturation—and this is still a problem today. Successes including the Gulf of Mexico Bullwinkle, Popeye and Tahoe fields confirmed the general usefulness of the techniques. This led to the establishment of Rose & Associates DHI Risk Analysis Consortium*** in 2001 with 35 members. The consortium is developing a systematic and consistent work process for DHI analysis—not a ‘silver bullet.’ For Forest, the three top issues for DHI are ‘calibration, calibration and calibration!’ You need to check the whole processing sequence, look at the gathers and tie everything to rock physics. Pitfalls include wet sands (23% of all failures in the Consortium’s database), geopressure, low saturation gas and hard rock above sand. One example from offshore Barbados shows a huge flat event—but the gas was long gone! Recent DHI successes include the billion barrel discovery in Uganda’s Lake Albert, Ghana’s Jubilee field and (perhaps) McMoran’s Gulf of Mexico Davy Jones discovery.

Landmark was showing off a port of Geoprobe to 64 bit Windows 7 on a 6xHD, 11 megapixel display from Mersive. Geoprobe is now a true ‘big data solution’ for both Windows and Linux as witnessed by the 200 million triangle surface. A new data format supports multi-CRS data across NAD zones, leveraging the OpenWorks R5000 data infrastructure. Geoprobe data can now reside either in memory or be streamed from disk and is always at full resolution.

Paradigm was pitching its 2011 release which is to redefine interpretation around the Skua flagship. The company is now also leveraging its Epos 4 data infrastructure with ‘preview technology’ leveraging ‘tera size’ data and models. Distributed, secure computing, Microsoft Windows ports and HPC with GPU support also ran. Today’s interpreters are overwhelmed with attributes and need systems that are engineered to support multi attribute interpretation, co-visualization and holistic workflows spanning cross sections to property models, gridding and flow simulation. Data management is a new focus for the 7,500 Epos users who are going ‘from terabytes to petabytes in prestack data roaming.’

The best laid plans of mice, men and the marketing department do go awry. It behooves us as the intrepid reporters that we are to note that on the WesternGeco corner of the Schlumberger booth, a demo of very high end Gulf of Mexico seismics was running, not on Petrel and Windows, which as you know is capable of displaying ‘all the Norwegian data,’ but on GoCad and RedHat Linux! Can’t WesternGeco afford a Petrel license? Whatever the answer, the Paradigm folks on the booth opposite were amused.


** amplitude vs. offset


Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.