Saudi Aramco’s 100,000 trace seismics and 7 petabyte datacenter

Editor Neil McNaughton attends a mind boggling presentation at the Las Vegas SEG. Processing multi petabyte surveys forces rethink of workflows and QC. Field tests and visual shot inspection are old hat, replaced with automated QC and interpretation. ‘Noise’ is no more—it’s all signal now.

I have a confession to make. I have in a previous life destroyed data. I knew I was doing it at the time, but I just could not help myself. Out in the field with a state-of-the art hard disk and 21 track recording set up, we were concerned over noise levels and did what was considered, circa 1980 as a ‘test.’ This involved fiddling around with a few different patterns of geophone arrays for input to the 48 trace system and visually inspecting the results. Like thousands of seismologists before (and after) us we decided that a largish array gave us significantly better signal.

Even then I knew that there was something wrong with summing in the field. But my own guilt was mitigated by the knowledge that the data would be summed and visually inspected a lot more before the processing shop was through with it.

Fast forward 30 years to the 2012 meet of the Society of Exploration Geophysics’ this month in Las Vegas (full report in the December issue of Oil IT Journal) and two presentations from Saudi Aramco’s Peter Pecholcs and Brian Wallick that debunked once and for all the concepts of ‘stack’ and ‘noise.’

These were not theoretical talks. Aramco presented the results from three surveys conducted by WesternGeco using a 100,000 trace land seismic acquisition system. The idea was to abandon arrays in favor of point sources and receivers. Pecholcs observed that, ‘in the old days, the array gave us a good feeling, you could see data.’ But today the clever stuff is done in the computer center, ‘giving the signal some respect.’

The largest survey produced 165 billion traces and three petabytes of data. Processing this data abundance required some 7 petabytes of storage in the data centre and a rethink of processing workflows and quality control. With such large volumes, ‘you can’t just do stuff (like changing a decon operator) over.’ Current visual inspection of a 15dB plot is ‘no good any more.’ A technique evolved around fast track volumes followed by pilot processing of a mere 55 terabyte/9 billion trace data subset. Substantial up front 2D field testing failed to give a clear cut indication of the potential of the mega survey. But the survey went ahead regardless and produced spectacular 3D results. Pecholc recommends you ‘don’t waste money on 2D tests.’

To process these huge data volumes, computer technology needs to scale and QC needs to be improved, difference plots ‘just don’t hack it.’ Moreover, with good spatial sampling, what used to be thought of as ‘noise’ is actually coherent signal, scattered from a shallow low velocity layer. This was the first time the top ten meters of sand had been imaged.

In the Q&A, Pecholcs was asked if ‘visual’ QC could usefully be replaced with statistical methods. He opined that what was needed was ‘just good geophysics, no neural nets to complicate our lives, go back to geophysics 101.’ Finer sampling means that you see what ‘noise’ actually is. But the scary thing is that the signal ‘is so complex that no time domain method can flatten an event, and no one can build a velocity model.’

Aramco’s project is for an ‘integrated broadband acquisition-to-rock mechanics’ methodology. Brian Wallick took over to address the interpretation and reservoir characterization aspects. The aim is for data that is easy to interpret, ‘data should interpret itself.’ One major problem is intra bed multiple contamination—but as this is better sampled, it is easier to remove. Bandwidth has increased from 8-30Hz to 3-45Hz bringing better lateral continuity and data that is much more amenable to autotracking. Aramco has only scratched the surface with this data set, there is lots more to do in terms of rock physics and ‘impedance fidelity.’

Just in case you don’t see the trend, the Aramco presentations were followed by Shell’s Guido Baeten who described tests performed by BGP on the new HP/Shell/PGS wireless sensor network—with a million channel capacity!

All in all, the race for traces represents a whole new set of challenges and opportunities. Tomorrows’ processing and interpretation will require more rigor and automation right across the workflow. More from Pecholcs’ and Wallick’s abstracts.

Follow @neilmcn

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.