Society of Exploration Geophysicists 79th Annual Meeting, Houston

The ‘Road Ahead’ Session heard about ‘non coherent’ seismic acquisition, robot ‘dragonflies,’ the end of Moore’s Law and the race to the exaflop, and new processing technology from Aramco. StatoilHydro reported on twenty years of hugely successful sea bed acquisition. ExxonMobil unveiled its DHI/AVO cook-book which has produced a 166% increase in success rates. Chris Liner quantified the carbon capture and storage problem. Petrosys demoed OGC-compliant web map services.

In the excellent SEG Forum on the Road Ahead, Guus Berkhout(Delft/Delphi) argued that the current race for higher and higher trace counts and exponentially growing data volumes was both unsustainable and unnecessary. Instead of conventional shooting at a regular interval, Berkhout argues for ‘non-coherent’ seismic acquisition. The same number of shots gives better results if shot ‘incoherently.’ Overlapping shots are ‘de-blended’ in processing. The results as presented are spectacular. Further gains are to be had by migrating multiples—’they are sources too!’ and from considering the underside of reflectors—an ‘inverse data set.’ Berkhout used the analogy of the Energy Internet (a.k.a. Smart Grid) to suggest that swarms of micro robots will be used to deploy sensors on the sea bed—or robot ‘dragonflies’ that will land briefly to record a ‘non-coherent’ shot!

Juan Meza (Lawrence Berkeley National Lab—formerly with ExxonMobil) noted that ‘computing is changing more rapidly today than ever before.’ 2004 saw the end of Moore’s Law and 15 years of exponential growth. Since then, while the number of transistors kept on growing, clock speeds flattened because of power dissipation considerations. Today it is the number of cores per chip that is doubling every 18 months instead of clock frequency. This is having a huge impact on supercomputing whose whole architecture is about to change. The PC/COTS paradigm is no longer driving HPC as witnessed by the RoadRunner, a petaflop machine introduced in 2008, with 6,000 ADM Opterons and 13,000 IBM Cell BEs. Another Petabyte machine, Cray XT5 Jaguar has 10,000 Opteron cores. The TOP 500 graph shows exponential growth since 1994 by extrapolation, we may see an Exaflop machine by 2020. What does this mean for programming and applications? Another Top500 metric is concurrency. As core count rises we may see clock speeds decreasing, as they handle millions of concurrent threads and offer inter and intra chip parallelism. With chips such as the IBM CELL, GPUs, Sun’s Niagara 2 and Intel’s Network Processor, ‘the processor is the new transistor.’ Meza forecasts that, following this period of rapid change, Intel will continue to be a market leader and HPC will stabilize on a new architecture and new programming techniques. The change will be like Exxon’s move from Cray to clusters. MPI will persist because of its installed base. But we will see ‘MPI+’ with the arrival of PGAS languages, CUDA, and ‘auto-tuning*,’ programs that write programs to search across an ‘optimum space.’ Long term, we can expect Peter Kogge’s DARPA ‘Exascale’ program to pay off. But this will imply that problems such as the flattening off of current architectures and power consumption will need to be solved. Meza questioned whether 100MW of power for a billion node machine is possible—noting that as the PC market investment declines, embedded processor investment is on the up. HPC salvation may come from iPhone/ MP3 player technology—to minimize power consumption. More from https://hpcrd.lbl.gov/html/FTG.html.

Tim Keho of Saudi Aramco’s EXPEC R&D unit outlined a ‘new era’ for land seismic—addressing the near surface challenge. Exploring in Saudi Arabia involves low relief structures beneath near-surface karsts (up to 600 m), sand dunes, scarps. These are ‘hard, if not impossible to model.’ Elsewhere there are sand dunes up to 500 feet in height. Traditional solutions to static correction are passé—they don’t work. It is easy to ‘lose’ low relief structures in near surface ‘noise’. Current approaches include fast autopickers and imaging—but ‘what can you image in the near surface?’ Keho claimed that Aramco’s new solution, which he is to present at next year’s SEG, ‘turns the problem around and treats the whole near surface issue as an imaging problem—microgravity also ran. Aramco’s most challenging problem is not PSDM but ‘statics’ which now must be treated as an imaging issue. A similar-approach was described recently by University of Houston’s Arthur Weglein in the Houston Chronicle**!

Felix Herrmann (UBC Seismic Lab for Imaging and Modeling—SLIM***), agreed with Berkhout that cost of acquisition and processing turn around times are impediments in the modern seismic workflow. Moreover, ‘Moore’s law is coming to an end, we can no longer compute ourselves out of this mess.’ Today’s sampling is too pessimistic. Acquisition can be optimized with sparser shooting and ‘filling in the blanks’ by adjusting sampling to subsurface complexity. New maths, the Johnson-Lindenstrauss lemma**** and ‘incoherent’ random simultaneous source acquisition mean that we are on the cusp of a breakthrough in seismic imaging. Sparse is cheap and promises faster turnaround.

Our virtual ‘talk of the show’ award goes to Mark Thomson (StatoilHydro) for his presentation on two decades of ocean bottom seismic experimentation. StatoilHydro has acquired 62 ocean bottom (OB) surveys since 1989. Acquisition began on Gullfaks and Statfjord, with increasingly technology intensive techniques—3D, 4D repeat surveys, shear wave sources and most recently focused seismic monitoring with fiber optics on the seabed. A survey on the Tommeliten field demonstrated the feasibility of imaging through a gas cloud with shear wave data. Surveys have identified wells located in the wrong reservoir compartment, pressure build up and non sealing faulting. The ROI for the technique is ‘huge.’ Data volumes have risen steadily over the years, but processing time has stayed constant at about one year per survey. Focused seismic monitoring uses Octoplan’s sensors and is a facet of Norway’s push to ‘Integrated Operations,’ allowing for near real time use of the information. Ocean bottom is moving towards densely sampled seismic ‘carpets,’ a digital fiber optic oil field, seismic cloud with autonomous seismic nodes. Ocean bottom techniques have informed conventional acquisition such as wide azimuth and dual streamer acquisition. Data from the seismic ‘cloud’ is now transferred to the office in hours and routed to stakeholder for QC and analysis.

Following Charles MacFarlane’s hugely entertaining jacket slicing act on the Schlumberger booth, Alex Ross provided an update on GeoFrame and Petrel interaction, providing insights into Schlumberger’s differentiation of the two toolsets. GeoFrame targets very big, multi-data type projects, illustrated by a 15,000 sq. km. project with 180GB of seismic accessible from a high-end workstation. A new ‘Send to Petrel’ option produces a Petrel ‘.zgy’ file of seismic data along with a zip file of the interpretation. This can be picked up as a complete project in Petrel. Ross concluded that GeoFrame has ‘many years before it,’ and Schlumberger is still adding new features. Integration with Petrel is easy—there is a reduced risk of changes in formats and units. Ross recommends ‘staying with a single vendor solution.’

Most other vendors would no doubt concur—although they may differ as to which single vendor to chose. SMT was showing the new Kingdom Geomodeling option leveraging JOA’s Jewel Suite patented gridding technology. This claims superiority to [Petrel’s] pillar gridding with better cell sizing, geometry and distribution. SMT’s new workflow includes interpretation in Kingdom, back and forth to Geomodeler and optionally via Jewel Suite and into the simulator.

On the Petrosys booth Paul Jones provided an entertaining talk on ‘Maps in the age of Twitter.’ Jones showed how OGC-compliant web map services (WMS) can be used to produce a wide variety of maps on multiple devices. Petrosys offers a ‘publish to WMS’ button to enable such functionality which allows for industry specific maps to be mashed-up with public domain data such as OneGeology, the Gaia web map system, the Canadian Geoscience Knowledge Network or with commercial services such as Valtus.

Chris Liner, (University of Houston) gave an insightful presentation on the geological sequestration of CO2. Worldwide energy consumption is measured in ‘quads’ (1015 BTU). In 2005 some 436 quads were consumed producing 27 giga tonnes of CO2. The forecast 2030 is 680 quads—the number tracks population growth—with a concomitant rise in CO2 emissions to 43GT. To put this in context, the vented CO2 in 2005 is roughly equal to four times the world’s natural gas production! There are therefore ‘huge economic/infrastructure costs to carbon capture and storage (CCS).’ The current Norwegian Sleipner CCS test is capturing around 1 million tonnes/year.

Henry Posamentier (Chevron) demonstrated the use of Paradigm’s Gocad to study the seismic expression of depositional systems and to predict lithology. Pattern recognition is the key—as Posamentier showed with spectacular comparisons of the Albertan Cretaceous and Mississippi flood plane aerial photos—’taking Vail into the 21st Century.’ An impressive seismic firework display.

Bill Fahmy described ExxonMobil’s direct hydrocarbon indicator best practices noting that a) there are no silver bullets and b) an AVO anomaly does not necessarily mean hydrocarbons are there. DHI is all about the application of technology and fundamental scientific thought. Despite the caveats, the technique is hugely successful. Worldwide, prospects without DHI show a 30% probability of success, using the technique this jumps to 50%! ExxonMobil established internal guidelines for DHI in 1997 and these are continually revised in the light of experience. The company does its own controlled amplitude and phase processing with bandwidth balancing to compensate offset wavelet changes. Prospects are evaluated with a DHI quality vs. confidence matrix, calibrated against analogs and historical data.

* links/0911_1.

** links/0911_2a and 2b.

*** links/0911_3.

**** links/0911_4.

This article is an abstract from The Data Room’s Technology Watch from the 2009 SEG Convention. More from www.oilit.com/tech.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.