Future Computing

At the Leading Edge’s crystal-ball gazing session on Future Computing, presentations came from Hampson-Russel, Schlumberger, IBM, the University of S. California and ChevronTexaco. The ‘seers’ concur that the future will see fast, ubiquitous computing—maybe with less ‘Nintendo geology’ and better exploitation of the ‘human computer’.

Brian Russell (Hampson-Russell) commented on trends in seismic survey size, multi-component analysis, pre-stack depth migration (PSDM), time-lapse seismics and visualization – all of which are extremely compute intensive. As hardware costs tumble, Russell’s future computing has the laptop PC as the computer of choice – running Windows or LINUX (for Power users). One problem is finding top notch programmers—it’s hard to interest youngsters in geophysics. Russell favors a ‘new focus’ on object oriented programming and a move away from ‘Nintendo geology’—with the application of more diagnostic science.

Beasley

Schlumberger Fellow Craig Beasley wants to go beyond more data and faster machines—and into ‘better resolution’. Today’s seismic acquisition can involve around ˝TB/day. On the hardware front – Beasley notes that in terms of compute power per $ ‘we are beating Moore’s law’. In 1972 an offshore survey cost $6,000 to process – the equivalent of $12,000 at today’s prices. The same survey could be processed for $1,000 today. The future will see ubiquitous high performance computing through technologies like GRID computing—commoditized CPU cycles. The future will also bring ‘bullet proof’ computing (not just ‘fault tolerant’), pervasive 3D visualization, wireless computing and continuous field monitoring.

Klepacki

IBM researcher David Klepacki described IBM’s ambitious ‘Blue Gene’ Linux supercomputer. Blue Gene M is a switch-less, massively parallel 80,000 dual PowerPC node machine. IBM also has plans for a one million processor Blue Gene P deliverable by 2009.

Neumann

Ulrich Neumann is director of the Integrated Media Systems Center at the University of Southern California—a $10 million/year facility for studies in streaming media, perception and cognitive modeling, computing and virtual reality. For Neumann, ‘we are infovores’ and require rich visual stimulation to ‘get our endomorphins going’. IMSC studies the design of systems that harness engineering and science information to exploit the brain’s processing power. IMSC ‘Remote Media Immersion’ fuses internet browsing with an immersive theatre experience with a 45 Mbit/s stream of HD video and 12 channels of audio. This is by doctors for remote operations – or to facilitate negotiations ‘as if you were in the same room.’ In the future, ‘widespread tele-presence will replace email’.

Paul

Don Paul (ChevronTexaco) believes that the future will see even more integrated oil companies and more ‘reality’ in reservoir modeling. Other objectives are to augment the human ‘computer’ and to manage IT and business together as an ‘ecology.’ Paul observes ‘exponential’ desktop network traffic at ChevronTexaco (CT) – around 55GB/month currently. CT’s global IT spend splits as follows – 34% upstream, 53% downstream and 13% corporate and finance. Growth is seen in finance, security, communications, supply chain, video and visualization, real-time and ‘simulation of all kinds’. IT is ‘no longer just number crunching’.

This article is abstracted from a 24 page illustrated report on the 2002 SEG Convention produced as part of The Data Room’s Technology Watch service. For more information, email tw@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.