SEG Dallas 2003

The mood at this year’s SEG convention was downbeat (again!). Oil companies are still not spending on geophysics. Salvation may come from new technology. 4D (time lapse) seismics continues to get good press while 4C (multi-component) has yet to prove its merits. Despite this there was a plethora of developments in 4C hardware from companies such as Thales (since bought by CGG), Input-Output and ABB. The general impression is that if you are going to equip your ‘e-fields’ with permanent sensors, they might as well be 4C. By the time you have recorded a significant amount of data someone will have figured out what to do with it all! Computer-aided seismic data mining, automated 3D voxbody tracking and facies classification are all the object of much current research. R&D spend by oils continues to decline. While the service sector takes up some of the slack, payback time comes in the technical sessions with more and more ‘infomercials’ passed off as science.

For Western-Geco president Dalton Boutte geophysics is the ‘alpha and omega’ of the reservoir, although the end of wireline is not yet nigh—‘We are Schlumberger after all!’ Boutte emphasized the need for timely delivery of survey results. Results from the first ‘Q-on-Q’ survey comparison for Statoil were available 10 days after the end of acquisition and showed a ‘clear 4D signal’ from water injection. R&D spend continues its decline. The 5 majors spent $4bn on R&D in 1990. Today this has halved. Service sector spend over the same time frame has doubled – from $0.5bn to $1bn.

Legacy is history

Shell E&P America president Raoul Restucci offered encouragement to seismic contractors since 4D seismics needs high quality data not legacy or spec data. Shell’s real time operations from New Orleans are a success with up to nine wells around the world controlled from the center. Restucci praised Landmark’s DecisionSpace as key to the integration of G&G software. Shell is ‘very committed’ to 4D. But 4C (multi-component) is ‘not yet mature—too slow and costly’. On the demographic front, training and mentorship are proving essential in face of a collapse in geoscience degree intake.


China National Petroleum Corp. (CNPC) unit BGP has embarked on an ambitious program to develop an integrated seismic processing and interpretation environment ‘GeoEast’. Due for release year end 2004 will leverage BGP’s GRISYS seismic processing and GRIStation interpretation system. The project is CNPC’s largest software project with 300 developers. The BP Center for Visualization at the University of Colorado is working on an Interactive Drilling Planner for a consortium including BP, ChevronTexaco, Exxon and Shell. The tool, to be commercialized by Earth Decision Sciences (EDS), offers ‘proactive’ well planning by anticipating uncertainty in formation tops and offering realistic well path updates. EDS is extending its flagship GoCad suite to ‘full featured’ 3D seismic interpretation with VolumeExplorer. Features include GeoProbe-like ‘cursor’ functionality, horizon gridding, feature extraction and fault picking. New functionality in GeoModeling Corp.’s VisualVoxAT includes waveform picking, fault interpretation and seismic facies classification with neural nets.

N-dimensional database

Helical Systems’ HHArchive is an innovative storage system for large multi-dimensional data sets using technology similar to SDE/SDO. In fact, the brains behind Helical System’s Self Defining Structure (SDS) technology is Herman Varma—who was also involved in Oracle SDO and helped ESRI embed HHCode in SDE. Helical has partnered with TeraRecon to provide a data visualization bundle for HHArchive leveraging the ISO/TC211 standard for model and meta data. Paradigm was showing ‘across-the-board’ integration through Epos which offers dynamic search, retrieval and publishing – ‘translating’ from one domain to another – such that petrophysical ‘language’ is accessible say to a geophysicist. Also new is ‘SeisEarth’, a 2D/3D line-based interpretation system, ‘iMap’ a mapping tool and a geocellular model viewer—FlowView. Schlumberger’s ‘next generation’ seismic interpretation system offers ‘smart’, automated structural interpretation, and a new vehicle for distributing interpretation results—an ‘Adobe Acrobat’ for seismic interpretation. Schlumberger was also showing ‘next generation’ seismic interpretation with ‘ant tracking’ autopicking. Starting with a coherency volume, the software sends out ‘ant’ agents to seek out best paths along faults. The ants ‘deposit pheromones’ as they go, and more ants follow and strengthen up fault pick. TeraBurst and HP demonstrated data sharing and high resolution graphics over the internet using TeraBurst’s video to optical (V2O) and video to data (V2D) products.


Adic’s Pathlight VX makes disk drives ‘look like’ tape. The system can be deployed to replace tape drives on acquisition systems with disks without re-writing software. Legacy tape backup systems can be replaced with disks—at 1TB/hour, or the disk-based system can be a rapid front end backup—prior to paging tape storage. DataDirect’s S2A3000 FiberChannel/Serial ATA storage appliance gives 1.5 Gbytes/sec bandwidth and 0.25 Petabyte Serial ATA storage per unit. DataFrameworks’ software provides an interface between applications and storage resources, imposing consistent naming conventions across files systems and offers a tree-structure interface to storage resources. FileTek’s StorHouse GeoCube seismic data storage and management system provides for direct SEG-Y data load to workstation. IBM was showing its new 3592 tape cartridge with 100-300GB capacity and 40MB/sec transfer rate. A 3494-L22 robot holds 180 cartridges and also works in STK Silos. LSI Logic has also entered the Serial ATA (SATA) space and claims the first enterprise serial ATA storage system leveraging cheap disks and a FiberChannel interface. Ovation’s latest PetaSite entry-level configuration with two SCSI drives come at around $100k. Orad Inc.’s graphic card cluster-based visualization hardware is being tested by Shell. Originally developed for broadcast, Orad uses ‘standard’ ATI or NVidia graphics cards and up to 2GB of texture memory. SpectraLogic’s 7950 Tape Library is a new multi-format tape library with 500TB per cabinet. Five cabinets can share a common robot. Spinnaker NetworksSpinServer NAS solutions scale from 2 to 11PB. Trango’s is re-writing its Manager software using Microsoft’s .NET framework. A web front end allows for map and text search of seismics by line name, date shot, fold or source.


IBM’s Grid-based processing promises ‘CPU cycles on demand’ leveraging United Devices’ GRID MP. Landmark’s ProMagic Server connects ProMax 3D processing software with Magic Earth’s Geoprobe so that interpretation software can be used in processing. Accessing pre-stack data ‘brings the power of interpretation software to the processor’. Dave Roberts (BP) showed GeoProbe use in ‘public relations’ on Algerian assets. The big picture includes surface facilities, 3D seismic and well paths. Core photos pop up at scale and in situ. Landsat imagery, flowline and facilities are displayed – including photos of construction work in progress. Zooming out, the whole of Algeria becomes visible—topography, roads (ex Michelin map) and surface geology. Making for ‘one of the most impressive images of my career’ fro Roberts. Integrating all this into GeoProbe is ‘work in progress’. Regional interpretation surfaces and reservoir simulation results will complete the picture ‘real soon now’. A plea to vendors to ‘make it easier to do this—so you don’t have to mess around with pixels’. Also of note is a new video showing use of Geoprobe on BP’s Wytch Farm field.

Data mining

Richard Uden of Rock Solid Images described a seismic data mining technique that ‘seeks to establish patterns in data’ and to ‘provide understanding of the parameters controlling the data being mined’. The process uses progressive steps, gradually reducing the volume of data samples being mined. Lithology, porosity and fluid type can be ‘mined’ without visual inspection of large 3D datasets. Christoph Arns (Australian National University) showed how rock properties could be determined from digital X-ray CT images and used to compute formation factor, permeability and elastic properties.


Matthew Hill (IBM) described a seismic data mining technique, PetroSpire, supporting identification of seismic features of interest. The technique is used to screen large volumes of seismic data in an exploration context. Data features are classified by texture or variance. The prototype system centers on a novel indexing technique for seismic data which supports tasks such as finding zone intervals between horizons and faults.


Mike Glinsky (BHP Billiton) described the emergence of Java as a ‘serious numerical and 2D graphics language’ and the ‘general acceptance’ of open source maintenance agreements. Java’s floating point speed now rivals that of Fortran. BHP leverages Seismic Unix and proprietary modules for wavelet-based lithofacies identification and stochastic inversion. An open source Java-based data viewer is presented. Landmark’s Dave Hale is also working on computer aided seismic interpretation – leveraging a ‘space-filling, feature-aligned mesh’ which makes global segmentation of 3-D seismic images feasible. Unlike local event tracking or region growing methods, these methods work with the entire image. Hale speculates that ‘further automation is possible’. In fact for 3-D images automated techniques may improve on human interpretation.


The Standards Committee began with an ‘action replay’ of last year’s discussion on the use of the EPSG survey data embedded into the SEG standards. The latest recommendation is that this dataset be hosted by the SEG with care taken to synchronize the standard with other mirrors. A platform-independent SQL version of the EPSG database is published this month.


Draft proposals are under examination for fixes/updates to SPS and SEG-D. The consensus is that SEG-D is probably beyond a ‘fix’ – and that there is a better case for a new SEG-‘x’ (presumably SEG-E) format – to move away from the ‘shot domain’ focus of prior standards. At the same time such a new format will be designed to integrate modern computer data standards and will likely move away from the tape focus of the old standards.

This article has been abstracted from a 25 page illustrated report produced as part of The Data Room’s Technology Watch report service. If you would like to evaluate this service, please contact

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.