The floor plan of your regular trade show has the major vendors occupying the plushy carpeted central ‘high ground’ and the assorted minnows, dot orgs and consortia stuck out around the exhibit floor’s periphery. At the 2007 European Association of Geoscientists and Engineers (EAGE) Conference, held this month in London’s Docklands, the major vendors were joined by seven major oils—playing a newly ostentatious recruitment role. The oil and gas high ground is very busy these days. But you have to ask, how many really big stands should a trade show allow?
Hardware
In the recent past, pre stack data was confined to the ‘silo’ of the processing house. Seismic processors would take the vast data volumes recorded in the field (some 300 terabytes (TB) for a Gulf of Mexico 3D survey) and process it down to a few hundreds of GB for interpretation. But interpreters interested in amplitude vs. offset, or other pre-stack indicators of the presence of hydrocarbons are increasingly leveraging pre-stack data—stressing data storage, network bandwidth and project data loading times. According to a NetAPP estimate, there are about 70 petabytes (PB) of upstream data stored on spinning disks today. Experience would tend to suggest that this is likely an underestimate now and that if it isn’t, it will be real soon!
Scalable Graphics
Accessing considerable data volumes is not just a pre-stack issue. For performant visualization of a few hundred GB of stacked data, considerable hardware gymnastics are required. One such offering was on show on the Paradigm booth. French start-up Scalable Graphics was showing a cluster-based data service for visualization of 400 GB datasets distributed across eight machines. GOCAD connects to the cluster which renders the data and serves it up to the client workstation over 100MB Ethernet. A 64 nodes machine is in development with a 1.2 TB bandwidth. Doing the same kind of thing on a single client requires data decimation. The cluster solution works at full data resolution and offers a roaming frame rate of 60 frames per second. The overall result is performance akin to GeoProbe on a large shared memory architecture—without the data to memory load times.
Results management
Hats-off to Schlumberger for a ‘vendor-independent’ presentation, by Statoil’s Cathrine Gunnesdal, of the use of ProSource Results Manager (RM) for capturing interpretation results from applications including Landmark’s OpenWorks along with Schlumberger’s own Eclipse reservoir flow modeler. Statoil uses RM to capture projects at ‘decision gates’ such as a recommendation to drill, following a basin modeling exercise or prior to a license application. A high level Statoil governance ruling obligates knowledge workers to clean up their projects before storage in RM. According to Gunnesdal, standard nomenclature has been the key to success. Project data is kept for a year before deletion. ProSource is also used to create and QC OpenWorks projects and particularly, to track which seismic interpretation goes with which eclipse model—this is ‘impossible in normal workflow.’ The benefits to Statoil include an ‘awareness of doing things right,’ assuring data management and quality through a proactive approach and a strict nomenclature.
Seabed logging
It’s nearly three years since Dalton Boutte made the bold claim that seabed logging ‘could replace seismics’ (OITJ Oct 04) and we thought that we’d see how things were progressing. EMGS claims market leadership with five crews and 250 projects to date ‘more that all our competitors combined.’ This survey count leaves the technology some way behind seismics today. But that didn’t stop PGS from picking up UK-based MTEM just after the show for a cool $275 million.
Geocap
First announced in 2000, Geocap was founded by Olav Egelend formerly with Technoguide/IRAP. The geo-visualization toolbox has got some traction with use by the UN in law of sea arbitration. Geocap offers map calibration, color table management and customization through scripting. Visualization leverages the open source Kitware data model. Geocap’s seismic display capability was put to good effect by partner Roxicon whose ‘Seismic Super Survey’ sets out to emulate PGS’ Mega Survey success in the Norwegian OCS.
SEG D Rev 3
The SEG standards committee met at the EAGE to progress the SEG-D Rev 3 tape standard. This is looking beyond conventional seismics to new data types including passive ‘interferometry’ and seabed EM. The committee is planning a web service for software conformity testing. SEG-D has not embraced XML, but with more rigorous lock down of bit positions it should be easier to translate header information to a tagged format.
New OpenWorks data model
Landmark is counting down to the OpenWorks R5000 release which includes a new data model. This will bring new efficiencies for data managers and in particular, heralds enhancements to the treatment of coordinate reference systems—which has been an ‘area of frustration’ for users. The new OW data model introduces multi project management, security and the ability to subset data for distribution to partners. Landmark acknowledges that this will be a major disruption and that customers will need (and get) help with migration. Landmark has also come fully into the OpenSpirit fold in order to extend its DecisionSpace infrastructure and dev kit to third party data store access (see page 4).
IBM
IBM had an impressive smorgasbord of IT hardware on display—from its 15 teraflop Blue Gene ‘petascale’ machine to its Deep Computing Visualization (DCV) offering. This, like the Scalable Graphics solution above, offloads graphics to dedicated hardware and now supports a Windows XP client so that Petrel users can benefit. IBM was also showing a rather obscure technology leveraging the Cell Broadband Engine (BE) as used in the new Sony Playstation. The ‘evolutionary computing’ technique is used in seismic analysis and pattern recognition with the system ‘writing its own software’. Chevron and Statoil are said to be trialing the box. IBM also expects to have a GPU-based compute offering next year. By heading the TOP500 list (see page 9) IBM has a good claim on HPC leadership. By building high-end interconnect into the hardware IBM claims very high real bandwidth—280 out of theoretical 360 TF machine. Commodity-based clusters usually max out at around 10% of their notional peak.
Ovation
Ovation’s Data Stewardship program, which hosts and refreshes companies’ E&P data has met with modest success with five US clients signed-up and management of CGG’s 120TB multi-client library.
NetApp
NetApp is evolving its offering from hardware to application support. Upstream applications run across Oracle and flat files which NetApp consolidates to a single system, simplifying backup procedures. NetApp clients include Shell (4 PB), Aramoc (2½ PB) and Petrobras (5 PB). NetApp systems are also sold by IBM.
Headwave
Headwave is now selling its pre-stack data access technology independently of Petrel as a stand alone ‘pre-stack for interpreters’ system. The compression-based technology offers access to terabyte size data sets without requirement for a storage cluster. The system targets pre-stack interpretation or processing.
Zeh
Zeh is expanding its software lineup with Horizon’s ‘GIMS’ geological data management package. GIMS originated as a front end to Fugro-Roberston’s SE Asia dataset. The data viewer and service combo displays well spots with drill down to a pdf of say, a palynology report. ‘Worksets’, collections of links to data, are used to create farmout CD distributions. Zeh is working to integrate GIMS with its SeisInfo package.
Sun reforms oil and gas unit
Sun has reformed its recently disbanded oil and gas unit and is getting back market share with its x86 workstations and the ‘Thumper’ 24 TB storage system. Dual AMD processor workstations are used by Devon, BP ConocoPhillips and Chevron and WesternGeco. The top-flight U40 workstation with dual NVIDIA 5600 is in the process of certification with Landmark. Sun’s acquisition of SeeBeyond has given it an SOA offering. SeeBeyond is used by BP for global identity management.
This article is a summary of a longer, illustrated report produced as a part of The Data Room’s Technology Watch Service. For more information please email info@oilit.com.
© Oil IT Journal - all rights reserved.