March 2004

I-O buys Concept Systems

Input-Output has paid a total of around $47 million for UK-based Concept Systems. Concept’s seismic navigation QC and design systems will augment I/O’s ‘full wave’ seismic initiative.

I/O President and CEO Bob Peebler has extended Input/Output’s ‘full wave’ seismic offering (Oil ITJ Vol. 8 N° 12) with the acquisition of UK-based Concept Systems Ltd. (CSL). Concept provides software, systems and services for towed streamer, seabed and land seismic acquisition. Stafford-based I/O paid $36 million in cash and 1.68 million in I/O shares - a total value of approximately $47 million.


Peebler said, ‘The synergies between CSL and I/O are compelling. The company is positioned for rapid growth and is an exceptionally strong strategic fit that will help us drive out logistics costs and shorten cycle times in field operations’.


CSL MD Alistair Hay added, ‘We have made significant investments in imaging from the seabed; 4D life-of-field seismic; and better integration of land acquisition operations through software. By combining I/O’s equipment and technology with Concept’s software and services, we will deliver a high-value added solution that will change the way seismic data is acquired, managed, and processed.’


CSL provides integrated planning, navigation and data management solutions to seismic contractors acquiring 2D, 3D, and 4D data. CSL software is installed on the majority of towed streamer vessels worldwide and has rapidly become an integral part of both redeployable and permanent seabed acquisition systems.


CSL is a key solutions provider for BP’s Valhall permanent seabed reservoir monitoring project in the North Sea and the company should be well positioned to benefit from rapid growth as 4D, life-of-field seismic projects are increasingly pursued by the major oil companies.

Saudi Aramco

With great timing, CSL announced that Chinese geophysical contractor BGP has bought CSL’s ‘Gator’ navigation solution for deployment on a Saudi Aramco ocean bottom cable (OBC) project. This is CSL’s first Gator sale in China and the company’s first deal with BGP, which is rapidly emerging as a major player in the seismic industry worldwide.

Command & control

Gator integrates vessel positions in real time into a central ‘command and control’ data management system. Gator is used fleet-wide by a majority OBC-capable contractors.

Core ‘disappointed’

Core Laboratories reports record revenues of $405 million for 2003. But the company is seeking to sell or downsize its reservoir management operation.

Core Lab reports 2003 revenues at an all-time high of $405 million – up 11% from 2002. President David Demshur commented, ‘Operations generated record levels of cash from operations and free cash flow. We used this cash to create shareholder value from stock repurchase and invested $23 million capex.’


Core reports that higher finding and development costs for natural gas on the shallow shelf of the Gulf of Mexico and crude oil in North America, have caused oil companies to increase spending on optimizing production and recovery factors.

Reservoir Management

Core is ‘disappointed’ in the performance of its loss-making reservoir management operations, especially specialized geophysical and seismic-related services. The company is evaluating options to improve the profitability and cash flow from these operations, including downsizing or selling the business. This will not have an effect on Core’s integrated, engineering-related projects and multidisciplinary reservoir studies which generated over $10 million revenue and $1.6 million profit in 2003.

A million miles of spaghetti eaten every day!

Oil IT Journal editor Neil McNaughton attends the Semantic Web Interest Group—part of the World Wide Web Consortium’s Technical Plenary. He discovers a brave new world of real-time knowledge capture in action. In the midst of the IT decision makers—debating semantics and ontologies, he finds himself a bit out of his depth. Braving traumatic early memories of an off-topic question, he nonetheless takes on the W3C SWIG on his old hobbyhorse of Units of Measure.

Way back, my dad, a biology teacher, parked myself (age 10) and my brother (5) in an ornithology lecture while he went off for a drink with the organizers. I listened dutifully until the Q&A session and then—as a pre-nerd—concentrated on formulating my query. Questions came on the types of birds migrating to and fro, the ground speeds attained, birds’ ages and sociology. I was about to formulate my own ersatz twitcher’s query when I was pre-empted by my brother. He asked, causing me instant mortification and immediately entering the family annals, ‘Did you know that a million miles of spaghetti are eaten every day?’


This early trauma (for me—my brother was well pleased with his instant recognition) has made me wary of the off-topic remark. And yet, it is the editorialist’s lot to be permanently on the brink of the gaffe—the question that reveals that you haven’t been paying attention, that you are in the AGM of the local real-estate collective by mistake (yes that has happened too!), or that you just haven’t a clue about the matters being discussed.


This month we were lucky to get an invitation to attend the Semantic Web Special Interest Group (SWIG)—part of the World Wide Web Consortium’s (W3C) 2004 Technical Plenary held in Cannes, France. This is a yearly event, shared between the US and Europe, with a ‘who’s who’ of the computing world in attendance. Early in the proceedings, my neighbor whispered to me that this was ‘where the future of IT was being decided’. I was impressed.


Not having brought my laptop with me, I was in a vanishingly small minority. The IT decision makers unfolded a range of PCs and other mobile devices. The hotel was equipped with a wireless network and soon everyone was busy reading the news, tunneling into their network or whatever. Once all were up and running, surprise, surprise, about 40% of the SWIG use Apple Mac’s—showing the success of Apple’s strategy of deploying a Unix OS.


The meeting began with a bit of self congratulation on the fact that the Semantic Web project had just delivered its first standards—the Resource Description Format (RDF) and Web Ontology Language (OWL). More of which later. As the meeting got underway, I was struck by the frenetic activity of the participants. This was not the laid-back mouse clicking of the web surfer in search of divertissement. This was elbow-jostling, ten finger typing at breakneck speed. It was just as well that I had left my machine at home—for every twenty keystrokes forward I need at least five back arrows.


What was going on? Turns out that this was audience participation of a semantic nature. While the speaker spoke and the Power Point bullets flew, the assembled IT stewards were collectively composing a web log (a blog) of the proceedings. One indeed was a scribe—but all were participating—with asides, quips and references to topics under discussion located and posted in real time. In fact you can read these proceedings on They are as you might expect rather impenetrable. But as Marshal MacLuhan said, ‘it’s the medium not the message, stupid!’


The media that should be—because they are many. From IRC, through Blogs to Wiki’s there is a range of exciting new technologies which turn the passive ‘teacher and class’ paradigm into a dynamic, collaborative knowledge-creation effort. Arguably the first new thing in KM since the invention of Power Point. In so far as these technologies are ‘semantic’ you could also say that the SWIG eats its own dog food. There is an XML/RSS feed (see Oil ITJ Vol. 9 N° 1) of the proceedings for syndication.

Data types

As for me, being computer-less and completely out of my depth, it seemed like an ideal moment to ask the ‘million miles of spaghetti question’. The debate was moving around the higher planes of RDF graphs, ontologies and serializations but every now and then, reference to typing was made. The SWIG, like the XML community, is concerned that strong typing should be built into its specifications.

Units of measure

Seizing a lull in the conversation I dropped the big one. ‘Data typing is important but what about units of measure?’ IT looks after own with ‘strong typing’ of data pigeonholes and then throws caution to the wind when the value of ‘3.75’ is stored as a floating point representation without recording if the length is feet, miles or meters. This causes spacecraft to crash, bridges to fall down and wells to be drilled in the wrong place (maybe I already wrote that editorial). Since one aim of the SWIG is for machine-readable catalogues, it would seem prudent to make a UOM description and discovery mechanism part of the specification.

W3C vs. ISO?

The answer was a reflective, ‘Yes this has cropped up before. Maybe this has more to do with ISO than W3C’. There appeared to be little enthusiasm for action. Behind the scenes, as my subsequent researches suggest, this could be the tip of the iceberg—in that W3C and ISO seemingly don’t get along. Can you believe that? Two standards orgs that don’t get along? And impacting the future of both IT and engineering big-time? One thing is for sure, many end users of this future technology, including oil companies, are conspicuously absent from the W3C and may be missing out on the ‘future of IT’.

A2D SilverWire—web-services log delivery

A2D’s new web-services log delivery mechanism is deployed in Landmark’s PowerExplorer.

TGS-Nopec unit A2D Technologies is now offering its Log-Line Plus well log data repository through a web services delivery mechanism. The new ‘SilverWire’ web service exposes A2D’s log data for remote query and access. The web services paradigm ensures standards-based, machine independent, remote access.


Queries can be based on common well header and log metadata attributes and/or spatial extent. SilverWire also supports user authentication and e-commerce: users can purchase a variety of log products directly from A2D.


Landmark is an early adopter of SilverWire technology which is now embedded into its GIS-based data management tool, PowerExplorer. PowerExplorer leverages ESRI web technology for spatial data display and analysis including quality map generation in GISView using MapObjects Java–supporting cgm, dxf, dgn and zgf export formats. Dynamic ad-hoc queries, industry standard reports leverage JCPageLayout, bubble maps and GeoFrame support complete the picture.

LAS data

A2D claims to have amassed one of the largest databases of high quality, key LAS digital well log data for the Gulf of Mexico, Gulf Coast, and Permian Basin regions. There are currently over 1.5 million logs in the database with over 60,000 new additions each month.

New version of Recon ported to Onyx4

Austin Geomodeling’s flagship embraces knowledge management and extended interoperability.

Austin Geomodeling (AGM) has just released a new version of its 3-D geological interpretation solutions Recon. Recon is described as a ‘next-generation’ 3-D geological interpretation package that lets geoscientists and engineers visualize and interpret well log and seismic data in an integrated 3-D model. The latest release introduces new workflows for advanced 3-D surface modeling, presentation quality graphics and compatibility with RMS, GoCAD, Petrel, Petrosys, and Eclipse.


Knowledge management is supported through 2-D and 3-D hyperlinks to external documents and a ‘free-form’ interval database architecture supports generic well interval datatypes. Recon has also been ported to SGI’s Onyx4—scalable to 64 CPUs, 128 GB of memory, and 32 graphics pipelines.

MMS re-awards data processing contract

After a few months of hesitation, the MMS has awarded its digital log contract to A2D.

Following the kafuffle last year (Oil ITJ Vol. 8 N° 9) when the US Minerals Management Service (MMS) awarded and hastily withdrew a well log data processing contract to A2D Technologies, the contract has finally been re-awarded for real.


The withdrawal was due to a ‘misinterpretation’ by the MMS of internal procedures, forcing a re-evaluation of all bids on the contract. After what was described as ‘an extensive procedural review process’, the MMS has re-awarded the exclusive multi-year digital well log data processing contract.

Official contractor

A2D, a subsidiary of TGS-Nopec, will act as the MMS’ official contractor for well log data processing and hosting of proprietary and public well log data assets. A2D will work directly with companies operating in the Outer Continental Shelf (OCS) to collect and distribute complete, clean, processed well log data to the public.

Workstation ready

A2D will process digital vector log data to the MMS’ data specifications and will also supply the MMS with Workstation Ready (WSR) data for internal use. A2D’s WSR format is used by a exploration companies and is claimed to offer efficiencies in data management and interpretation activities.

POSC kicks-off global well ID project

POSC is proposing a globally unique well identifier—with support from Shell and ConocoPhillips.

The Petrotechnical Open Standards Consortium (POSC) is asking for public comments on its proposal for a global, unique well identifier. The project was sparked-off at last year’s PNEC Data Management conference and has resulted in a discussion document available on Eleven volunteers from POSC, oil and service companies have contributed to the initiative which proposes a solution to the ‘long-standing global, unique well identifier problem’. POSC has extended the public comment period to end March for Oil ITJ readers.

Web service

Current thinking is that a ‘Well Identity Service’ will be made available to the industry addressing various aspects of well and wellbore identities. The authors of the document believe that the heart of the matter is the lack of consistent definitions and best practices, as well as the lack of a single point of contact for registration, query, and conflict resolution. Comments to date indicate strong industry support for the initiative.

Open Source spatial from PPDM

Presented at the ESRI PUG, the new PPDM project may prove a ‘Trojan’ for Open Source GIS.

At the ESRI PUG, the PPDM Association kicked-off the fourth round of its spatial-enabling initiative: Spatial IV—a.k.a. PPDM ‘Lite’. PPDM III was an ambitious extension of previous spatial projects that enabled all PPDM objects, provided a sample ESRI Geodatabase implementation and researched further integration technologies. Spatial III was implemented by Nexen, Talisman and Geoscience Australia.


The new project sets out to create a lightweight data model which can act as a data collector for data stored in full-blown PPDM implementations and in other repositories such as document management systems and pipeline datastores. The ‘Lite’ model will then serve as the basis for a new GIS-enabled product. Target users in large and small companies are those seeking to integrate multiple datasets into a spatial front-end.

Open Source

Quest Software’s Toad schema browser was used. The project will leverage previous database implementations on Oracle and the Open Source PostgreSQL. Spatial components are developed for ESRI SDE and PostGIS - another open development by Refractions Research. More from

ENI leverages CoBrain search in Portal

Innovation Machine’s semantic processing offers answers to oil knowledge workers.

Invention Machine’s ‘CoBrain’ knowledge management tool has been embedded in ENI’s technical users’ Portal (Oil ITJ Vol. 9 N° 2). The semantic search engine uses a simple subject-action-object grammar which has proved ‘very powerful’ according to ENI’s Antonio Carlini. CoBrain is an enterprise knowledge-mining solution that enables corporations to ‘capture, share, and leverage intelligence’.


CoBrain’s ‘semantic processing’ transforms text into a ‘sophisticated index’ that reveals ‘precise answers to user questions’. CoBrain creates ‘Knowledge Bases’ from the Internet, Intranet, Lotus Notes repositories, corporate servers, and corporate databases accessible via ODBC. CoBrain can also access ‘deep’ web sites such as Patent Office databases.


CoBrain leverages research performed in the 1950’s by Genrich Altshuller whose ‘TRIZ’ theory provides systematic solutions to technological problems. Invention Machine is working with customers like Shell, Saipem, BP and ConocoPhillips on improving mooring and drilling systems and next-generation drilling technologies.

ISA Bridges Z-Map to ESRI gap

ISA has bridged the gap between Landmark’s Z-Map contouring and ESRI’s GIS.

ISA’s ‘A2Z Bridge’ gives PC-based GIS users the ability to tap into the surface modeling capabilities of Landmark’s Z-MAP Plus system. A2Z Bridge connects the Unix world to the PC with a straightforward interface that supports user-selection of data either in both Z-Map projects on a Unix workstation and ARC Shapefiles on the PC. Gridding and contouring can be carried out with full Z-MAP Plus functionality through the ZCL scripting language (part of all Z-Map releases). The A2Z client can be deployed either as a stand-alone program or as an extension to ESRI’s ArcMap. The Unix server component can run either from a GUI or as a batch process waiting on requests from client process.


On the Unix-side, the server dynamically generates and executes ZCL scripts. Once gridding and/or contouring are complete the system will return the resulting grid and contours, in compatible formats, to the Arc project where the user is working.

Virtual Geoscience Workshop

Imperial College’s model of ‘discontinuous processes’ targets sedimentation and multi-phase flow.

The Earth Science department of Imperial College, London has been awarded a £470,000 grant to build a ‘Virtual Geoscience Workbench (VGW) for Discontinuous Systems’. VGW will study granular dynamics, packing and heap stability using discrete element modeling (DEM), finite discrete elements (FEMIDEM), complex system modeling and continuum modeling.

Collision dynamics

VGW will model complex shape collision dynamics, interaction between particles, fluid coupling, fracturing and fragmenting particles at ‘pseudo-static’ geological timescale systems. VGW will be ‘open source’ and easy to customize. The tool will be demonstrated using a sedimentary rock process model - from genesis to brittle deformation and include sedimentation, avalanching, compaction, diagenesis, multi-phase flow through granular media, faulting and jointing. Continuum Resources is a technology partner in the project.

Landmark rolls-out Engineer’s Desktop

New integration platform combines drilling software with production and economics.

Landmark Graphics Corp. has just released its ‘Engineer’s Desktop’ (ED), a new integration platform for drilling and completions and production operations. ED is a suite of well design, real-time operations, field surveillance and economic tools in a common data management environment.


Landmark president Andy Lane said, ‘As the digital oil field becomes a reality, integration of key workflows across drilling, production and economics will enable customers to optimize their field operations. Landmark is the only company to offer this breadth of data coverage and application capabilities on one integrated platform.’

Drilling Desktop

ED combines Landmark’s Drilling Desktop (DD—see Oil ITJ Vol. 8 N° 10) with production and economics. New capabilities allow cross-domain analysis enabling drilling engineers to achieve better wellbore placement, greater levels of efficiency and reduced drilling costs. DD features new data management, visualization, engineering and query capabilities.


Murray Roth, executive VP, Systems and Marketing added, ‘ED brings integration to engineering and operations, in the same way that OpenWorks succeeded in improving the workflows of the geoscience community. Real-time links will enable operators to drill safer, more effective wellbores, track production, reduce downtime, promote rapid intervention design, and validate decisions with economic analysis—driving overall well performance and field profitability.’

Fugro-Jason’s Reservoir Characterization

Seismically-derived reservoir properties are used to model reservoir and fluid flow.

Fugro-Jason makes a bold claim for its new Reservoir Characterization and Modeling (RC&M) technology – described as the ‘first true integration’ of seismically-derived reservoir property information with the static reservoir model and subsequent flow simulation.


Fugro-Jason has observed the move to a closer integration between the Geoscience and Reservoir Engineering and industry acceptance of seismic inversion as the optimal source of rock property information – the basis of reservoir characterization.


RC&M leverages Jason’s 3DiQ rock property toolbox. Static geocellular models are upscaled using FastTracker. RC&M promises an updateable subsurface model consistent with all measured data during the full development of a field.

GIS data-on-a-plate for explorationists

Governments in the UK and New Zealand are offering free digital geo-data to explorationists.

Antipodean governments are leveraging geographical information systems to entice operators to take-up new acreage. As a part of the UK’s 22nd licensing round – announced earlier this month, the UK DTI has posted an ESRI Shapefile of the available acreage on the DEAL Data Registry download page (

New Zealand

The Institute of Geological & Nuclear Sciences (GNS) has launched a web-based information service for New Zealand’s oil industry. The Petroleum Data Query Map is described as a ‘one-stop-shop’ for information about New Zealand’s petroleum fields and license areas.

New entrants

The GIS-based system offers information on license holders, work plans, wells drilled and seismic data acquired. Short reports of the main findings of work undertaken can also be downloaded. More from

Enerdex and Infogistics team

Startup sets out to apply ‘intelligent search’ to the growing mountain of corporate information.

Energy Data Express—a.k.a. Enerdex is a London based start-up which sets out to offer ‘new and innovative’ data management solutions for organizations in the energy sector. Enerdex is in the process of adapting natural language software from Edinburgh-based Infogistics, a specialist in text-mining and document retrieval.

1 Petabyte/day

Enerdex reports that world-wide corporate users will create over one Petabyte of data per day and that knowledge workers are only achieving two hours of productive work in the day—the rest of the time is spent ‘trawling, compiling, or routing information to complete their tasks’. These figures will drop even further as the amount of data increases, unless smart tools can be employed that can efficiently retrieve, route, and present users with the information they need to achieve their tasks.


Enerdex technology allows dispersed corporate information repositories to be merged into one ‘advanced knowledge discovery system’. Intelligent search simplifies navigation and fast retrieval of documents by gathering clustered groups of relevant documents from multiple sources. These solutions are already supporting organizations in the legal, HR, and government domains.

ESRI Petroleum User Group 2004

The PUG is the place to be for upstream GIS specialists. Attendance was around 800 (up from around 600 last year). ArcGIS 9 is due for imminent release including a new geoprocessing environment (GUI and command line-based), new 3D extensions and componentized tools for developers building enterprise GIS solutions. Extended functionality comes from Safe Software’s extract transform and load (ETL) tools and the Maplex labeling engine. Data ownership roles and replication paths are critical if ‘balkanized’ data management is to be avoided. Replication got a high profile and metadata is getting recognition as the key to enterprise GIS – and indeed to interoperability in general. The pipeline data model wars seem have calmed down some. It’s not that they have been won—there is a recognition that multiple data models exist, and that software should be capable of handling the different flavors. A half day geodetics workshop was organized by the Americas Petroleum Survey Group to spread the geodetic gospel to the ignorant. If you think that latitude and longitude tell you where you are—think again, read our report from the workshop (next article) and visit with the American Petroleum Survey Group on

ESRI’s head of software development, Scott Morehouse, stated that ESRI’s goal is to build information systems by providing a high-level programming and information model. A generic GIS framework minimizes application-specific engineering and allows domain specialists and users to configure systems. For ESRI, the Geodatabase is the ‘open’, generic model for geodata. Deployment can leverage .NET, Java, WebServices or HTML. Interoperability is best achieved by a loosely coupled architecture and ArcGIS 9 is an extensible, componentized solution. Developers can embed the ArcgGIS Engine into their own applications. Morehouse believes in ‘accepting and adopting standards that work’, citing ISO, OGC, web services and ‘API’s’.


John Calkins (ESRI) demonstrated ‘push button’ ArcGIS install by end users without system administrator privileges. This uses the Microsoft Installer and encapsulates all options and packages and uses Microsoft’s System Management Server (SMS). Calkin then demonstrated the personal Geodatabase – with a coal bed methane study of the Powder River Basin. The demo rolled-in mining data from the Bureau of Land Management and 29,000 wells – all packaged in a Microsoft Access Geodatabase. ArcGIS 9 now supports a raster data type – scanned well logs can be stored in the database. Data from IHS Energy’s SQL-Server repository of US wells and other public sources was rolled in on-the-fly. Weather data can also be subscribed to and integrated into the map. Safe Software’s Feature Manipulation Engine (FME) has been embedded in ArcGIS 9 and opens up a range of file formats including AutoCad DWG, GeoGraphix, InterGraph, MapInfo tab files etc. The Maplex labeling engine supports gapped, labeled contours. Displays can now be paused during map drawing – to add or remove layers without waiting for the wrong map to be drawn.


A new GUI supports geoprocessing (also scriptable in Python, VB or Java Script and legacy AML). Workflows can be packaged for re-use and used to ‘document tradecraft’. Other applications include basin modeling and high consequence area (HCA) pipeline studies. An oil and gas geoprocessing video is available on the ESRI website.


Ken Hood described an ExxonMobil project to capture and preserve the results of E&P farm-in opportunities. Traditionally this information spans a multitude of databases and data types. ExxonMobil has built a 3D framework where the basic data element is the reservoir compartment. This is captured as a geolocated outline, along with chronostratigraphic unit and field assessment data. The system allows for aggregation at field, prospect, region and business unit levels. GIS based presentations support opportunity analysis and selection of multiple undrilled prospects, partially explored structures and multiple stacked pays.


Tracy Thorliefson (Eagle Information Mapping) holds that, ‘Stovepipe solutions to Pipeline GIS are not the best approach to interoperability’. Operators have a range of other applications which need to share data—Maximo, SAP Plant, AFE databases etc. All of these have their own repositories and contain overlapping data to that held in the pipeline database. Other key data sources include DOT regulated daily operations, scheduled inspections, leak detection and so on. All this makes for ‘balkanized’ data management, data islands, duplication and inconsistencies. How do you get all systems to talk to each other? First decide who owns the data—you need one owner for each data type. Next, make sure data is captured once and for all—as near to the source as possible. Finally, develop service-oriented systems and promulgate controlled replication. GIS should only own geo information. ‘Data owner’ systems replicate content to ‘data consumer’ systems.

GIS in BHP Billiton

According to Katya Casey and Robert Graham, BHPB is ‘very serious about metadata’ and uses XML style sheets and a form-based interface with pick lists for metadata capture. BHPB wants to ‘educate’ vendors to supply data in the most current SDE format – along with good metadata. Shape files are not good enough. Vendors are willing – and in the future BHPB will mandate metadata supply. BHPB’s GIS rolls-in ArcGIS, Petrosys, OpenSpirit, ERMapper and the BHPB Portal. Geodynamic’s Spatial Search Engine is used for spatial/text-based search. Today the big paradigm shift is to Arc Internet Map Server – with vendors supplying data as services. IHS Energy, PetroWeb, WesternGeco, TGS Nopec, PGS, Fairfield, Veritas and PriceWaterHouse are all providing data services to the BHPB web server. A future project will involve the creation of a ‘corporate metadata store’.


John Stigant (Devon Energy) gave an entertaining account of real-world GIS (a.k.a. surveying) with Devon’s geodetic due diligence on a Syrian license. Maps of key well locations showed up to 50m variation. A geodetic campaign set out to verify well locations and block boundaries. The International Terrestrial Reference Frames (ITRF) framework was used for triangulation. All topographic information including station photos was loaded into a geodatabase. The result, all of Devon’s applications—ER Mapper, GeoFrame and Geographix—now operate with the same spatially referenced data.


Malcolm Ross (Landmark Graphics) demonstrated the use of ArcGIS to display plate tectonic reconstructions—with 3D displays in ArcScene and ArcGlobe. An ambitious ‘whole earth’ modeling systems integrates Geomark petroleum systems data, climate simulation data and orographic (mountain building) effects to match oil isotopic signature with palaeotemperatures.

C-K Associates

Perry Lopez (C-K Associates) showed GIS usage on a pipeline environmental impact study in South Louisiana involving alligator counts, beach elevations, bird nesting, vegetation and land loss/gain. The pipeline track was overlain on a photo mosaic of ‘before’ and ‘after’ images—along with a second control swath. The study showed a significant amount of land loss over five years, but the control area lost as much as the pipeline corridor.


Deloitte & Touche’s PetroView is migrating from MapInfo to ESRI. Geodynamics’ text and spatial search engine (SSE) now includes field outline data. Information BuildersWebFocus front end now sports an ArcIMS extension providing a map-based interface to corporate data. MJ Harden has embedded PipeView in ArcGIS and now presents graphical interface to risk assessment data and US DOT compliance. New Century Software’s GAS HCA analyst also targets DOT compliance with an ArcGIS 8.3 extension to identify high consequence areas. OGM’s LandSlam is a hosted service for communication between operators and land personnel—replacing paper-based updates of title information. LakeView Technology’s OmniReplicator synchronizes multiple databases including SDE data. OpenSpirit’s new SDE write functionality lets non-spatial applications share positional information with mapping applications. Petris WINDS GIS Data Viewer lets field operators redline maps and integrate GPS data input. Petrosys’ Map SDE connector displays shapefile data Petrosys’ map.

This article has been abstracted from an 20 page report produced as part of The Data Room’s Technology Watch report service. If you would like to evaluate this service, please contact

Geodetic wake-up call from APSG

A seminar by the Americas Petroleum Survey Group stressed dangers of poor geodetic calculations.

Jim Cain and John Stigant of the Americas Petroleum Survey Group held a half day workshop on Geodetics, Datums and Projections as applied to the oil industry. Cain laid down the basics of geodetics for geodetically-naïve end users or programmers. Cain’s clear message is ‘latitude and longitude are not a unique representation of a point on the earth’s surface—but vary according to the geographical coordinate reference system used’.

Lost in space!

Or, putting it in plain English, ‘given latitude and longitude, you do not know where you are’. Errors of the order of a hundred meters or more can come from misunderstandings of geodetic datum. Other pitfalls await the unwary when 3D data is projected onto a 2D map. Stigant ran through the plethora of map projections available—with case histories of projection disasters. ‘People get it wrong all the time—there is a constant need for education, you should always re-check software to make sure that scale factor and declination have been handled properly.’


Beware of software that allows ‘unknown’ datum—a geodetic anathema. Wells pose a delicate problem—a deviation survey records a vast amount of metadata which may or may not be available in a trade context. One recurrent problem is the over zealous re-application of a correction—stressing the need for audit trails.


Pipelines pose another type of mapping problem—accurate knowledge of ellipsoid height and height above mean sea level is required to avoid pumping uphill at unexpected pressure! Another scare story involved the building of a production facility accidentally located on a neighboring property— resulting in a $12 million windfall for the owner! Stigant advises checking and double checking your survey data. If in doubt – get help! Done well, geodetics can be a low cost, high competitive advantage activity. Done badly, they have huge risk potential. The APSG ( is ready to give this presentation to corporations on request.

Folks, facts, orgs, et cetera

News from C&C Reservoirs, Seitel, Explora, CDA, ConocoPhillips, ExxonMobil, Baker and Fugro.

C&C Reservoirs has appointed John Anderson, Jose Guzman, Richard Footitt and Suvimol Maingarm to its geoscience team.


Seitel has hired Randall Stilley as president and CEO, subject to approval of the bankruptcy court. Stilley is a past president of Weatherford International’s oilfield services division.


Waterous & Co. has named Adrian Goodisman as MD of its Houston office. Goodisman was with Ziff Energy Group and Phillips Petroleum.


The US Department of Energy has named Mark Maddox as acting Assistant Secretary of the Office of Fossil Energy. Maddox replaces Carl Michael Smith who resigned recently from the position.


Houston-based start-up Explora Seismic Processing ‘ESP’ announces its management line-up as follows: Adrian Lillico, President, Rory McArthur, VP New Business Development and Efrain Melendez, VP Geophysical Operations.


The Australian government has just created a portal - - to state, territory and federal government geoscience agencies.


Greg Jonassen is to manage Aveva Group Plc.’s new office in Calgary to market plant engineering lifecycle solutions to Canadian industry.


Common Data Access (CDA) has just kicked-off a new ‘Value Study’ to be masterminded by Paras Consulting. CDA is also undertaking an exploration data disposal study for publication under the UKOOA banner.


ConocoPhillips has given $500,000 to the Society of Petroleum Engineers’ Foundation. The SPE is enganged in a $5 million fundraising drive to ‘transform SPE’s Internet capabilities’.


ExxonMobil vows to ‘stay conservative’ on drilling and is to cut $1 billion costs in 2004. From 1999 and 2003, Exxon Mobil’s work force declined 18% to 88,000 following the merger and increased productivity due to new technology.


Baker Hughes Inc. has appointed Larry Brady to its board. Brady, is chairman and CEO of Unova Inc.


Following its acquisition by Fugro last year, Robertson Research has changed its name to Fugro-Robertson.


Guy Gueritz has been hired by Linux Networx to market cluster-based solutions to the oil and gas business in the EAME region. Gueritz was previously with SGI.

Oildex, electroBusiness team on e-business

New grouping sets-out to integrate electronic invoicing and workflow systems in North America.

TransZap unit Oildex is teaming with electroBusiness to offer North American producers an integrated e-commerce solution spanning Oildex SpendWorks and electroBusiness e-Business Utility systems. The combined offering extends e-commerce including electronic invoicing and financial e-documents to some 600 new vendors.


Oildex president Peter Flanagan said, ‘The alliance enables Canadian energy producers to save money using the Oildex SpendWorks system, by easily processing and paying bills, received through the extensive electroBusiness network of digitally-enabled Canadian vendors.’


Oildex SpendWorks provides oil companies with web-based tools that simplify the process of ordering and paying for goods and services. By replacing paper invoices with digital information, engineers, accountants and procurement managers save time and money, improving profitability.


Oildex Connect is a web-based suite of financial and operational workflow tools that deliver real-time decision support information to energy companies. Oildex Connect provides one of the largest data exchanges in the industry and includes digital and scanned invoicing, owner/producer relations data posting, check stub reporting, crude oil data exchange, gas data exchange, production and sales volume reporting and joint interest billing.

PIDX hiccup, new e-commerce standard

The API PIDX standard has overcome some compatibility issues and is ready for e-business.

Andy Ross, Digital Oilfield and co-chair of the PIDX Business Messages Workgroup (BMW) has provided Oil IT Journal this progress report. ‘The PIDX BMW maintains PIDX’s standards for electronic commerce. In 2003 an initiative was undertaken to incorporate several new elements into PIDX’s existing XML standards (RP 3901, version 1.0).

OFS Portal

Specifically, OFS Portal schemas were harmonized into 3901, a CustodyTicket schema was added, and several schema design guidelines were incorporated. Unfortunately it was discovered at the PIDX Fall 2003 Conference in Houston that the proposed schema revisions were not backwards compatible with version 1.0.


In January 2004, Schlumberger hosted meeting for the Messages Workgroup to resolve the incompatibility issues. The group worked through the issues element by element, and attribute by attribute. Once approved, version 1.1 will be posted on the PIDX website.’

Service sector anticipates 2004 pick-up

Despite consensus on a need for consolidation, CGG and PGS have no merger plans as yet...

CGG’s operating profit for 2003 was 10.6 million Euros down from 61.6 million in 2002. Robert Brunck, chairman and CEO, declared, ‘While the market in 2003 continued to suffer from the combined effects of very slow demand and persistent overcapacity, CGG has further reduced its debt, delivered positive earnings before exceptional items and preserved its technological leadership. Only a consolidation of our industry followed by a significant reduction in capacity, particularly in marine, will bring the changes needed to allow the actors of the industry to reach satisfactory level of return. This is the reasoning behind our stake in competitor PGS.’ In a webcast Brunck re-iterated his call for consolidation between PGS and CGG, stating that, ‘The companies have agreed to deliver a clear message to market in the near future.’


PGS president Svein Rennemo did not share Brunck’s enthusiasm for a joint statement when questioned during the PGS 4th quarter conference call. Rennemo denied that such an agreement had been reached, stating that, ‘Any accommodation—be it with CGG or another company can be entertained—but must satisfy our criteria’. PGS posted a net loss of $815 million (down from $1.2 billion in 2002) on revenues of $1,1 billion (up 12%). PGS expects industry overcapacity to remain for the short to medium term but believes there is an opportunity for higher pricing of technology.


Fugro’s 2003 net result was €32.4 million, down from €60.2 million in 2002. President and CEO Gert-Jan Kramer commented, ‘In a year in which incidental factors resulted in disappointing financial results, Fugro made an important strategic step with the acquisition of Thales GeoSolutions. The integration process is now almost finished and we have a strong starting position for 2004. The strengthened market position of Fugro combined with the sound development of the order book are positive signs.’ Fugro reports that oil and gas sector clients will be cranking up investment by some 4-6% in 2004. Positive developments are expected from deepwater projects, particularly in the Gulf of Mexico and West Africa. Good capacity utilization is also anticipated for the activities in the Middle East, the Caspian Sea and Asia. The company described as ‘delightful’ the fact that several of the largest oil companies will now use $20 (instead of $16) per barrel to determine the feasibility of new investments.

Iron Mountain

Richard Reese, Iron Mountain chairman and CEO said, ‘2003 was another solid year for us. Revenues exceeded $1.5 billion for the first time in our history and we doubled our presence in Europe with the strategic acquisition of Hays. We see tremendous opportunities before us and we are focusing our efforts on capturing them.’


2003 operating revenue was $ 24 million, down 14% from 2002; the company posted a $341,000 loss for the year. Kelman president David Richard described 2003 as, ‘A difficult period for the seismic processing industry. Clients’ activity was more weighted to exploitation—requiring less seismic information.’ Kelman has taken steps to reduce operational costs and expects savings will be reflected in the second and subsequent quarters of 2004. Richard sees an ‘improved operational climate’ for 2004 and believes the company is ‘well placed for the future’. ‘Our Houston, Denver and Oklahoma offices have been refurbished and expanded. The current high and comparatively stable commodity prices are expected to encourage our clients to expand their 2004 capital exploration budgets. We expect that the combination of cost savings, expanded operational capacity and increased seismic spending by our clients will reward KTI in 2004.’

Veritas DGC

Veritas posted revenues for the six months ending January 31st 2003 at $252 million – down 4% on the same period of 2002. This translated into a net loss of $12 million (against a profit of $6 million). Outgoing chairman and CEO, Dave Robson said, ‘Although I am leaving, I’m confident that the management team will continue to emphasize technology, operational excellence and positive cash flow.’

Scandpower to distribute Tecplot RS

Scandpower Petroleum Technology is worldwide distributor for reservoir modeling post processor.

Scandpower Petroleum Technology has been selected by Tecplot as worldwide distributor for its Reservoir Simulation (RS) post-process plotting solution. Tecplot RS is an integrated plotting environment oil reservoir modeling, co-developed by ChevronTexaco and Tecplot. See last month’s Oil IT Journal for more on the latest release of Tecplot.


Tecplot product manager Don Roberts said, ‘We expect Scandpower’s industry expertise, global coverage, and contacts will grow Tecplot RS’ share of the post-processing market.’


Scandpower is also responsible for the distribution and maintenance of other reservoir simulation tools such as ECLPost—originally developed by Norsk Hydro. ECLPost is a collection of routines for pre-and post-processing of simulation models and results, integrated into a neat and fast toolbox for reservoir engineers working with ECLIPSE reservoir simulations.


Scandpower director of reservoir technology Erik Ackles added, ‘Tecplot RS complements ECLPost providing a comprehensive, cost-efficient solution for enhancing the reservoir engineer’s


The price of Tecplot RS begins at US$3,500.00 for a perpetual single-user license on Windows, Linux, Mac OSX, or UNIX. Multi-platform network licenses are also available. In addition to distribution by Scandpower PT, Tecplot RS will be sold directly by Tecplot.

InterAct web solution positions BG laterals

Schlumberger’s downhole camera and real-time communications enable real-time well design.

Schlumberger’s Oilfield Services unit has won BG Group’s ‘Innovation Award’ for a real-time well placement solution in BG’s North Sea Minerva-Hub development. Problems arose when the first development well flowed at a rate well below expectations so a new, adaptive multilateral design was chosen for subsequent wells.


Schlumberger’s InterACT communications link was key to this multi-discipline intervention. Real-time positioning of the laterals was achieved through remote, geoVision borehole camera imagery and azimuthal density neutron logging while drilling (LWD). Schlumberger’s InterAct is a combination of software and communications technology that enables near real-time transmission of well log image data from the well site to the office.


Schlumberger’s well placement manager, Ian Tribe said, ‘The real-time images were critical in planning the next stage of the well. LWD saved three bit trips to recover memory data.’ Last year, BG deployed Schlumberger’s Inside Reality immersive well planning and geosteering environment at its Interactive Visualization Center, located in Reading, UK. The system is used to plan wells using seismic, geological and dynamic reservoir models interactively, in ‘true virtual reality’.


Presenting the award, BG VP Jon Wormley said, ‘Our strategic relationship with Schlumberger enabled the project team to quickly re-design the Minerva well concept to cope with reservoir uncertainty. With Schlumberger, we identified applicable technologies and solutions to the challenges we faced. Winning the award is a tribute to the teamwork displayed by the staff of both companies in delivering an appropriate solution to the Minerva development drilling program.’

BP cranks-up Wellogix business

E-commerce solution links operators and service providers through hosted service.

BP is to pilot the use of Houston-based Wellogix’ web-enabled business process management software as a component of its e-transformation ‘e-Trans’ process at its San Juan Basin operations in Colorado and New Mexico. BP is working with key suppliers, Hanover Compression, Flint Energy Services, and Wil-Tech, Inc. to determine the functionality, value, and potential for full scale implementation throughout its North American operations.


Wellogix’ technology (see Oil ITJ Vol. 8 N° 5) automates interactions between operators and service providers by connecting financial and technical data sources. Wellogix software is delivered as a managed service—insulating users from technology change and providing paperless tools tailored to individual businesses.


BP’s Unit Leader Richard Morrison said, ‘Wellogix software will enhance the integration of our field operations and our supply chain management. The tool will provide an electronic link between our work order system and our key suppliers, streamlining the order to invoice to payment cycle. Improved collaboration with the supplier community is critical to meeting future E&P demands for commodities and services.’


Wellogix CEO Ike Epley added, ‘Our software will simplify and enhance BP’s current purchase-to-pay process and will have a positive impact on many facets of their business, including reconciling field tickets to negotiated contracts, introducing compliance features, and establishing online key performance indicators with suppliers. By integrating BP’s existing planning and scheduling software applications with our internet based software, we are introducing a new business model for BP’s onshore operations.’

Communications solutions proliferate

New solutions from M2M Corp. and Honeywell illustrate blurring of boundaries in SCADA.

Two new alliances are set to increase deployment options for remote facilities and large scale production assets. M2M Data Corporation has teamed with satellite telecoms operator Stratos Global Corp. to deliver ‘turnkey’ remote automation solutions. M2M, a provider of internet-based Supervisory Control and Data Acquisition (SCADA) services will leverage Stratos’ infrastructure to offer end-to-end remote automation solutions.


The solution combines M2M’s hosted monitoring and control services, and Stratos’ satellite-based communications to provide Internet connectivity in remote areas. The solution overcomes obstacles, such as lack of communications options, inflexible poll/response architecture, and high development and maintenance costs while providing round-the-clock network monitoring support and reporting.


An asset-scale communications solution has also been announced by Honeywell Process Solutions and Symbol Technologies. The ‘Enterprise Mobility Company’ will provide wireless asset management solutions, integrating Honeywell’s asset management software with Symbol’s ruggedized mobile computing, wireless infrastructure and advanced data capture devices.

Baker Hughes licenses Drillworks

Baker Hughes is to offer Knowledge System’s Drillworks for high-end well design.

Baker Hughes has signed a world-wide deal with Houston-based Knowledge Systems Inc. (KSI) for the provision of its Drillworks software for real time geopressure and wellbore stability analysis. Baker Hughes will embed Drillworks into its new technology offering to help operators overcome the two leading causes of downtime – geopressure and wellbore stability related problems.


Baker geological advisor Mike Reese said, ‘Drillworks has been the basis of our wellsite PressTEQ Pressure Management Services for over ten years, letting us dynamically integrate a wide range of data and make accurate pressure analyses. This helps our customers make real-time, risk-reducing decisions at the wellsite’.


Baker Hughes will use Drillworks Onsite component to perform real time pore pressure and fracture gradient analysis at the wellsite. Onsite leverages the Well Information Transport Standard (WITS) to communicate with other rig-site real time data systems.


Baker Hughes OASIS drilling optimization group will also use the Drillworks 2004 suite to analyze seismic, drilling and geologic data to help operators avoid drilling trouble and reduce expensive non-productive time. The Drillworks 2004 Professional Suite will be combined with Baker Hughes’s new downhole technology to provide operators leading wellbore stability and drilling optimization services.


KSI COO James Webster concluded, ‘Baker Hughes understands the impact that geopressure and wellbore stability analysis can have on a drilling program. By implementing the industry leading Drillworks 2004 Suite, Baker Hughes will help operators worldwide minimize costly problems and downtime while improving drilling performance and safety.’

Maximo 5.0—Dashboard and KPIs

MRO Software’s Maximo now offers managers more control over corporate assets.

MRO Software, Inc. has just released version 5.0 of its flagship Maximo asset management solution which now supports Key Performance Indicator (KPI) reporting and a performance management ‘Dashboard’. Maximo users now have immediate access to personalized metrics of asset performance.


Users can customize Dashboard layout and content with dynamic graphs, gauges and lists to display role-related information. Operational goals can be established and monitored in what is described as a ‘strategic, enterprise-wide approach’ to asset management. The Maximo Dashboard lets users react to potential problems and opportunities.


The new KPI Manager includes a range of pre-defined KPIs such as Mean Time to Fail, Actual-to-Plan Variance, and Preventive Maintenance (PM) Work Orders Overdue. Custom KPIs can be added to align users metrics with organizational goals.


Maximo product marketing director Rich Caplow said, ‘The flexibility our customers gain from the variety of KPIs allows them to have the critical metrics available to run their business and make decisions based on real-time information.’ Maximo is a component-based software solution leveraging Sun’s Java 2 Platform, Enterprise Edition (J2EE).


Science Applications International (SAIC) recently included Maximo as part of its Integrated Services Management Center (ISMC) offering. SAIC now offers Maximo as a component of its managed services portfolio. SAIC provides a variety of outsourced IT solutions to the energy sector.

Earnings doubled

MRO Software reported ‘solid performance’ for 2003—with a doubling of earnings. Oil sector clients now include BP, China National Offshore Oil Company and Kerr-McGee.

XML for e-gas? No thanks!

EU spec for gas e-business shuns modernity!

The latest release, V3.0, of the Edig@s format is instructive in that it shows how legacy, ‘vanilla’ EDI still underpins gas e-business. Edig@s evolved from the 1983 GasNet protocol and is used by major EU transmission companies for data exchange. The Edig@s protocol continues to develop—with no notion that XML is likely to impact the standard in the foreseeable future. To ‘speak’ Edig@s you send messages like ‘BGM+34+01+9DTM+158:19970911:102’DTM+159:.. OK it’s not pretty, but apparently the 20 page spec does everything the gas folks need.

Digital Human Model for Osaka Gas

UGS PLM’s ‘Jack’ is used to omptimize human interactions with living and working environments.

Leading Japanese energy company Osaka Gas Co. is using EDS unit UGS PLM Solutions’ Digital Human Model (DHM) in a futuristic project that creates virtual living spaces. The DHM is a component of Osaka’s Comfort, Usability, Performance, Safety (CUPS) system to be launched in April 2004.


UGS’ DHM—‘Jack’ to his friends—enables users to position biomechanically accurate digital humans of various sizes in virtual environments, assign them tasks and analyze their performance. Jack digital humans can tell engineers what they can see and reach, how comfortable they are, if they’re getting hurt and other ergonomics-related information.


Osaka research fellow Masaru Hotehama said, ‘Jack lets us evaluate physical functions such as posture and eyesight. Jack integrates with our proprietary behavioral algorithms—offering a high degree of flexibility in the development process, enabling us to import and run programs from external sources. We model cardio-load to check an environment’s impact on elderly people and can run executables on Jack that we develop independently with Visual Basic.


UGS VP Ted McFadden added, ‘Our digital human model, Jack, can adapt to the requirements of this demanding application. The Osaka Gas team has successfully deployed Jack in a unique application, achieving extraordinary results’. CUPS was developed with support from the Japanese government.

Stop Press - EDS just divested UGS PLM to a venture capital group.

Divestco announces GeoCarta Tools

A new ESRI-compliant front-end merges legacy GIS tools and exposes public datasets.

Calgary-based Divestco Inc. has released a new mapping product, GeoCarta Tools for ESRI’s ArcGIS. GeoCarta evolved from earlier work done by both MSI Capture and AnGIS, two companies purchased by International Datashare Corp. – itself subsequently bought by Divestco.


GeoCarta offers dynamic filtering, report and query building and is based on an ‘open systems philosophy’ which enables the tool to run against a wide variety of public and proprietary data. Divestco offers an alternative to using proprietary data with the introduction of Data Packs consisting of industry standard reports, graphs, queries and wizards. These offer clients easy access to public data hubs without learning ArcGIS modules. GeoCarta Tools can be easily deployed to a standard ArcGIS 8.x extension. By encapsulating GeoCarta Tools in a single extension, an organization can deploy the tools without needing to modify existing projects.


Divestco VP Shannon Niemi said, ‘Divestco has gained extensive experience in working with both a wide variety of oil and gas data and applications, as well as the latest ESRI product offerings. It is with this broad knowledge base that Divestco, in consultation with clients, now provides easy access to a range of oil and gas specific data through ESRI’s ArcGIS 8.x’ GeoCarta Tools is slated for official release April 1, 2004.

Oil gets Maxamine workout

Oil IT Journal won an EDS-funded trial of Maxamine’s analysis. With mixed, but illuminating results.

Oil IT Journal’s website, won a free site analysis from Maxamine as part of a promotional effort from Maximine’s patron, EDS. Maxamine analyzes websites in a straightforward, hands-off manner and evaluates site structure—particularly how easily a site can be accessed and maintained, potential pitfalls of compliance and risk issues and accessibility.


Maxamine analyzes the three components of a site—traffic, structure and content—either from a scan of the public website—or possibly offline for corporate sites. The starting point of the analysis is a web map—which can be overlain with traffic analyses—from httpd logs or other sources. A range of customizable reports can then be run on the site to investigate metrics such as accessibility, company ‘image’, search engine optimization and link integrity.

Good news

First the good news. scored well—a 65% overall rating put us bang in the middle of the 100 corporate and government sites Maxamine uses as a benchmark. Oil scored best in page ‘weight’—a measure of download speed. We didn’t do too bad on link integrity—the site is regenerated programmatically every month, so we have had time to work on such issues.

Bad news

We didn’t do so good on redundant file control—one of those things that you mean to fix but never get around to. And to our shame, we scored zero on search meta data coverage and accessibility.

Was it worth it?

A lot (but not all) of what Maxamine does could be done fairly easily with standard site management tools—but there is a big difference between ‘quite easily’ and actually doing it. Maxamine scores with a low-maintenance (and quite low cost—around $1,500 for a one-off, but less if you contract with them) and high quality ‘in-your-face’ analysis of your web site—warts and all. Maxamine’s presentation makes it easy to see where things are going wrong—and where to concentrate limited development resources to maximum effect.

© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.