December 1997

POSC's business objects vs. Open Spirit Alliance - the race for Plug and Play (December 1997)

The quest for the IT holy grail of application interoperability is hotting up with the near-simultaneous publication of POSC's Request for Technology (RFT) entitled 'Interoperability and Business Objects' and the upbeat announcement of the Open Spirit Alliance at last month's SEG conference in Dallas.

POSC's RFT seeks to "extend the POSC specifications to include higher level business-oriented objects with agreed data attributes and operations for use as application components." The Open Spirit Alliance is to build on Shell’s in-house Spirit II development to provide the Open Spirit E&P Component Framework, an application independent software platform enabling the "plug and play of software applications across the E&P lifecycle".


Clay Harter from Chevron speaking on behalf of the Open Spirit Alliance (OSA) described this as a move from today's "bloated applications" to a more modular computing environment with slimmed down applications talking to data stores through an OSA middleware layer using business objects. OSA has assured funding from existing partners of $200,000 over an initial two year period. Harter described Chevron's IT strategy as buy AND build - with Chevron's in-house developments currently integrated with commercial applications through Chevron's Object Integration Server. This latter is to be replaced with the OSA E&P framework when it is available. Steve Jennis from Prism - the prime development and marketing contractor for the project - described the role of the E&P framework as "neutral, industry standard middleware for E&P".


The POSC Request for Technology (RFT) defines 6 levels of software interoperability, and in true C programming style numbers them from 0 to 5. The RFT aims at implementing level "3" (the fourth level) in a two to three year timeframe. This level allows applications to share process and presentation objects though a common interface and implements plug and play for process objects and data objects. The RFT asks for submitters to provide high level business objects such as a well, seismic trace, coordinate system etc. These are to be assembled into a working prototype system demonstrating the technology's feasibility in the domain of seismic interpretation.


The POSC specification defines the lowest level technology to be used as the Object Management Group's (OMG) CORBA specification. The OMG itself is working on various higher level Business Object Frameworks, but these have progressed slowly such that POSC distances the RFT from this work. The advantage of this approach is that the E&P BOF will not have to wait on the OMG to finalize their specifications, the downside is that if and when the OMG gets around to defining other BOF's then they may look very different from the E&P specification. Already the POSC legacy of Epicentre and Express forms the core of the RFT.

A race?

The relationship between the OSA and POSC's RFT may or may not really be a race. In fact Jennings stated that the OSA will be responding to POSC's RFT - this must make them an attractive candidate for POSC - having already obtained their funding and started off the marketing effort is quite a big way. The POSC RFT on the other hand makes no mention of Open Spirit. Some members of the OSA (Chevron, Elf, Prism) are cited in the PSOC RFT as having contributed resources to a team that developed a preliminary BOF architecture earlier this year.

And the DAE?

PDM readers may be surprised to see Prism crop up again in the role of E&P middleware provider, after all it was only in October that Prism unveiled their Data Access and Exchange POSC compliant product. The answer according to those in the know is that the DAE will be a POSC specific layer used by compliant applications, whereas the Open Spirit middleware will be more generic. In France they have a saying "qui peut le plus, peut le moins" which being translated means what can do more can do less. If Open Spirit can talk to anything, then it is a little hard to see what role the DAE will play in the Open Spirited enterprise. Prism explained to PDM that the DAE will play the same role in the OSA as that played by the proprietary API's to GeoFrame and OpenWorks. The DAE will talk Epicentre databases.

Focussed initially on the subsurface interpretation domain (but planned to embrace the full E&P lifecycle), the OSA platform will support E&P data stores from Landmark, Schlumberger and POSC. Distribution and communication will be via OMG/CORBA ORBs and utile CORBA services. The OSA consists of a service-based architecture made up of two architectural layers - a Business Object Facility (BOF) and an E&P Framework.

Business Objects

The BOF is designed to comply with the emerging OMG Standard and consists of implementations and extensions of CORBA services, both existing standardized services and additional services which are currently in the submission and evaluation process.

The E&P Framework provides both generic E&P components (such as co-ordinate transformations) and components specific to subsurface interpretation, The components can be logically grouped into several subsystems: GUI/application, 2D graphics, 3D graphics and data. The GUI/application components are developed in Java to provide cross platform portability and to support web based applications. The generic 2D graphics components will be provided via the Carnac product from INT Inc. PrismTech is extending and adding to these components to provide integration with the data framework. These components are being implemented using a combination of Java and C++. Generic 3D graphics technology will be provided by the OpenGL graphics library and OpenInventor product. Again PrismTech will provide integration with the data framework.

Roll out

The data framework itself is (initially) providing data servers for the following data types: Wells, Seismic, Interpretations, Velocity Models, and Culture. Interfaces to these data types are defined in OMG/IDL. These IDL interfaces provide application developers with access to data independently of the data store, location, or the implementation language of the data access technology. This will be accomplished using CORBA ORB technology and by implementing a design pattern that isolates the specific data access code. Work is currently underway on Version 1 of the platform, which will be available from Prism in beta form 1Q98. V1 will include support for well and seismic data, initial 2D graphics components, data access to/from Landmark data stores, and BOF capabilities. The first public roll-out of the technology is scheduled for the New Orleans SEG in 1998.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_1 as the subject.

Spirited Business Alliances (December 1997)

The Open Spirit Program is sparking off some strategic alliances and corporate maneuvers already. Shell Services Company (SSC) has taken a 10% equity stake in Prism and two senior SSC executives have been appointed to the Prism Board.

For an initial two year period SSC will provide personnel and equipment to work on Open Spirit and other data management solutions for the E&P and Manufacturing Industries. Speaking of these initiatives, Scott Reeves, SSC's Manager of Subsurface IT Services stated "Currently many service providers are attempting to solve the E&P data problem with point solutions. We believe to really solve this problem a new approach is required that is enabled by new technologies. By combining this technology and our E&P business knowledge into our Veristream services offering we will present a unique and compelling value proposition to anyone who has E&P data management problems".

Java Beans

Another hook up concerns the relationship between OSA and Interactive Network Technologies (INT) whose Carnac C++ and Java 2D Graphics Toolkit core technology has been selected by OSA to provides a 2D graphics toolkit bringing "the next generation in flexible high-performance, cross-platform, multi-source graphics". With both C++ and Java API's, Carnac is a powerful tool for developing graphics applications. As an illustration, INT uses Carnac to produce its GeoBeans and JavaBeans visualization components. This library includes beans for seismic displays, logs, contours, maps, and XY-plots and will be used for the development of Java based E&P applications. Intriguingly, INT is also developing its own JavaBean based set of E&P visualization objects - GeoBeans, a library of components for the display of seismic, logs, contours, maps and XY-plots. 

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_2 as the subject.

A trip to the moon, an objective view of E&P computing (not) and a plea for a 'BOBM'. (December 1997)

PDM’s editor Neil McNaughton discusses the difference between entertainment and invention and wonders in which camp emerging business objects should be considered. He concludes that although ‘business benefits’ are cited in abundance, a real business model for the deployment of this technology is conspicuously absent.

This month I'd like to ask the question "how do you evaluate the promises of jam tomorrow that various technologies appear to make?" The starting point for this quest is the observation that when Jules Verne wrote his book "De la Terre la Lune" he was writing an entertainment, and that although he pre-dated NASA and Neil Armstrong by 104 years, he cannot really be said to have stolen their thunder. I mean you do not "invent" something by just writing about it. If you do not agree with this then I must be a pretty clever chap, because I have just dreamed up a system of oil exploration that uses robots crawling around on the seabed recording, processing and interpreting away. When they directly detect hydrocarbons they whip out a laser beam and get drilling. The fused wall of the borehole neatly avoids the need for pipe and specialized go-bots trundle down logging and perforating promising zones using gourmet sniffer RFT’s to "taste" the fluids before micro - nuking in the main perfs. Production is achieved by satellite using the "beam me up Scotty" technology familiar to Trekkies all over the world.


I could go on but you get the point. In reality, many IT presentations today are a mixture of fantasy and reality as we all know. Along with the rest of the software industry we are as likely to hear descriptions of vaporware, designed to spread fear, uncertainty and doubt (FUD) in the minds of buyers and competitors alike as to have a vendor appear with a product that just works! New technologies, from the increasingly integrated but isolated environments that vendors are offering, to the futuristic object-oriented environments promises of this month's lead are all "marketed" long before they are up and running.


What is often unsaid is the time and effort required before one or other of these technologies will bear fruit. In the commercial world mission-critical aspects of the vaporware may be overlooked in the current version of the software, and a wait of many years may be in store before the required functionality is there. Elsewhere - in the "open standards" environment, the initial specifications are so loosely drawn up that it may be quite hard to determine when the job has been done. Often the deliverables of one project turn into the specifications of the next, with all participants living and working in a never-never land in so far as producing anything workable goes. But most are well aware of these issues and have got used to expecting less from the product than the salesman promises, or only using a tiny subset of what the standardization organization comes up with.


I'd like now to add another dimension to the question, one that did not exist when NASA was developing the Saturn V, the business model. NASA's business model was simple to non existent: send a man to the moon, pay for it with tax dollars. Because this was later deemed not to be politically correct, history tacked on "invent the transistor and discover Teflon" but I digress. The Apollo program was without a recognizable business model, but no-one would get away with this today. Nowadays we need to show business benefits, cost savings and jump through hoops before we get a cent for a new project. But while we offer jam tomorrow in the form of interoperability and costs savings and so on, these are really only anticipated business benefits and are never developed into a true business model. The assumption is that is it is good technically, then the business will follow.

More bloat

In the IT world at large there is a "standards" war raging between the Microsoft and Java camps. References to "bloatware" in the context of Java and objects generally refer to Microsoft's hegemony and products such as MS Word. But these references seem misplaced in E&P. Could the E&P bloatware referred to be applications such as those from our favorite vendors - GeoQuest, Landmark and the like? Surely not, because they are all fully on board the E&P open systems movement aren't they? Without getting bogged down in that, I had better make my point before the turkey gets cold.

Who pays?

Objects are undoubtedly becoming an important aspect of E&P computing and the existence of a reliable underlying E&P business object facility with a standardized interface would allow many more software components to interoperate than is possible today. Also it should be simpler to build more reliable systems with less (or no?) data duplication and which are easier to maintain. But what would this mean for the existing marketplace. Imagine that 2-3 years down the road it is all up and running. What are GeoQuest and Landmark supposed to do with their bloatware? Where does their market share go? Does anyone really believe that the future will be a world of E&P specialists assembling their own applets out of freeware, and that the infrastructure will be maintained by a standards organization? Lets get real and have someone explain - as I say assuming all the technical issues are resolved - how all this is supposed to fit together commercially. Who pays for what, who will own what. We've got business objects, lets have a business object business model - a BOBM please and soon.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_3 as the subject.

LYNX announces GIS group (December 1997)

Lynx Information Systems UK announces the setting up of a specialist Geographical Information System (GIS) Group in London. The aim of this Group is to support products developed by Lynx and to assist clients using ESRI's ArcInfo/Arc View application for exploration data management.

Each of the Lynx "Exploration Advisers" covers a certain geographical area and consists of a variety of exploration and production related maps linked to well/field databases, well logs and seismic sections, all in digital format and managed from ArcView. Text, tables and figures are also produced digitally and have been incorporated into the Adobe Acrobat application. The GIS Group will be developing "Advisers" for other exploration areas throughout the world (Iraq, Libya, West Africa, Western Siberia, Venezuela, Algeria and SE Asia are available), and helping companies incorporate their own proprietary information into the existing information pool.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_4 as the subject.

SEG Standards committee report (December 1997)

PDM reports on the Society of Exploration Geophysicists’ (SEG) Technical Standards Committee’s recent meeting.

The SEG has (at last) begun to publish their standards on the website ( There you will find the definitive specs for SEG D, Record Oriented Data Encapsulation Format (RODE), Shell processing support format for land 3-D surveys and the proposed polarity convention for Vibroseis. Current activity for the standards committee includes the Ancillary Data Standard (ADS), a collaboration between the SEG and United Kingdom Offshore Operators Association intended to provide a uniform method of recording all ancillary field data. The ancillary data types may be extended to include standard names for processing sequences and tape/volume cataloguing. This subcommittee is in the process of finalizing the specification, and would welcome input from interested parties asap. A standard means of mapping the SEG seismic tape standards to CD-ROM is also underway - which may be generalized to all byte-stream media. While RODE can do this and more, there is a consensus that there is a place for a simpler mapping of tape to disk type formats than RODE.

SEGD Rev 2 has been out for around a year now with little or no adoption as yet. The committee recommends a push from the client side as necessary to help take-up. The main advantage of Rev 2 is the ability to write a standard tape label. This has advantages in the context of high density media, where a tape label should allow content identification in seconds rather than minutes or even hours of spinning. The RODE format was criticized for its total flexibility, and it has been suggested that there should be a more standard manner of writing RODE that could be endorsed by the SEG. Proposals for this are to be published to the website for comments.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_5 as the subject.

New Chairman for CDA, CEO sought and hard copy phase awarded to Spectrum and Robertson (December 1997)

The UK's Common Data Access (CDA) data repository chairman Dave Overton is bowing out and returning to the Amoco fold to take up his 'real' job again. John Foot (BP) will be taking over.

Both Dave and John are loaned from their companies to the CDA consortium, and they will soon be joined by a new appointee in the form of a full-time CDA Chief Executive Officer. This position will be advertised in the new year.

CDA has (finally) signed the contract for Hard Copy Services with lead contractor QC Data and subcontractors Spectrum and Robertson Research. This appointment, already announced back in May 1997, has been 6 months in finalizing as the detailed arrangements have been thrashed out with the subcontractors and the DTI.


Under this new contract, Spectrum will catalogue, rationalize and scan hard copy well logs and well reports from the UK Continental Shelf (UKCS) as a service to CDA. The work to be performed under this contract will be carried out in the Department of Trade and Industry's premises in Edinburgh and is said to constitutes "a vital step in the management of oil exploration information resources and towards the fulfillment of the overall objectives of CDA". Over a two year period, Spectrum will build a complete master hard copy set of well logs and associated well reports to include the 7,000 wells drilled in the UKCS to date.

DTI enthuse

Rationalization of multiple copies stored by oil companies will allow 2,000,000 items of hard copy to be reduced to approximately 250,000. A unique and complete master hard copy data set will then be available, yielding significant cost and efficiency benefits.

In a letter to CDA, John Brooks of the DTI states "The DTI is enthusiastically behind the proposals of CDA to establish the hard copy initiative operations within the DTI Core Store in Edinburgh. The Department views this as an important project for the entire UKCS. The cataloguing and easy availability of data will prove crucial in the coming years as the industry looks for more subtle traps and innovative ideas." The system will become available to users approximately six months after the start of the project. At the start of the project, Robertson Research International Limited will provide a set of well log images and scanned reports from over 2,000 released wells in the UKCS. Robertson will also be providing their geological expertise to facilitate the project initialization.

30 members

CDA was originally formed as the Common Industry Data Access Initiative (CIDAI) consortium by over 30 UK-based Oil and Gas Companies to set up a shared repository for UKCS data. The principal objective of CDA is to establish a common, high quality store for UKCS primary petro-technical data; an objective which is fully supported by the Department of Trade and Industry. QC Data will manage the hard copy project, and will be responsible for the integration of the hard copy data into the existing digital well log system. QC Data’s existing contract with CDA has been expanded to include the management of the hard copy well data.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_6 as the subject.

PI (ERICO) enhances digital offerings and scanned image database (December 1997)

Erico – PI’s UK arm - is to introduce the Scanned Image Database Comprising some 30 GB of well data from the Department of Trade and Industry.

Erico has been a service provider in the North Sea for 24 years. Erico described their activity to PDM over the last 3-4 years as a move from vertical hierarchical drill-down through their data to a horizontal linkage across different datasets. PI (Erico's parent company since 1990) promote the P2000 relational database product as their flagship software. Unlike some competing products, P2000 imposes a large number of constraints and integrity checks on the data that goes into it. The downside of this is that because of the tests and other constraints of the data, loading can take quite a while. But the upside is that when populated, the database is clean and should be surprise-free from then on. P2000 is built around the PPDM data model, with extensions developed by PI. The domain focus is primarily well and stratigraphic data.

DTI dataset

A recent addition to Erico's product line is the DTI Scanned Image Database which is now claimed to be the most comprehensive database of its kind in the industry. Scanned composite, velocity and mud logs are available for all 2800 UKCS released wells. In addition scanned sonic and formation density logs are available for all wells released well since January 1997 and will shortly be available for all released wells. The data includes logs, seismic sections, engineering data and completion reports. These are scanned and indexed and represent around one million A4 pages and over 30 GB of data. Overall PI (Erico) has around 75GB of data online.

Web access

PI is convinced of the future of the web and browser technology as their data browsing and delivery mechanism of choice. A re-vamped website can be seen at allowing for browsing of header information with viewing of real data scheduled for early 1998. PI supply Arthur Andersen with their dataset for inclusion in AA's PetroView product. PI's development environment includes Oracle of course, Access, Visual Basic - in particular the MapInfo ocx, and the Norwegian product PixTools for bitmap viewing.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_7 as the subject.

Corelab to offer seismic processing through Scott Pickford Houston Unit (December 1997)

Scott Pickford, now part of the Corelab group of companies is now offering seismic data processing services from its Houston office.

In what is described as the "continuing convergence" of the Corelab companies, Scott Pickford will be offering its specialist processing services utilizing a core ProMAX system. Scott Pickford highlight their experience in processing large volumes of data from difficult areas such as the sand dune terrains of the Sahara. Both 2d and 3D datasets have already been processed from Algeria and Tunisia and Scott Pickford have developed specialist solutions including proprietary refraction statics routines. Other interpretive processing is on offer using Scott Pickford's proprietary software; Velit - for depth processing and IC2 for attribute analysis. More info from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_8 as the subject.

GeoFrame, POSC and compliance revisited (December 1997)

GeoQuest reveals its plans for making GeoFrame ‘POSC compliant’ and extending the data model to other E&P domains.

At last month's POSC meetings in Dallas, Schlumberger- GeoQuest's product line was in the spotlight at the first of a new "Supplier Workshop Series". The idea behind the workshop's is for a supplier member of POSC to lead a half-day workshop following each POSC Member General Meeting. Larry Denver (GeoQuest) traced the development of GeoFrame from V 1.0 in '93 to V 4.0 due to be released in 1999. By that date GeoFrame should incorporate inter-well imaging, production analysis, simulation and drilling, in other words integrating much of the domains currently covered by Oilfield Manager (OFM). Interest was expressed as to the availability of a GeoFrame developers toolkit which is currently being shipped to "key" clients. Thirty companies have already purchased the development toolkit. As for the holy grail of plug and play, as we have previously discussed here in PDM, this will only be a reality for applications developed in the GeoFrame environment. In the POSC environment, the migration to "full" Epicentre is obviously of interest. GeoQuest stated that this would occur over the next "two to four years".


Najib Abusalbi (GeoQuest) provided some information on the implementation of Epicentre in GeoFrame. Some 20% of the 600 plus entities in GeoFrame are extensions including many derived attributes (with stored procedures). This is partly due to "historical usage" but there are areas where the corresponding Epicentre attribute has not been used. GeoQuest has already implemented its own Business Objects (see this month's PDM lead) which provide application access to the physical data model - avoiding direct SQL table queries. Some of this work is fed back to POSC.


Current GeoFrame architecture does not use the POSC Data Access and Exchange layer and GeoQuest has essentially gone for a proprietary DAE although the PRISM LightSIP DAEF (see October PDM lead) is under evaluation. Bulk data in GeoFrame does not use the POSC defined Frames concept, but rather through the POSC DAE Bulk Data Access Library (BDAL) specifications. GeoQuest uses the notion of POSC compliance very widely in its marketing effort and came under fire for the potential "confusion" that this might cause. POSC is therefore very interested in a compliance verification process, while GeoQuest appeared reluctant to submit themselves to such a test.

PDM comment : GeoQuest like to talk about POSC compliance because they perceive a marketing advantage - and will very likely customize or open up POSC type entry points to their POSC-committed oil company clients. What is less likely is that such plug and play facilities will be offered to GeoQuest's competitors. This is understandable in the commercial world, and reflects the fact that, while the technicalities of interoperability have been investigated in great depth, the existence of a business model that might support interoperability is a rather naive assumption. In other not so far removed fields such as UNIX, or the current Netscape vs. Microsoft courtroom battle, interoperability has either proved a myth, or is centerpiece in out and out commercial warfare. Rather than coming up with some minimalist POSC compliance testing schema, it might be worthwhile to develop even a theoretical business model for how interoperability could be made to work commercially. After all, we are all in it for the money!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_9 as the subject.

Geophysics on CDROM review (December 1997)

PDM reviews the Elsevier Science Seismic Exploration on CD-ROM which contains the most frequently used volumes from two book series, Geophysical Publication's 'Handbook of Geophysical Exploration' and Elsevier's 'Advances in Exploration Geophysics', as well as a large collection of related abstracts.

The product is described as "a formidable reference library available at your fingertips". Installing and setting up is near instantaneous requiring a mere 7MB on the users hard disk to access the 182 MB of data on the CD-ROM. A quick visit to the help file and you get the general idea about how to use the search tools which allow you to search and browse the reference books and a large collection of recent abstracts from the literature. Everything you see can be printed out or cut and pasted to other applications. Elsevier claim that the CD-ROM was developed to be 'user-friendly' and that the information can be easily accessed without prior training or referral to complex manuals. In essence this is true. The interface is efficient and hassle free and all the functionality of the program is available with minimum effort.


Klaus Helbig and Sven Treitel co-authored the CD-ROM and Helbig describes it as "a new way of getting geophysical reference books to the individual" allowing the individual researcher access to a "private reference library". In his introduction to the CD, entitled "The Future of Books and Journals" Helbig makes a quirky argument in favor of the printed word (and the modern "printed" media - the CD-ROM) over the "evanescent" recording of the Internet. It is true indeed that the way in which URL's are suppressed or moved around on the internet detract from the permanence of the archive but it is a shame perhaps that no pointers are included on the CD to geophysical sites on the net. I looked up tape formats on the CD and accessed an erudite article on the basics of magnetic recording, and an image (of rather poor quality) of a sample recording format. Some pointers to the SEG would have been useful - or even a cross reference to the SEG's Georom series.

Standard sought

Overall the usefulness of this product can be judged from the contents of the volumes that make up the CD. If you have found either of the titles useful in the past, then the compact access offered by the CD-ROM should be a timesaver. At PDM we already have the SEG's Georom, the AAPG Bulletin on CD ROM and a couple of other reference volumes that are regularly used, and what we'd like for Christmas is a) a CD Jukebox to put them all in and b) a standard interface for full text search across multiple reference works. Maybe we should start up our own committee…

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_10 as the subject.

GeoGraphix announces GES97, Prizm v2.1 and data partnering program with Platte River. (December 1997)

Three new software components have been released by GeoGraphics - Landmark's PC software arm.

The GeoGraphix Exploration System (GES) 97 promises enhanced deviated well functionality, expanded support for international map projections, open data access to the GES database and better integration with the other products in the GeoGraphix product line. Prism V2.1 lets users present log interpretation results in map view and allows for the calculation of curve statistics such as average and total net pay, sand count, porosity, and Sw. These can then be gridded and plotted in GES97. Improved multi-well crossplotting functionality is also claimed. A new addition to the GES97 Workbench is ResMap which allows earth scientists and petroleum engineers to incorporate production and pressure data into geologic interpretations, property evaluations and field operations. ResMap offers data plotting in formats such as pie and bubble charts, in what is described as presentation-quality basemap display.

Point and click

Users have point and click access to mapped data so that for instance a click on a well can bring up log interpretations, production history, formation top data etc.

Geographix has teamed with Platte River Digital Cartography to resell digital data as part of their expanding data partnership program. This enables Geographix clients to buy third party data direct from them for use in the GeoGraphix Exploration System. The new program is augmented with GeoGraphix "QuickStart" service offering which includes digitizing, data loading, network and system setup and other consulting services. GES97 also paves the way for improved integration with Landmark's OpenWorks database. Through a combined Landmark/GeoGraphix effort, a Windows NT utility is under development to allow the easy migration of well data between projects on OpenWorks and GES97.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_11 as the subject.

Oil Company IT spend rankings (December 1997)

Informatiques Magazine (France) has just published a study performed by UK based Spikes Cavell which investigates IT spend for the top 500 European companies. PDM offers an analysis of the Oil and Gas sector’s IT spend.

With their permission we have extracted the figures for the main EU oil companies and present them in the table below. It is much easier to say why such studies are flawed than to draw meaningful conclusions from such broad-brush data. For instance, what is an oil company? Some of these figures include large downstream or even non-oil and gas components. What is an employee?


A company that posts a low IT spend may have achieved this by outsourcing a whole service such that the IT component is hidden in an asset's budget. Notwithstanding the caveats, we present these figures for your perusal.



IT Spend (M$)

IT spend/

employee ($)


Employee k$

IT Spend/


Turnover (M$)


136 646


2 963



38 047


123 046


3 772



47 256

British Gas

54 754


4 048



13 618


41 803


4 282



25 921


9 994


4 336



7 423


91 544


4 342



32 758


58 150


5 563



57 168

Elf Aquitaine

89 500


7 626



39 367

Norsk Hydro

32 353


7 995



12 870


1 296


9 774



1 009


18 616


12 122



17 913

Royal Dutch Shell

104 000

1 583

15 224

1 060


110 192


13 653


17 652

1 401


19 122




25 405

1 996


1 191


55 425


8 936



30 275

Some correlation can be seen between the head count and turnover, and between IT spend and turnover. But the graph of IT spend per employee against turnover shows very little correlation. This may be because IT spend is generally disconnected from headcount, so that just because you work for a company with high turnover and a big IT budget, it doesn't necessarily figure that you will have a whopper of a workstation on your desk. This will come as no surprise to the beleaguered data management departments throughout the E&P world. Overall the spread of IT expenditure per employee goes from $3,000 per annum to over $25,000.

SAP's secret

The comparison with the world at large places the oil business in a mid to low end of the range position in terms of the percentage of gross revenues spent on IT. Banks and financial services top the chart with up to 9% of turnover spent on IT. Interestingly, some software companies such as SAP and CAP Gemini have very small IT expenditure over turnover per se; 1% in both cases - perhaps they know something we don't!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_12 as the subject.

Geoquest data management sales to Siberia (December 1997)

Geoquest has made sales of data management software and services to two administrations in Western Siberia.

GeoQuest will provide software, hardware, training and technical support to the Geological Commiteee of the Administration of Khanty-Mansiysk Autonomous Okrug. The contract calls for the installation of GeoQuest's Finder integrated data management system, including SeisDB seismic trace management and archival extension and AssetDB physical data management and order system. GeoQuest will load all the seismic data and well log data in the region. New data collected from contractors operating in the area will be loaded and archived when delivered to the center. Located in Khanty-Mansiysk City, the data management center will be operational by the end of the year.

World Bank

Another similar sale involves an $8.4 million contract to deliver data management software and services to TomskNeftegas designed to optimize the exploration of the company's oil fields. The two-year contract, funded by the World Bank, calls for GeoQuest to provide data management software, hardware and training to set up two fully synchronized databases linking the Western Siberia sites of Tomsk and Strezhevoy. A production data management system also will be set up in Strezhevoy to handle data from 16 fields in the area. In addition,

GeoQuest Reservoir Technologies will conduct a major field study of the Sovietskoye field. "Information from the synchronized databases will help TomskNeftegas engineers make day-to-day operating decisions and provide data for the field development plan," said Jorgen Rasmussen, GeoQuest district manager for the CIS and Eastern Europe. TomskNeftegas will use GeoQuest's Finder data management system, including the LogDB curve archival extension to support data management operations. GeoQuest also will provide integrated geological, geophysical, petrophysical and reservoir simulation software; workstations; and computer systems. Integrated workstation software to be installed are CPS-3 mapping and surface modeling system; ECLIPSE reservoir simulation; the GeoFrame Petrophysics reservoir characterisation system; and StratLog geological interpretation software.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_13 as the subject.

PGS and MR-DPTS awarded Petrobras Contract (December 1997)

PGS Data Management and MR-DPTS have been awarded a contract by Petrobras to preprocess and input data to Petrobras' new GeoBank data management system.

Geobank is PGS' data management solution built around IBM's PetroBank data repository. MR-DPTS will remaster 250,000 9 track and 3480 legacy seismic tapes to the new 3590 High Density Media and prepare the metadata for PetroMaster, PetroBank's index system. At the same time, PGS is to scan 725,000 paper and microfiche records to optical disk for near line access. With the opening up of the Brazilian E&P scene the intention is that the data store will turn into a fully fledged national data repository over time, with multiple client sites throughout Brazil. Petrobras is currently preparing for a licensing round in January so there is something of a rush on to populate the database before the onslaught of the "January Sales".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_14 as the subject.

IBM names new General Manager for European Petroleum (December 1997)

IBM has just appointed Bill Miller as General Manager Europe for the Petroleum and Process Business Unit. Miller replaces Gordon Philipson who is leaving IBM and returning home to the US.

Miller was previously Vice President of Marketing for the North American Petroleum and Process unit. His old boss, Erwin Staudt, IBM's Global General Manager for the sector said of Miller "his in-depth knowledge of the sector, his creativity and energy make Bill the ideal person to direct our team and help our clients enter the world of e-business". The IBM press release does not make clear what this "e" business we are talking about here. Energy? No you old fuddy-duddies we are all in the electronic business now by golly.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_15 as the subject.

Oil-Link boosts capacity (December 1997)

Oil-Link, who describe themselves as 'another Yahoo, just for oil and gas' has moved to a considerably faster server with DS3 (fibre optic) connectivity.

In addition, Oil-Link has been redesigned and now boasts over 2,600 oil and gas websites listed in 47 different categories. Oil-Link claims to be the largest such directory on the internet proving to be a "significant resource for oil and gas employment with 60-100 new job openings being posted each month". Oil-Link receives approximately 6,000 user visits monthly from over 45 different countries. These visitors include representatives from every major oil & gas and service company. The rationale behind Oil-Link is that since finding information on the web is becoming more and more difficult with the exponential growth in the number of websites a dedicated oil and gas search engine is required.

Low S/N

The low S/N ratio was recently demonstrated by an Alta Vista query for the keyword "oil" that resulted in 1,000,000 responses. At the same time, universal directories such as Yahoo! are said to be far from comprehensive for verticals like oil and gas. Oil-Link services include :

Bookstore - In conjunction with, Oil-Link maintains a bookstore exclusively for oil and gas. Here you can browse the latest titles and purchase books directly from - secure, online, real-time.

Employment - A place for locating or posting jobs, resumes and consultant information.

Industry News - In conjunction with NewsPage, daily headlines from 31 categories of oil and gas news.

Oil-Link is provided free of charge and does not require registration (a great boon if like us at PDM you can never remember your password). Listings in the directory are free and can be submitted online. You must have a page on the web to get listed. Oil-Link is financed by what is described as a modest fee for Premium Listings together with paid banner advertising.

More info from, or visit on

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199712_16 as the subject.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.