May 1998

Landmark signs $28 million deal with Statoil (May 1998)

Landmark makes major sale of software andservices to Norway's Statoil

Landmark has pulled off a huge deal with Statoil, the Norwegian state oil company for the provision of a broad range of their integrated software, data transcription, project data management, support, training, and professional consulting services. The six-year contract covers Statoil's worldwide exploration and production business groups. Landmark will work with other Statoil-selected vendors to integrate their applications towards the OpenWorks environment, as well as using the open industry standard POSC Epicentre. Robert P. Peebler, Landmark president and CEO, said, "Landmark is extremely pleased to form this strategic alliance with Statoil that is a key element of aligning their organization to realize their worldwide business goals. By changing to Landmark's open and scalable integrated environment, Statoil is leveraging its extensive sources of existing data while laying the foundation for the future."

6,000 licenses

Randi Grung Olsen, Statoil senior vice president E&P, added, "We have chosen to work with Landmark to ensure the implementation of a technical computing environment across multiple disciplines including geophysics, geology, petrophysics, and reservoir engineering. Our goal is to develop new integration synergies through technological developments and industry standards to further improve our decision-making processes and value creation for Statoil as a company." Statoil requires flexibility to manage and integrate subsurface data types from a variety of data sources. Specific areas of technological emphasis include 3D visualization and interpretation, interpretive seismic processing, petrophysical analysis, and dynamic reservoir modeling. Their new open and integrated environment will be based on OpenWorks, described as "the industry's most widely deployed and used project data management system with more than 6,000 licenses worldwide".

Epicentre-Based CDS

Randi Grung Olsen continued, "Statoil is committed to industry standards such as POSC so that our personnel have both a robust and open environment now and in the future. We selected OpenWorks because it provides us the best integration environment for our projects, and Landmark is committed to working with us on our future Epicentre-based Corporate Data Store development." Landmark will work closely with Statoil to develop, implement, and deploy the most effective workflow processes for consistent usage by geoscientists, reservoir engineers, and IT personnel around the world. "While technology is an underlying driver and enabler for our industry, it is just as important to address the processes across multiple disciplines and departments, as well as prepare and train personnel to new, innovative approaches in working together," continued Peebler.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_1 as the subject. If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_1 as the subject.

Ivey and Pennington go as PI(Dwights)/Petroconsultants inaugurates 'Office of CEO' (May 1998)

Contrary to prior indications, Charles Ivey will not be CEO of the newly grouped PI/Dwights/Petroconsultants (PIDP) subsidiaries of Information Handling Services (IHS). Chris Meyer, President of IHS and Chairman of PIDP has just announced the formation of what is described as the 'Office of the CEO' destined to 'bring PID and Petroconsultants together under a combined management team'

It will be a crowded office by the sound of things, with the following inmates -

Jean Christophe Fueg, currently vice president of Petroconsultants, is promoted to president of PIDP's Geneva operation.

Keith Neal, currently managing director of PID's UK subsidiary, is promoted to executive vice president in charge of PID/Petroconsultants' sales and marketing outside of North America. Neal will also continue as head of PI (Erico), the group's UK operation.

Dave Noel, previously executive vice president of PID, is President of Worldwide Technology and Product Development for PIDP.

David Richard, current president of PID's Canadian subsidiary, is now President of PIPD's combined North American Operations.

Mark Rose, currently senior vice president in charge of PID's Legacy and Business Archives operations, is also promoted to executive vice president in charge of US sales and marketing. In a parallel move within the Petroconsultants' organization, Dale Pennington, previously vice president of sales and marketing is understood to have been ousted. Meyer also confirmed the retirement of Christian Suter, CEO of Petroconsultants, later this month. Meyer said Suter "played a key role in our recent acquisition of PI/Dwights. I look forward to continuing a close working relationship with Christian, who will continue to assist IHS Group as chairman emeritus of the combined PIDP group."


Miles Baldwin, VP of Corporate Development with IHS told PDM that the move to "virtual office" of the CEO was a key part in IHS's strategy. IHS is giving selected domain specialists from different subsidiaries group-level responsibility for policy. This will allow IHS to implement, for instance, a homogenous IT policy throughout the group, built on the best practices of each of the subsidiaries. Baldwin continued "We are excited about the new management of our Energy Information Group. We feel that the Internet has changed the way the information handling business operates, and that we are ready to reap the benefits of these innovations. The management changes are going to leverage the knowledge of all of our subsidiaries and will optimize the synergy of our acquisitions".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_2 as the subject.

Editorial - Is the Web still news? You bet! and for a while to come. (May 1998)

PDM's editor Neil McNaughton investigates the future of web-delivered data and the impact of such technologies on end users and vendors to conclude that this is just a beginning.

Three suppliers have announced new or enhanced web-based data delivery mechanisms this month. But this is really only the beginning. Think what could be done. Many businesses both inside and outside of E&P will be radically changed by this developing means of communicating. Here at The Data Room we studied the development of many of these technologies - particularly the imminent growth of voice telephony over the net - a phenomenon that has been described eloquently as "the death of distance". Voice telephony over the internet could be one of the first examples of a major paradigm shift as a large chunk of the market moves from telephone companies to Internet Service Providers and other new ventures. So how does this relate to E&P and what developments should we expect as distance dies and bandwidth cost tends to zero? The answer is of course that we don't know but that we should be prepared for anything. Before we look at the direct impact on E&P data management, I'd like to digress a bit further, and imagine how some domestic data management might be affected by these changes.

100 GB chez-vous

You may not realize this but you probably have about a hundred gigabytes of digital data in your CD collection. If you are like me, you have a hard time managing this data set. Your daughter may appropriate a few for her personal use at a far-off university, and another family member may just leave your favorite Brahms concerto face down in the dust as Sepultura's latest is thrust into the slot. Well it's all digital, all in the same format so why not buy a terabyte of storage and manage the data properly. Decide on some metadata schema, with information as to personal preferences and commentaries on the performance, you may even like to start a CD management standards organization. In reality, you probably feel as I do that this is an unlikely development although not inconceivable. And after all, this will leave you with the sort of hassles that you are probably trying get away from when you come home from work - classification, retrieval software, metadata management and the like.


No, what is a much more attractive paradigm is data-on-demand. Imagine no CD's at all. Just chose what you want to hear on a screen, with some nifty purpose built software that is checking you preferences and pushing magazine articles and offering suitable snippets of Bach, Boulez or Barry Manilo as your fancy takes you. Developments such as PI's PetroDirect, Petroconsultants PetroNET21 and IBM's Surf and Connect are all clearly set to improve on existing services offered by these providers. But the are they may also be the first signs of a more general move away from in-house data management. Just as we may in the near future be throwing away our CD collection, and going for a music-on-demand business model; we may quite soon be able to throw away huge chunks of our in-house data management. There is a fairly substantial step to be taken before companies store their most confidential and proprietary data off-site with a commercial data vendor. But some Norwegian companies already do this with PetroBank. The whole thing becomes a matter of contract, security and data quality.

No more projects!

Such developments have a huge potential spin-off for oil company IT. The problem today is in general that we have good applications and poor data management. Well this moves all the data management to a third party. In fact I wouldn't mind betting that some companies will take the all-off-site route before they are through with fully populating their own in-house systems. In fact it may be the general realization that implementing and maintaining an in-house world-wide data warehouse is simply beyond the capacity of even a major E&P company that pushes us to outsource. The corollary of all this is that the above developments from the data vendors are not toys or simple enhancements of existing products, they may herald a paradigm shift in the way we will do business and merit very close observation and input from E&P IT. We can browse the data today, but soon this data will be being pushed into our projects through subscription services. Heck we may not even have projects, just sit down at the workstation, fire up the interpretation software and generate 'em prospects. The downside of this of course is that PDM will only have about two subscribers…. Maybe I should re-write this editorial!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_3 as the subject.

Houston hosts intensive Data Management Week (May 1998)

Data management's star was in the ascendant in Houston last month with the Geoshare AGM, the PPDM and POSC member meetings, the PNEC Data Integration Conference and the inaugural Independent Consulting Services Geoshare short course.

Mark Robinson (GeoQuest), speaking at the Second PNEC, (a division of Philip C. Crouse and Associates Inc.) Data Integration and Management Conference estimated that around 90% of the oil produced today is managed through Excel! A situation which he described as frightening (a position with which PDM is in strong agreement - see our January editorial). GeoQuest are working on integrating much of the SCADA production metering systems with Finder/Enterprise and showed video footage of the water front moving through a Nigerian Reservoir obtained from data stored in FinderPRO, the new production extension.

Back door data management

Marion Eftink (Unocal) introduced the concept of "back-door" data management. In theory, data should be managed from the instant it arrives in the exploration department. It should be catalogued and cleaned-up so that, for instance, everyone calls the same thing by the same name. In practice, the power users in the asset teams often get first crack of the whip, and before you can say "data management" there are multiple copies of the same data, with different naming conventions in just about every system within the company. Enter "back door" data management. Instead of attempting the impossible policing of data up-front, Unocal's system uses a GIS front end to the various data stores and implements a system of cross-referencing the different appellations through look-up tables.

Mobil TCS Update

Mobil's Michael T. Farley gave an update on the linkage between GeoQuest's Finder and Landmark's OpenWorks. As we have revealed previously in PDM the chosen route to link these two "POSC compliant" environments is .. Geoshare! Mobil contracted their first Geoshare half link development to Landmark in 1992 and have, since then acquired considerable experience in this type of work. This has not been entirely without pain. One problem was that support for Geoshare half links from vendors can be lukewarm at times. Other more fundamental problems such as ambiguous data model mappings and even the odd bug have been known to surface, usually with fairly catastrophic results. Today, the majority of Mobil's E&P "primary" data types transit successfully through the Geoshare half links. Work is still in progress on some of the "secondary" data types such as DST/RFT which are not currently supported in Geoshare. Farley praised Landmark's support in this effort, with bugs sometimes fixed within 24 hours. Geoshare implementation remains a non-trivial task, but is "better than ASCII transfer". Problems encountered on the project include the Geoshare's cryptic error handling, and the absence of well status symbol codes in the standard. Current projects include more round-trip testing of data transfer, implementation with OpenWorks 5.0 and Finder 8.5 and SQL links to StrataModel. Mobil's Technical Computing Strategy (TCS) is currently a $110 million project over three years. Farley insisted that the TCS was "an added value project not a cost-saving initiative".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_4 as the subject.

User Group head-count. (May 1998)

PDM offers a head count of attendees at the various user group meetings as a rough and ready measure of the popularity of the different technologies.

Geoshare, POSC and PPDM are to a limited extent complementary technologies, but they are mainly competing ways of achieving interoperability. Each one has its own different data model for a start. To have three user meetings in the same place in the same week allowed us a rough and ready way measure of the relative popularity of the technologies. The scorecard is as follows



Geoshare AGM


POSC Houston member


PPDM Houston member


This actually reflects unfairly on Geoshare, which mustered some 200 attendees via their sponsorship of the PNEC conference. There is anyhow, no real front runner in this race!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_5 as the subject.

PI/Dwights announces new version of P2000 and roll out of PetroDirect (May 1998)

PI have announced version 2.0 of the P2000 E&P data management system and also a new on-line data service - PetroDirect.

P2000, the PPDM based data management system designed to store well, production, seismic and lease data is now installed in a number of oil company client sites such as Marathon, Amerada Hess and Chevron. On the service side, it is the main repository for all of PI/Dwights (PI/D) massive North American and North Sea data stets. Intriguingly, the product is also used by Landmark's parent company Halliburton! PI (ERICO), PI's UK subsidiary recently completed the migration of its entire European Well DataBank - a database of Well attributes which includes tops, directional surveys, checkshots, log engineering, wireline and DST test data - into P2000. Data quality control is an important facet of P2000 with, for instance, the maintenance of consistent units of measure throughout the model. "P2000 provides us and our clients with a consistent, clean, high quality data set based on an industry leading data model, the Public Petroleum Data Model (PPDM)," says Colin Gray, manager of Data Management Services. "For example, P2000 is being used by ourselves and our clients to routinely handle large volumes of well and production data. PI/D in the US recently loaded its entire well and production data (some 2.8 million wells) from a mainframe into P2000 in just 4 weeks!"

Round the clock

PI/D are also launching PetroDirect-the new online delivery service for exploration data. Customers can now review, order and receive comprehensive value-added exploration data sets using the Internet. "PetroDirect, will supply a better and speedier service to customers by providing a wide range of exploration data types, 24 hours a day, 365 days a year, throughout the world," says Keith Neal, Managing Director. He adds, "PetroDirect will provide direct access to value-added data from several thousand released wells in Northwest Europe, as well as data from a substantial number of wells from around the world."

E-commerce "reality"

The online service consists of a browser linked across the Internet to PI’s integrated information server, which makes available digital wireline log data, well attribute data including formation tops, checkshots, directional surveys, core, DSTs, other test data, and also the company's reinterpreted formation pressure data. Customer information, data availability and delivery are handled by the information server. The browser contains a map-based catalogue of the Northwest European datasets including all UK and Irish data, in addition to data from other countries worldwide, and is intuitive to use. After completing security checks, the customer simply selects an area of interest from the map, then identifies the wells of interest and the required data types. Upon confirmation by the customer of the order, the information server then collates and delivers the data across the Internet immediately or at a customer’s specified time. Says Neal, "Electronic commerce through the Internet is now a reality and we see the online delivery service as a natural extension of our existing services. PI is investing considerable resources into providing IT services to customers of which the online delivery service is one. We have a program to add many other complementary data sets, such as the DTI scanned image libraries including indexed well reports. This year will also see the addition of significant volumes of seismic data. Users will immediately realize the benefits in using this service, in preference to the outmoded and inefficient data delivery systems that are prevalent in the industry today." More info from PI (ERICO)’s website at

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_6 as the subject.

First Geoshare Course (May 1998)

PDM's editor was invited along to the first edition of Independent Consultant Services' Geoshare short course run by Geoshare guru Bill Sumner.

Pretty well all I knew about Geoshare before the course was that it worked with half links. I had also heard of the "n-squared" problem which is the rapid rise in the number of point-to-point data links needed as the number of data sharing applications increases. The Geoshare solution is to define a standard file format to which all applications can read or write data through what is termed a "half link". The standard file format chosen for Geoshare is similar to the Schlumberger DLIS format. In fact DLIS, Geoshare and the SEG RODE data formats all share the same underlying technology, the API's Recommended Practice 66 (RP-66) format. In a sense, the RP-66 standard could be thought of as a portable operating system for this whole family of data exchange technologies.

RP 66

Geoshare works by defining a universal data model (much in the way that POSC and PPDM have done). A half link is built using an application's programming toolkit (API), or with a good knowledge of the application's internal data format, and writes data out of the application into the Geoshare data model. The reverse occurs when the data is read by another application. The use of the RP66 format, which takes care of the low level stuff, such as how floating point numbers are stored, means that a Geoshare data file can be written to tape, stored on disk, or even sent around the world using the Internet. The actual file structure is in two parts. First comes the Parameter Data. This describes the data to follow in the subsequent Frames - which contain the actual information. Data to be written to a single Geoshare file may contain 15,000 well logs, 17 Gigs of 3D seismic, and so on. Writing a Geoshare sender is said to be much easier than writing a receiver. Footprinting problems may be associated with the transfer into and out of the Geoshare data model.

For those who would like a fast track to building Geoshare half-links, Sumner's company, Independent Consulting Services provides a toolkit in the form of the GeoBasic product. This contains a wide variety of low-level building bricks for constructing Geoshare half links.

The one-day course offers an excellent introduction to the standard and I'm sure that if my livelihood depended on it, I could now be out there writing half-links along with the best of them. More info on the Geoshare course and GeoBasic from Bill Sumner on

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_7 as the subject.

Common Data Access appoints CEO, and moves away from all-online data storage (May 1998)

Malcom Fleming, formerly with Kestrel and latterly senior consultant with Paras has been appointed as CEO of the UK's National Data Repository, CDA.

This is a new position and it is the first time that CDA has appointed a full time Chief Executive. Previously member company appointees performed the management of CDA. John Foot, the CDA chairman said "The appointment of a CEO will bring new drive and focus to CDA as it embarks upon the next phase of its development as the UK's common, industry-owned petrotechnical data repository." Another CDA appointment concerns Brian Lucken who will be chairman of the Hard Copy Sub-Committee. Lucken, Data Administrator with Phillips Petroleum Company (UK) replaces Conoco's Isobel Elmslie.

Paul Duller (Amerada Hess) speaking at the PESGB's annual Data Management Seminar revealed further details of the next (seismic) phase of CDA. The driver for this latest phase is the DTI's release of oil company seismic data - a first for the UK, which has had no release mechanism for seismic data until now. CDA is no longer attempting to build a repository for seismic data, which is considered too large a dataset. The new strategy is to establish links to contractors’ data banks, on line or near line, to tape storage companies and oil company data stores. CDA's seismic phase will be fully virtual.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_8 as the subject.

Landmark OEM agreement with Decisioneering heralds Aries Decision Suite (May 1998)

Decisioneering’s Crystal Ball to serve as core simulation technology for Landmark's new Aries Decision Support Package

Founded in 1986, Denver-based Decisioneering Inc., supplier of an Excel plug-in providing Monte-Carlo risk-analysis and simulation, has signed an OEM agreement with Landmark, to package its Crystal Ball software with Landmark’s upcoming ARIES Decision Suite. Scheduled for imminent release, the ARIES Decision Suite is aimed at oil and gas asset managers and comprises the ARIES Decision Tree and ARIES Simulation. "Crystal Ball technology is a strategic necessity for Landmark as we expand our market share into large, multi-national oil and gas companies," said Landmark’s marketing director Randy Smith.

A single user crystal ball license retails at around $500 and this is a widely used, horizontal software tool. Clients include a dozen major oil companies and Landmark's main competitor, Schlumberger. A demonstration version of the product is available at Decisioneering's website ( While Monte-Carlo simulation (whose use in E&P goes back a couple of decades) is not exactly rocket science, the strength of a product such as Crystal Ball is its intuitive use and tight integration with the ubiquitous Excel. Further tight integration with the Aries suite should prove a seductive tool to the E&P planner.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_9 as the subject.

Petroleum Data Manager Interoperability Survey (May 1998)

PDM has polled industry leaders on their visions of how interoperability of E&P applications may become a reality-perhaps within a three-year time frame.

We have been somewhat overwhelmed by the response to our request for information (and opinion) on the subject of interoperability that we launched after last month's issue. We asked a number of leading protagonists of data modeling, E&P computing and interoperability specialists the following questions.

Where are we today with regard to E&P data modeling and software interoperability?

How could interoperability be achieved in a reasonable time frame - say less than 3 years?

What is your position on business objects?

What is the role of a large data model such as Epicentre deployment for interoperability?

Are there are any other routes to interoperability than using a vendor API.

We expected a couple of paragraphs from our respondents, but we were deluged by lengthy and well-argued essays on the subject. So instead of grouping all the replies into a single article, we will be publishing them separately. The first contribution appropriately comes from David Archer, CEO of the Petrotechnical Open Software Corporation (POSC) whose workings have been directed towards that elusive goal of interoperability for nearly a decade now.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_10 as the subject.

The Road to Interoperability (May 1998)

David Archer, CEO Petroleum Open Software Corporation (POSC) kicks off the interoperability debate with the state of play as seen from this leading standards body.

Interoperability can mean different things to different people. To try to establish some common ground in this complex field, POSC took an in-depth look into industry perceptions of interoperability (see side box). I think that the prevailing view, when people talk about interoperability, is a meaning somewhere at or around Level 4 or 5. To achieve this degree of interoperability across the board will clearly take some time. At the same time a distinction must be made between interoperability across different vendor's platforms and intra-operability within the products of a single vendor. We have already seen evidence of some degree of intra-operability between products within the main vendor's frameworks.


To address the question as to how interoperability might be achieved in a reasonable (three year) time frame, I believe that we will begin to see the capability for Level 2 and 3 interoperability relatively soon; perhaps by early 1999 in the case of the Open Spirit work and the first POSC specifications for Interoperability and Business Objects. The capability for Levels 4 and 5 will evolve over the next 2-4 years as the industry builds new generations of integrated products. I like to say that we're moving to a new generation of where we are "building integrated systems in addition to integrating built systems". We'll only see Level 4-5 interoperability if the key E&P application suppliers build on open standards that broadly support these levels of sharing. Such functionality will have to be offered independently of whether they have such capabilities within their own product lines. POSC is very much a supporter of the notion of business objects. Our Interoperability and Business Object Initiative has as its focus sets of E&P Business Objects, along with enabling architectures. We believe that the success of this initiative will accelerate the industry's ability to build, deploy and use interoperable applications.


The next question concerns the deployment of Epicentre as a route to interoperability. As we have long known, common terminology is important for information sharing. A comprehensive life-cycle data model such as Epicentre is a fundamental enabler of this sharing. But it is not sufficient to permit the level of integration (3-5) that the industry requires. Mapping data objects to Epicentre gives us a basis for sharing common concepts between data objects, applications, systems and users. Therefore, Epicentre deployment might be looked at as a necessary, but not sufficient requirement. Finally you ask if there are other routes to interoperability than vendor APIs. By "Vendors" I assume you mean E&P technical application vendors as opposed to middleware vendors. Certainly the use of vendor APIs improves our ability to share environments and data and for application interaction. If these APIs were built on a common basis (with a common data model, a common object architecture including common concepts and interfaces) then the ability to put together interoperable systems (approaching level 4) across product lines would be at hand.


Notwithstanding this, vendors most likely will continue to develop APIs at various levels to support application development. The key issue is how "Open" they are and how much these APIs share common building blocks, so that applications built on one vendor's API can interact appropriately with applications built on another. Basing a vendor's API on open standards (such as Epicentre, POSC Business Objects, etc.) will be a major step forward. We believe that this is where the main focus should be over the next few years. There is an alternate (but not mutually exclusive) idea that middleware vendors will provide computing environments (including APIs) that G&G application vendors can adopt as the foundation of their computing platforms. This is a sound idea if the application vendors see -- as they have already seen in hardware, operating systems, database technology, GUI tools and other areas -- that developing all aspects of the underlying computing infrastructure is not their core competency. Rather they would layer their internal APIs and applications on third party platforms built on industry standard architecture and object definitions.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_11 as the subject.

What is Interoperability? (May 1998)

POSC, as part of the Interoperability and Business Objects Request For Technology, have had a work group look into and delineate what is meant by interoperability.

The team (from quite a number of companies) defined six levels of interoperability as follows

Level 0 Applications must access the same data store to avoid the data reformatting overhead.

Level 1 End-users can run the same application against different data stores using standardized intermediate data servers

Level 2 Actions taken in one application can dynamically cause actions to take place in another.

Level 3 Applications can share process or presentation objects/servers such as GUI's. Users can control aspects of multiple applications through a common interface.

Level 4 System integrators can create virtual applications, which have components from different vendors and may be customized to end user work processes.

The POSC work group concluded that to talk about Interoperability as though it is a single, "one-size-fits-all" concept is fairly misleading.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_12 as the subject.

98A release of ECLIPSE suite announced (May 1998)

GeoQuest has just announced a new release - '98A' of the Eclipse reservoir simulation software.

98A includes the following applications:

ECLIPSE Simulators-ECLIPSE 100, ECLIPSE 300 and ECLIPSE 500

Pre-Processors-EDIT, FILL, GRID, PSEUDO, PVTi, VFPi and SimOpt

Post-Processors-GRAF, RTView and FloViz.

New to the ECLIPSE suite of software is SimOpt, which optimizes the history-matching phase of reservoir simulation studies. SimOpt allows the simulator to calculate the sensitivity of the simulation results to changes in the values of particular parameters, and offers an interface for managing the sensitivity analysis and results. "The key contribution of SimOpt is that it allows you to assess, in a fraction of the time it has previously taken, the validity of your history match," said George Steinke, vice president of Marketing and Business Development for GeoQuest. Another new application released with 98A is FloViz advanced three-dimensional visualization software. Spatial data input to and output from the ECLIPSE simulators can be displayed with FloViz. This new visualization capability reveals aspects of the simulation model not readily apparent through normal two-dimensional displays.


One of the advanced features of ECLIPSE is a new parallel option, which enables the computations involved in running a single ECLIPSE model to be divided among a number of different processors on a parallel architecture computer. By splitting the work and arranging for the calculations on the different processors to run in parallel, the total run time for the model can be reduced substantially. "For the user with access to parallel architecture hardware, it dramatically reduces run times without sacrificing either robustness or the full range of available features in ECLIPSE," said Steinke. The parallel option can be run on the IBM RS/6000 SP system, the Silicon Graphics Onyx2 or Origin2000 systems and on the Sun Microsystems Enterprise Servers. In tests, a one-million-cell fully implicit black oil model was run "in a few hours". Before the release of the parallel option to ECLIPSE, this simulation would have taken many days.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_13 as the subject.

Business Objects Under Construction (May 1998)

The Petrotechnical Open Software Corporation POSC has just announced the consortia that have responded to their Interoperability/Business Objects Request For Technology (RFT).

Four responses have been received from consortia, three of which are essentially comprised of members of the Open Spirit Alliance (OSA - see PDM Vol. 2 N° 12). Of the three OSA derived submissions, one covers the low-level architectural principles, which will provide the services for deploying the distributed objects. Intriguingly, this is the only submission that involves Schlumberger. Next up the complexity chain - or down the abstraction road is another OSA submission, again of a highly technical nature concerned with the representation of interpretation objects using "a canonical geometric representation" which is "a discrete topological representation such as a polyline or trimesh". The third OSA submission as well as that from the Omega consortium is more recognizable as E&P object oriented projects. The OSA Business Object Framework submission includes business objects designed to support seismic interpretation, including three "collections": wells, culture and seismic. Also provided are object viewers and a consistent way of dealing with coordinate systems.


The wild card submission from the Omega consortium sets out to provide an object oriented framework for developers working in the field of 3D geological modeling. OMEGA is a technology-transfer project destined to re-cycle technology developed in the aerospace and automobile industries to use in E&P. Omega is part-funded by the European Union under the ESPRIT program. The OMEGA (Object-Oriented Methods and development Environment for Geoscience Applications) group comprises Matra Datavision, TNO, Sintef, Beicip-Franlab, Volumetrix and BRGM. Omega makes use of the "Business Data Object" concept such as a horizon, fault or a stratigraphic unit which themselves encapsulate data and associated processes. The object-oriented nature of these data objects gives them a chameleon-like behavior for the end user. The same object will appear to each-user (geophysicist, geologist, reservoir engineer etc.) with the appropriate domain-specific representation. Omega partners are convinced that the interoperability inherent in this technology will change the way in which users work with their data and interact with each other. More info from

IBM, Landmark??

Conspicuous by their absence from the POSC RTF submitters are both Landmark and IBM. An IBM spokesperson told PDM that they are following the POSC Interoperability initiative closely but are also in the process of developing their own object framework in the San Francisco project and the recently announced Enterprise Java Beans. This new technology from IBM currently only supports the financial side of enterprise computing. This is because a) that is (naturally) where the big IT money is, and b) the relative simplicity and commonly agreed-on definitions of business objects in the financial field means that OO technology has already gained some acceptance. Conversely, the complexity of E&P objects, and the small user-base of the E&P IT community are the main technological risks for the POSC effort.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_14 as the subject.

OpenSpirit demos and forms SIG (May 1998)

OpenSpirit, the prime respondent to the POSC interoperability RFT will be exhibiting and demonstrating software at the 60th European Association of Geoscientists and Engineers (EAGE) Annual Conference and Technical Exhibition between 8-12 June in Leipzig, Germany

Demos will be conducted by engineers from Chevron Corporation and PrismTech and are targeted at both interpreters (end users) and application developers. The demos will include:

Multi-Well Cross-Plot - that run on UNIX and PC.

Log Browsing Applet - a simple web-deployed log browser.

Surface Operations - an illustration of a 'virtual' application that uses a third-party module in conjunction with existing Landmark or GeoQuest applications. Performs surface processing and seismic attribute analysis and display; works with IESX and SeisWorks via CORBA servers.

Rapid - a simple seismic viewer that shows high performance display of successive slices through a 3D volume; accesses IESX and SeisWorks data via CORBA servers.

Well Data Access from Excel - an Excel spreadsheet with macros to allow the direct access of well data (e.g. well headers, well picks) via a CORBA well object server (shows CORBA/ActiveX integration and links to MSOffice applications).

The new OpenSpirit Special Interest Group (OSIG) web pages have now also gone live and are located at: OSIG membership costs just $ 1000 per annum per company site. More from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_15 as the subject.

Paras announces the Knowledge Management Collaboration (May 1998)

Paras, the Theseus Institute and POSC, with sponsorship from Schlumberger-GeoQuest, have announced the Knowledge Management Collaboration (KMC) project.

The KMC is described as "an ambitious schedule of individual company workshops, group discussions and in-depth research on the role and application of knowledge management in the oil exploration and production industry". KMC's project manager, Paras' Hamish Wilson stated "an industry-wide dialogue is the only effective way of funding and managing the resources needed to undertake such an ambitious and important program." Theseus is an international management institute in Sophia Antipolis, France where a knowledge management competence center has recently been inaugurated. Charles Despres and Daničlle Chauvel of Theseus will focus this expertise on issues affecting the KMC. The KMC's goals are to :

Explore and define excellent knowledge management practice in complementary businesses and consider how this can be applied to E&P

Enable participating companies to make evidence-based decisions on the role of knowledge management in E&P

Help each company chose the best technique and service provider

The KMC will start up in May 1998 and is seeking additional sponsors. Companies interested, and with $25,000 to chip in are invited to contact either Tamzin Howard of Paras on +44 (0) 1983 528700, or Struby Overton of POSC at +1 (713) 784-1880.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_2.0_199805_16 as the subject.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.