March 1999

'Open' Strategy to the fore at GeoQuest Forum 2000. (March 1999)

Some three hundred attendees at the Paris GeoQuest Forum 2000 conference heard of GeoQuest's latest technological offerings. Of particular note was the new 'open' strategy, with a keynote address on Open Spirit, and a demonstration of remote data acquisition and management from the well site

At the Paris Forum 2000 GeoQuest confirmed the opening-up of their software products to third party developers and other vendor datastores. The open strategy comprises three components

The Finder/Enterprise database for the corporate data store

The GeoFrame Development Kit allowing integration of third party software at the project database level,

OpenSpirit at the application level, for integration with other major applications

GeoQuest acknowledge this as a change in strategy 'driven by industry needs'. In fact over the last year, GeoQuest have moved from being something of a laggard, to the forefront of the openness field particularly with the backing they are now giving to OpenSpirit.

commercial pressure

Ax Hesterman from Shell gave the keynote presentation on OpenSpirit as seen from Shell's perspective. Hesterman's presentation offered an interesting insight into the commercial pressures behind OpenSpirit. While both PGS and GeoQuest have responded to Shell's integration position by offering to join the OpenSpirit Alliance and port their apps to the new framework, Landmark only offered to use OpenSpirit as a means of integrating in-house developed applications.


GeoQuest's ambitious targets for the deployment of OpenSpirit are:

Interoperability demonstration using commercially targeted applications SEG, Nov 1999

Products ready for commercialization cycle Dec 1999

Complete OpenSpirit implementation over the next 2-3 years.

Virtual wellsite geology

Following Pilenko's focus on automation, announced at last year's Cannes Forum (see PDM Vol. 3 N 4) the Oilfield Services group participated in a proof-of-concept demonstration with a live hook up to the Villejust test well in the Paris Basin. Jean Marc Soler demonstrated real time data communications allowing the wireline unit to send log data during acquisition to LogDB and GeoFrame with control of remote operations through an InterAct video linkage.


Such data transmission is claimed to be secure, and part of Schlumberger's commercial offering today.

This technology will help to ensure that the 'wellsite' geologist will actually witness the logging operation instead of supping at the local caf. Forum 2000 may have been a success in 1999, but will pose a problem for GeoQuest marketeers in the year 2000. What will they call the Forum then?

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_1 as the subject.

Petroconsultants - thay's trouble at t'mill*! (March 1999)

A report in the Geneva-based journal Le Temps describes the difficulty of restructuring a traditional European business.

IHS Energy is having a hard time restructuring its European unit, Petroconsultants, according to a report by Roland Rossier writing in the local newspaper Le Temps. Firing CEO Jean Christophe Fueg and handing over the management function to IEDS has ruffled feathers amongst Petroconsultants employees who regard the fusion with IEDS as a 'dismembering' of the scouting group. Fueg has been replaced by Jan Roelofsen who described Fueg's ousting as due to "friction with the IHS management". One difficulty facing IHS management is inherent in the nature of the scouting business. Whereas a restructuring oil and gas company may fire 'expendable' G&G's, much of the value of Petroconsultants' scouting asset lies with the people.

* translation - "there is trouble at the mill".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_2 as the subject.

Bruce Rodney, from UK-based consultancy Exprodat analyses current trends in data management. (March 1999)

In this guest editorial, Bruce Rodney argues that the current economic climate mandates a new approach to data management. Bruce compares heavyweight and lightweight approaches to the problem and agrues in favor of greater flexibility.


The persistently low oil price has forced enormous changes throughout the industry in the past year. Companies are looking for ways to cut costs across the board, and budgets for 1999 have been slashed. Where does data and information management (IM) fit into this world of contracting service budgets? There is a real and increasing tension between the requirements for data management, which are to do with the longer-term preservation of data assets and building of corporate information and knowledge; and the business, which is increasingly short-term decision-focussed, with a use-and-discard approach to project data. The following current and predicted trends characterize this tension:

Data/Information Trends

Companies are assessing their entire IM strategies. They are looking for more flexible data access and purchase models, faster delivery to the business user and tighter integration with their existing systems. There is also an increasing trend towards the remote provision of data and applications/analysis, where the information purchased and acquired resides and is managed by vendors, and is not physically duplicated at the client site. Data are increasingly becoming a commodity that can be bought when and if needed, allowing database population to take place on demand. Data vendors will standardize their offerings to better integrate with other data and software vendors, and data delivery via the Web will increase.

Technology Trends

There is a move away from large, central, ‘big bucket’ corporate repositories to more distributed ‘leave-it-where-it-is’ systems. Aiding the fragmentation of corporate databases is the increased use of project databases, which have improved rapidly in terms of coverage and application integration. There is a trend towards the acceptance of ‘visual integration’ vs. actual integration, i.e. applications that allow access to distributed data in a way that is transparent to the business user. The need to model the world in a structured fashion (relational databases with vast data models containing 1000’s of explicit attributes) is declining.


Unstructured information (in documents) is becoming more integrated into E&P workflows. The data management application market is dividing into two categories. Heavyweight data management supports physical asset management and corporate data (e.g. Finder/xxxDB, PetroBank MDS, PetroVision). Lightweight data management supports subsurface field or prospect business processes, i.e. application, project or regional data management (e.g. GeoFrame, OpenWorks OpenExplorer, and PetroBank PDS). Companies are looking to apply technologies and processes that can reduce their IM costs. There is a growing acceptance that Web technologies can provide greater flexibility when it comes to integrating corporate information.

Organization & Process Trends

The centralized approach to data management, where services are provided by a central department, is less justifiable today given the pressures on costs. Looser, asset-focussed distributed data management organizations will increase. Formal data management systems with precise definitions of roles and responsibilities and exhaustively documented procedures and standards, i.e. a 100% ‘belt-and-braces’ approach, will be replaced with less formal ‘80% solutions’ that meet business needs differently.

IM framework

Data management needs to be integrated into an IM framework, including electronic document management systems, knowledge management, GroupWare and the Web. Managing data (at least the heavyweight component) may well be an integral part of managing the business, but it can no longer be considered of itself to be a core business. It is therefore an attractive target for cost reduction through outsourcing.

Data/Information Management Improvement

What companies will require in the next few years is to manage the tension between short-term business activities and longer- term data management. The simplest ‘solution’ is to scale down current data management systems in line with required cost reductions. But this may result in a failure of the basic data management commitment - to provide data of known quality on demand to business users - and the deterioration of data assets over time.


What may be required is a system that borders on organized chaos; a flexible, 80% solution that is squarely focussed on the business. Exactly what shape this could take, and what the quantitative level of savings and tangible added value would be, will vary from company to company. Companies can undertake IM organization, architecture and strategy reviews to improve their IM service provision.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_3 as the subject.

Shared Earth Model nears with GeoFrame 3.5 (March 1999)

Property3D and LPM now integrated with GeoFrame offering what is claimed as the 'First Complete Reservoir Modeling Workflow on an Integrated Platform'.

Property3D is a three-dimensional geological modeling package allowing users to create property models, perform statistical analysis and determine reservoir connectivity. Results from Property3D can be used in FloGrid for upscaling to generate reservoir property descriptions for reservoir simulation. GeoQuest's LPM software is a surface-based mapping application for mapping reservoir properties within a reservoir. Zone maps can be guided by relationships between log data and seismic attributes. Both deterministic and stochastic methods are available


Property3D and LPM are part of GeoQuest's push for seamless interpretation to simulation workflow. This begins with seismic interpretation moves through the three-dimensional property model and ends in the fluid-flow model in FloGrid. LPM can be used to generate two-dimensional property maps based on seismic and well log information.


These property maps can then be leveraged by Property3D or directly by FloGrid to enhance the reservoir model.

Larry Denver, vice president of Marketing for GeoQuest claims that "This reservoir characterization system lets engineers and geoscientists bridge the gap between the geological and geophysical interpretation and the reservoir simulation model."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_4 as the subject.

Calgary-based PPDM Association seeks volunteers for new Advisory Council. (March 1999)

The Council will help the Board 'identify the strategic priorities for information management that can be addressed by the Association’s products and services'. Suitable candidates will be 'senior executives of organizations for whom E&P information is a vital part of their business and who are respected by the industry, as offering valuable non-partisan advice'. The Council has no direct authority or responsibility in the Association, and only gives non-binding advice to the Board. If you feel that you fit the bill, talk to David Fisher, at 403-691-4227 or

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_5 as the subject.

People on the move (March 1999)

Andrew Sitek has joined Kelman Seismic Processing as Processing Geophysicist. Andrew was previously with Amoco and Paradigm. IHS has jockeyed its top brass, Dave Noel is now CEO of IHS Energy's information business, Keith Neal has been appointed CEO of data management services, and Susan Whitbread has been appointed CEO of the economics and consulting business. Coherence Technology Company has promoted Evelyn Medvin to Vice President, Interpretation Services. Medvin Joined CTC in 1997, she was previously with Cities Service Oil, Occidental, GeoQuest and Landmark. Another CTC promotion goes to Tim Rondstadt who is now Vice President of Sales and Marketing. Tim was with Amoco, Zycor, Landmark and GeoQuest before joining CTC.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_6 as the subject.

GeoGraphix Ships GeoGraphix Release 99.1 (March 1999)

New version of Landmark affiliate, Geographix' interpretation suite promises improved Integration with GES, SeisVision and Prizm

GeoGraphix has announced the immediate availability of GeoGraphix Release 99.1. which offers GeoGraphix users the capability to dynamically share SeisVision and GES well data. Changes made to one interpretation are automatically updated in the other interpretation.


Bob Peebler, president of Landmark Graphics Corporation states "Release 99.1 is another tremendous step for GeoGraphix in our goal to achieve full integration across our product lines. With each six-month incremental release, our users are going to experience a higher level of integration in their workflows and interpretations."


Part of Release 99.1, SeisVision v.4.5 includes new functionality. SeisVision users can now interpret multiple 3D surveys in a single project, In addition, multiple versions of the dataset are also allowed with the SeisVision v.4.5 release so that interpreters can now compare different versions of the data – unmigrated with migrated, or unfiltered with filtered data.

6 months

Also provided with the new version are new tools for mapping, depth converting 2D interpretations, and merged 2D/3D project interpretations. Release 99.1 is the second consecutive GeoGraphix release to occur in the newly adopted "Integrated Timed Release Cycle". The Integrated Timed Release Cycle involves a synchronized six-month release period of GeoGraphix products. More from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_7 as the subject.

Troika hosts PESGB SEG-Y website. (March 1999)

The new version of the SEG-Y standard is now available for consultation on the Troika and Associates website

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_8 as the subject. 

Kerm's Korner free newsletter. (March 1999)

Assiduous scribe Kerm Yerman is offering a free newsletter with extensive coverage of the Canadian and International Oil & Gas Industry. For details

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_9 as the subject. 

Paradigm - new open forums for software users (March 1999)

Paradigm Geophysical now offers users of Echos and Ergos a forum for the exchange of views on its products and technology issues. Check the forum out on

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_10 as the subject.

Epicentre V2.2.3 Release Preview (March 1999)

POSC has announced proposed extensions to the Epicentre data model to be released as version V2.2.3 in June 1999.

The enhancements will cover new reference data, the incorporation of gravity and magnetics (this work was performed as part of the BGS/Ark Geophysical FieldBank Project) and new well classes and reference data developed in conjunction with the UK Department of Trade and Industry and as part of the WIME project (see Article by Nigel Goodwin in this issue). More information from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_11 as the subject.

Caesar Systems PetroVR 1.4 (March 1999)

Caesar Petroleum Systems, the has released Version 1.4 of Petroleum Ventures & Risk (PetroVR), the project evaluation and decision support system.

Some highlights of the latest version of Caesar System's PetroVR are :


Decisions are time-based and can be integrated with a project schedule?

Spider plots allow analysis of the impact of an input parameter on project economics.

Portfolio Manager is enhanced to allow flexible and intuitive portfolio analysis. More from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_12 as the subject.

CMPT to offer helpdesk for EU funding (March 1999)

UK-Assist is a new service from the Center for Marine and Petroleum Technology to help UK-based companies access EU research grants.

CMPT is to initiate a helpdesk service UK-Assist to aid UK-based companies benefit from European Union funding for hydrocarbon technology R&D under the recently announced Fifth Framework Program. CMPT's Irene Hepburn, said "The aim is to maximize the success of UK-led hydrocarbons technology development projects in gaining European funding". More from

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_13 as the subject.

Epicentre and POSC/CAESAR, Part Two of Three (March 1999)

Nigel Goodwin of Essence Associates continues his analysis of upstream data modeling with a look at the concept of inheritance

What is inheritance? As we concluded in last month's article, the data modeling world distinguishes two types of inheritance. Epicentre and POSC/Caesar both use variants of the EXPRESS data definition language which lets the programmer define inheritance of attributes and relationships. This is inheritance at the entity-type level.

A simple example of entity type inheritance from Epicentre is the entity-type 'activity', and the sub entity-type 'well activity'. 'Well activity inherits all the attributes and relationships of 'activity', but in addition it has a relationship to 'well', so you can say which well the activity is being performed on. Standard relational database management systems do not support inheritance. Indeed, it appears that even Oracle 8i does not support inheritance either. Implementing entity-type level inheritance requires modern technology, such as provided by PrismTech, or by "projecting" the data model and implementing in more traditional relational DBMS’s.

data level

The second type of inheritance is at the data level. In POSC/Caesar, there is a class of equipment called ‘pump’ and another called ‘centrifugal pump’. These are held as data in the equipment class entity type. A centrifugal pump is a special case of a pump. Everything we can say about pumps (e.g. they usually have a maximum flow rate) can also be said about centrifugal pumps. Inheritance can be used to manage this information in a more compact form, we don’t have to repeat the fact that all subclasses of pump may have a maximum flow rate, we just define it for the superclass ‘pump’.

Express lacking

This is inheritance at the data level, and surprisingly, there is no capability in EXPRESS to achieve this, because Express, like SQL is dealing with the entity-type level. Data level inheritance is common knowledge amongst the POSC/Caesar community, although it is not explicitly mentioned in the specification.

When implementing POSC/Caesar in old technology, inheritance at the entity type level can be catered for by performing methods equivalent to the Epicentre relational projection, although POSC/Caesar do not at present (and have no plans to) define a relational projection. Inheritance at the data level must be managed using other tricks such as managing data inheritance ‘on the fly’ or using caching techniques which can be quite successful because the reference data library, by nature, does not change frequently.

business domains

The wellbore and its associated activities and equipment is one business domain where Epicentre and POSC/Caesar meet up. A current POSC project WIME, has just released version 1.0 of its specifications ( WIME defines drilling and well operations activity and equipment types. The project used concepts from both Epicentre and POSC/Caesar although leans more to the latter. The results are being integrated into the next version of Epicentre.

Oracle 8?

PDM has already touched upon the object extensions available in Oracle 8 which are to be deployed in project Synergy (PDM Vol 4 N 2). How much will be gained from the new technology depends on whether Oracle 8 can handle inheritance at the data level, whether it can manage documents, (both as binaries and broken down into their graphical elements), and whether it can provide units of measure utilities. If the move is from a traditional SQL-type DDL to something similar to EXPRESS, then the benefits are not likely to be great. However, if it is a move towards a full engineering DBMS, including units of measure handling and inheritance at the data level, then this would aid application developers to bring their products to market more rapidly.


A good data model should be implementable for performance and should cater for a range for implementation technologies. It must model the business needs of the domain adequately and should be flexible enough to cater for changing business needs - including expansion. It should not be any more complex than necessary and should be suitable for ad-hoc queries by computer literate end users. Finally, the model should be coherent, unambiguous, and have common style throughout the model.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_14 as the subject.

IHS to team with GIS specialist Geodynamic Solutions Inc.

Houston-based GeodynamicSolutions the GIS specialists are to provide their expertise inGIS front end development to IHS Energy. The goal is to have 'a seamless,enterprise-wide GIS and data management solution for the petroleum industry'.

IHS Energy Group, has signed a letter of intent to form a strategic alliance with Geodynamic Solutions Inc. (GSI), a petroleum industry leader in GIS software technology. IHS Energy Group's P2000 (petroNet21) will provide the integrated data management system designed around the PPDM-based PIDM database. GSI will be responsible for developing and supporting an enterprise GIS product suite and will also offer additional services such as consulting, training, custom application development, and database solutions. "We look forward to combining our database management capabilities with Geodynamic Solutions, an industry leading GIS technology solution provider," said Dave Noel, CEO of Worldwide Information Services for IHS Energy Group.


"GIS technology and large, integrated datasets need to be accessible to the entire enterprise," said Kirk Barrell, president and CEO of Geodynamic Solutions, Inc. "The relationship between our two companies will allow us to provide a seamless, GIS and data management solution to the industry. GIS technology integrated with IHS Energy's well and production data provides a real advantage to companies wanting to operate more efficiently. It is critical for customers to have access to accurate, comprehensive and up-to-date information, and to have the technology to access and analyze that information. GIS technology is rapidly emerging as the primary tool for searching and analyzing spatial data in the petroleum industry."

Fall 1999

Geodynamic Solutions' enterprise GIS product suite, to be released in the fall of 1999, is being developed with software technology from the Environmental Systems Research Institute (ESRI). It will consist of GIS applications utilizing ArcView, MapObjects, MapObjects Internet Map Server, and the Spatial Database Engine. More from the website on

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_15 as the subject.

Coastal buys in to Powerhouse (March 1999)

Coastal Oil & Gas has awarded GeoQuest a five-year, $2.9 million, contract from for PowerHouse E&P data management services and Finder software.

GeoQuest will team with Geco-Prakla, a division of Schlumberger, and Data Logic Services, a division of IHS Energy Group, to provide E&P data management in support of Coastal's U.S. and international operations. The agreement provides for the outsoutced management of Coastal's seismic, well and log data through the GeoQuest Data Management center (DMC) in Houston.

Makes sense

"With today's market conditions, it makes more sense than ever for oil and gas companies to look for more cost-effective ways of doing business," says Stan Wedel, vice president, GeoQuest North America. "The PowerHouse service was designed to make an impact on the operations of our customers through its ability to leverage critically skilled personnel, best practices and the DMC facility to deliver a high-quality, yet cost-effective data management solution to the industry. We welcome Coastal Oil & Gas Corporation into our growing family of users for PowerHouse data management services."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_17 as the subject.

Conference Report - Knowledge Management in Oil and Gas (March 1999)

The First Conferences Knowledge Management in Oil and Gas was well attended with over 100 delegates. PDM offers the highlights and an insight into current practices in Knowledge Management.

Kent Greenes and Chris Collison (BP) presented BP's attempts to "learn faster than the competition". Where the rubber hits the road, this is achieved through BP - "Connect", a web-based yellow pages service which allows projects to be monitored before and after they are undertaken. "After Action Reviews", where lessons are learned and recorded from a project are a key component and the Yellow Pages allow employees to post their experiences to form "communities". So far there are 170 communities and the website has had 4 million hits since 1/1/99. It is interesting to note that material posted is not checked for accuracy!

not k-management

Projects from Statoil, Enterprise and Shell de-emphasized the knowledge management component. Experience has shown that excessive focus on theory tends to hinder uptake. Thus knowledge management becomes "knowledge networking" for Statoil, "learning networks" for Shell, and the knowledge worker is re-assigned to the position of "knowledge minder" with Entreprise.

Martin Vasey and Ken Pratt, BG Technology, presented the "Technology Bank", an Intranet which merges in-house information with outside feeds such as the Financial Times and the Economist. Current and legacy in-house reports are available on-line as are competitor intelligence, BG technology reports, business units' databases and a technology, skills and contractor database. Technology deployed includes RetrievalWare from Excalibur. This allows for both structured (database) and unstructured (full text) storage and retrieval.

No buy-in

Arjan van Unnik (Shell) originally had limited success and 'no buy-in' for knowledge management within the organization. KM concepts were therefore re-packaged in two more palatable ways. One, Communities of Practice (CoP) is typically a large, distributed group of users spread across the organization while the other, the Distributed Project Team comprises a small team with a clear deliverable. Cost per community is of the order of several $100k per annum. Van Unnik claims this is a lightweight solution, "there is no point in doing more, we are going for the low-hanging fruits". Some of these concepts have been discussed at the San Francisco Workshop where Shell and 7 other companies share CoP experiences, data and are developing tools for measuring CoP performance. Shell's main technologies are Altavista Forum and the Mezzanine Document Management Product.

no capture

Intriguingly, there is no systematic capture of corporate knowledge, no strong measures in place to keep knowledge within the company.

fair CoP

More CoP tales were related by John Keeble from Enterprise Oil - Enterprise is aiming to remove time and geography as barriers to communication by setting up a knowledge community. Four pilot projects will show benefits within a six month time frame. Some recommendations from Keeble;

KM is all about people. Technology can make it fail, but will only contribute to success.

Remove KM theory and language - and then remove some more!

Confidentiality - "Don't put anything on the web that you wouldn't put on the company notice board".

Use good consultants, for Enterprise this was Arthur Andersen, but Keeble told PDM "the final choice came down to the individuals on offer from the consultant".

PDM Comment

What is Knowledge Management? There are as many definitions as there are protagonists. We suggest that it is best understood as a Trojan horse for management consultants. What the Trojan horse vehicles into the organization is whatever the consultant has currently on offer. This may be a counsel-oriented offering (coaching of teams in communicating with each other). It may be some theory-oriented novelty or a hybrid offering incorporating an IT component. Finally it may be just the dressing-up of a prosaic IT service (Intranet or email) into a "K-M initiative". Intriguingly, success stories from Statoil and Shell both involved the dressing down of Intranet-based projects from grand K-M projects back into a more straightforward categorization.


Another important categorization and a key to understanding the positioning of KM in the organization is the rediscovery of matrix management by the KM industry. As management fads have come and gone, one persistent trend is the separation of the business into Asset and Discipline related chains of command. Thus an organization may have a drilling manager for a geographical asset - the North Sea, and also a head of drilling at headquarters who holds corporate technical responsibility. Few organizations can afford to staff a full matrix and this is where the KM practitioners come in with the offer of a Community of Practice. This is a virtual community with members working in Assets, but contributing their knowledge to the Community of Practice through the Intranet.

While almost anything can be passed off as knowledge management, we offer the following distillation of the essence of the current KM project.

The essential KM project

The basics of KM are a) capturing 'knowledge' from employees and b) distributing it around the organization. The theory of what Knowledge really is can be rather painful. There is an attempt to define knowledge as intrinsically different from 'data' or 'information'. This then raises the level of debate above the level of populating the corporate database and opens up the field to allow the company intranet to become the vehicle for distributing knowledge.


Of course some of these distinctions are quite artificial. The breakdown into data, information and knowledge may actually reflect the IT behind the solution more than the intrinsic nature of the problem.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_18 as the subject.

Calgary Raster Logs Revival (March 1999)

While other parts of the data management world fret over objects and hi-tech, the Calgary marketplace is getting fired up over low-tech well log raster images. Vendors are rushing to sign up customers amongst Calgary's 400 plus oil and gas companies.

There are about 330 000 wells in western Canada, almost all of which have wireline logs in the public domain. These have previously been available as either hardcopy prints or microfiche, but what has changed recently is the arrival of the depth-registered raster log image.


This hybrid technology - mid way between dumb images and digital logs offers a significant business benefit. The image effectively captures all the information on the original log, with none of the risks involved in digitizing or vectorizing. Depth registration also allows log intervals to be called up from the database intelligently.


Calgary's fiber network infrastructure contributes to the success of this technology with the image stored on a central server, allowing the customer to retrieve the log from the workstation. Logs can also be delivered on CD or DVD.


The main vendors are : QC Data - delivery of images is via the Accumap application and their "Accumap Opportunity Network" (exclusive fiber connection), International Datashare Corporation - with delivery over one of three competing fiber networks and MJ Systems - images from their fiche library, delivered on CD or DVD. (MJ Systems is now collaborating with geoLogic - see last month's PDM)


The big advantages of raster log images are:

1. Every log on every well is "instantly" available (not just the usual suite).

2. There is no interpretation or error in the process of capturing the log image and no worries about smoothing, splicing errors or the wrong sample rate.

The drawbacks:

1. limited applications (so far) to support interpretation using these images

2. the images are raster, not vector. So you can't do cross-plots, petrophysical computations, rescaling, stretch/shrink, etc.

More from the vendor websites on and

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_19 as the subject.

Veritas Gold promises 'instant access to industry data'

Veritas GeoServices Ltd., a Veritas DGC company, has announced the implementation ofVeritas Gold (GeoServices On-Line Data) browser, an Internet-based GIS applicationallowing immediate access to geophysical industry data.

Utilizing a multi-theme map-based GIS interface with Internet browsing technology, Veritas Gold enables clients to view and query public data sources such as well spots, well headers, landsale notices and seismic sale data (including metadata and visual data examples).


Optional data sets include pipeline, DEM, environmental, well tickets, land sale results and leased crown data. Enhanced with the ability to access satellite imagery and digital elevation displays, clients can also request survey audit and order hard copy maps directly from Veritas on-line.


Gold is based on the Public Petroleum Data Model Association's PPDM data model. Dave Berard, Coordinator of On-Line Services with Veritas explained, "A natural progression from this public data model was to enable clients to view, access and manage their proprietary seismic data archived at Veritas. Clients have the optional ability to privately display other proprietary data such as land holdings through a secure layer within Veritas Gold."


Berard went on, "Veritas Gold is an Internet application that employs a true virtual computing model. Clients do not need to make an additional investment in hardware or software to take advantage of the advanced capabilities of Veritas' data management resources."


Initially for Canadian market, additional markets will be targeted at a later date.


Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_19 as the subject. 

Windows NT in Petroleum Computing dissected at Pohlman International

A new report from Pohlman investigates current trends in E&P operating systems

The Pohlman report offers an analysis of Windows NT from an oil industry perspective and involved a survey of executives, decision makers, users, and IT professionals.

Oils & vendors

Coverage includes both oils and vendors. The report is said to be vendor neutral and examines the current deployment of NT, its strengths and weaknesses, product availability on the platform, market forces and trends and a forecast of future developments.


The Report is available as an internet-delivered PDF electronic book or on paper for $1,995.00. A discount is available for orders placed before April 30, 1999. More from or

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_20 as the subject.

XoX Corp. remodels board and posts profit. (March 1999)

Thanks to a $5.75 million deal struck with GeoQuest, hi-tech 3D geometry specialists XoX are showing revenue growth.

Xox, the 3-D geometric software specialists have posted 1998 net revenues of $2,241,389. Most of the revenue hike comes from a deal struck with GeoQuest, which selected the XoX Shapes geometric engine as the core technology for its Shared Earth Model (See PDM Vol 3 N 3). XoX states that that geosciences will continue in fiscal year 1999 to represent the main revenue stream.

cost cutting

Both operating expenses and R&D have been significantly reduced in a cost cutting exercise and changes have been made to the XoX board. Mr. Mark Senn has been promoted from VP Operations to Executive Vice President and COO while Dr. Pradeep Sinha is resigning as CEO, CTO and member of the Board. XoX Chairman Steve Liefschultz announced that XOX is to open up shop in Houston and has appointed Tim Ryan as VP Sales. Tim was previously with Paradigm Geophysical. Currently 12 geoscience applications companies are developing applications based on the SHAPES geometric computing system.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_23 as the subject.

Landmark rolls out Continuity Cube replacement - PostStack ESP. (March 1999)

Following the protracted courtroom battle with Coherence Technology over the use of the Continuity Cube (see past PDMs). Landmark have bounced back into the coherence/continuity arena with the PostStack ESP product, which analyzes, not continuity, nor coherence but 'similarity' of stacked seismic data.

Designed to "unravel complex faulting patterns and detect subtle stratigraphic features" Landmark's new software PostStack ESP (Event Similarity Prediction) is part of Landmark’s integrated PostStack technology suite.

Bob Peebler, Landmark president and CEO claims "Most companies use only a tiny fraction of the data they have available to make multi-million dollar decisions about their exploration and production activities," said. "The PostStack technology suite can fundamentally improve basin analysis and prospect generation, as well as greatly enhance the development and production of petroleum reservoirs, through access to subtle, yet critical, information concealed in seismic data."

PostStack ESP, designed to replace Landmark’s Continuity Cube software, is an improved productivity tool for interpreters working in highly faulted areas to unravel complex faulting patterns. It can also detect subtle stratigraphic features such as reefs and channels on seismic data.

PostStack ESP operates on the seismic data itself, yielding a high resolution and impartial image of the seismic features, free from interpretive biases. These diverse images help to support several phases of the E&P process, facilitating the interpretation process and insuring a more accurate and reliable interpretation of the data at every stage of a project.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_3.3_9903_24 as the subject.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.