December 2001

GrandBasin for Shell

Shell Exploration and Production Company is outsourcing its data management to Landmark’s GrandBasin e-business unit. PetroBank and Surf and Connect will provide data storage and access to Shell’s two terabytes of seismic data.

Houston-based Shell Exploration and Production Company has awarded Landmark Graphics’ e-business unit GrandBasin a three-year, outsourced data management contract. The deal covers the management of over two terabytes of Shell’s seismic data using the PetroBank data management system, and the web based Surf and Connect data browser, both recently acquired from PGS.


Dave Lesar, chairman and CEO of Halliburton, Landmark’s parent company said, “This data management contract is an example of the great partnerships that result from organizations like Halliburton and Shell E&P working to leverage advanced technology. Shell’s decision to outsource its data management to GrandBasin is a clear indication of the confidence that they have in PetroBank and Landmark’s data management.”


The contract provides for the outsourced management of all of Shell’s US affiliate’s data to Landmark unit GrandBasin. Landmark president John Gibson said, “Shell recognized the strategic importance of the PetroBank vision many years ago by actively participating in its architecture and development. Outsourcing E&P data management to Landmark will give Shell improved access to the right data at the right time and will enable it focus on its core business.”

Surf and Connect

GrandBasin’s data management tools, including Surf and Connect will be used to support Shell’s data flows and work processes, to aggregate data and manage projects dynamically. New developments in corporate data management technology from GrandBasin and Landmark will feed back into the partnership.

Change of tack

The outsourcing of Shell’s data management to Landmark represents a significant shift from previous attempts by Shell to provide similar services to the industry at large. As reported in the earliest editions of PDM (Vol. 1 N° 2) , Shell originally planned to offer upstream data management support - including PetroBank - through its Shell Services International ‘Veristream’ unit. Shell’s New Orleans Gulf of Mexico unit was also an early PetroBank adopter in-house (PDM Vol. 4 N° 5).

Veritas-PGS merger

Seismic companies Veritas DGC and Petroleum Geo Services are to merge, creating the second largest seismic contractor in the world.

Houston-based geophysical acquisition and processing specialist Veritas is merging with Norwegian Petroleum Geo-Services (PGS) in what is seen as a “merger of equals”. The as yet unnamed new entity, with annual sales of around $1.5 billion, will be headquartered in Houston.

27 seismic centers

The combined companies will have 21 marine crews, 8 visualization centers and 27 seismic processing centers throughout the world and a 400,000 square kilometer library of modern 3D seismics. Annual cost savings of $35 million are anticipated from increased operational efficiencies and consolidated research and development. Dave Robson (Veritas CEO) and Reidar Michaelsen (PGS CEO) argue that ‘growing global demand’ for seismic data, and ‘ongoing industry consolidation,’ justified the merger.

60/40 split

The deal will be a paper swap, with a 60/40 split for PGS and Veritas shareholders. The new company will be the second largest seismic outfit after Western Geco. The deal is dependent on shareholder and regulatory approvals.

Oil Information Technology Journal

Editor Neil McNaughton tells of plans for a re-launch of Petroleum Data Manager, reviews this month’s coverage of upstream IT, and reflects on what makes an E&P standard.

This is the last edition of Petroleum Data Manager. But fear not, we are not abandoning you! Petroleum Data Manager is to re-launch as Oil Information Technology Journal. Why? The change is simply to better reflect the true scope of our coverage. Petroleum Data Manager has never just been about data management. We have always tried to give you the big picture.

Identity crisis

But in this day and age, it is not enough to supply content, packaging is equally important. We recently carried out a study of PDM readers, both casual and confirmed. This was achieved by in-depth anecdotal collection, chance observations and a couple of phone calls. We concluded that we have a problem of identity. Casual observers have been seen dismissing our publication as ‘just advertising’ - which it is most definitely not. Others have taken a far too literal interpretation of PDM’s name. Thinking for example that as a ‘knowledge worker,’ data management does not concern them.


This situation has frustrated us for some time, and has only been bearable thanks to a sustained effort of... denial. But something has come up that has rendered this strategy untenable. Our very modest success over the last couple of years has led the French fiscal authorities make increasingly ludicrous demands on us. So we have to act. I see three choices - exile (far too pro-active for our denial-prone corporate strategists), throwing in the towel (but what would your megalomaniac editor do without an audience?) and getting a larger readership.


The move to Oil IT Journal is intended to more accurately reflect our expanded coverage, and to bring in lots of new readers for whom the ‘Data Management’ tag may be a turnoff. But for those of you who are data managers - and I count myself amongst you, fear not. Our coverage of data management and IT infrastructure will continue unabated. In fact my humble opinion is that our coverage of these matters is actually getting better, and to demonstrate this, I’d like to walk you through some significant developments we cover in this issue of PDM.


From the bottom up, as it were, we see how CGG is moving from tape-based systems to high capacity disks and gigabit Ethernet to support its high performance PC clusters. Similar technology is being deployed by SeiScan to offer highly granular access to seismic data. And Conoco is building a ‘format neutral’ archive for project data. We also report on significant developments on the Statoil Slegge project which are leading Schlumberger to re-tool its data management infrastructure.

Standards Review

Also new in this issue is an extensive review of upstream data standards. This is one of the most researched pieces we have ever done, and for us one of the most illuminating. One tends to think of standards bodies as august organizations that grind slow and fine. But the reality is that Standards come and go, much like dot coms in a boom and bust. They compete for pieces of the action, and they are not always aware of what each other is up to.


I would like to be able to provide a pithy summarization of what standards bodies are really about. But sometimes, the more closely you observe something, the more complex it seems to get. I’ll just offer up a few random observations and let you draw your own conclusions. Politics, in many guises, plays a significant role in standards. Traditionally, at least in the upstream, most standards have come from either Schlumberger, or the not for profits. But a couple of years ago a new breed of standards body was formed by two groups of commercial companies. One is definitely deceased, the other is moribund and neither have made any usable standards public.


My next observation is that the remaining standards orgs have a very different attitude towards life. Take the SEG standards committee for instance, and the new SEG-Y Rev. 1. This was ‘nearly final’ in August 1998 and is still not official. This delay is not an accident. A lot rides on a new SEG standard. As Alan Faichney, SEG standards committee chairman told PDM, the SEG aims “not to be first, but to be certain.”


At a different point in the standards spectrum we have most of the current XML initiatives. These seem to generate instant ‘standards’ which represent, not agreement between different groups of users, but just one possible way of using a new tool. A ‘disposable’ standard if you like. These short-fused standards initiatives have the advantage that they leverage the latest technology, unlike ‘conservative’ standards like the SEG-Y Rev 1.

If it ain’t bust?

My last rambling observation on standards came from a couple of workers involved in the migration of legacy, ASCII or binary standards to XML. Both questioned the dash for XML solutions to problems which have already been solved. Far from being luddites, these observers had noted that the new XML protocols start out looking like panaceas and paradigm shifts. As they move to become industrial strength solutions, and incorporate the hard stuff like scalability and security, they lose some of their shine. The process of moving from a ‘disposable’ to deployment may take so long that technology will likely have moved on by take-up time. Is this the true enigma of the standards development process? That the development and approval cycle time means that you can only ever have old tech standards? Happy New Year.

PDM Interview - Kjetil Tonstad, Statoil

Kjetil Tonstad tells PDM how the Slegge project has turned into the Statoil Corporate Data Store. The CDS is to be the ultimate repository of data which has been QC’d and approved within OpenWorks projects.

PDM We have heard quite a lot about the Slegge project over the last few months but have never quite understood its relationship to POSC Epicentre and Landmark’s PetroBank. Please enlighten us.

Tonstad – The Slegge project, a Corporate Data Store for interpretation results, was originally co-owned by Statoil, Norsk Hydro, NPD and PGS Data Management. Following the acquisition of PGS Data Management by Landmark, we were keen to push the partially completed Slegge project forward. After discussions with all interested parties, we awarded a contract for the further development of Slegge to Schlumberger. Because Landmark still owns the Slegge name, the new appellation for Slegge is the “CDS” or Corporate Data Store.

PDM – How does the CDS differ from the Petrobank MDS?

Tonstad – The CDS represents a move from ‘data container’ to ‘business support’. The aim of the CDS is to preserve the results of our interpretations. Information is extracted from a project and stored in the CDS with extra security, QC and context. Data management requirements, not applications are the drivers. The CDS is decoupled from the Project Data Store.

PDM – The PDS is still OpenWorks?

Tonstad – Yes. Our PetroBank Master Data Store is physically the same as that used by the DISKOS consortium. Data from here is loaded into our OpenWorks PDS for cleanup and interpretation. The CDS has effectively moved from ‘below’ the projects to ‘above’, avoiding the bottleneck of loading to a CDS before the PDS. It also moves data management up the value chain, from infrastructure to business support.

PDM – You don’t mind sharing what is also a public data repository for your MDS?

Tonstad – We have full confidence in the security and entitlements of PetroBank.

PDM – Statoil used to be a strong supporter of POSC and Epicentre. What technology will underlie the CDS?

Tonstad – We are much less prescriptive in terms of technology than in the past. Statoil defines required functionality and Schlumberger implements with such technology as is deemed appropriate. Our focus is now on organizing the workflows around the different data stores. We are importing some constraints from Slegge back into OpenWorks, so that standard nomenclature is available in terms of drop down lists of formation names etc. This has proved an important contribution to interpreter productivity – the use of templates means that a geologist only ‘sees’ one gamma ray log – not twenty! Projects are front-loaded with explanations of roles and process, and an delivery plan is agreed upon, along with minimum audit milestones which are tracked by central data management. It works!

Schlumberger EPDS and the Federator

PDM got a sneak preview of new technology that Schlumberger is developing to fulfill Statoil’s requirements. The Schlumberger E&P Data Store (not Finder!) and Federator (not Open Spirit!) will be commercialized once the Slegge project is completed.

The key objective of Schlumberger’s new Corporate E&P Data Store (EPDS) is to store data and interpretation results of known quality. Such data is stored in the EPDS along with tags identifying its quality level as well as why it was created. The philosophy behind the EPDS changes some of the traditional ways of managing E&P data.


The EPDS uses applications not only for interpretation, but also for data quality control, cleanup and management. Quality assurance is now performed from within the interpretation project. Quality assured data is sent on to the EPDS for storage. This may be interpreted data, in the form of horizons and maps, but also data such as well logs or seismic volumes, that have gone through the QA/QC process. The EPDS will capture approval workflows by tagging data items as they are edited and approved. Each data item has a new set of configurable attribute-value pairs that can be tailored to a particular company’s requirements. So that quality levels can be recorded as ‘unknown’ ‘good’, ‘bad’ or ‘QC’d’ etc. Context – such as the source of data (e.g. contractor name) can be tagged. The client-defined system is dynamic and can evolve with time making the EPDS an information data store rather than a database.


The underlying database is ‘POSC-based’. Access is performed at the ‘conceptual level’ using the new Schlumberger EPDS Federator. The Federator makes the EPDS database-independent. This will allow Schlumberger to plug into existing Finder installations with the same technology.

Not Open Spirit!

However, the Federator, even though it uses business objects, is not OpenSpirit. OpenSpirit reads and writes data to applications for interoperability. The EPDS is focused on data management, on ‘rich data objects’ with the possibility of ‘fixing’ data. The EPDS is furthermore designed to be configurable by the end users (or at least by their programmers). One objective of the Statoil CDS project is to be able to define new business objects, without coding, in less than 100 hours. The Federator has already been demonstrated and used to manage data in Finder, OpenWorks, GeoFrame and the CDS. The EPDS will be productized once Schlumberger has met its obligations to Statoil.

Intellex Report GOM data bundle

A pre-populated database of Gulf of Mexico well and lease information is included with the latest release of software from Energy Graphics Inc.

Houston-based Energy Graphics has released a new version of its Gulf of Mexico (GOM) dataset and mapping bundle. Intellex Report V 5.0 (IR 5) provides reporting and GIS capabilities of a range of GOM data types.


Enhancements to IR5 include color lease maps, weekly scout check, improved lease and production reporting, and a new ‘e-zine’ to leverage IR 5’s reporting functionality. IR 5 is delivered with Energy Graphics’ proprietary GOM database, which is updated monthly. Other datasets are available at extra cost.


FieldBase is a new GOM field and reservoir database containing information on over 1200 fields, 8600 productive sands and 25,000 fault-block reservoirs. FieldBase is based on data from the Minerals Management Service, U.S. Geological Survey and the U.S. Department of Energy, as well as technical articles in professional and trade journals.

Texas state

Other databases of well information are available for GOM federal waters and offshore Texas state waters. More from

Geographics Release 2001.1

The latest release of the Discovery Suite from GeoGraphix offers more speed and functionality.

Improvements to system throughput, stability and concurrent access performance are claimed for the new release of the Discovery integrated interpretation suite from Landmark unit GeoGraphix. Tests show a two to tenfold speedup over previous versions of the software.

R 2001 . 1

The R2001.1 release includes over ninety enhancements to the system, ranging from simplified user interfaces to streamlined workflows. The new release also offers new tools for depth-domain seismic interpretation, new interactive log analysis features and improved map-making and editing.


Landmark president John Gibson said, “Over the next few years, we believe the industry will see a shift in the way that independent companies or smaller exploration groups apply geoscience technologies, one that will require easier-to-use, value-based, high-performance systems.” GeoGraphix provides integrated geophysical, geological and petrophysical software for Microsoft Windows. More from

SGI and Landmark

Landmark and SGI are to market ‘Reality Centers’ pre-loaded with Magic Earth’s GeoProbe interpretation software.

Landmark Graphics Corp. is teaming with Silicon Graphics (SGI) to jointly market GeoProbe software and SGI Reality Center visualization facilities. GeoProbe was developed by Magic Earth, recently acquired by Landmark parent group Halliburton.


Landmark president John Gibson said, “Landmark is pleased to partner with SGI in this joint marketing effort. GeoProbe and the SGI Reality Center will help our customers locate new reserves and identify by-passed plays. Customer interest in GeoProbe and SGI systems is exceeding our initial expectations.”


SGI CEO Bob Bishop added, “We have 550 Reality Centers around the world and 120 in the energy sector. The oil industry’s future lies in the visualization of large, complex datasets, and in collaboration to improve decision-making and reduce risk. We are working with Landmark to enable this paradigm shift in the use of visualization infrastructures.”


Under the marketing agreement, Landmark will provide the overall project leadership, application software and software-related consulting services. SGI will provide the hardware, display system and consulting services associated with setting up the environment.

IP award for Foinaven VR tour

A VR tour of BP’s FPSO has won Max & Co. and SAIC an IP award for IT.

Aberdeen-based design and communications agency Max & Co. has won the Institute of Petroleum’s information technology award for its extranet-based virtual tour of BP’s West of Shetland Foinaven floating production storage and offloading vessel (FPSO). Developed in association with SAIC Ltd. and sponsored by BP, the VR ‘tour’ familiarizes personnel, and enhances safety on and around the vessel.


The software offers panoramic, fully immersive photography and interactivity to give a realistic view of the vessel’s environment, enabling potential hazards to be identified and managed effectively. Users can navigate around the vessel and access information for virtual risk assessment and safe work planning.


SAIC’s Ben Tye said “Our relationship with BP encourages identification and application of new technology in support of best practice, particularly in the communication of health, safety and the environment.”

Proxis KM suite from Wellogix

New knowledge management software from Wellogix offers protection against the ‘brain drain’ of retiring experts.

Houston-based e-business software house Wellogix is rolling out its new Proxis Suite of knowledge management tools. Proxis is a web-based set of data source independent applications, designed to improve corporate decision-making and performance through instant, ubiquitous access to stored expertise. The new software facilitates corporate-wide implementation of best practices, policies, and processes.


Wellogix VP Bill Chikirivao said, “Complex industries face a ‘brain-drain’ within the next five years, as veterans depart at a faster rate than new employees can be trained. The Proxis suite mitigates knowledge loss by helping companies transfer knowledge from retiring experts and make it permanent and internally ubiquitous via Intranets. Our new line of knowledge management products and services helps turn knowledge processes into competitive advantages.”


Proxis integrates technical and financial data, making technical-to-business connectivity a reality. The tools also support multiple rule sets to address large organizations in which best practices vary by business unit, geographical region or mission-critical job function.

Hosted solution

Proxis is available as hosted Internet applications or installed as Intranet software within a company’s firewall. Implementation requirements are minimal. A browser and network access are all that is required. More from

Reservoir modeling webcast

Spotfire’s DecisionSite data mining has been used to optimize horizontal well placement.

Spotfire DecisionSite was featured in a webcast by reservoir analyst Richard Reese of Houston-based Oil & Gas Consultants, Inc. Reese showed how Spotfire’s horizontal technology (see PDM Vol. 6 N° 9) helped determine potential wellbore tracks for horizontal infill wells in a producing oil field.

Geocellular model

Using Spotfire’s analytical capability, Reese showed how petrophysical and seismic data, processed and formatted to a geocellular model, is integrated with time-dependent production data, to optimize horizontal wellbores.


Sensitivity analysis was performed at several potential infill drilling locations. The DecisionSite analysis led to a new 2,000 barrel per day oil well in an old field with average production of just 200 barrels per day.

Management buys DPTS from Lason

DPTS’ management has bought the company from parent group Lason International. A joint venture deal with the KPS Group has been struck.

The management of DPTS are pleased to announce that they have acquired DPTS from troubled Lason International Inc. Lason International filed for Chapter 11 protection earlier this month. DPTS’ management team have participated in the purchase and will be continuing in their current roles.


MD Alan Jepson said, “We have funding available to enable us to develop our successful data management software and services. We look forward to maintaining the same high quality service that we provide to all our customers”.

KP Seismic

In a separate deal, DPTS has joined forces with Calgary-based KPS Group (formally KP Seismic) to provide data management services and solutions in EAME region. Services include data cataloging, capture, digitizing, databasing and management services, and proprietary software products. More from

SeiScan to integrate FileTek StorHouse

A new seismic data management system from SeiScan will offer row-level access to seismic data.

FileTek, Inc. has signed a memorandum of understanding with SeiScan GeoData Ltd. for sales and marketing of FileTek’s StorHouse and Cetera products. These will be integrated with SeiScan’s Magma and GeoServe technologies.


The new system stores seismic data on a common depth point/trace-by-trace basis. This offers users direct access to their data, eliminating the need for complex polycut processes during retrieval. In e-commerce environments, speed of data delivery is said to be critical.


SeiScan’s Chairman Robert Pettit said, “We are delighted to bring StorHouse technology into the E&P market place. The system is the only near-line storage solution to provide row-level granularity of access to both geoscientific and commercial data from one repository.”

Upstream IT standards review

PDM brings you a review of standards body activity with exclusive reports from the American Petroleum Institute, the European Petroleum Survey Group, Geoshare, LAS, Open Spirit, the SEG, WITS and of course POSC and PPDM. When we last reviewed standards (PDM Vol. 4 N° 5), we counted six different organizations in the upstream technical and business space. Since that time, one is officially deceased, one moribund, one quiescent and another has ‘gone commercial’. Life in the standards space can be dangerous! But some dot-orgs are alive and kicking whether they are conservative, reflecting well established best practices, or like the XML brigade, cutting edge and speculative. PDM reports on the survivors and the ‘also-rans’.


Tena Allain heads up the American Petroleum Institute’s (API) ComProServ task group which has been established to investigate standards for financial transactions. ComProServ standards support the specification and execution of complex products and services. Allain told PDM how ComProServ’s twenty member companies have come up with XML-based standards for thirteen transactions, along with a common library and schema.

Cement jobs

ComProServ works to describe complex services such as well casing programs or cement jobs at a data and work order level. Such information will then be traded back and forth between operators, contractors and suppliers. ComProServ began by looking at work done in the chemical and electronic industries (RosettaNet) and plans to re-use this work.


The API work has its roots in previous EDI-based standards, used for invoicing of drilling and geophysical services. The EDI standards are free-form text based so the move to XML allows for more structuring of information, more context and metadata to be included. According to Allain, most US majors use the PIDEX EDI standards for services and joint interest billing, but there was less take-up for these standards in the smaller independents. The move to XML may increase take-up for the smaller players.

If it ain’t broke…

The ‘migration’ of EDI based standards that are in everyday use to embryonic XML standards is a highly charged subject. As they say in Texas, ‘if it ain’t broke, don’t fix it.’ Allain said that EDI is still going strong. Many users do not even deploy the latest versions of the EDI standards. The advent of low cost bandwidth has significantly reduced the costs of EDI transactions. On the other hand, XML is not all that cheap – particularly when the costs of an Enterprise Application Integration platform such as Tibco is included in the calculation.


BP’s Roger Lott heads up the European Petroleum Survey Group (EPSG). Lott told PDM how the EPSG started out as an informal meeting of senior survey personnel from the major North Sea operators. In the mid 1980’s, the group pooled internally compiled data on worldwide coordinate systems, which grew into a significant resource, which was leveraged by POSC in the Epicentre data model.


EPSG standards have seen take-up outside of the oil and gas industry – particularly by satellite image specialist SPOT which rolled the POSC/EPSG work into the GeoTIFF standard for geodetically referenced bitmap imagery. The EPSG work has been further leveraged through the Open GIS Consortium, extending the EPSG’s work into the Geographical Information System community at large. The EPSG data set is now a de facto worldwide reference and is available online as an Access data base from the EPSG website.

ISO 19111

The EPSG currently working on a new version of its specification which will become the ISO 19111 geodetic standard. This will be available in the next release of the EPSG database – Version 6.1. The EPSG plays something of a proselytizing role with respect to the GIS industry – in the hope that promotion of the EPSG standards will help its members. Other EPSG work includes the issuance of ‘guidance notes’ – best practices for the positioning business.


Jack Gordon (Conoco), who chairs the Geoshare user group told PDM that “Geoshare is still in use, judging by the 3000 hits per week to the Geoshare website”. Most visitors are looking at the data model itself. Geoshare is a stable data transfer environment with few requests for changes. There have been proposals for an XML-based version of Geoshare, this is still under discussion. Gordon believes that the current Geoshare model, based on the API RP66 protocol, still has merit. The standard offers a sophisticated environment, with built-in support for units of measure (UOM). It is efficient for binary data and is ‘self-documenting.’ But Gordon acknowledges “XML is a wonderful tool – it is great to be able to read the ASCII information.” In any event, future XML standards should build on prior work such as Geoshare’s UOM standards.

Finder to OpenWorks

One significant usage of Geoshare is to connect Finder to Landmark’s Open Works. An Excel mapping of Finder and OpenWorks attributes is available on the Geoshare website. Who uses Geoshare today? According to Gordon, Conoco does along with Anadarko and Burlington. From Schlumberger’s ongoing support for the standard, one suspects that the technology still has currency within GeoQuest, but if there are any big Geoshare projects out there, everyone must be sworn to secrecy! Geoshare is calling for papers for the April PNEC conference (see page 10) which follows the AGM.


LAS chairman Kenneth Heslop told PDM that the version 3 of the Log ASCII Standard (LAS) is currently under consideration by several administrations as the required standard for data submission. LAS 3.0 handles well logs, core analysis, well tests, and deviation surveys. One current LAS project involves creating a new code system to replace the old API codes and content rules. The new log codes incorporate up to date tool mnemonics and will provide for new codes as new tools are developed.

Content rules

LAS 3.0 content rules let government regulators and users specify minimum acceptable content. The LAS ‘Certify’ program will be enhanced to check an LAS 3.0 file to ensure compliance with both the LAS standard and specific content rules. Work on these two aspects of LAS 3.0 will continue in 2002. A Windows version of Certify is now available for download from the website. This version supports all versions of LAS, and has expanded checking and reporting. Future versions of this application will also support content rule compliance.

Open Spirit

Open Spirit CTO Clay Harter gave PDM an update on Open Spirit’s activity. But first we popped the question: Is Open Spirit (OS) really a standard? Harter points out that OS publishes business object definitions on its website. While these “could be implemented” by a third party, Harter admitted that without the OS license and runtimes from Landmark and Schlumberger, this is unlikely to be a practical proposition. The drift away from the POSC business objects standards began when OS developers realized that the division of labors between specification and deployment was not tenable. Harter summed up “‘Standard’ is a bit misleading – OS is rather an ‘interoperability solution.’ We would love it if OS became the de-facto standard for upstream interoperability.” OS is talking to POSC about finding an appropriate way of presenting and describing these aspirations.

Business objects?

OS has ‘borrowed heavily’ from POSC Epicentre and Business Objects and still believes that developing and evolving business object standards should be an open process and that other companies and standards bodies should be involved. This might involve cycling new business objects submitted by OS developers through a POSC-like approval process.

Late binding

OS Release 2.2 introduces new domain objects and extends current objects to the latest releases of GeoFrame and Open Works. This buffers OS users from changes in the underlying data stores thanks to a ‘late binding’ interface. Late binding uses the same techniques as the dynamic link library (DLL), prevalent in the Windows environment.

Future directions

OS 2D grids will be extended to tri-mesh and point sets. Object richness will be extended in the subsurface space and into drilling and production arenas. OS is looking at Microsoft’s .NET technology to leverage legacy (UNIX) data stores. OS is also working with ESRI to interface with ArcView.


POSC chairman David Archer told PDM that 2002 should see a renewal of interest in POSC’s flagship database Epicentre. This centers around work being done for Statoil (see page 3 of this issue) and for the China National Oil Co., big Epicentre users.


POSC’s XML work is ‘maturing’ – with a move to reusable XML ‘components.’ POSC has worked with PPDM on the joint reference values project, resulting in an agreement with PPDM for a cross-posting of related URLs on organizations’ websites. POSC is also working to ‘reduce confusion’ with other workgroups and is sharing personnel with Geoshare. Phase I of the Practical Well Log Standards was completed in 2001. Phase II will continue into 2002 and involves the rationalization of mnemonic data. The Shared Earth Model is now an EU-funded research project, with work being done by the IFP and PDS.

Web services

Reflecting on POSC’s evolving technology focus, Archer said, “We have been through generations of technology – component, virtual data stores, unbundled and distributed architectures. In fact CORBA is still too tight – things need to be unbundled further – especially the stuff that we do not control such as security and authentication. All this calls for a web services approach, leveraging technologies like XML, SOAP and WSFL. Another tendency is the move towards Open Source – we are making the POSC Universal Units Converter available as Open Source.”


The planning process is ongoing, but the current focus is on architecture – the framework – as well as content. Well Header ML is likely to be leveraged by CDA and the MMS. Development continues of Well Log ML (which ultimately should replace the LAS) and Production ML will be extended to incorporate joint venture reporting – this has application both within the enterprise (Shell is looking to use this internally) and for regulatory reporting. Other plans include looking into how a well lifecycle approach could keep track of well data throughout its history. The plan is to use something analogous to an automobile chassis number to uniquely identify a well, and to track associated data as it moves between partners and contractors. More work will be performed internally using web references services to provide a data model dictionary – with dynamic updates of POSC data model changes available on the web. POSC may develop its own Petroleum Industry Data Dictionary (PIDD) with an evolving record of well log mnemonics, and industry reference data. POSC has had a difficult year, especially with a reduced member base due to mergers. These have brought some newcomers into the fold – such as Paradigm. The Indian ONGC also joined in 2001. The relationship with PPDM is aimed at “avoiding duplication and conflict – by ensuring that both organizations are aware of what the other is doing.”


PPDM President Scott Beaugrand told PDM of his success in bringing Saudi Aramco into the PPDM fold. Aramco’s data modeling and application development group is looking to standardize the data model for use with off-the-shelf software. The plan is to ensure that the 50 years of Saudi data will be accessible in a standard, non-proprietary data model.


At the PPDM AGM this year, Woodside Petroleum presented the results of the first ‘real-world’ implementation of the PPDM Spatial I project. Spatial I uses an intersection table to decouple metadata in PPDM form from a generic spatial database. A new PPDM-related project, ‘Spatial II’, has been initiated by ESRI. Spatial II will build a PPDM-based ESRI ‘geodatabase’ for the oil and gas industry. Like POSC, PPDM is working to leverage its data modeling work with XML, and has registered XML data exchange specifications with the repository. Beaugrand believes that there is increasing recognition of the importance of data modeling and that funds available for good projects. Data exchange, software applications and other services based on the PPDM model have reduced user and supplier costs and improved the effectiveness of information technology.

Version 3.6

Activity in PPDM’s core business area of data modeling continues with the release of PPDM V3.6 this month. The new release adds to the seismic and information management modules and enhances the stratigraphy module. New additions to the seismic module provide for lifecycle data management, from acquisition through processing and interpretation, including transactions and brokerage. Data encryption, error tracking and tape copying are now supported in PPDM Version 3.6. Entitlements to data, information and products are described in a new module and cost center summaries and pointers allow users to integrate their business and financial databases effectively.


The Society of Exploration Geophysicists (SEG) Technical Standards Committee (TSC) pretty well lays down the law in terms of how geophysical data gets recorded. SEG TSC chairman Alan Faichney of Concept Systems provided this update. The new SEG-Y format (Rev. 1) incorporates multi-component data and integrates the European Petroleum Survey Group (EPSG) geodetic reference data. A final draft specification was published in the Leading Edge in August 2001. Subsequent to this publication, one astute reader pointed out that the multi-component specification omitted multi-component source data. This issue is to be addressed by the committee, and a final revision will be out shortly. Faichney notes that the ‘tone and structure’ of the new revision has been accepted.

Format review

The Shell Positioning Format (SPS) is aimed at land seismic acquisition and solves the problems inherent in using P1 with multi-component acquisition. SPS is widely used by recording system manufacturers and software companies involved in land acquisition. SEG-D is also under review, although again this mainly concerns equipment manufacturers as all data is now transferred to SEG-Y immediately. The new SED-D will facilitate recording data direct to disk. GXF is a new gravity and magnetic format for gridded data. It has emerged as a de-facto standard and will likely be accepted as a recommendation. Again, the EPSG positional resource will be embedded. Faicheny, as SEG TCS chair, has been co-opted onto the EPSG.


The original Well Information Transfer Standard (WITS) specification evolved out of Amoco’s Critical Drilling Facility, Chevron and Mobil’s Data Centers, Statoil’s Drilling Automation and Real Time and other proprietary initiatives. The original WITS format is binary, efficient but not very portable. John Shields (Baker Hughes Inteq) presented a new initiative, WITSML at the POSC fall member meeting in London. BP and Statoil got together to update the WITS standard for ‘right-time, seamless flow of information’ between operators and service companies. BHI, Halliburton and Schlumberger are participating. WITSML will connect to OpenWorks and GeoFrame through an application programming interface (API). In January 2002 there will be commercial offerings from the major service companies and at least one smaller software house. WITSML is web-based, and follows the W3C XML guidelines. This was a ‘good decision’ in terms of web services - it is platform and language independent. The API can be prototyped with Visual Basic. Shields noted that the logs and real time objects are not very standard XML. WITSML ‘looked at’ POSC WellLogML and tried to be ‘close to’ POSC. The real time specification is ‘self-configuring and self-describing’.

Standards on the web

Log ASCII Standard (LAS)

American Petroleum Institute (API-PIDEX)

Society of Exploration Geophysicists (SEG)

Well Information Transfer Standard


Open Spirit Corporation

European Petroleum Survey Group

Petroleum Open Software Corporation

Public Petroleum Data Model Association

The ‘also-rans’

The Oracle ‘Synergy’ project (PDM Vol. 4 N° 5), once a major Epicentre development, is now deceased, reflecting both Oracle and Statoil’s changing priorities. BizTech (formerly COM) for Energy appears moribund, at least judging from its public website. Interestingly, both these projects were heralded by their protagonists as a new breed of service-company funded standards initiatives with more focus and fiscal stability! On the product data/construction and engineering front, the POSC/CAESAR organization also appears to have gone extremely quiet. This reflects both the difficulty of replacing proprietary technologies by standards and the hegemony of the large construction companies.

This review was abstracted from a Technology Watch report by The Data Room. The Data Room produces around 15 in-depth Technology Watch reports per year, along with a detailed annual summary. For more information on The Data Room’s Technology Watch service email fax +331 4623 0652 or call +33 1 4623 9596 .

Pakistan data management for LMK

Landmark’s/Grandbasin’s PetroBank and Surfer will be deployed to manage Pakistan’s 10 terabyte national data repository.

The Pakistan government has awarded Landmark affiliate LMK Resources the exclusive rights to market and promote all of Pakistan’s E&P data. The five year contract involves the archival and management of 10 terabytes of seismic and well data on behalf of the Ministry of Petroleum and Natural Resources in Islamabad. The repository will leverage Landmark’s PetroBank data management system and web-based Surf and Connect data browser.


Petroleum Minister Usman Aminuddin said, “By implementing PetroBank in Pakistan, we join the select group of countries that have their E&P data available on line. We hope to further boost our oil and gas potential by giving companies fast and efficient access to valuable data.” Landmark acquired a controlling stake in LMK Resources (previously Mathtec) earlier this year (see PDM Vol.6 N°3).


Halliburton president Dave Lesar added “This is indicative of the expansion we are seeing in our data management business, driven by Landmark’s acquisition of PetroBank earlier this year.” G.A. Sabri, head of Pakistan’s Petroleum Directorate concluded “Our relationship with LMK Resources has been one of mutually rewarding technology partners. The self-financing data management model is a success story and can be easily tailored for implementation in other countries to provide efficient and cost-effective access to their E&P data.”

New technology key to petroleum future

A new study from Sun Microsystems and CERA concludes that slowing demand and increasing price volatility, will make for survival of the technological fittest.

A new study, ‘Global Oil Trends 2002’ by Cambridge Energy Research Associates (CERA) and Sun Microsystems claims that “astute application of technology will be a key determinant of success for the oil industry.” Companies will be under continuous pressure to improve their financial performance, as the industry contends with revenue swings of a dramatic magnitude. Technology will be crucial in the drive to reduce costs and improve margins in both the upstream and the downstream markets.


CERA President Joseph Stanislaw said, “Technological progress in the upstream means prospects can be found, and reservoirs produced, that would otherwise be uneconomic or invisible. The rise in upstream costs since 1996 is already being reversed. Upstream costs in non-OPEC countries are expected to fall by an average of 3% per year to 2010, from almost $9 per barrel, to little more than $7.”


Larry Rice, energy sector manager for Sun Microsystems added, “As the industry looks for ways to use technology, improving data and knowledge management are important areas of focus. Technology has increased flexibility in the downstream industry which is responding to changing regulatory and market conditions. Shorter response times and adaptable operations have contributed to significant reductions in operating costs.”


According to the CERA report, world proven reserves of crude oil as of January 2001 have grown to 1,027 billion barrels (79.2% in OPEC countries), up 11 billion barrels from 2000. US proven reserves increased by 0.8 billion barrels, mostly associated with ongoing development in the Gulf of Mexico deep water and on Alaska’s North Slope. A characteristic of the oil market in recent years is exceptional volatility. In 1998 the annual average price of Arab Light in nominal terms was $12.30, but by 2000 it had risen to an average of $26.75, an increase of 117%.

Vendors adopt Open Spirit platform

OpenSpirit has got new support from a dozen software houses for its integration platform and an endorsement from TotalFinaElf’s researchers.

A dozen or so upstream software vendors have announced that they are to develop OpenSpirit-enabled versions of their software. OpenSpirit allows developers to write to a single application programming interface (API) that supports data access to both Landmark’s OpenWorks and Schlumberger’s GeoFrame and most recently, Finder. OpenSpirit’s shareholders are Chevron, Shell and Schlumberger.


TotalFinaElf (TFE) recently lent its support to the integration platform. Philippe Chalon, VP Information Systems for TFE said “TFE has chosen OpenSpirit as the software integration platform for its internal R&D. OpenSpirit lets us deliver the fruits of our R&D projects to our geoscientists in a timely manner. Through OpenSpirit, these can be integrated with the major E&P applications.”


One typical OpenSpirit development is from NuTec Energy Services. Version 4.2 of Nutech’s Prima uses OpenSpirit to link directly to GeoFrame and OpenWorks, improving data access, workflow and integration of NuTec products with other applications.

Organik KM tool for GTS-Geotech

GTS-Geotech has bought Orbital Software’s Organik knowledge management software for its worldwide IT support operations.

GTS-Geotech has licensed Orbital Software’s Organik knowledge management software to bolster support services to its oil and gas clients. Geotech uses Organik to leverage the expertise of its worldwide team of consultants to resolve clients’ IT problems. Organik now forms part of Geotech’s ExPert intranet, giving users access to expert knowledge sources, a Help Desk system and IT and geoscience applications. ExPert lets Geotech apply best practices, reduces duplication of effort and improves efficiency and productivity. Information can be shared internally and distributed externally to customers and partners.


Alin Farah, Geotech MD said, “Our consultants and engineers need quick and reliable access to information in an accessible, comprehensible form. We expect to see demonstrable value within just a few weeks of implementation.” Organik provides a question and answer forum to efficiently share expertise. Patented ‘people-profiling’ technology creates and dynamically updates user profiles and builds an extensive knowledge database that records previously asked questions and answers for future use.


Orbital Software COO Brian Gray said, “This is a clear sign of the value now attached to the management of intellectual capital in global businesses. Organik is set to become Geotech’s primary communications tool.” Geotech, a privately owned group, provides IT support and consultancy services to oil and gas companies worldwide. Headquartered in the UK, GTS-Geotech has operational centers in London, Aberdeen, Houston and the United Arab Emirates. More from

AVS/Express illuminates streamlines

Advanced Visual Systems AVS/Express 6 has been updated with new 3D lighting and rendering. The software now handles massive 2 giga-element data sets.

Advanced Visual Systems’ AVS/Express now offers an “illuminated lines” visualization technique. By combining stream lines and particle animation with advanced texture display techniques, the generic data visualization software blends 3D lighting, motion and 3D flow structure in a single view.

Batch rendering

The handling of transparent objects is improved by a new, two-pass method. The UNIX version offers off-screen rendering for ‘windowless’ batch operation. Data management has been enhanced. AVS/Express 6 increases the size limit for arrays on 64-bit platforms to 2 giga-elements. The latest release also sports a thoroughly revamped MPEG generator, readers for NetCDF and Plot3D formats and a utility for combining series of 2D image files into a 3D volume.


AVS CTO Jeff Tingle said, “This release of AVS/Express, adds features that appeal to the broadest possible user base while continuing to provide cutting-edge new visualization techniques for our most sophisticated users. Our 11-year tradition of industry-leading visual quality and graphics performance has been enhanced with new usability and interface components that continue to provide state of the art visualization techniques to our global customer base.”


AVS/Express users with current licenses and maintenance agreements will receive an upgrade package automatically by mail. More from

Conference diary for early 2002

PDM brings you the essential upstream IT and data management conferences for the first months of 2002.








Feb 11-12


SMi Conferences

E&P Data Management 2002

Steve Warner

(44) 20 7827 6052

Mar 4-6



ESRI Petroleum User Group

Andrew Zolnai

(1) 909 793 2853

Mar 11-13



AAPG Conference and Exhibition

(1) 918 560 2617

Mar 19-21



SIS EU Forum

(1) 713 513 2000

Apr 1-4

New Orleans


SIS US Forum

(33) 1 4600 3800

April 3-4



Landmark City Forum

(1) 281 560 1000

April 9-10



Landmark City Forum

(47) 5183 7000

Apr 15-16



Philip Crouse and Associates

Philip Crouse

Phil Crouse

(1) 214 841 0044

Apr 17-18



Landmark City Forum

(44) 1932 829 999

Marketing deal for ISA and Eurotech

Eurotech and ISA have struck a reciprocal marketing deal for software and services to the oil and gas business in Europe and Australasia.

Eurotech Computer Services (Eurotech) and Integrated Solutions Australasia (ISA) have formed a strategic alliance to offer a range of products and services to their existing customers and the broader IT community.


Eurotech’s oil and gas related IT services will now be available to ISA’s Australia and Far East operations and ISA’s database management products and services will be provided in Europe. ISA’s GeoBrowse GIS-based data visualization tool was introduced to the European market at the EAGE conference this year (see PDM Vol. 6 N° 7).

Seismic Vessels

Eurotech was formed in 1993 to provide IT Services to the oil and gas industry, offering systems and storage integration, tape technology and consulting services. Eurotech also provides hardware and IT support on-board seismic vessels. The company has offices in the UK and Norway. More from

New report highlights terror risk

A report from consultants Utilis outlines the risks to energy and utilities from terrorists and saboteurs. IT and physical infrastructure vulnerabilities are covered.

Utilis Energy, located in New York and London, has just completed a study on Energy Infrastructure Security. While the study’s focus is primarily the downstream/utilities arena, the report has relevance to all sectors of the oil and gas business. The report offers an overview of different threats to the energy business such as terrorism, sabotage and hacking.

IT vulnerabilities

Around 20 pages of the 100 plus page report are devoted to IT threats and countermeasures. A section on cyber security outlines basic system vulnerabilities such as weak password security, poor backup strategies and hacker attacks. An appendix provides additional material and recommendations on matters such as specific Windows and UNIX


The report includes three case studies, including how Cantor Fitzgerald managed to carry on trading in the aftermath of the attack on the World Trade Center. The report is available from Utilis at a cost of $995. More from

People on the move

This month’s movers hail from A2D Technologies, DigiRule, the API, Petris and FileTek.

David Armitage, founder and former CEO of GeoGraphix has joined A2D Technologies as CEO. In 1994, following Landmark’s acquisition of GeoGraphix, Armitage founded Qubit, a consumer technology firm providing innovative Internet access devices for the home.


Adel Takla has joined DigiRule’s sales staff as borehole seismic representative. Takla also runs his own company – BSC. More from


Lee Raymond, chairman and CEO of ExxonMobil has been elected to a one-year term as chairman of the board of the American Petroleum Institute (API).


Petris Technology has appointed Walt Rosenbusch as VP of e-Government. Rosenbusch was previously Director of the Minerals Management Service (MMS) at the U.S. Department of the Interior, where he managed the federal mineral revenue and OCS leasing and administration programs.


Jeffrey Maskell has been appointed manager of oil and gas client services by FileTek, with responsibility for the SeiScan GeoData relationship (see page 5). Maskell previously worked with DPTS, Petroconsultants, and Petroleum Exploration Computer Consultants.

Seismic Unix R 35 out

The Center for Wave Phenomena at the Colorado School of Mines has released V 35 of its Open Source seismic processing software.

The Colorado Center for Wave Phenomena has released the 35th version of its Seismic Unix (SU), Open Source seismic processing software. The latest release of SU includes modules for wavelet transform analysis, modeling, dip move out and migration in transversely isotropic media. New methods of handling three component data, a genetic algorithm for static correction and 3D common offset migration code have been developed. Contributors to the open source code project hail primarily from the Center for Wave Phenomenon, but also from many other universities and commercial organizations such as Talisman Energy, the USGS, PGS and Landmark Graphics. More from the home of Seismic Unix, the Colorado School of Mines,

CGG center showcases gigabit network

PDM visited CGG’s new Redhill processing center where a new high performance 100 terabyte storage area net from LSI Logic feeds data to CGG’s 1000 processor PC-Cluster compute engine. Data flows over what is claimed to be Europe’s first 2GB/s network.

PDM visited CGG’s new processing center in Redhill, UK for the roll-out of a new, high performance storage system supplied by LSI Logic Storage Systems. LSI’s Metastor enterprise storage system, as supplied to CGG, comprises 100 terabytes of disk storage along with what is claimed as Europe’s first 2 Gigabit Fiber Channel storage area network (SAN).


CGG’s IT Manager Laurent Delorme explained that performant seismic processing at Redhill increasingly relied on high-speed disk-based data storage. The majority of CGG’s storage is now LSI, but CGG also deploys disks from Sun, SGI and Network Applications. CGG appreciates LSI because ‘they know the geophysical business.’ CGG also benefits from direct access to LSI – without going through a distributor – gaining early access to, and influence on, new technology.

Never ending

Storage optimization, the ‘never ending story’ is part of an ongoing effort to enhance and optimize CGG’s productivity. Today’s seismic acquisition typically collects around 5TB of data in one project. During data processing, this data volume expands to around 50TB per project. Such intermediate data was previously stored on cartridges, but is increasingly moving to disk through hierarchical storage management systems. CGG may have as many as 50 projects current at any given time – representing around 2.5 Petabytes of data. A single project is collapsed to around 0.2 TB once processed and delivered to a client for interpretation on the workstation or visualization center.


To improve load-balancing, reduce data spooling and improve fault tolerance, CGG’s IT architecture is decoupling storage from the compute engines. This is leading to a migration to a new Distributed Migrated File System (DMFS) paradigm. Interoperability of Fiber Channel hardware remains an issue, indeed the whole field of distributed file system controllers is still in its infancy.


The current CGG solution relies on large volume Fiber Channel disks and a SAN Fabric using switches from Brocade and Q-Logic.


Today, CGG’s 1000-processor PC-Clusters are only used for ‘embarrassingly parallel’ code. Tomorrow, the DMFS architecture will move more and more code from expensive NUMA machines to the clusters. CGG claims 7 teraflops of processing power worldwide.

Virtual network

CGG offered an amusing insight into its high bandwidth, transatlantic ‘virtual network’. Sending a 5TB processing dataset over the ‘pond’ is still prohibitively expensive. The solution adopted by CGG to move BP’s seismics from Houston to Aberdeen? Send a rack full of disks via DHL!

Conoco to create project archival system

Conoco is building a project archive management system for its worldwide exploration effort. Prime contractor is Exprodat, which is also developing an interface between the archive and Conoco’s geographic information system.

Conoco has selected Exprodat Technology Inc. to develop a Project Archive Management System for use within Conoco’s exploration departments worldwide. Exprodat is to build a web-based project archival system which will manage information from multiple applications, databases and documents. Archives may be created in either vendor-specific native formats, or in industry-standard neutral formats.


Pat Meroney, Solutions Analyst for Conoco Inc. “We wanted a solution that addressed the business value of the archive, not the IT side. In addition, we get the option to archive in neutral formats, which gives the archives a potential lifespan of decades. Systems on the market today focus on copying bytes from disk to tape.”


Exprodat president Bruce Rodney added “Metadata is the key to this project and our goal is to create rich data descriptions, so that Conoco will never have to restore a project just to see what’s on the tape”.

GIS integration

The system will integrate with Conoco’s Geographical Information Systems (GIS) so that archived projects can be browsed spatially. The distinguishing requirement of the system is the rich business description of the project’s contents that remains online once data is archived.


The Project Archiver is being developed for deployment within Conoco during the first quarter 2002. The application will be commercialized for sale to third parties later in the year. Exprodat’s US unit Exprodat Technology Inc. was set up to leverage Exprodat’s technology, originally developed for the Web Open Works data browser, WOW. WOW was sold to Landmark Graphics Corp. earlier this year. More from

© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.