15 years of PNEC—Part 1, 1997-2001

Full PNEC report next month. But meanwhile we thought we’d celebrate the 15th anniversary of Phil Crouse’s data management conference with a three part walk down memory lane. Part 1 covers interoperability, outsourcing, ‘back door’ data management, e-commerce and application service provision and GIS-based data management. It all sounds rather modern in fact!

1997 Geoshare, although Schlumberger-sponsored is a truly open environment. Development kits can be obtained by any third party for a modest sum. The essence of Geoshare is that an application can output or receive data through a ‘half link,’ and an application-independent data model. Geoshare makes sense in today’s ‘best of breed’ multi-vendor environment.

Of course Geoshare is not magic. Conoco’s Jack Gordon emphasized the care necessary to ensure data integrity during transfer especially with topographic datum shifts, and when two applications have a fundamentally different view of data representation. The deployment and regular use of Geoshare is not for the faint hearted.

Cindy Pierce described the outsourcing program underway in Conoco. Conoco’s radical approach involves not only outsourcing the task, but the people too. Full time Conoco employees are taken on by the contractor (GeoQuest). The new look E&P department no longer considers activities such as data management as core business.

Janet Rigler (BP Houston) debunked a widely held belief concerning the way the E&P asset team does its business. The shared earth model should enable interpreters to include new information as new wells are drilled. Well, it doesn’t work like that. The limiting factor is data management. Applications are licensed on different machines and data must be moved around constantly.

1998 Mark Robinson (GeoQuest) estimated that around 90% of the oil produced today is managed through Excel!

Marion Eftink (Unocal) introduced the concept of ‘back-door’ data management. In theory, data should be managed from the instant it arrives in the exploration department, catalogued and cleaned-up so that everyone calls the same thing by the same name. In practice, power users in the asset teams often get first crack of the whip, and before you can say ‘data management’ there are multiple copies of the data, with different naming conventions in just about every system within the company. Enter ‘back door’ data management. Instead of attempting impossible policing of data up-front, Unocal’s system uses a GIS front end to the various data stores and implements a system of cross-referencing different appellations through look-up tables.

1999 Rene Calderon presented a novel product, Petris Winds—a system that is configured to browse a corporation’s data in situ across a variety of data stores. Winds builds its own meta-data view of the enterprise by ‘spidering’ the data overnight.

Gayle Holzinger (Shell) described the deployment of PetroBank for offshore 2D data delivery to the workstation. This has cut the data loading cycle time from ‘up to’ six months down to a couple of days.

2001 PetroWeb’s David Noel traced the evolution of thinking on E&P data access, from data models to the present situation of multiple databases and multiple clients. Accepting the status quo, Noel advocates the use of ESRI’s ArcView as ‘doing a great job of integrating internal and external data sources.’ Non-proprietary map layers and external databases can be managed by third parties.

Vidar Andresen traced PetroBank’s history starting as IBM’s development for the Norwegian national data repository, Diskos, to its deployment, via PGS, at sites all over the world. Landmark is now re-branding PetroBank, which also underpins the ‘Grand Basin’ e-business unit.

Geodynamics’ GIS-based data management system was the subject of no less than three presentations, featuring authors from Kerr McGee Oil and Gas, BJ Services, and Enron. Enron’s Mark Ferguson described how GIS was used to match pipeline capacity to demand. Enron has integrated Petrolynx GIS software with security and access control through Sun’s Java Start plug-in and leveraging Tibco’s ‘hub’ middleware.

Innerlogix’ Dag Heggelund’s thesis is that the ‘next generation’ of web applications will be built using an XML-based object technology combining data and stylesheets for data translation into and out of proprietary environments. Heggelund downplays the importance of standards, saying these would be ‘nice to have’, but are unlikely to materialize. The key is the XML/XSL technology that allows for interoperability in a non-standard world. SOAP is also significant in that it circumvents the battle of the CORBA, COM and Java orbs. Heggelund quotes—’well logs should have portable web addresses,’ ‘today’s ‘fat’ applications know too much,’ ‘a data item should not know where it lives, have no knowledge of who uses it and not know where it comes from,’ and ‘Bill Gates is not going to implement Open Spirit!’

In the discussion on application service provision (ASP), Geonet’s Bill Micho described ASP as a ‘tough marketplace.’ Geonet provides ASP for some 50 applications. GeoQuest also offers an ASP-based monthly subscription service to its products, and Landmark’s Grand Basin subsidiary does likewise, although Graham Merikow said the move to ASP was not like ‘flipping a switch’.

Addressing the integration question, Bill Quinlivan (GeoQuest) observed that ‘life is ten times harder for a receiver than a sender of data.’ Also the difficulty of a solution might be unrelated to the urgency of the problem. Geoshare, Open Spirit etc. may be ‘hammers looking for nails.’ Quinlivan advocates evaluating a link development project by comparing cost of development with cost of use. This is represented as a ‘cost-performance frontier’. Further optimization is achieved by arbitration and selection of cost effective link technologies according to constraints such as development budget and/or cost of use. Visit PNEC on www.oilit.com/links/1105_43 and checkout The Data Room’s coverage of 15 years of the show on www.oilit.com/links1105_44.

This article originally appeared in Oil IT Journal 2011 Issue # 5.

For more information or to comment on this topic email here.