PNEC 2006, Houston

The Petroluem Network Education Conferences (PNEC) Data and Knowledge Integration Conference continues to thrive, with some 370 attendees this year, almost half oil company employees. The highlight of the 2006 PNEC was Nancy Stewart’s (Wall-Mart CIO) address (see this month’s editorial). Kerr McGee (now Anadarko) has been working on unstructured data management and using AJAX technologies to enhance its users’ experience. Shell continues to enhance (and measure) data quality. More metrics underpin Burlington Resources’ (now ConocoPhllips) application portfolio rationalization. Finally, OpenSpirit has been quick to jump on the Google Earth bandwagon and offers the popular GIS front end as a data browser for OpenSpirit-enabled data sources.

Paul Haines reported on Kerr McGee’s program to realize the value of its unstructured data. For Haines, ‘There is no magic in data management, it’s just work and it can be hard to quantify its ROI’. Kerr McGee’s data management vision is of a single doorway to data. Priority is given to unstructured data. A standards-based high level corporate taxonomy has been implemented and roles and responsibilities assigned to business users and data ‘gatekeepers.’ Data acquisition goes through the gatekeeper before archival. The corporate taxonomy is held in OpenText’s LiveLink. Moving files to the EDMS has benefited from the standard taxonomy, search, version control and document management. This has positive spin-off regarding Sarbanes-Oxley and records and information management (RIM) compliance. A Kerr-McGee developed application, WellArchiver, manages well files and metadata capture. Search and retrieval leverages WellExplorer (Geologix) and NitroView (Exprodat). The ILX Viewer (InnerLogix) is also used to spider the Kerr-McGee repository for well log files and also provides access to CoreLab’s off-site data store.

Nexen

Wes Baird (Data Matters) outlined Nexen’s ongoing data management project. The data and process landscape shows up many point to point connections. Proprietary data is used in many areas and there is a lack of consistent business rules. The plan is to move from ad-hoc processes to scheduled processes with shared data leveraging the Carnegie Mellon capability maturity model (CMM). This starts with interviews to capture business rules. Early results show a ‘data hell’ of in-house PPDM, OpenWorks and IHS data stores. A ‘reference hell’ with inconsistent naming, a ‘security hell’ of access constraints (tables, rows, roles). Baird described similar hells for interface, process and maintenance. Baird’s solution involves data ‘chains,’ simple tables showing data, schema, server, process, roles and responsibilities and what links to what. Nexen has now implemented a PPDM data model, and has monthly meta data capture and is in the process of capturing business rules and linking everything together into a ‘repository ready for questions.’

Philip Lesslar, Shell

Managers and users lack feedback on data quality problem severity so Shell is going for a single rolled-up data quality KPI per organizational unit, enumerating the problem of declining energy level as data goes through its cycle. Various dispersed data quality efforts were grouped into Shell’s EPiQ project which resulted in the development of Shell’s IQM tool. IQM offers query management, procedures and global metric sets, developed with local businesses to ensure take-up. EPiQ shows color coded quality metrics along with trend indicators. The project is aligned with Shell’s global standard taxonomies. Change management remains an issue.

IHS

Steve Cooper (IHS) outlined the conclusions from a recent survey of 50 IHS clients which found that data volumes are doubling every 6 to 12 months. Customers are creating master data stores using PPDM 3.7 which is emerging as the standard data model for industry world wide. Data movement is being revolutionized by web services, a ‘game changer,’ and data exchange standards developed around xml. These can be simple wrappers to existing data servers that let customers go in and grab just what they need. Cooper gave the example of the Accenture xIEP applet developed around SAP’s NetWeaver. ‘Business process mapping, workflow engines and the IM framework are coming together.’

OpenSpirit

Clay Harter (OpenSpirit) asks, ‘Could Google Earth (GE) be used in the oil and gas business?’ GE Enterprise rolls in shape files and raster images through Fusion which blends the GE database and data on in-house servers. Google’s KML is an XML format for GIS data – points, polylines etc. This can be from static or dynamic sources. A zip version (kmz) embeds raster imagery. OpenSpirit (OS) has leveraged its integration framework to tie into GE by dynamically creating kml/kmz from OS sources that can be consumed by GE. Harter showed a movie demo of GE in action and a new OpenSpirit Web (OSW) product. OSW browses OS data sources in lists as html. A button allows for KML creation and visualization in a GE client. OS does the transform to WGS84 (the map projection assumed by GE). GE fills the need for a light weight easy to use 3D browser. GE can be used as an OS data selector.

AJAX in Oil and Gas

David Terribile described Kerr McGee’s use of ‘Web 2’ technologies – a.k.a. AJAX to tie its diverse data sources together. Kerr McGee has built a master data store, a cheap, simple structure to expose master well headers with pointers to raw data locations. One ‘quick win’ application lets a user enter an API number and retrieve the corresponding DIMS UWI. Kerr McGee leverages Oracle Dynamic Collections (ODC) and AJAX components for ‘serious’ data management. An example of an AJAX component is the DbNetGrid which with ‘6 lines of code’ makes a highly interactive user interface. The grid component is deployed as a front end to Kerr McGee’s geopolitical database and used to filter queries with drop down pick lists, pre populated from standard values. Kerr McGee’s WellArchiver and WellExplorer apply a similar philosophy with widgets for printing and export to HTML, Word or Excel. AJAX has been a key enabler to add GUI functionality and to fire-up other apps in context. Asked about where Kerr McGee was on the ‘buy not build’ scale, Terribile answered that this project was more of a configuration exercise, ‘In fact there is less configuration here than for a ‘normal’ GeoFrame installation. We’re not AJAX/CSS experts. The displays come out with a professional look and feel because of the components.’

ConocoPhillips

Dan Shearer, (Burlington Resources (BR), now part of ConocoPhillips) challenged its IM specialists to raise geoscience productivity by 10%. Multiple corporate acquisitions had caused a software application explosion. A Global Tech Review (performed with Landmark) added some software to fill gaps and turned off maintenance on specialty software. Savings were put into a kitty for subsequent lease of specialty software in ASP mode if needed. OpenIT’s application usage monitoring technology, showed Burlington that although folks said, ‘we use that package all the time’ it was actually last used nine months ago! Maintenance was reduced by 36% in 2003 over 2002 with a reduced data management effort. Application status was classified in terms of currency of use. Geoscientists can now ‘shop’ from Burlington’s own list of 250 approved applications. Burlington has evolved from a ‘cost conscious cult’ to a ‘disciplined value cult’. This has targeted shortening project lifecycles with a 3D earth representation and by preserving analyses. A study can then ‘pop up’ if a subsequent oil price rise makes it economic. Burlington sets aside $14,000/year/geoscientist for training, has joined Nautilus and hosted a ‘creative solutions conference.’ A data SWAT team has been formed composed of 50% geoscientists and 50% software engineers.

This article has been taken from a 10 page, report produced as part of The Data Room’s Technology Watch reporting service. More from tw@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.