SMi E&P Data and Information Management, London

Around 80 attended the 9th SMi E&P Data and Information Management Conference in London early this month. We report here on the importance of data in ExxonMobil’s IT and the true cost of a terabyte of disk storage; a customer survey by Schlumberger; Landmark’s timid introduction of new E&P middleware bridging the structured/unstructured data divide; BHP on early experiences with Google’s Enterprise Service for Oil and Gas; Brunei Shell on data overload from the ‘smart field;’ Petrolink’s CADI database for Pemex and a look to the future of e-commerce from OFS Portal.

Steve Comstock (ExxonMobil) polled the audience to ascertain that it was split roughly 50/50 between IT and the business. ExxonMonil views data management as an IT/Business partnership. Multiple single point solutions must integrate an enterprise framework to enable data sharing. XOM’s upstream company also has to integrate with its global IT. A PriceWaterhouseCoopers study found that ‘62% of data is ‘managed’ in Excel spreadsheets by end users.’ This means that there is no enterprise level control over data which ‘may have reporting implications.’ For Comstock, data is so important to the business that it should be ‘elevated to the same level as reserves.’


In ExxonMobil, global IT (GIT) has a say in infrastructure, applications, data architecture and support. Exxon’s Common Computing System (CCS) comprises network, platform (Linux, Unix, the new Vista ‘challenge’), applications and ‘user experience’. The CCS ‘has data at its core.’ Companies should beware of ‘data management amateurs.’ Comstock encourages the training of career professionals and the education of non data management professionals in proper ‘data hygiene.’


The Society of Petroleum Engineers has recognized IT as critical part of the business and intends to formalize IT training. Today, at Texas A&M all you need to get an engineering degree is Fortran and Cobol. The result is that some people coming into oil companies ‘don’t know what they are doing.’ All of which contributes to ‘data entropy,’ with the Excel spreadsheet as part of the vicious circle. Poor management is a ‘tax’ on the organization. Good management delivers a data ‘dividend,’ allowing the business to make money from its data asset.


Online geophysical data is growing in leaps and bounds. While one talks of Terabytes online, the cost of hardware is a ‘drop in the bucket’ compared to what it takes to manage and serve the data. It may only cost $50,000 to add a terabyte, but it takes $500,000 per year to manage it! Other data issues are self-inflicted. A recent North Sea audit found 250,000 copies of the same business data on the LAN! Comstock noted XOM’s $10-30 million SAP spend saying, ‘you could do a lot with this in the upstream!’


Donna Garbutt (Schlumberger Information Solutions) reported on a survey of SIS clients that found the main production data management application to be Microsoft Excel. SIS sees efficiency gains by moving to a ‘true’ production management system. There is a break in data flow from the SCADA world to longer term business requirements. The survey also found that production monitoring, water injection, deliverability analysis, well test, production loss monitoring, surface water handling, artificial lift, sand, gas lift and zonal allocation were all ‘priority problem workflows.’ In the Q&A, SIS was taken to task for ‘not understanding production accounting and advanced process control.’ Garbutt countered, ‘We are very active in these areas—particularly with ProdML and the TietoEnator pilot.’


David Holmes described Landmark’s research into unstructured data management, portals and data integration. This has determined that security is critical. Holmes expressed surprise that customers tolerate multiple identity management offerings, preferring a corporate identity management model. Holmes criticized the Google Appliance which works fine on a file server. But if you point it at OpenWorks or GeoFrame and you’ll find that it doesn’t understand entitlements. So anyone with access to Google can suck out the whole database! Landmark’s research concluded that vendors need to create ‘hooks’ for integration of seamless integration of unstructured data.

New middleware?

Vendors need to adopt a truly open approach and provide interfaces to their datastores, middleware and applications. Notwithstanding the ongoing beauty contest as to who is ‘openest’ of them all, vendors (‘that includes us’) need to be called to account. Holmes unveiled Landmark’s new middleware offering that targets the data integration problem. Here, middleware exposes services available from published SDKs. Enterprise services provide security and ‘contextually aware’ search of upstream data. ‘Standards-based’ identity management also ran. Java APIs provide common interfaces to data stores.

Pemex’ CADI

Nick Baker described Petrolink’s work with Pemex’ Pozza Rica Altimara (PRA) asset offshore Veracruz. Petrolink’s drilling and production database and management information system has been customized and localized as Pemex’ ‘CADI’ and is used as a reporting and data quality assurance preprocessor for data loaded to Pemex’ corporate @ditep data store. Petrolink was translated into Spanish by local consultants with industry experience. All data types are entered for full reporting, metadata is added in a controlled workflow and the results served all to stakeholders. Intranet and extranet access is available for home/travel and entitled third party access. CADI runs on a Windows 2003 Server with Lotus Domino Server. A folder structure lets managers look at a workflow for rig scheduling, and see available executive reports, flight details, production reports and sales. CADI is now to extend its coverage the whole Norte region.

BHP Billiton

Katya Casey described BHP Billiton’s integrated working environment built around a spatial data infrastructure, getK, a Documentum-based knowledge system, Landmark’s Engineering Data Model and ‘eWell,’ a Schlumberger Decision Point application for aggregation of well related information. Casey noted that all vendors say they are ‘integrated.’ But this can mean different things like visual integration on a 3D canvas, a Portal, GIS-based integration of sparse data.


GIS is the ‘best thing that ever happened to us’ and has been deployed in BHP’s ‘EarthSearch’ Portal that attacks all the above data sources leveraging SOA-based middleware. This leverages a taxonomy-driven E&P metadata layer which embeds business rules for critical information. BHP is planning middleware for its entire application portfolio, working with OpenSpirit and other vendors to extend the middleware footprint. BHP is an early adopter of Google’s enterprise service for oil and gas which has been found to offer good scalability. Google Earth lets you blend web services feeds and sits at the top of BHP’s stack alongside SAP and enterprise search.

Brunei Shell

Femi Adeyemi described Brunei Shell’s Digital E&P Business which supports high end smart, ‘snake’ wells, real time data integration, remote monitoring and valve control. Shell’s smart field ‘value loop’ brings challenges of data overload—engineers are again spending more time accessing and manipulating data rather than ‘real surveillance’—analysis, interpretation and action. Security is an issue in the real time data environment and a high tech firewall separates the control system from the office domain. Shell’s data driven models use real time data for 24/7 surveillance and optimization. Exprodat’s Information Quality Metrics (IQM) tool is used for data QA. IQM’s asset data ‘traffic lights’ have raised data quality’s profile for many Shell assets.

OFS Portal

According to OFS Portal’s Bill le Sage, upstream oil and gas spent $265 billion in 2006. E-commerce has had a tough beginning in oil and gas. Le Sage thinks that we are now at ‘the end of the beginning.’ It has been a lot harder than anyone thought but today, PIDX standards have really taken off. For the oil and gas supply chain, data means catalogs—but these are rarely used for sourcing in the upstream. One advantage of PIDX XML documents is that they allow capture of transactional (spend) information. Today almost all business transactions are PIDX standard documents.


OFS Portal is now focusing on ‘eIPP’ electronic invoicing, procurement and payment. In the upstream, the real value is in complex products and services. One supplier catalogs 800 kind of drill bits. SAP can’t handle ‘non explicit’ purchases which may be only a couple of lines of text. OFS Portal is working with SAP on such issues. Le Sage sees big wins from spend analysis and reduced invoice processing costs.

This article has been taken from an 11 page report produced by The Data Room’s Technology Watch program. More information and sample reports from

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.