IQPC Knowledge and Data Management Asia

Data management successes from Petronas, ADNOC, Shell, ADCO and Total Indonesia.

Che Zan Yassin described Petronas’ knowledge management (KM) initiatives spanning upstream and downstream (petrochemicals). These include a magazine, ‘The Pipeline,’ a website portal for the domestic Petroleum Management Unit, a KM familiarization program, communities of practice (COP) and lessons learned reviews (LLR). These share failures as well as successes. Techniques learned from Rolls-Royce have led to interviews with experts by KM facilitators—these are recorded and put online. Training is crucial since 42% of Petronas employees have under three years experience. KM success is evaluated by LLRs on the projects themselves and by tracking website click-through and video usage and from feedback from monitoring and coaching. Lotus Notes is the basis for most current KM work although other tools are being evaluated.


ADNOC’s E&P Information System (EXPRIS) was presented by Mohamad Abu El-Ezz. ADNOC wanted a common data model for the upstream and selected Schlumberger’s Finder back in 1994, a ‘pioneering project’, believed to be one of the first to achieve such broad coverage in a single database. EXPRIS’ design principles are to ‘honor business rules, honor the earth and honor reality.’ As Finder matured, there was less need for customization. In 1999, ADNOC moved to a standard version of Finder, reducing customization and interfacing with Business Objects, OFM and FinderWeb. To date over 90% of ADNOC data has been captured and preserved. Data preparation time reduced from 4-6 months down to weeks.


Paul Helm (HP), a geophysicist by training, has been ‘moving downstream’ and into the digital oilfield. Here a single SCADA system can generate 10,000 alerts per day. These are impossible to handle as such, so neural net or stochastic processing is performed on the data to predict failure etc. Helm says this is established technique but ‘we are only now starting to get the value’. In one case, an EU gas producer was paying $15 million/year in tax on back allocated gas volumes. The simple expedient of a $300k meter eliminated the discrepancy, and the tax bill! One of HP’s supermajor clients avoided replication by leaving its data in the Historian. Helm advises that such solutions are ‘brittle’ and recommended an online transient data store, built around its HP’s ‘Zero Latency Enterprise’ (ZLE) technology, a joint venture with CapGemini. ZLE Hubs store enterprise info which is then accessible through web services publish and subscribe mechanisms. ZLE leverages EAI middleware including 2EE, Tuxedo and CORBA. HP itself is a keen consumer of KM and actually mandates use of Knowledge Systems by management. ‘Brute force is the only way to ensure take-up.’ This comes in the form of the year-end appraisal where ‘metrics drive behavior’. A neat KM example involves background scanning of corporate email to produce a ‘knowledge map’ showing, for example, an RFID ‘cluster’ of people who make up an informal social network. Helm notes also that various SCADA protocols are increasingly replaced with wireless IP.


Femi Adeyemi described how Shell has bought in to the digital oilfield as a key component of its operational excellence model. Shell is ‘going global,’ with process, workflows, apps, data management and is looking to standardize infrastructure. A big effort is put into information quality with a virtual activity group for data management.


Zinhom Wazir showed how ADCO has enhanced the interface between Schlumberger’s Finder and Eclipse to support consistent terminology. ADCO noted a lack of ‘standard operating procedures’ (SOP) for the model building processes. The Finder-Eclipse interface covers well data, PVT, SCAL and 3D geological models. A separate Production Injection Database (PIES), also used to build Eclipse models. ADCO plans to extend the interface to include RT data and Maximo etc.

Total Indonesia

Sugimin Harsono told how following Total’s thirty years operations in the Mahakam delta, its data situation was getting out of hand. Multiple databases meant it was not easy to know which one was right. Data known to be bad was not deleted. Data management was perceived as ‘dull’ and there was little corporate awareness of the problems. Total initiated a ‘triple C’ approach, communication, cooperation, and consultation in its Target 3000 data revamp. Total’s production data management system (PDMS) is migrating from an in-house solution to a Schlumberger-based package. This leverages FieldView, Finder, OFM and other components. A FieldView to Finder (F2F) link was written for the project. The hub of the system is Schlumberger’s DecisionPoint, used for reporting, data loading (with ‘portlets’) and to configure personal workspaces. The PDMS is a component of Total’s Operational Excellence Information Systems that will help monitor processes, identify cause of failure and deliver continuous improvement.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.