PNEC Data Integration 2013, Houston

Chevron’s ‘next generation’ data management. Shell on ‘lean’ data management and harmonizing applications. Noah, EnergyIQ, ExxonMobil, IHS, on data specifics in the non conventional space. ExxonMobil on regression testing the data asset. XTO on SOX. Professionalizing data management.

The 17th edition of the PNEC data integration conference was introduced by Cindy Crouse with a short reminder of PNEC founder Phil Crouse’s life. Cindy and the team have proved able conference organizers and managed a successful and well attended conference with a record 580 head count and some 43 presentations.

Chevron’s Jennie Gao outlined the requirements of a modern upstream interpretation shop. These include data integration between multiple geoscience applications and datastores. Top priority is data access between Petrel, Epos and OpenWorks R5000. However commercial tools only support selected data types. File exchange is limited as vendors offer more ‘import’ than ‘export’. It is challenging to keep data in sync between these three main core applications. OpenWorks is used as a master data store for well data and ‘gold standard’ interpretations. Data is pushed out to Petrel and Epos. OpenSpirit Copy manager and Schlumberger’s Innerlogix are deployed. Chevron has developed a complex system to handle well symbology and coordinate reference systems across the different platforms. TOAP, Tibco OpenSpirit’s Petrel plug in, provides bi-directional exchange with OpenWorks. The solution is maintained by Chevron’s ‘next generation’ data management team with help from Larsen Toubro Infotech. A wry comment from the floor suggested that, ‘25 years on, and we’re back to Geoshare!’

Karl Fleischmann (Shell E&P America) has been investigating what technical data management has to learn from manufacturing. The ‘lean’ approach to manufacturing can be applied to upstream data. Much of today’s work is unnecessary—data quality standards may be set too high and data (especially seismic) is in ‘overproduction,’ as the same work is repeated. As an example, Fleischmann cited Shell’s attempt to build a ‘well book,’ an overview of a set of wells in a field. This used to be an ‘incredibly arduous task’ taking 6 months to create a 600 page book for the North Sea Nelson field. To IT this seemed like a good candidate for automation à la manufacturing. But this was harder than it appeared. To properly understand the process, Shell embedded a data management group in the Nelson asset which helped compile the well book, hand collating documents etc. until ‘they really understood the process.’ Next, ‘value stream mapping’ was used to high grade process improvement opportunities and implement a first pass improvement with better project management, deployment of a corporate data store, but as yet no automation. The results were spectacular. Where previously it took five employees six months to handle one field, the same team managed 74 fields in half the time using a ‘highly scripted’ process. The scripts, which are recipes for standard procedures rather than computer code, make sure that (only) key data is captured quickly. In the Q&A, Fleischmann said that the process was good at identifying elements of the workflow for further automation—a concept that has support from Shell’s CEO.

Shell’s data guru, Matthias Hartung emphasized the need for ‘trusted data’ across many domains, from geotechnical, through HSE, emergency response and compliance. But current schedule-driven culture means that while there is no time to ‘do it right,’ there is always time to do it over! The result is that data management is on the maturity level it should be. This is impacting novel forms of exploration and development like non conventional with a very rapid cycle. Data management is lacking in standards and data managers’ career paths are unclear. Yet there are sustainable solutions and successes—Hartung cited the Standards leadership council and advocated data management education and accreditation. In Shell, technical data management is now established as a globally managed technical discipline and Shell is developing or harmonizing its data standards applications and architecture. Hartung suggests two urgent steps. One, professionalize data management and turn it from ‘Cinderella’ to an enduring beauty. Two, develop an academic curriculum including agreed-upon standards and data ownership.

Judging by Fred Kunzinger’s (Noah Consulting) presentation, data management in the unconventional space is more of an opportunity than a problem. Unconventional is different. Developing a shale play mandates an integrated approach unlike the old exploration, development and production stage gates. Challenges for unconventional exploration center on supply chain and logistical efficiency—think Six Sigma and ‘lean’ as above. Operators have to schedule multiple operating rigs, managing a complex patchwork of land holdings and truck water and produce the oil and gas. Unconventional exploration is also forcing operators to abandon information silos in favour of integrated, real time systems. All the usual gotchas of data management, differing well identifiers and nomenclature need to be fixed to enable the new ‘assembly line.’ Unconventionals represent a true ‘big data’ problem with great benefits to be gained from integrating large data sources to identify sweet spots and optimize completions in a continuous improvement process. A relatively small investment in IT can provide a significant return for a company drilling 1,000 Bakken wells per year at around $10 million a pop.

EnergyIQ’s Steve Cooper, in a joint presentation with TGS/Volant and Perigon, outlined a project performed for another unconventional player, Hess Corp. Again, speed is the name of the game. Data loading has been partly automated. ‘Drop boxes’ are used for QC before loading to Hess’ Technically validated database (OITJ February 2010). Perigon’s iPoint toolset and TGS’ Envoy viewer also ran.

Gbolade Ibikunle returned to the ‘lean’ manufacturing approach to data management in a presentation of Shell Nigeria’s ongoing integrated drilling and production review project which leverages Shell’s well book concept (see above). Data from multiple sources, Recall, Landmark’s EDM and OpenWorks and Schlumberger’s Oilfield data manager are all linked in. Workflow control leverages Adobe digital signatures for document sign-off. Deployment has involved a mixture of encouragement (with branded T-shirts), enforcement and annual review. Engineers now spend much less time looking for data and production is up. Quality data and good collaboration between developers and users were keys to success.

Many North American unconventional plays are found in proven oil and gas provinces with a long, richly documented history. Extracting information from such historical data can be challenging. As Stephan Auerbach (ExxonMobil) and Cindy Cumins (IHS) asked, ‘Is legacy data a graveyard or a treasure trove?’ Exxon Mobil is using IHS’ electronic document custody service to get a handle on its huge document archive. Millions of hard copy documents and other media are at or near end of life. These represent a rich heritage from 50 plus acquired companies which captured documents with what was at the time the ‘latest’ technology. Exxon and IHS are now working to unlock value from a 25 million item data set covering the US lower 48, representing over 3 million wells. The process has uncovered significant hidden data of interest to US unconventional plays. Old well logs, not available to competitors, revealed hazardous igneous intrusions—allowing Exxon to walk away from the deal. Scanned images have been geotagged in a ‘rigorous’ workflow. Hard copy is triaged for relevance before metadata capture to Iron Mountain’s Accutrac records management system.

ExxonMobil’s CT Gooding asked why more companies did not perform regression testing to maintain the quality of their data asset. The answer is that regression testing is hard, requires multiple tools and the underlying systems ‘morph constantly.’ The answer is to automate testing on sample data sets generated from user input and metadata constraints. One use case is checking data synch between engineering and geoscience databases. A test record is injected into the two schemas which are then compared with an automated SQL query ‘to see if anything breaks.’ Gooding recommends a dedicated test environment to include tests on user roles and identities, ‘don’t run a test with super user privileges.’ The test environment itself can be generated automatically with triggers inside the production database.

Eileen Mahlow (XTO Energy) recalled the times when companies were told that in-house custom development was deprecated, heralding a move to off-the-shelf (COTS) software. Then there was Sarbanes Oxley (SOX) and a suite of new controls on accounting and other systems. COTS tools and accepted practices were not generally aligned with the new SOX regulations. For instance, software should not need full privilege to run and users’ roles should align with their responsibilities. Other issues such as software defect and change management may likewise expose companies to SOX issue. What is the answer? Mahlow advised vendors do a better job of keeping code separate from configuration files and to improve entitlement management. Oil companies need to improve how they administer privileges, ‘don’t let just anyone access the dbadmin account.’ Another gotcha is the ability of an unauthorized user to bypass access controls through direct access with tools like Toad.

One PNEC plenary session was devoted to the subject of professional data management and competency training and testing. Oil IT Journal has already reported on PPDM and UK CDA’s initiatives to formulate a data management training program, backed up with certification and testing. PPDM CEO Trudy Curtis pushed the boat out a little further, comparing PPDM, now the ‘Professional petroleum data management association’ with the SEG and SPE. CDA CEO Malcolm Fleming likewise called for the establishment of a professional association with its own journal, annual conference, workshops and seminars. ‘Data managers need their own club.’ In a similar vein, Omar Akbar outlined Saudi Aramco’s plans for the DMBoard, an ‘operators only’ club that aims to succeed in upstream data management where the vendors have failed. Akbar stated that, ‘We don’t want to be controlled by solutions, we want to be controlled by the business.’ Early work is to focus on a ‘reference model’ of industry terminology.

Our PNEC coverage continues next month with more presentations on unconventional data management, on emerging solutions to managing Petrel data and on Pioneer’s, err, pioneering use of data virtualization. Meanwhile, visit the PNEC home page.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.