12th SMi E&P Information and Data Management, London

Data managers hear more on Chevron’s data and IT standardization and from Repsol on keeping it ‘open’ and ‘people-focused.’ ENI warns of lost geodetic know-how. Aramco scales-up its data effort to match drilling hike. GDF Suez, Total offer insights into their data quality programs. AF Engineering and Preen Petroleum show how data quality is assured in the refinery.

Around 100 showed up for the 12th SMi E&P Information & Data Management Conference held last month in London. Chairman Floyd Broussard (Schlumberger) noted how the industry has evolved from the days of unmanaged data collections though a stage of conservancy and project building to data ‘curation’ and ‘shaping the information landscape for self service by users through quality tools and automation.’

Peter Breunig’s (Chevron) described data management as, ‘not cool, but critical.’ Data was a key aspect of an IT transformation project kicked-off by Chevron’s new CIO, Louie Ehrlich, in 2009. The project identified Chevron’s ‘top ten’ data types, which were targeted for a major clean up program over the next decade along with the start of a data maintenance program. This connects with Chevron’s ITC ‘utility’ services spanning upstream, ERP and business units.

Chevron is working with an upstream IM ‘shaping curve’ that helps understand objectives and provides ‘swim lanes,’ with details of how to achieve and roll out projects. Chevron is working on standardized systems of record, search (text and spatial) and on IM career paths—there will be no more ‘dipping in and out’ of geotechnical and IM careers. The project sets out initially to answer the ‘15 or so’ top enterprise questions, ‘what’s my production’ etc. via data standards. Flexibility is needed to balance the ‘architectural review’ approach with the ‘killer application’ that does not conform.

Malcolm Fleming (Common Data Access) outlined a new UK data initiative: a National Data Repository (NDR). The NDR is to house petrotechnical, cultural data and metadata along with license and production sharing agreements, regulations on data preservation and reporting. A ‘virtual’ model is planned for the NDR leveraging existing vendor and archive sites. There will be no central location and there will be duplication—it will be a ‘chaotic’ system that will need to be fixed. CDA is working with the DECC on catalog standardization and exchange. The vision is of an ‘affordable science-driven selection policy for preservation.’ Fleming warned that ‘It is unlikely that the UK will ever allocate sufficient financial resources to preserve all North Sea data—there are hard decisions to come.’

Repsol’s Augustin Diz acknowledges that we need structured databases but these need to be integrated by design—which is currently not the case. Managing ESRI Shapefiles for instance is complex and Repsol’s users have moved to the simplicity of Google Earth. There is more happening in operations than in G&G. But we are still not doing surveillance with business rules. We need interpretation tools that broadcast their models. Social tools should be built-in to support access. We need to ask what users would like to say about their models and interpretations—make it useful and easier to remember what they did. Diz advocates ‘embracing’ open publication. In Repsol a team was using Sharepoint to share official tops and integrate work flows. Well logs were dumped into SharePoint because this was seen to be better that ‘putting them into the data base.’ IT came along behind and captured this stuff into databases without losing SharePoint facility. SharePoint was considered as a vehicle for IM improvement. Diz advocates thinking ‘people first’ and how best to create value. You should also be prepared to switch between learning and teaching modes. G&G folks should spend a month or two in IT.

Tarun Chandrasekhar (Neuralog) and Steve Jaques (Laredo Energy) presented Laredo’s ‘iOps’ data environment. iOps comprises a NeuraDB log database, a GIS-based web portal and access to third party data from the Texas RRC, IHS and P2ES. The ‘glue’ behind the system is a PPDM 3.8 master data management system that collates data across the components. iOps provides integration with interpretation tools such as Petra and SMT Kingdom. The iOps system is understood to be one of the first Microsoft SQL Server-based PPDM implementations.

Mario Marco Fiorani (ENI) described the parlous state of coordinate reference system (CRS) information across the industry saying, ‘We are losing the knowledge of handling cartographic parameters. In ENI there are only a couple of people left who understand datum shift and CRS data management.’ To fix this and to support its GIS and business intelligence applications, ‘MyGIS Explorer’ and ‘InfoShop,’ ENI has performed a quality assessment of its coordinate data. ENI is working with OpenSpirit and is now ‘nurturing’ its remaining cartographic skills.

Ahmed Al-Otaibi described how a three-fold hike in drilling (146 wells in 2008, up from 44 in 2001) has created data management challenges for Saudi Aramco. Exploration and delineation drilling footage has risen from 4 million feet in 2004 to 8 million in 2009 with a similar hike in development. Aramco has developed a data governance framework for its well data with roles, responsibilities, and policies. The company has its own data model, ‘not Epicentre, not PPDM,’ and its own seismic standards.

Well data tracking has automated the process of assuring data completeness in drilling, wellsite geology and wells databases. Traffic lights and a dashboard track process across diverse in-house and outside data producers. The new system can simultaneously track 400 wells with minimum manual intervention. Proactive data tracking from wellsite to corporate database has resulted in a 13 fold productivity gain. The system tracks well headers, well bores, deviation surveys, logs, cores, tops, tests, lithology, ROP and VSPs and supports document tracking and loading along with QC of document metadata and secure access control.

An in-house developed data quality management (DQM) process tabulates data quality metrics such as completeness, consistency, validity and uniqueness. A very successful data services system (DSS) was developed in collaboration with Petris that has now been commercialized. DSS acts as a hub to seismic data from processing, field acquisition, specialized processing and as a gateway to project databases.

Another in-house developed system, ‘Geo-Knowledge Management’ (GKM) captures and manages knowledge of exploration assets—performing version control, providing connectivity to knowledge repositories and databases, and offering reporting and GIS-based search. The GKM is now enhanced with a master data catalog—further aiding search through ‘centralized and consistent master data across all repositories.’

Finally (most important) are people and professional development. This is assured through training on G&G data and document management, data quality, GIS and database administration through in-house and vendor supplied training and mentoring. There is also a professional development program for young talent. Others can enroll in more specialist programs. In answer to a question, Al-Otaibi said that data management is separate from IT. Exploration data management is done by geoscientists.

Future data challenges include pre-stack seismic data and interpretations. Here Aramco is working on data governance and QC procedures before rolling out a system that supports data flows from acquisition to prestack interpretation and on.

David Lloyd observed that GDF Suez’ recent growth has seen a doubling in Aberdeen and London for the Cygnus development, the largest Southern North Sea project in 20 years. GDF Suez’ vision is of tightly integrated information systems, robust processes running on state of art technology and databases and leveraging quality, data and GIS. Lloyd noted that ‘the technology deployed in schools and universities has not yet reached the oil company coal face.’

Technologies such as 2 terabyte disks, Intel Nehalem CPUs, solid state RAID, ‘Light Peak,’ Nvidia Tesla 20, faster RAM and USB3—are all coming ‘real soon now!’ But what are we doing about it?

Standard methodologies including PRINCE2, ITILv3, MOF etc. can help—but they need a good ‘pitch’ to avoid analysis paralysis and rejection. Projects need to ‘fit’ with the E&P Industry and avoid an ‘IT crowd’ label. Lloyd recommends going for a minimum number of clearly defined processes with demonstrable ROI. Business analysis provides ‘eyes and ears’ into the department to find out geologists and geophysicists requirements. Information systems project management can be tailored to the industry, leveraging the best bits of PRINCE2—not re-inventing the wheel. Documentation is required—a minimum of project mandate, brief, initiation document, presentation, risk log, RFC, end stage and end project report. Everyone needs to know what is expected.

GDF Suez is currently three months into the project and has drafted high level definitions in Visio. An online business management system is being deployed using corporate standards. A risk assessment matrix of severity vs. likelihood has been established and people hired to fill roles. RPS/Paras’ Alan Smith gave a parallel talk on the use of psychometric testing to fill key roles in projects like this. In answer to a question, Lloyd described GDF Suez’ UK business as a ‘devolved environment, although this is starting to change. E&P data management processes are now emerging from Paris. We may end up somewhere in the middle of the central/devolved spectrum like BP.’

Pascal Colombani (Total E&P) asked, ‘What does a data quality program involve?’ The need for such a program became clear when an internal study revealed that ‘users were struggling to find reservoir data’ in part because of different geographic coordinates across CDA, the survey department and the OpenWorks master database—up to six different sources of positioning information for one well and four 4 different locations. Another driver was the need to avoid individual DIY data management in Excel. Data can also be key in HSE, as was the case in one unit where a ‘lost’ well had been sidetracked to avoid an abandoned nuclear tool—a potentially dangerous situation that was fixed by the quality effort.

The cornerstone of quality management is data ownership, which is also the big challenge. People will happily spend 50% of their time building their own data set—but will balk at the extra 10% effort required for validation. Colombani warned against regarding data management and quality as a ‘project.’ These initiatives need ongoing funding to be ‘sustainable.’ Data management has not always been given the consideration it deserves in the past. But Total is steadily building a compelling argument for doing it right. In Indonesia, a sustained data management and quality effort has seen the creation of a new, QC’d reference database which has contributed to a ‘huge production hike.’ This was achieved by visibility (for the first time) of a complete, quality dataset that has meant drilling better wells, enhancing injection, and optimization based on a better reservoir model.

Christer Öhbom (AF Engineering) cited Preem Petroleum’s Tore Carrick’s ‘data commandments’ as follows, 1) data is always wrong but decisions must be right, 2) data must be visualized to understand its origin and context and 3) data has endless life while systems come and go. Preem Petroleum has 20,000 measurement points in its refineries. With an MTBF of once in 30 years, that means 2 faulty signals per day. Often these are hard to understand and may trigger inappropriate action by operators. To combat this, Preen recommends the ISO 9000 7.5.2 ‘special processes’ methodology which states that, ‘if results cannot be verified you need to QA the process.’ For preen, this has been done by DNV for 800 flow meters and other tags. QA is now built into Preen’s Quality Information System (QIS). Data flows from Historians, PLC/DCS and databases through QIS and on to users/applications. QIS leveraged a ‘logical process model’ of input, output streams and values. Tag agents aggregate data from minutes to hours. When an anomaly is detected, a warning goes to the data owner.

This article is an abstract from a Technology Watch Report produced by The Data Room. For more information on Technology Watch, visit oilit.com/tech or email tw@oilit.com.

This article originally appeared in Oil IT Journal 2010 Issue # 3.

For more information or to comment on this topic email here.