The 19th PNEC petroleum data integration conference held earlier this year in Houston kept up (in general) with its tradition of offering informational value with a majority of presentations providing insightful disclosure of upstream data management practices. Our reading of the proceedings also suggests a shift in focus from data management per se to a broader, holistic rendering of the business-data-IT technology triangle, with even a sortie or two into the digital oilfield space.
Jim Seccombe showed how BHP Billiton, with help from Halliburton and Infosys, ‘reverse engineered’ a data management solution around its existing decentralized wellbore data infrastructure that spans Petrel, Petra and OpenWorks. Project ‘O’Weld’ centralized well data using scripts to crawl servers to gather target files to a central Recall database. The project included much data de-dupe, clean-up and promotion of key data and is now kept up to date with nightly cron jobs. Other functionality includes the generation of workstation-ready curves leveraging Recall’s Raven data validation engine. Landmark/Petris Winds Enterprise adds a comprehensive search functionality.
Petroweb’s Brandon Schroeder introduced a ‘bigger, better, faster’ implementation of the PPDM data model that leverages a modern ‘big data’ back end. The problem with current PPDM deployments is that data volumes are expanding beyond the capacity of ‘traditional’ approaches such as denormalization and database tuning. Petroweb has been investigating a NoSQL solution using Hadoop, MongoDB and Solr. The tools were easy to set up, fast, scalable and, as open source software, free. Petroweb has added spatial indexing and search to Solr to expose PPDM content via a GIS. Schroeder observed that the findings are not specific to Petroweb. Others who have implemented home grown PPDM solutions could leverage the technology.
Mike Slee (Addax Petroleum) and Peter Black (EnergySys) presented an innovative, cloud-based data management solution. Sinopec-owned Addax has interests in Europe, the Middle East and Africa, presenting particular data and reporting requirements. Following recent mergers, Addax has consolidated its disparate spreadsheet-based production reporting to a hosted solution from EnergySys. The authors are scathing with regard to current oil and gas industry standards and standards bodies whose production often ‘languishes unread.’ Even ‘successful’ standards such as Witsml are ‘notable for their rarity rather than their impact.’ Addax is however a Petroweb, and therefore a PPDM user. The EnergySys approach is to leverage the generic, Microsoft/SAP-backed OData protocol that provides ‘straightforward integration’ of cloud and on premise systems, bringing together information from a variety of data sources. While the cloud itself does little to fix the problem of data silos, OData is ‘revolutionary’ in that it provides ‘cloudbusting’ technology that offers the data manager with some coding skills the ability to connect multiple desktop applications. Excel PowerQuery got a plug as did Petroweb’s GIS data viewer which also uses OData.
Petrobras’ Renan de Jesus Melo and Halliburton’s Ricardo Álvares dos Santos presented another use of OData, here to link Petrobras’ proprietary seismic trace database with Landmark’s OpenWorks. Landmark DecisionSpace Integration Server is used to consolidate and match data from the two environments and an OData link feeds up to a web-based client.
Notwithstanding the standards skeptics, Noble Energy’s Vijay Chitiyala and Oracle’s Carl Schuckenbrock teamed on a PIDX standard-based solution to manage Noble’s growing global inventory management challenge. The aim was to create a trusted and consistent foundation for engineering procurement and spend analysis. The adoption of HTS/ECCN product codes also facilitates international trade. The system comprises Oracle E-business suite and product hub and Pilog’s master data management. To date Noble has cleansed and enriched over 30,000 items to PIDX and UNSPSC standards.
Scott Sitzman (Conoco Phillips) and Ryan Hamilton (SpatialEnergy) presented another cloud-based solution for the storage and management of spatial imagery. To achieve this, a lot needs to be done up front to ensure that data is correctly georeferenced and usable by applications. But the benefit of global access to secure, hosted imagery has meant a 40% hike in delivery speed and over 3.5 million maps drawn every month from the 100 terabyte hosted set. Workflow integration feeds imagery into Conoco-Phillips’ desktop applications that include Geographix, Petrel, Esri, Kingdom and Petra.
Alberto Dellabianca and Elio Rancati presented Eni’s OSIsoft PI-centric interpretation of the digital oilfield theme. The authors observed that many digital oilfield initiatives have failed due to unrealistic goals and overarching master solutions to complex processes. Eni initially deployed a minimal solution built with standard tools from OSIsoft and OVS. This was then augmented using PI AF to create a set of templates for wells and other assets and to implement exception based surveillance of ESPs and rotating equipment. The OVS workflows now include interaction with GAP and Eclipse. GUI development leverages PI Web parts and Coresight in Sharepoint. Eni is now working to apply the framework at the enterprise level with the deployment of PI Asset Analytics to embed standardized real time KPIs in the global templates.
Marcos Pérez (Petrolink) presented work done with Pemex on the use of the (not-so-rare) Witsml standard to consolidate data formats from multiple service providers in non-conventional frac operations. In the frac van data access was limited to a single serial data port. Petrolink built a splitter box that replicated the RS232 data to TCP/IP ports. The asci data was then reformatted to vanilla Wits prior to ingestion by a Petrolink aggregator. Fully qualified Witsml data was then shared with HQ via a satellite link. The solution consolidated well head data from Schlumberger, Halliburton, Weatherford and CalFrac. Closed-loop real-time surveillance of fracking operations has optimized proppant use, reduced water use and helped manage fluid disposal.
Chris Josefy and Omar Khan described El Paso Energy’s oilfield of the future which seeks to break from current ‘less than perfect’ engagement strategies where the business seeks a solution to a pressing issue and at the same time supplies a ‘fully formed solution,’ engaging IT as a mere ‘order taker and lagging partner.’ EP Energy set up a team of engineers, managers and IT specialists to figure out ‘how we should be doing this.’ The result is Well 360, an OVS*-based portal that offers a single view of well data from disparate systems, and partially-automates well surveillance and other workflows such as artificial lift optimization. A quick win from the project was the elimination of ‘wasteful’ morning meetings that were eating up an hour per day of lease operators’ time.
Mara Abel of the Brazilian INF/UFRGS research establishment, with help from software boutique Endeeper has performed an ontological analysis of the PPDM’s lithology/core data model. The objective is software interoperability by ‘making apparent the meaning of geological objects represented in the model.’ This quasi-philosophical approach involves asking deep questions as to the meaning of ‘reservoir,’ ‘what is a rock?’ and so on. Ontological analysis is claimed to clarify the modeler’s intent and highlight conflicts of interpretation. The idea is to produce small models with a limited number of formally defined entities and attributes that facilitate data integration. Abel examined some 49 PPDM lithology tables using the Ontoclean process. The results, as far as we can tell, suggest that some geological concepts defy ontologically rigorous classification. Other ‘successful’ classifications may be rather hard for database traditionalists (or geologists) to follow!
While ‘what is a rock?’ may be a hard one, PPDM’s ‘what is a well’ (Wiaw) has got traction in ‘information-driven’ data management according to James Pipe (PPS) and Gary Winningham (Chevron). The authors eschew current siloed modeling and advocate a holistic, top-down approach leveraging a robust information model and data dictionary to align source systems with the business. A three step method starts with a logical model of the business, then maps this to applications and workflows before finally plugging in to the data sources. For Chevron, this meant refactoring existing systems around a modified and extended implementation of Wiaw.
Sean Sanders, with help from Halliburton, outlined Energy XXI’s seismic data management solution which includes a NetApp FAS 6210 Tier 1 storage unit atop an IBM/Tivoli storage manager. Troika’s newly launched data management suite is used to perform data clean-up and metadata collection prior to loading to Landmark’s PetroBank.
More Landmark technology was cited in Hussein Al-Ajmi’s presentation where he showed how KOC uses PowerExplorer as an entry point to an extensive back end including OpenWorks, Finder, LogDB, eSearch and ArcSDE spatial data stores.
As a parting shot we have to give a black mark to the presenter who chopped off the product name from his slides. Not really in the PNEC spirit! More from PNEC Conferences. Also, mark your diaries for the 20th edition to be held in Houston from the 17-19th May 2016.
* One Virtual Source.
© Oil IT Journal - all rights reserved.