2013 PNEC Data Integration, Houston—Part II

Pioneer on data virtualization. RoQC, Petrel data management and the oxymoron. Tiandi’s data factory. Petronas Carigali’s data doctor. Talisman on well planning in DecisionSpace. More …

Our best paper award for the 2013 PNEC goes to Martha Gardill for her presentation of Pioneer’s four year journey to data virtualization. Pioneer’s activity in unconventional exploration has seen the deployment of multiple ‘best of breed’ applications leveraging different architectures and data integration mechanisms. The downside of best of breed is that the onus falls on the operating company to bring it all together. This can be tricky as the authoritative source of data may not be evident and there may be overlap in application capability. At the start of the program in 2009 Pioneer evaluated three options—direct access to data stores, build a data warehouse, or develop a ‘self service’ data system. The latter was chosen partly because of application scalability issues and also concerns over compliance issues with multi-user access to data. The solution, Pioneer’s ‘self serve data system’ (SSDS) leverages Composite Software’s data virtualization engine alongside Tibco’s ActiveMatrix Business Works business automation platform. SSDS has democratized data access for Pioneer’s users. Power users can develop views, access applications and use reporting tools. Most use cases are for read-only access but the Tibco messaging bus also allows for update. In the Q&A, Gardill revealed that the biggest challenge was finding the right data virtualization tool.

Ian Barron (RoQC Data Management) asked, ‘Is Petrel data management still an oxymoron?’ Schlumberger’s Petrel interpretation flagship has been ‘well received by users but reviled by the data management community.’ The problem is that with Petrel, it is too easy to get data in and copy it around. Soon, nobody knows which is the correct version of the truth. The current dogma is that all Petrel items need quality flags and metadata, but no user does this.

However, things are changing as Schlumberger gears-up for ‘serious’ data management. Current ‘straw man’ functionality allows standards to be embedded in Petrel deployments, helping to find and fix non compliant data. In Petrel Studio, standards can be broadcast to local projects. Third party tools such as Blueback’s project tracker can help de-dupe project data and check coordinate reference systems. RoQC’s eponymous toolset further enhances data audit, looking for unlikely or missing values. Barron concluded that Petrel data management has come a long way in the past year and foresees more improvement in the very near future.

The rapid ‘factory drilling’ approach to non conventional development in the US is impacting data management as Tiandi Energy’s Richard Ward outlined. Working for Hess Corporation, Tiandi has developed a streamlined approach (a.k.a. the data factory) to gathering and consolidating legacy data, much of which has been given a new lease of life as source rocks are re-evaluated for their reservoir potential. Unconventional legacy data is a ‘back to the future’ problem—the Bakken shale was first drilled in the 1920s. Core data is of particular interest to non conventional operators—for total organic hydrocarbon evaluation and rock mechanical studies. Rather than seeking to capture all this diverse information in ‘the mother of all databases,’ Tiandi has developed a simple transmission standard so that teams working on core data records can capture the required information which can then be consolidated and loaded to Perigon’s iPoint. The workflow allows Tiandi process ‘hundreds or thousands’ of wells per month. The actual standard transmission format may be as simple as an Excel spreadsheet.

Daljit Singh described a fairly common situation in Petronas Carigali whose legacy production information management system (PIMS) was showing its age. A decade after its introduction PIMS users were reporting discrepancies between reconciled string and sand production volumes and between OFM and PIMS production figures. A 2010 data quality study found that many applications modules were out of date, that data ownership was unclear and that many users still preferred using Excel. Moreover, management was overconfident in data quality. Petronas has now developed a ‘data doctor’ audit tool to pinpoint data anomalies. The data doctor was developed with Oracle SQL/ PLSQL, Javascript and HTML. The situation is that Petronas’ legacy data has now been cleansed. The tool is now used on current flow data in a continuous data quality management effort. The ultimate aim is for a single central database, with managed data ownership and standards and a dedicated data quality team to eliminate data errors at source.

Fred Schwering described how Talisman Energy was ‘breaking down the silos’ between geosciences and engineering with Landmark’s DecisionSpace collaborative well planning application. DecisionSpace needs feeding with data—surface data from ArcgGIS, subsurface G&G data and drilling specs for well path planning. A rather complex workflow involves combining GIS data into a ‘feasibility layer’ showing possible pad locations. Geoscience interpretation from Petrel and drilling specs are imported via OpenSpirit, using OpenWorks as a staging database. Once everything is in the same place DecisionSpace generates ‘really accurate’ drilling-ready well plans. The approach supports expensive, complex non conventional operations. DecisionSpace ‘knowledge nuggets,’ ad hoc well bore annotations, are used to flag anticipated drilling hazards.

Ernie Ostic gave a thinly veiled commercial for IBM’s InfoSphere metadata workbench as a means of tracking information ‘lineage.’ In general, the value of information degrades with time as crucial details as to provenance may be lost. This is a critical situation in the event of a safety incident—an HSE report may be available but is it up to date and authoritative? Data lineage needs to be ‘ingrained’ in the enterprise culture. InfoSphere empowers users to capture lineage and reduce information latency.

Petrosys’ Volker Hirsinger reported on a thorough test drive of various cloud-based data storage options available to smaller multinationals. While the regular internet is OK for smaller documents, sharing large SEG-Y files or large databases is harder to do without an IT-supported WAN. Cloud-based storage is an attractive proposition. But not all clouds are equal. They may not behave as advertised and some lack an intuitive interface. SharePoint repositories can be hard to set up and have proved unstable. The ‘cloud’ may be a complex ecosystem of multiple stakeholders making for multiple potential points of failure. Public clouds like Dropbox are much faster than private clouds but may have file size restrictions. Legal issues and security are further concerns. In the end, Petrosys uses multiple cloud-based workflows in different contexts—along with FTP and USB/disk data transfer. More from PNEC.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.