PNEC Data Integration Conference 2011 Houston

Highlights from this years PNEC—Paradigm beyond the ‘gold’ data store, RasGas’ ‘DMX’ data management accelerator, managing Saudi Aramco’s petabyte disks, Shell’s data distribution engines, Petrosys’ data audits, Idea Leadership Company on managing data ‘cultures,’ Exco’s IT transformation, Shell’s spatialization program and Petronas’ ‘PiriGIS’ information system.

Judging from the number of presentations at the 15th Petroleum Network Education Conferences’ (PNEC) Data Integration and Information Management in Houston last month, Landmark and Petris have cornered the market for data management application software. In-house software development continues apace, mostly focused on customizations of PPDM. Noah/Hess and Schlumberger/CDA made valiant attempts to put a ‘dollar value’ on data management.

Paradigm’s Jean-Claude Dulac thinks that industry needs to go beyond the ‘gold’ data repository. Why should we expect a ‘single version of the truth’ when all our data has errors and uncertainties? The reality is that measurement is imprecise, and that picks may differ between interpreters. Deciding which is ‘right,’ approving one source over another makes establishing a gold data store contentious and costly. Uncertainty is everywhere—in measured data, in processing and in modeling parameters. Dulac advocates the use of Bayesian probabilities, ‘We need to change the data management paradigm—instead of using manpower to qualify data, use the computer to find all the versions of the truth, and to find out what data most affects the outcomes.’ This enables the interpreter to focus on the most relevant information. Probabilities need to be analyzed in the review process. Everyone needs to think in terms of probabilities and review probabilistic maps, tops and log computations. Dulac suggests that ‘Gold plus probability equals a platinum repository.’ The Q&A sparked off an enthusiastic debate on the possibility of extending PPDM to manage uncertainty.

Mark Priest from Qatari LNG operator RasGas explained how the company is transitioning from development to operations. The transition includes a data management acceleration project (DMX) which aims for ‘no data left behind’ after the development phase ends. DMX also sets the scene for surveillance-based operations. Early in the project, it was realized that RasGas had ‘many more schematics than wells,’ there were ‘inconsistent’ formation tops and ‘a million files scattered around the place in non-communicating systems.’ All in the face of a projected 80 years plus of field life.

The project is not technology driven, the idea is to make DMX sustainable with built-in data quality checks, security and a desire to minimize the time that subject matter experts are expected to devote to the project. Data ‘judo’ was used to gain control over Excel, leveraging the tool’s power to the end user while eliminating its use as a data repository. RasGas sees data quality as improving with use as there are ‘more eyes on the data.’ The project is expected to bring around 4,000 person hours per year efficiency gains. RasGas shareholder ExxonMobil provided advice to the project. Questioned on the ‘solvability’ of the data management problem, Priest noted that 20 years ago, data management was an ‘office function’ at best. But thanks to the efforts of PNEC’s Phil Crouse and others, ‘We are beginning to get a handle on this. Information is power and is a key differentiator.’ Priest wants future RasGas engineers to look back on DMX and say, ‘Wow we are standing on the shoulders of giants!’

Saudi Aramco’s seismic data volumes are currently growing at around 45% growth per year. Jawad Al-Khalaf reported that today, 50,000 trace acquisition is commonplace and the trend is for seismic interpretation on pre stack data. Aramco’s Expec data center had around 7.5 petabytes online last year—this has now grown to 9 petabytes posing ‘a big challenge for IT.’ Paradoxically, storage space is underutilized—with about 1.5 PB unused. Too much data is mirrored, making for higher costs in terms of floor space, energy, maintenance and backup. ‘Terabytes are cheap, administration and usage is where the costs are.’ Aramco is looking to more data compression and deduplication where appropriate—and for better use of performant disks across applications including its Disco (seismic processing) and Powers (flow modeling) tools.

Aramco is working on breaking down workflows into processes, applications and data types (PAD) and documenting its business processes. This has been achieved for disk hungry applications including Matlab, Paradigm’s Geo-Depth, Hampson-Russel, GeoFrame, OpenWorks, Petrel. There is a 2.3 petabyte workspace for seismic data in Disco. Targeting and cleaning up the big disk users resulted in an immediate 250 terabyte space saving. Aramco is now working on data mirroring, native Oracle compression and Documentum disk optimization. A data life cycle policy has been developed for each PAD. The company is also evaluating ‘housekeeping’ tools to automate life cycle policy execution, optimize disk use and storage management. The intent is to develop or buy a robust storage management system to help arbitrate what is needed online and offline and how data should flow between SAN systems for processing, on to NAS for longer term use and eventually move offline.

Hector Romero outlined Shell’s journey from well data stores to ‘data distribution engines.’ Shell’s corporate data challenge is to manage the ‘official’ version of a log along with competitor data of variable quality and to serve it all up with the constraints of entitlements. Back in 2000 Shell’s data situation was ‘complex.’ By 2005 data management solutions were implemented around Landmark’s Corporate Data Store (CDS) and Petris’ Recall. Shell now has around five million wells in the CDS and is developing standards, naming conventions and processes for data audit, vendor management, search, project building and quality. The system serves data of ‘known,’ rather than ‘best’ quality. Users can judge if is fit for purpose. Connectors between the CDS and Recall have been developed with help from Landmark and Petris. Shell’s ‘data distribution engines’ serve data to projects from the corporate stores. Data cleansed by users can be written back to the CDS. Most recently Shell is leveraging Recall LogLink to ‘auto push’ data from Recall to OpenWorks. A second data path uses Landmark’s Advanced Data Transfer tool (ADT) to populate OpenWorks from the CDS. Shell’s users like what has ben done and want more, in particular an expansion of the data footprint to Landmark’s Engineering Data Model to add casing points, perforations etc. Romero is now also trying to bring in information from Shell’s plethoric production data sources. Unstructured data such as well reports, scout tickets and links to documents makes for something of a grey area. Here Shell uses Landmark’s PowerExplorer to fire off complex queries leveraging Shell’s Portal and EP Catalog.

Volker Hirsinger described how Petrosys has been working with the Victoria (Australia) Department of Primary Industries on ‘tracking change and handling diversity in a database.’ Many apparently static items in a database actually change over time. Such information can be captured in a PPDM well version table which has audit history, data source and interpreter fields. This provides a minimal form of change tracking. Petrosys has extended PPDM to allow a full audit history. A typical use might be generating a ‘single version of the truth’ from different source databases. Here Petrosys has developed a method of promoting a ‘best version’ to the corporate database.

John Eggert, who heads-up The Idea Leadership Company, has put his finger on a data management pain point: the communications ‘gap’ between management and technologists. This ‘social’ problem stems from the fact that techs lack people skills and are convinced that management does not understand what they do. Non techs on the other hand have unrealistic expectations and like to take ‘data free’ decisions. Co-author Scott Tidemann (Petrosys) has developed a program to help, focused on communicating around data management, ‘Communication skills impact project success as much as technical skills.’ Structured approaches like PDMP project management and PRINCE2 can help too.

Rob Thomas described Exco’s IT transformation in support of its shale gas-exploration effort. Shale gas wells come in fast and furious and mandate efficient IT systems. Analysis performed by co-author Jess Kozmann (CLTech) located Exco in a data maturity matrix and identified a ‘change trajectory’ for improvement. This involved a move away from Excel/Access usage with better data links and standardize tools. The outcome is that now Exco has portal based access to its 10 terabytes of drilling and completion data, has a handle on its application portfolio and has now embarked on a three year project to develop a corporate PPDM-based upstream data store.

Roy Martin reported on how Shell has spatialized data in its CDS and OpenWorks repositories. Spatialization involves transforming positional data into a GIS representation, usually in ArcSDE or a file geodatabase. Shell has formalized and streamlined the process in a global project commissioned to resolve the situation. Landmark’s PowerHub, PowerExplorer Spatializer and Safe’s FME toolset were used with ‘minimal customization.’

Zukhairi Latef described how Petronas has maximized the value of its subsurface data with GIS integration using an ESRI ArcSDE/ArcGIS Server with raster imagery in ERDAS Image. The PiriGIS system has its own spatial data model (a PPDM/ESRI blend) with ESPG geodetics and ISO TC211 metadata. Spatializing tools include OpenSpirit and Safe’s FME. Data integration has proved key to Petronas’ exploration success. More from www.oilit.com/links/1106_39.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.