PNEC Data Integration Conference, Houston

Although some companies are at what one observer described as the ‘Alfred E. Newman’ stage of data management—Excel and shared drives, most are making some headway. We report on data initiatives from Saudi Aramco, Highmount, Marathon and Shell, on Teradata’s brave new data world, and on new developments from Noah Consulting, Schlumberger, LogicalCat and Halliburton. We also hear two ‘boomers’ expound on ‘generational prejudice’ and how to entice Gen-Y into the E&P fold.

Phil Crouse’s PNEC Data Integration Conference got a respectable turnout of around 250 from 22 countries. Not bad in the face of the recession and the swine flu epidemic. As we reported last month PNEC saw the commercial launch of Petris’ WINDS OneTouch E&P ‘knowledge portal.’ Petris’ technology got a pretty good endorsement from Saudi Aramco’s Turki Al-Ghamdi who described deployment in the context of lifecycle seismic data management. Aramco has analyzed the seismic workflow in terms of a long term asset cycle (LTAC) that embeds shorter term operational cycles from acquisition through processing and interpretation. The LTAC approach is improving communication between the ‘producer’ and ‘consumer’ cycles. LTAC includes a data governance process, reports, QC and an authoritative data store. Underpinning LTAC is Petris’sWindsEnterprise, described as a ‘plug-in, a metadata-driven, vendor independent integration platform.’ The system complements Aramco’s Oracle and Documentum environments, avoiding ‘costly migration from legacy systems.’ The ‘data services solution’ (DSS), conceived by Aramco, deploys ‘smart’ business objects which allow new data types and workflows to be added on the fly. Al-Ghamdi concluded by recommending a division of labors whereby ‘the vendor handled the technology and the company handles the business.’

Tina Warner described Highmount E&P’s ‘Incentive’ data environment that couples an IHS PIDM database with data QC from Schlumberger’s Innerlogix unit. The corporate PIDM repository receives nightly updates ‘pushed’ through the firewall that respect in-house ‘preferred’ data. Innerlogix’ data QC tools automate quality data delivery to users—replacing previous inefficient manual process. Innerlogix has turned the PIDM database into a ‘trusted system of record’ and provided its business units with tools and a strategy for data management. Data flows from PIDM 2.5 to an OpenWorks master and on to Petra and OpenWorks projects. Warner noted some ‘friction points,’ such as inadvertently overwriting data that has already been ‘fixed.’ Highmount estimated the cost of a failure to implement would be around $3 million a year. The project took around 9 months to implement.

Petrosys’ Alec Kelingos noted that ‘data is worthless if you don’t know where it is located.’ The oil and gas industry lags in terms of coordinate reference system (CRS) management. V16 of Petrosys’ mapping package comes bundled with the European Petroleum Survey Group’s ‘well known text’ rendering and unique ID. All 200 Petrosys client sites have been spidered to identify CRSs in use. Kelingos recommends using a cross discipline team of geodeticist, IT and users. Pitfalls abound—units of measure, round off errors and novel undefined CRSs. Petrosys’ CRS crawler provides an audit report. ‘Spatial needs a unique data management strategy.’

Two ‘boomers,’ Schlumberger’s Richard Johnston and Dwight Smith did a good job of stepping into the shoes of ‘Gen-Y’ to report on ‘generational prejudices’ and a future world where collaboration (at a distance) is a given. Combining Twitter, YouTube and resources such as the Schlumberger Expert Directory might allow multi million dollar prospects to be developed on Facebook from a cell phone! The ‘millenials’ will see a paradigm shift from our map and hierarchy-based displays to multi dimensional displays, artificial intelligence agents, shared search and automatic translation.

Hector Romero presented Shell E&P Co.’s well log data environment which builds on another Petris product, the Recall petrophysical lifecycle data management system. Header data comes from Shell’s corporate data store to a Recall staging database for edit and curve selection and then on to OpenWorks, Techsia’s TechLog and the Recall master. Shell’s ‘EPICURE’ quality rules are applied en route. A variety of reformatting strategies allow log data to be browsed in Landmark’s PowerExplorer and a lot of work has gone into automating data loading. Some Shell units handle thousands of new curves per months. Shell is now working on tight integration between Techlog and Recall.

Niall O’Doherty gave another enthusiastic plug for Teradata’s visionary environment for seismic data. O’Doherty envisages the development of a ‘Google Earth for seismic data’ leveraging Teradata’s spatially registered data structure and a new logical data model. The benefits would be an environment that supports both analytics and data management—pushing the boundary such that users could ask any question any time. O’Doherty advocates ‘data refinery’ with a redefined demarcation of what is done in the database vs. what is done in the application. Teradata poster child is E-Bay which uses Teradata to process its 50TB/day incremental data flow. To get traction, Teradata has a lot of legacy systems and ideas to displace. O’Doherty notes that, ‘In the mind of an expert there are many established ideas. But in the mind of a beginner there may be a few new ones!’

Shari Bourgeois outlined how Marathon’s ‘Midas’ well master data solution was developed, leveraging HP’s ‘IQM’ data quality methodology. The Midas ‘golden’ database includes workflows for AFE, peer review, rig on site and well spud. Midas uses database triggers to initiate workflows of ‘mindboggling’ complexity. Marathon had a ‘three day fist fight’ over thorny questions such as ‘what is a well?’ VP sponsorship was necessary to convince skeptical asset teams of the project’s value.

Noah Consulting’s Shannon Tassin proposed an enterprise architecture for business intelligence and real time information quality management. This addresses the problem of multiple data silos and real time data that is ‘stuck’ in the operational historian. Noah advocates a ‘rationalized metadata repository’ a.k.a. the ‘glue that binds.’ A further recommendation is for federated MDM rather than a single centralized solution.

Rail Atay Kuliev (Saudi Aramco) observed that, with around 10 million bopd production, ‘a percentage point is a lot.’ Aramco’s production data was previously stored in multiple, isolated legacy systems. Correcting bad data was complex and no real time data was available. Now production data is served centrally from a web-based ‘multisystem’ for use by engineers and management. Real time data is now accessible from centralized production allocation and includes reliable on/off time information. A single field was used for pilot in 2007 which boosted confidence in the system. Subsequently projects were segregated into phases and areas. The same team was used throughout the life of the project—this helped with knowledge sharing.

Jess Kozman, formerly Schlumberger, now heading up Carbon Lifecycle Technology Consulting, outlined the state of play with Schlumberger’s analysis of oil and gas companies’ data management ‘maturity.’ Until relatively recently, ‘mid tier’ US oil companies did not really have a data ‘issue.’ But they sure do now with hundreds of PCs and complex data workflows. Analysis with Schlumberger’s data management maturity model shows some at the Level 1 stage, a.k.a. ‘Alfred E. Newman data management!’ They likely deploy Excel and shared drives and have no best practices. Progress up the data maturity matrix is possible but often interrupted as a small, lower maturity company is swallowed up by larger higher maturity acquirer. Schlumberger found that in the couple of years following a merger, from 20 to 70% of data management staff were ‘lost.’ Kozman noted that the next round of A&M will focus on PC-based companies who will be forced into more structured data management environments. In the Q&A it emerged that Schlumberger has not yet convinced any majors to do enterprise-level surveys so the data set is skewed away from supermajors. Outside of oil and gas, medical, military and intelligence have more data and higher maturity.

Bryan Hughes’ LogicalCat software startup is engaged in ‘fighting entropy with vendor-neutral search.’ LogicalCat’s technology targets mid-size companies which may not have OpenWorks and likely only PCs. They are unlikely to have a master database and may in fact have a couple of hundred of de facto systems of record! Projects (SMT, Petra, Geographix, SeisWare) make up the fundamental ‘document’. As such they can, with the right technology, be indexed and ‘googled.’ Shapefiles are still the de facto GIS standard. Hughes likens the approach to the ‘e-discovery’ of the legal profession. The result is near real-time reporting that can be tied into analytical tools like Spotfire for analytics.

Charlotte Burress described the migration of Halliburton’s Baroid drilling chemicals unit’s legacy knowledge management community of practice (CoP) to a Microsoft Office SharePoint Server (MOSS) environment. Each Baroid product has its own community where members can ask questions or learn on demand. Baroid’s ‘KM 1.0’ environment was built with Plumtree Portlets. Baroid wanted to move away from this top-down designed ‘busy’ portal with links, images and collaboration tools and to move to a ‘Web 2.0’ mentality à la Wikipedia, MySpace, LinkedIn, or Flickr. This process involved a two way conversion—of the site itself and of the users. The spec called for enhanced usability and the ability to find anything in less than two seconds. Burress emphasized that, ‘content management is a process not a software package.’ The development was done on a ‘vanilla’ MOSS, not the high end ‘Enterprise’ edition with more bells and whistles.

This article is an abstract from The Data Room’s Technology Watch from the 2009 PNEC. More information and samples from www.oilit.com/tech.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.