PNEC Data and Information Integration—Amsterdam and Houston

This report concatenates two PNEC conferences held earlier this year in Amsterdam and Houston and brings highlights of presentations from Shell, ConocoPhillips, Anadarko, RasGas and Pioneer. As oils ramp-up their SOA efforts, a call is made for independent guidance from a third party, an ‘OASIS for oil and gas.’

The Amsterdam and Houston PNEC meetings totaled over fifty papers and panel discussions so the following is an eclectic report of some highlights. Sushma Bhan’s presentation covered Shell’s holistic approach to R&D knowledge management (KM). Collecting and serving information from Shell’s ‘walking encyclopedias’ is not always easy. But the effort is worth the trouble as R&D holds much of the company’s knowledge. The holistic approach to KM involves R&D portals, technical publishing, Livelink, the E&P catalog and Shell’s Sitescape community of practice (COP). The COP is managed by subject matter experts and global coordinators. Shell holds monthly seminars where researchers present the fruits of their studies. ‘Legacy knowledge’ is captured from ex-chief geologists co-opted to the R&D team. Technology support is provided by Autonomy, MetaCarta, the Flare Catalog and Invention Machine’s GoldFire semantic search.

Perfect search

Alessandro Allodi noted that Shell’s PDO unit produces more information in a day than a person can read in a lifetime. Shell is searching ... for the perfect search engine! The (somewhat intractable) problem involves a balance between recall (ensuring all documents are retrieved) and precision (the relevancy of what is returned). Allodi compared keyword-based and Boolean querying and ranking methods, where precision remains a problem. Tagging and metadata is an excellent though expensive approach. GIS is great, but Google’s ranking techniques may be less appropriate in the corporate environment. On the horizon (two to three years out) we can expect improvement as technology embeds heuristics and information theory. The heuristic approach is already deployed in MetaCarta’s GIS search engine.

Chesapeake

Irina Tucker described Chesapeake’s migration of its geoscientists’ role assignments and operational hierarchy from a plethora of Excel spreadsheets to a GIS-based information system. Chesapeake’s ‘Team Table’ now offers map-based management of teams, audit trails and historical data that shows how the company was organized in past – offering interesting possibilities for correlating organizational strategy with the bottom line. Team Table applications include Hyperion System 9 master data management and ESRI GIS.

ConocoPhillips

Dede Schwartz outlined ConocoPhillips’ Alaska unit’s integration of its geoscience, drilling and production processes into a ‘neutral’ Oracle database. The Alaska Technical Database (ATDB) holds 7,900 wells and is growing at around 100 wells per year. The vendor-neutral ATDB is a single point of access and a control point for well identifiers. Schwartz described the unique identifier as a huge benefit in bringing different systems together. The ATDB acts as a hub for data transfer between applications—allowing apps, which are ‘not eternal,’ to retire or upgrade gracefully. The ATDB also houses value added data, which otherwise is often isolated and hard to integrate. The ATDB was initially loosely based on Finder. Data extraction is performed by ‘the duct tape guy’ whose Perl scripts ‘can extract data from anything.’ According to Schwartz, the secret to good data management is, ‘Just say no to Excel!’

Anadarko

John Pomeroy described Petris work on streamlining Anadarko well data delivery. The initial Recall deployment in 2003 was a success, but a large ongoing drilling program swamped the operator, compromising data quality. Data was being delivered piecemeal to multiple stakeholders and it was hard to ensure that digital data sets were clearly identified. Direct delivery to asset teams ‘escaped’ corporate data management. Petris’ solution rationalized and automated data delivery to a unified central source. Service companies logs onto Anadarko’s website for the correct well identifier before upload. Anadarko’s 3 TB database now holds data from 300,000 wells. The Recall Autoloader was used to merge huge data sets form acquisitions. The solution is deemed SOX-compliant as acquisition artifacts are captured. ‘Back door’ data management has been eliminated and consistent naming implemented.

RasGas

Qatari LNG exporter RasGas has developed a production surveillance database as described by Brian Richardson. The project started badly with a vendor that failed to provide the goods, at which point RasGas opted for an in-house development with help from The Information Store (iStore). iStore leveraged the OSIsoft PI API to turn the raw historian data into something that was ‘understandable for users’. PI data now streams into Oracle tables with overnight UOM conversions etc. The system has replaced multiple Excel spreadsheets, introduced standard units and achieved a complex aggregation of tag information. RasGas is now planning ‘date-effective’ tag management for historical analysis.

The IM elephant

Duncan Stanners described Shell Canada’s first attempt to tame the IM ‘elephant’ as a complete failure. The early Livelink EDMS implementation led to multiple repositories and copies of data. The feeling was that users couldn’t find and didn’t know what was there. Shell’s information specialists recognized the problem but were lacking a telling argument to sell an IM improvement project to management. With help from Flare consultant Alan Bayes, Stanners developed a ‘risk assessment matrix’ plotting the likelihood of an incident against the severity of the consequences. Rather than focusing on costs savings Shell looked at real events. In one oil sands project Shell failed to put the correct protocols in place governing its relationship with the engineering contractor, resulting in two truckloads of paper being delivered and ‘costing millions to sort out!’ The risk matrix persuaded management to devote $4 million to solve the problem. A Shell analysis supported the IM framework, putting people first, then process, then tools. Shell also noted that one size does not fit all. IM support for a deepwater well is different from that required in a Peace River CBM development involving 200 identical wells. Flare’s E&P Catalog was deployed to manage access to Livelink. The key learning? Standard reference data is king!

Pioneer

Pioneer, according to Carol Tessier, is betting on web services for data delivery. Users want to pull more and more disparate data together. G&G want to see engineering, financial, land etc. Expectations are growing, driven by Web 2.0 consumer mash ups and RSS feeds. With help from IHS and Schlumberger, Pioneer is moving to data services, applications and portal-based data search. Web services mean that ‘the IT standards wars are over.’ The POSC/Energistics ‘meta catalog’ is the key to integration. This approach offers ‘more flexibility than a data warehouse.’ Pioneer’s One Map application (developed by Schlumberger) plugs into IHS Enerdeq and PI data as well as collecting data from Landmark’s TOW and Pioneer’s financial system. Autonomy search is also provided as a web service. Tessier suggested that more industry involvement in this activity would be beneficial, suggesting an independent third party to guide us as an industry. In other words, an ‘OASIS for oil and gas.’

Miscellaneous

David Archer described Petris’ work for Saudi Aramco on information management for large, long-lived oil and gas assets. This leverages Petris’ Dynamic Common Model in an ‘Authoritative Data Store’ of application-independent business objects. The platform is accessible through web services which assure lossless data movement. Guy Holmes (SpectrumData) warned of the technology gap between the new Fiber Channel-based storage media and legacy SCSI. This is shaping up to be ‘a right royal mess.’ The issue of what data to keep, what to refresh and remaster, exercised speakers at the Amsterdam panel session. For some the cost of deciding what to keep was greater than the cost of just copying the lot! One major’s retention policy was signed-off by the CEO. Unused interpretation data is destroyed according to the retention policy.

This report is a short version of a Technology Watch report from The Data Room—more from info@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.