6th PNEC Data Integration Conference

Phil Crouse’s 6th PNEC Data Integration, e-Commerce and Data Management conference held in Houston was probably one of the best yet from the technical standpoint. BP, Shell, Marathon, Conoco and Saudi Aramco all presented insightful papers on their data management and IT. No project is complete these days without a new XML-based ‘language.’ We spotted new MLs from Innerlogix, PDVSA and three from PPDM! Panel discussions revealed that there is still a lot to be said about data management. In fact, much of what was said was pretty much the same as the early PNEC debates. Has anything changed? Is data management more or less a lost cause – which will always be under-supported and under-funded? Or as we manage exponentially growing volumes of data, maybe we are doing something right! As someone said – ‘it works!’

Web seismic delivery

Greg Hess (Kelman) thinks that although the Internet failed to fulfill its promise during the ‘false dawn’ of the dot com era, the systematic application of IT focused on the end user can change the way we work. We are moving towards an ‘e-topia’ where we will have more free time! Such nirvana is underpinned by intelligent data management software that ‘learns’ from user workflows and anticipates needs by pre-loading data to the cache. Standards ‘are like a toothbrush – we all have one, but nobody wants to use someone else’s’. Kelman is working with PetroWeb to offer seismic data delivery to the desktop. Seismic line ‘ends and bends’ are loaded into the portal, offering remote browsing of Kelman’s 9TB near-line seismic archive.

Portal of portals

Joe Kostecka presented Marathon’s use of PetroWeb as a ‘portal of portals.’ The dual level browser has PetroWeb as the (PPDM-based) repository of metadata of information stored in the ‘second-level’ portals. The development leverages ESRI’s SDE and Exprodat/Landmark’s Web Open Works (WOW). IHS Energy data is hosted from Denver and accessible through a P2000 browser. Another new browser from Exprodat – ‘Web Geolog’ is used to access this non-Oracle repository. Other components include Tobin Land Suite. A PetroWeb link to Open RSO provides access to ‘bar coded stuff’ in the file room and to under-utilized assets such as studies and other documents. Data must be ‘sanitized’ but the effort is worthwhile.

Wireless production data

Guido Urdaneta (Zulia University, Venezuela) described PDVSA’s test bed for remote, wireless access to production data. The system allows field engineers to access SCADA and other data from a mobile phone or PDA. Real-time data from OLE for Process Control sources is translated into an XML-derived Simple Display Definition Modeling Language (SDDML).

Web Archive Manager

Bruce Rodney presented Exprodat’s work on Conoco’s ‘Gold Database.’ No commercial software met Conoco’s requirements. Conoco is talking to Landmark about including a WOW compatible project archival module within OpenWorks. Conoco/Exprodat will offer users selection of all data from a single web browser – the Web Archive Manager.

Finder best practices

Jairo Freyre explained that following BP’s takeover of Amoco, Arco Permian and Vastar several massive well datasets required merging. The ‘Best Process’ of automated matching of 12 and 14 digit API numbers generated over 1 million ‘Best Wells’ out of 2 million input. The project involved Finder toolset customization with Best forms and Best code. Data from GES, GCS, IHS, Energy Graphics and the MMS were merged successfully. Note that the paper gives considerable detail on the Finder customizations performed.

Information Integration

Schlumberger’s ‘Noble Goal’ is to provide one official answer to a query, along with known risk. Nagib Abusalbi admits, ‘We are not there yet.’ Current tools offer access to a range of data sources along with as much contextual information about data and information provenance as possible. This lets users view data and fix problems – the idea is that the more data gets seen and used, the more it gets fixed. Abusalbi described available solutions in terms of an integration spectrum – from integration, through visualization, to consolidation into a single data store. Schlumberger’s solution offers uniform access to data in a variety of information sources. But to achieve this ‘we need to work together’.

Loose Data Integration

Dag Heggelund (Innerlogix) defines loose integration as that which does not require modifications to existing software. Currently it costs around $250-300,000 to develop an interface to a data store. To reduce such costs, Innerlogix has developed an XML-based Extensible Data Connectivity Language (XDCL) for loose integration. XDCL promises no-code development of data drivers. XDCL can retrieve lists of primary data objects within a data store. XDCL allows for further drill-down into the data store by building a table of contents for each primary data source.

Data Exchange

Rick Taylor demonstrated the new PPDM XML-based data exchange format by generating and XML document from a PPDM 3.6 database in Microsoft Access. Queries are built with a variety of freeware tools such as Java “Cool Beans,” Xerces and XALAN. This has resulted in several ‘languages’ for modeling business associate, seismic metadata and product information.

ASP

A panel of representatives from Petris, Landmark and Schlumberger discussed the future of Application Service Provision (ASP) which has lost some of its shine in the wake of the dot-com bust. The technology does have application – perhaps for data mining and clean-up. Stakeholder issues must be considered – data vendors, software developers, companies etc. One problem is that many companies are reluctant to see their data go off-site. BP in Aberdeen and PDVSA are counter-examples. One foreign user complained that Internet performance was not up to supporting ASP.

Recall Spatial

Mairead Boland (Brunei Shell) outlined Shell Brunei’s deployment of Recall ‘Spatial’ as its corporate well data management tool. Data management procedures and pre-built templates for frequently used logs underpin the new data management strategy. QC audits check that all data required for a plot is available. Recall Spatial offers various browsing data management and extensive plotting capabilities. Recall was demonstrated on a Linux laptop.

BP

Greg Jones (BP) stated that as recently as 1999, BP had no world-wide standards for data, but had a plethora of different data stores – OpenWorks, the visualization ‘Hives’, Merak Peep, Geolog, IHS’ P2000 and Landmark’s DSS. BP now has an approved data management strategy – the ‘Virtual Managed Data Environment’. In May 2000 several BP personnel attended the 4th PNEC Data Integration Conference and were convinced of the reality of web based access and data management.

Shell E&P Co.

Shell E&P uses OpenWorks, SeisWorks and its own interpretation software 123DI. In 2000 Shell commissioned a gap analysis from Schlumberger on data management software from PGS (PetroBank), GeoQuest (Finder) and Landmark (OpenWorks). Cora Poché revealed that for Shell, PetroBank was ‘the best’ and is already used by Norsk Shell and Expro in Aberdeen. Poché called upon companies to work together on a seismic data clean up project – along the lines of the MMS’ GOM well data cleanup initiative. Shell is working to unify work processes for loading data and synchronizing PetroBank and its IDC LogBase. Shell links between units and the MDS over a dedicated 100 Mbps network. Software is delivered in a mixture of on site and ASP hosted by GrandBasin. All Shell data is to be moved to GrandBasin and loaded on demand.

Saudi Aramco

James Richard told how SaudiAramco’s data managers have become ‘confused’ by POSC, PPDM, Geoshare and other ‘solutions.’ Meanwhile users still complain that finding and accessing data is still an issue as is data quality. The Aramco analysis determined that problems derived from poor workflow architecture – an ‘uncontrolled’ workflow existed from data producers to consumers. The solution is to cut the direct link from producer to consumer and to ensure that data is managed centrally. The first SeisServer – controlling the seismic workflow was demonstrated in 1999 – working in the seismic processing and interpretation space. It was ‘very successful’ and is now being extended.

KM Shell E&P Co.

For Gayle Holzinger, “We are in a ‘river of information’. Knowledge Management is about giving folks canoes and compasses!” Successful KM requires systematic implementation. Shell uses the LiveLink and SiteScape for document management, discussion and remote collaboration. Shell KM projects include a project lifecycle document management initiative, a Petrophysical ‘Vortal,’ a producing operations community, an operations work environment and ‘a plethora’ of other content-driven KM pilots.

Federating Databases

BP had acquired a complex environment through its recent acquisitions. Chris Legg outlined BP Houston’s search for a replacement to Amoco’s DataVision and the ex-BP/Arco EXSCI. BP aimed to offer GIS access to a federation of databases. After internal investigation and discussions with other oils, BP narrowed the field to Finder, Petrosys’ dbMap and ‘one other’ web-based tool. dbMap was chosen over Finder because it runs on both PC and Unix and output map quality ‘was better’. This is now undergoing a staged rollout across the US.

Mergers and data management

Flemming Rolle suggested that companies should prepare themselves for mergers by getting rid of in-house developed software. Mairead Boland disagreed. Shell develops its own software ‘because it does a good job’. Jan Hay (citing Oil IT Journal) drew attention to the emerging ISO documents management standard – the proper implementation of records management would save huge cost in a merger. The thorny issue of out-sourcing was raised – in-house development ‘is no longer an option’. Bob Decker suggested that maybe there were too many standards organizations and that consequently choosing and implementing a standard was getting harder. Con Caris said that the situation was exactly the same in the Australian mining industry. Chris Bearce ventured despite the difficulty – ‘the whole thing is actually working’ – even though it’s stressful’. Boland was more positive – Asset integration and HSE issues will force better DM practices. Bill Baski suggested that we are moving forward – from data models to service companies who will be able to provide a seismic ‘dial tone’ plug in the wall. Chuck Rinehart agreed with Boland saying we should sell data management on ROI not on cost. Tom Anderson concluded that ‘senior management does not and never will care about data management – it will be ‘a long slog through the mud.’

This article is abstracted from a 10 page report on the PNEC Data Integration Conference produced as a part of The Data Room’s Technology Watch Service. For more information on this service send mail to tw@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.