SMi Data and Information Management 2003

Around 60 attended the fifth SMi E&P Data and Information management conference. Shell’s Discovery data clean-up program continues to dominate the UK data management agenda, with an update from Erik van Kuijk and an insight into how security issues are being managed from Richard Mapleston. Aspects of the Discovery work are recycled (though POSC and the UK DTI) into (yet?) another joint industry-government data initiative. The UK Pilot Data Initiative sets out to address this issue by reducing the time for ‘hot’ data to move into the public domain, and by simplifying location and access to data. Some place in between Discovery and Pilot the technologists have spotted an opportunity to promote ‘web services’—direct computer to computer interaction. Flare Consultants apply risk analysis to investment in information management while a presentation from Instant Library looked into disaster recovery in IT and document stores. Exprodat continues to develop its GIS-focused E&P data access.

Stuart Robinson reported back from the 4th National Data Repository meeting held in Stavanger last year. Robinson defines a National Data Repository (NDR) as a place “where data sets can be shared amongst partners, regulatory bodies and other interested parties”. Various rationales for NDR establishment have been invoked – cost reduction, national archive and the requirement to attract new entrants. Robinson asked if NDR’s deliver business value – a debatable point. Strict business value in terms of cost savings may be hard to achieve, but may not be really necessary in view of the greater overall benefits. The one size fits all approach definitely does not apply to an NDR. Different data release legislation, data ownership, culture and oil province maturity make for different objectives and approaches. Oil company users should realize that the NDR is here to stay and will benefit those who ‘get involved’.

The Pilot Data Initiative

CDA MD Malcolm Fleming described the UK Government’s Pilot Data Initiative which was established last year to develop a data access, storage and National Archive model for UK data. Participants include DTI, UKOOA, BGS, CDA service companies and consultants. The project sets out to resolve ‘deficiencies’ in existing data legislation and management. In general, there is ‘confusion around ownership, rights, obligations and liabilities of license data’. The solution integrates existing repositories (DEAL, CDA) and proposes a new National Archive. The resulting ‘life-cycle’ model envisages holding data in a repository during the ‘active phase’ of its lifecycle. Upon license relinquishment the data moves into a National Archive – to be maintained by the BGS. This move relieves licensees of their obligations to keep data ‘in perpetuity.’ Concomitant with this program is an overall reduction in the release period from the current 5 to 4 years. A trial carried out on data from Kerr McGee’s Hutton field is currently underway.

Landmark

Jon Lewis regards IT as a lever for value creation. For Landmark, the hosting of data and application software represents a new business model as witnessed by Landmark’s new UK data portal ukcsdata.com. When operational, the portal will provide access to multiple commercial and other data sources and will be of particular benefit to new entrants by supplying comprehensive data through new leasing arrangements, lowering barriers to entry and broadening the market. Deployed in-house, hosting centers become ‘hubs’ or ‘MegaCenters’ in Shell’s terminology. The latest release of Landmark’s Surf and Connect leverages ESRI’s technology to simultaneously view Norwegian wells in DISKOS, UK CS wells out of CDA and in-house data in GeoFrame or OpenWorks. ENI now deploys a Landmark portal in-house at its Milan-based hosting center. This supports workers in Europe and Kazakhstan with some 140 application types. Lewis believes that companies the size of ENI and Shell are ‘a market unto themselves’—with the critical mass required for internal implementation.

Shell

Erik van Kuijk—head of Shell Expro’s Discovery data clean-up project—retraced the philosophy that Shell’s North Sea unit has developed around data management. The tenets of van Kuijk’s approach are the ‘KID’—Knowledge Information Data continuum and ‘entropy’—the notion that management involves the ‘cooling down’ from chaotic initial states of data and business processes. This approach has now been extended to ‘software portfolio entropy’—where van Kuijk notes that “the consistency and connectivity of every software portfolio deteriorate over time unless effort is spent”. Expro has rationalized its software portfolio into loosely coupled domains (subsurface, wells, production etc.). These are progressively being linked together into ‘active workflows’ which are constrained and coupled through the use of a consistent catalogue of data attributes. Catalogues are ‘hard-wired’ into the workflows so that users ‘don’t have to worry about the consistent terminology’. There is none the less prescription—people will be ‘forced to adopt and their behavior measured’.

Paras

Alan Smith outlined Para’s new ‘Once’ concept of ‘one-time’ data management, new ways of working, consistent and correct data all of which is economic and efficient! The business drivers behind ONCE are the aging workforce which means that more is being asked of fewer people. Paras has applied the ONCE methodology in a project for Premier Oil. Here various domains have benefited from the ONCE rationalization. HR reporting for instance has been streamlined, and information is now available through a ‘self service’ web portal. Similar rationalization is in progress for production reporting – where ‘spreadsheet-based’ reporting has been eliminated.

ExxonMobil

Dawn De La Garza described how ExxonMobil is ‘learning to swim in a sea of data’. Nuggets of knowledge are being lost in an ocean of data. For ExxonMobil, the answer is to capture key interpretations and archive or delete unnecessary data. Standards are important too—De La Garza cited issues with different workstation formats, but sees help coming from projects such as POSC’s Practical Well Log initiative and WITSML for rig site data – a ‘great effort’. ExxonMobil’s Upstream Technical Computing organization was formed to deliver and support a standard technology system offering seamless integrated technical IT for all upstream professionals. ExxonMobil personnel can work on projects irrespective of their location. Procedures, people and technology have been standardized and central services are leveraged for data entry and loading. While there are no data management ‘silver bullets,’ standards can be the ‘silver lining’—facilitating partner, vendor and service company interaction.

Schlumberger

Today, multiple in-house and inconsistent stores cohabit with external data centers. In this heterogeneous world, Schlumberger’s Steve Hawtin recommends understanding ‘what works for you’ – and defining long term goals. Schlumberger Information Solutions is ready to help out with consultancy services, documentation of workflows and planning. Key to the SIS offering is the Master Data Sore – now redefined as ‘a collection of repositories holding approved data and connecting business processes’.

Web services

Shell’s Richard Mapleston described the particular security challenges of moving upstream data. Key requirements are for secure, shared access to documents and applications. Shell is experimenting with Web Services as providing secure automated information flow both internally and with external partners. In 2001 a pilot was initiated with the UK DTI and IBM. Last year Shell Expro implemented web services in its internal Discovery.com portal. For 2003, upstream web services will be implemented with SAP portals. Mapleston believes that security issues can be successfully managed by adopting and trusting common standards. Shell is working on ‘Pathfinder’ developments testing digital signatures. Consultant Niall Young sketched a hypothetical use of web services interaction between oil companies and government. A complete e-government implementation would however be complex and expensive to implement. A web service which ensured well header nomenclature consistency between operators, CDA and the DTI would be more realistic. Young believes that standard data validation such as the supply of DTI well numbers should be supplied via generic web services.

Exprodat

Gareth Smith reckons that around 80% of E&P data has a spatial component and E&P professionals have come to expect map views of their data. GIS implies a considerable overhead in spatial data sets maintenance. Smith believes that ‘few companies realize the full business value of their spatial data and systems’. Exprodat therefore proposes a ‘cookbook’ GIS strategy comprising the provision of simple tools for the masses, a focus on ArcIMS implementation before the desktop. Content provision is important and should expose high value corporate data and trusted data sources such as DEAL and PetroView.

Instant Library

Paul Duller described disasters and their consequences before offering guidelines for prevention and recovery. Risk assessment maps the likelihood of disaster against severity. Risks are documented and prioritized. Information loss prevention relies on standards like BS 5454 for library storage. Duller stresses the practicalities of data preservation. Data should be stored on upper floors (not in the basement), suitable racking should be used that diverts falling water and assets should be kept away from water pipes and damp. While disasters are extraordinary events, they are ‘sufficiently frequent and similar to be amenable to planning and prevention’.

Flare Consultants

Paul Cleverley showed how information management (IM) can be integrated into the corporate risk control framework. The same consequence/likelihood mapping as proposed by Duller can be applied to information management. Cleverley describes IM risks as including poor learning, knowledge loss and information quality, poor accessibility and infrastructure failure. Flare’s work with Shell Expro on the Discovery data cleanup project has demonstrated the importance of a common vocabulary in tying together the different parts of the E&P workflow. Catalogue-defined terminology can be further leveraged by ‘process management’ to ensure that ‘key products are published to the right places’. Once E&P terminology has been ‘embedded’ in the system, graphical drill down across multi-domain data sets becomes possible. Virtual team working between regional offices has also been enabled and key remarks and audit trails are now published to the corporate ‘memory’.

This article has been abstracted from a 16 page illustrated report produced as part of The Data Room’s Technology Watch service. For more information email info@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.