IQPC & SMi host London data conferences

Two data management conferences in London in the same week might seem like too much of a good thing. Or maybe this shows how data and knowledge management are recognized as business critical. And with oil at $60 a barrel, folks are desperately trying to assure and leverage their data assets. The incumbent conference organizer, SMi is the one to beat—with around 100 attendees. But newcomer IQPC made a good first time showing with around 40 present, quality papers and attendees. Both shows’ high entry fees sure ‘eliminate the frivolous’ and perhaps some of the practitioners in the process. Our highlights from IQPC’s E&P Knowledge and Data Management include Repsol’s Social Network Analysis, an update on ExxonMobil’s UTCS and ‘Information Orientation’ work in Statoil. From SMi’s E&P Data and Information Management conference we report on BG’s first digital gas field and bring you an update on Shell’s standards-based IT and Enterprise GIS.

Oil IT Journal editor Neil McNaughton kicked off IQPC’s Knowledge and Data Management conference with a talk on ‘10 years of data management—from data management to managed data.’ Recently there has been a step change in how a ‘standard’ can be defined in IT terms. New technologies offer on the fly data validation. XML can be used to package, for instance, seismic trace data with unambiguous positional information as in OpenSpirit’s ‘managed’ seismic data format. Data validation can be applied to HTML—and McNaughton showed some interesting validation metrics from vendor and oil company home pages. The SOAP infrastructure that underpins WITSML can be used to QA data flows between drilling rig and visualization center. But it is in the context of the digital oilfield that these technologies will come into their own. Along with emerging standards for deploying machine understandable taxonomies like the W3C’s Simple Knowledge Organization System, SKOS.

Social Network Analysis

Repsol YPF’s Augustin Diz described the ‘medieval’ stage of data collection where ‘everything is preserved but relatively little is used’. The result is that knowledge decreases over time. Repsol now has 15 widely-used E&P communities of practice (CoP). CoPs may be perceived as outside the ‘process’ and issues that are not of local significance may receive poor middle management support. Nevertheless, Repsol has published its ‘better’ practices and is benefiting from the new possibilities of ‘social network analysis’ (SNA). Using data from CoPs and network traffic, it is possible to find out who people really talk to—their boss, or someone on the other side of world. Repsol is now performing process analysis and K-mapping to support and leverage future SNA.

Statoil

Liv Maeland described three $100 million IM projects: BRA, a major finance/admin project and SAP migration that was completed in 2000 and the SCORE E&P technical computing project which ran from 1998-2001. Today, Statoil is finishing up its third project, ‘collaboration@statoil.com’. This is a ‘major major’ project introducing comprehensive metadata across G&G.

Information Orientation

Maeland cited Prof Marchand’s (IMD Lausanne) work on ‘Navigating Business Success’. Marchand’s ‘Information Orientation’ work investigates ‘soft factors’ such as how the interaction between people, information and technology affects business performance.

Saudi Aramco.

Loc Vo’s presentation covered Saudi Aramco’s plans for data management in the I-field. Making sense of the huge amount of data collected during a field’s lifetime is a major facet of the I-field. Aramco has complete field histories and plenty of locations to pilot test. Challenges include the large amount of production data that is hard to optimize and a limited infrastructure for staging and transmitting RT data. Critical technologies are still under development and there is a lack of experienced people. Data access is an issue across disparate IT systems. When data is needed for decision making, it may not be there! The reality is that while it is not an impossibility, the I-field requires serious investment. But the potential pay-off is tremendous in terms of enhancing asset value.

ExxonMobil

Jake Booth showed an internal Exxon-Mobil promotional video for its upstream technical computing system ‘concept car.’ The idea is simple, to bring data to where you need it when you need it, without manual intervention. Access control is through role based security smart card single sign-on. The card allows the system to recognize a user’s role and provide access to entitled data. Find data through map based interface—drill down, monitor wells, set triggers. (SharePoint Portal). The plan is to replace data silos with integrated, asset-oriented workflows. UTCS is also a key component of the digital oilfield and will support real time trending of well performance, automated analysis and alerts—expanding the use and value of RT data. Automation will detect problems and send alerts to engineers by wireless. Everyone sees the same information at the same time. A shared earth environment brings everything into the collaborative environment of XOM’s large visualization center. Steve Comstock, CIO and chief architect of the UTCS claims it is ‘unique among competitors, all talk about it, none have achieved it.’ The video concluded with a disclaimer along the lines of ‘your mileage may vary’ as Comstock drove off into the sunset in, not a concept car, but a 1947 Chrysler ‘Woody’!

Round tables

IQPC’s conference leave a lot of time to panel discussions and round tables and these really took off with some great exchanges of views. Santos reported that it used to have little or no DM and it was necessary to ‘educate’ management with IM and data metrics, a risky strategy with the potential of embarrassing management regarding the state of corporate data. But now, ‘they understand’ and the drafting department is now performing GIS DM and the librarians manage taxonomies etc.

Santos

Santos uses Datalogix/Innerlogix for QC and data metrics. The company pays attention to training and has separated technical and scientific workflows, ‘so that folks don’t just learn how to push buttons.’ Santos wants to avoid people drifting into exploration through a software package.

Shell

Shell also stresses training with its own institutions that teach stuff like the theory behind synthetics seismogram generation. There is a general feeling in the industry for tightening rules, for data policing. Although Shell has deployed software for G&G field sheets, folks didn’t use them, ‘they can’t be bothered to enter a hundred or so parameters’. But from the management position there is a strong drive to ‘get these field sheets filled in’. This will be achieved by domain specialists pre-populating some fields and performing data QC.

NOCs

While the National Oil Companies (NOC) may possess very rich data assets, their needs differ from the majors. Aramco has 30-40 years of production data and PVT analyses and implements strict data access controls which can be a problem for expatriates as they may not realize what is there. Kuwait Oil likewise provides a fine degree of control on what data is seen by an individual. People may only see a small piece of the whole data asset.

ExxonMobil

ExxonMobil is constantly seeking ways of getting meaningful coupling between business initiatives and computing. Approved projects are carried through to ‘gate 2’ and companies are ‘told to commit monies and people.’

Overload

Danish DONG has experienced ‘initiative overload’ in the last few years and is cutting out a few of them. Statoil is trying to structure data management with its ECIM initiative and is to sponsor a data
management Masters degree program in Norway.

~~~

SMi

At SMi’s E&P Data and Information Management conference, BG Group’s Karen Moore and Schlumberger’s Steve Miller described an e-field development on BG’s Tunisian Miskar field. This presented specific challenges with 400 operated wells, gas allocations, scheduling and blending. Although very high data volumes were coming in from the wells, little use was being made of it. The field proved a good test bed for a standards-based production data management system.

Miller

BG’s production engineers previously relied mainly on Microsoft Excel as a production management tool. Excel spreadsheets were emailed onshore, copied, cut and pasted into other documents for reporting. ‘Data gymnastics’ was the norm. One engineer had 27 spreadsheets to perform daily reporting. SCADA data was not used or even exposed to decision makers. Well performance and nodal analysis tools were used but these suffered from a big ‘data disconnect’.

Business case

Moore and Miller developed a business case that demonstrated direct benefits to the field from improved data management, but also to the asset and to BG Corporate. The system now takes data flows from a new Emerson Delta V DCS/SCADA network to OSISoft’s PI data Historian and Schlumberger’s Avocet repository for allocation and sales. Data then flows on to Schlumberger’s Decide for analysis and reporting.

Performance curve

Wells have pre-defined performance curves stored in the historian which are compared with daily production. Spreadsheets have given way to graphical reporting from PI. The performance curves link to Avocet DM. The Tunisia implementation is work in progress and will be implemented in next few weeks. Moore admits that BG is a late adopter of some of these technologies and has been slow to adopt the digital gas field. Tunisia is a pilot for global rollout.

Shell’s GIS

Shell’s enthusiasm for corporate GIS has if anything increased over the two years since Thierry Gregorius first presented Shell’s GIS data ‘Swiss army knife.’ Shell uses GIS as a lightweight integration mechanism for corporate data so that it can be used across silo boundaries to visualize facilities, the subsurface, proximity to infrastructure, legal geo-constraints—just about anything.

Standards

Standards can bridge the gap between heavy applications, portals and a lightweight GIS integration. Shell is quite a way along the path of standardizing its IT infrastructure—especially with its heavily locked down desktop. This was initially ‘very painful.’ Users could not even set their own wallpaper! But this is now paying off and it is now possible to deploy software world wide across Shells’ 11,000 PCs easily.

Metadata

Standards-based metadata underpins Shell’s data sharing. Tools crawl and index documents which can now be ‘spatialized’ with tools like Metacarta. Gregorius was sanguine about XML as a data panacea—seeing a lot of marketing spin in the way XML is used to put ‘lipstick on the data pig’. More generally, IT offers only partial support for the wide-reaching infrastructure Shell is trying to achieve. Someone only has to update a piece of software in one place to cause things to ‘fall over’. Standards are undoubtedly the answer, but in GIS particularly, their uptake is slow—and almost non existent in oil and gas.

Crisis

Gregorius believes that the industry is in an interoperability crisis. One problem is that GIS data standards are often run by academic groups who may lack a sense of urgency. But there is hope. New York had been talking about an enterprise GIS for years before 9/11. After the attack, the system was in place within a week.

This article has been taken from reports produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this service please email tw@oilit.com.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.