SMi E&P Data Management, London

BG on managing ‘annoying’ Petrel data. Shell, ‘upstream data is different, avoid buy not build blindness.’ Big data capability and Jboss/Teiid data virtualization for Landmark’s re-vamped DecisionSpace. Repsol transforms with information. ExxonMobil, ‘give end users data management tools.’ OMV formalizes its information management following reorganization. GDF Suez’ SharePoint-based ‘one stop’ information shop.

Chairman Fleming Rolle (Dong), introducing the first speaker, BG Group’s Sherwin Francis, observed that ‘few have cracked the (ubiquitous) data management problem around Schlumberger’s Petrel.’ As Francis explained, Petrel’s file-based data structure and absence of a database makes central control challenging. Users can create projects at will, duplicating projects and reference data. Such flexibility is fine for users but a nightmare for data managers. Petrel project management tools exist, Schlumberger’s own Prosource and the Petrel project reference tool. The Innerlogix data cleansing suite can also be used to manage projects and a new Petrel Studio database tool is currently in pilot. Third party tools such as Blueback’s project tracker are also under study.

Petrel project managers must chose between various scenarios and data duplication strategies. These include a single master reference project, multiple connected reference projects or many stand alone projects. This kind of challenge did not exist in the days of OpenWorks or Geoframe. BG currently uses a Seabed/Recall master database and Innerlogix to create Petrel master and reference projects from which individual working projects are derived. In the future Studio will sit between Seabed and feed individual projects directly in what is described as more like the old Geoframe/OpenWorks type of scenario. Francis wondered why Schlumberger had not provided this functionality years ago. Further improvement should be achievable using Blueback’s Project Tracker which gives a good view of shared storage through scheduled scans, populating a SQL database and providing a geographical representation, pinpointing inconsistencies in data and reference relationships, monitoring use and managing disk space. The effort has been worthwhile for BG which now has an automated process for managing projects globally and less unused projects and duplicates.

For instance, Blueback Tracker found six different Petrel versions in use at one small asset where around 180 projects were compresses to about 20. Keys to success were management support, a rational directory structure and the implementation of interpretation standards, best practices and archival. It is also a good ideas not to run more than one or two Petrel versions.

Francis added that onboarding users by explaining the benefits of structuring Petrel projects approach was key. An observer agreed that the key to data management success is participation. Data managers should not sit on the sideline. They need to know what the team is doing and talk daily with end users. ‘Petrel is a very annoying tool. Blueback’s tools are a way of getting them out of trouble but it is not a solution.’

Johan Stockmann offered Shell’s perspectives on data architecture and management. The upstream is different. While the aim is, as in other verticals, for trusted data, this needs to be set against the ‘large volumes of expensive, idiosyncratic E&P data.’ There are moreover, ‘high levels of uncertainty’ and importantly, ‘no single version of the truth—forget it!’ Enterprise architecture is designed to accommodate such multiple parallel truths. Shell’s EA is loosely based on the Open group’s framework (TOGAF). The ‘system agnostic’ data model is defined down to a level that will support automation. Key data definitions are owned by senior staff with a methodological mindset and a great network.

Stockmann was scathing of ‘buy vs. build blindness’ and of the ‘local lobby’ which has it that ‘central standards are too expensive’. Quality is not the only problem. Today, solution providers need to include a fully specified data model and knowledge as to where data will come from. Web services are OK but you still need to understand sources, creators and usage—‘it all needs to be thought through.’ On the E&P ‘difference,’ iterative workflows are particularly demanding of data management, multi dimensional data is not well suited for relational databases or models. And real time sensor data makes for ‘interesting architectures.’ While Shell is an active supporter of standards both industry-specific and horizontal, ‘no single standard supports all our needs’.

Far too much time is spent on data migration, search and clean-up—even by senior execs who shouldn’t be futzing with spreadsheets at midnight! By doing the architecture right, Stockmann expects to reduce the data overhead by 66%.

Accoring to Chaminada Peries, Landmark’s software unit has had its head down for the past couple of years but is now about to release new products under the DecisionSpace banner. These will include workflow tools, dev kits, ‘big data’ capabilities and, because desktop deployment ‘is no longer an option,’ cloud functionality. The new framework will ‘acknowledge and work with third party apps’ (Petrel was not specifically mentioned but that is what we understood).

Landmark is trying to regain the enterprise high-ground with a solution that, it is claimed, reduces data duplication and the dependency on files and application data stores, while improving security and data audit. This is to be achieved thanks to ubiquitous data virtualization leveraging, the open source Jboss/Teiid virtualization engine. This underpins the DecisionSpace data server with connectors to Petrel, Recall, PPDM, SAP and other industry staples.

Data is retrieved as Odata services, a ‘powerful and open protocol.’ Halliburton is a contributor to the Odata community. Also on offer is a ‘unified API for text and spatial search across all data, GIS integration and Hadoop-based business intelligence. Stand-alone apps like WOW and PowerExplorer have been re-tooled to the web platform to take advantage of the ‘new reality’ of the cloud. DecisionSpace is now presented as a ‘platform as a service.’ Developers write plug-ins to the cloud-based infrastructure. More from Landmark.

Pat Merony and Gema Santos Martin presented Repsol’s ‘Transform with information’ initiative. This blends people, processes, tools and a ‘lean/kaizen’ approach. At the heart is an in-house developed ‘GeoSite’ portal providing a single point of access to information. Repsol has elevated information management to be on an equal footing with other upstream disciplines like geology, geophysics and reservoir engineering. A comprehensive service catalog has been drafted leveraging concepts from DAMA, CDA and PPDM. Attention has also been placed on career paths with potential promotion to data advisor and or architect roles and to competency development. Asked if candidates were ‘knocking at your door now?’ Santos Martin replied modestly, ‘Not yet but they are not busting down the door to get out!’

Keith Roberts (ExxonMobil) has been 33 years working in the upstream and remembers the days of the drawing office and typing pool. Things changed with the arrival of the PC and desktop applications that let people do stuff for themselves. Exxon’s aim is to do the same for data management—putting it in the hands of the end user.

The data manager’s role will have to change in the face of trends like the big crew change (already a reality for Exxon), which is driving productivity, increasing data volumes and pressure on cost. The future will see greater technology integration, ‘Petrel is just the start of it.’ The plan then is to ‘give users the tools they need for data management and get out of their way.’ Companies also need to move from data ‘schlepping,’ shoveling stuff around and on to adding value through data science and forensics. Exxon is looking at the cloud on a case by case basis but has concerns regarding bandwidth and security.

Juergen Mischker traced OMV’s three year journey to formalize its information management following a major reorganization in 2011. This saw the introduction of corporate information governance and data management disciplines—both anchored in E&P (rather than in IT). IM is now a key process owned by a senior E&P VP. A new data enhancement project has improved the quality and quantity of well data.

David Lloyd described how GDF Suez is supporting its growing operations with a one stop shop, The Portal, providing access to information, documents and data. Users are presented with a customizable SharePoint-based web client. SharePoint has its critics, but it makes for an affordable solution and provides basic document management functionality out of the box.

Deployers can buy third party web parts to plug SharePoint’s gaps. Lloyd advocates a ‘reasonable’ information architecture with drill down by function, asset, project, asset and well. SharePoint TeamSites is used for calendars, meetings, workflows and ‘presence management,’ showing when people are online.

Data cleanup was a prerequisite. On just one shared drive, there were 2.5 million files (almost half a million duplicates) in 184,000 folders. Getting rid of the mess can even reduce your carbon footprint!

But the real challenge is convincing users on merits (and obligations) of good information management. Other components of GDF Suez’ IM solution include Flare’s E&P Catalog, Livelink and OSIsoft PI. The Portal also opens up to vendor data from Whatamap, Hannon Westwood, IHS, DEAL, CDA and others. Unified communications and video chat with Microsoft Lync also ran. SharePoint may not be the best user interface around but it provides flexibility. GDF has three developers working on customization. There are issues with SkyDrive and various Office vintages. TeamSites can ‘spiral out of control.’ All of which need managing. There is ‘a lot going on backstage.’ GDF uses PRINCE II and ‘agile’ methods. It has been a ‘massive scary journey but fun!’ More from SMi Conferences.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.