2016 ECIM Data Management conference, Haugesund

Teradata on internet of things. NPD on delivering data in the downturn. Sirius students ’skeptical’ of industry. ConocoPhillips ‘cuts down hedge’ between data managers and business.’ Statoil’s ‘Gold Finder.’ Diskos gets Whereoil API. Cegal/Iron Mountain’s big data cloud solution. Schlumberger’s Studio World Map for Diskos. TNO on North Sea Data Management Forum. ILAP, more ...

The 2016 ECIM data management held up well in the face of the severe industry downturn with 220 attendees and some 70 presentations. In his introductory keynote, Teradata’s Niall O’Doherty asked whether the ‘internet of things’ was something that matters or just a ‘big eye roll.’ IT is terrible at naming things, something that leads to confusion for users and especially for management. For some, marketing literature has been revisited with a search and replace of ‘big data’ for ‘IoT.’ Sensors have been around for a very long time but their data has not been very accessible. For the IoT to be transformational we need to look at systems as a whole, both IT and operations technology, although IT/OT cultural differences make this hard. IoT success in other industries come more from good data curation rather than fancy new math. One Teradata poster child is Siemens whose ‘Sinalytics’ IoT initiative was the keynote feature of the 2016 Teradata Universe. The buy not build issue is also germane to the IoT. In other words, as O’Doherty asked, ‘are you an oil company or a data integration company?’

Bente Nyland, head of the Norwegian Petroleum Directorate noted that cost cutting means that many projects and new technology developments have been halted, although she wondered if all the cost cutting was really necessary. It is now particularly important to manage business information through the hiatus in activity. Norway has an unparalleled record in the fifty years since its first licensing round. Data management is a cornerstone of the Norwegian model which promotes competition on use of, over access to, data. The NPD’s data deliverables are the Fact Pages, Fact Maps and via the Diskos consortium. NPD also produces regular Resources Reports showing major undiscovered oil and gas.

The falling oil price has hit Norway hard. Recently there has been a slight recovery leading to a hope for stable prices ‘a little higher than today.’ The sudden shift from growth to cost cutting has meant that projects, people and whole companies disappear in acquisitions and mergers. This means a loss of expertise, stiff competition for funds and more short termism. Data re-use is likely to be important in the future and we need to avoid losing focus. Today we take for granted the availability of reliable data.

The NPD has laid the ground rules to encourage good data management. The authorities have a legal right to ask for data in a usable form and to mandate long term reporting and data retention. Despite ‘how to’ guidelines, the NPD is concerned with under reporting, reporting errors, data loss and degradation, and declining competency. We need to prioritize the retention of old data to be in a position to reap the future value. So, ‘clean up your data stores and work for the long term.’ Data management underpins long term value creation and the NPD is maintaining its focus on ‘simple, smart solutions’ based on open standards under the Diskos umbrella.

David Cameron of the Sirius SSI at the University of Oslo observed that in 2020, the oil price could be anything between $20 and $100. The world may even ‘turn away from oil.’ But in all events, we need to keep retain expertise in oil and gas ‘even if it is just for maintenance, safety and decommissioning!’ Unfortunately, oil and gas has shot itself in the foot, again! In 2013, there were 420 applicants for the petroleum technology course at Norway’s NTNU. In 2013, they dropped to 31. The University of Stavanger has only 12 applicants for 25 places reflecting a ‘profound skepticism to oil, gas and heavy industry.’

Universities have a hard time keeping two masters happy. On the one hand, academia measures researchers’ output on the volume of ‘papers with fancy set algebra,’ on the other hand, industry sponsors often doubt the business value of such efforts. Sirius is the ‘center for scalable data access in the oil and gas domain.’ Collaborative R&D initiatives include the Trondheim Integrated Operations Center and DrillWell in Stavanger. Today much time is wasted finding and accessing data. The answer lies in part on cloud computing, ‘where oil and gas computing is going.’ Cameron ended up with a reprise of his Optique presentation at Intelligent Energy demonstrating natural language access to various commercial and public data sources for ‘quick and easy’ ad-hoc requests. Components include ABS, a modeling language for distributed software systems and the EU Envisage/HATS project, ‘a potential game changer for the cloud and a deliberate EU attempt to counter Google and to provide users with the tools to keep Google honest.’ One Sirius finding of note is the fact that computers find ‘negation detection’ in natural language hard to catch, let alone the double negative, ‘when no means yes!’

Kristine Karoliussen (ConocoPhillips) has ‘jumped over the hedge’ from the business of geoscience and into the data management department. Data managers are expected to provide easy access to data, but the reality is more complex. Data management does not ‘do itself.’ Changes in corporate tools (a move to OpenWorks5000, the introduction of Studio) may overlook prior data management art. It’s all very well to say ‘just load the well data’ but this implies locating surveys, metadata and context. The business also tends to overlook reporting, data QC, maintaining the ‘gold’ databases and keeping OpenSpirit up and running during critical well operations. ConocoPhillips has put together a data management advisory team to handle such issues. Procedures and standards are complex. It would be great to a common standard, a Google-like portal for data access shared by operators and contractors. Times are challenging but this maybe an opportune moment to rethink how we do things and to ‘cut down the hedge,’ providing geoscientists with more insight and interaction with data managers.

A presentation from Frode Jansen Lande on Statoil’s ‘Gold Finder’ data discovery system demonstrated that effective solutions can be developed in-house with minimal (400 hours) effort. Statoil, like most large companies, has data and scattered across many systems and information silos. Gold Finder mimics a file browser with a single sign-on and provides a user-configurable results tree view. Master data means that the retrieved results are ‘richer than any single system.’ The web app leverages a single flat database table and runs on Oracle and AngularJS. The system was well received and went straight into production. Users of say, Recall, are even finding stuff in their own systems! Gold Finder has revealed data busts, strange names and duplicates. Statoil is now working to improve connectivity with document systems and to add Solr/Lucene for full text search and indexing. The ensuing discussion revolved around the relative merits of buy vs. build with a representative of the Norwegian vendor community pointing out that this type of functionality was already available in the marketplace, with map-based search too. Statoil is nonetheless well pleased with Gold Finder which has ‘succeeded where many earlier portal projects have failed.’

Eric Toogood and Elin Aabo Lorentzen of the Norwegian Petroleum Directorate gave an update on Norway’s Diskos national data bank. Diskos technology is provided by CGG, Kadme and Evry. The low oil price means that Diskos is losing members due to mergers and acquisitions and leavers and this makes for rising costs for the remainers. The industry is also losing skilled data managers. Legacy data remains a problem. Much data is on now obsolete 9 track tapes in storage. Remastering these is a complex process requiring investment and specialist knowledge. Public data in Diskos has been a success and NPD is working with CGG to make public data from relinquished areas available at release date. A programming interface (API) for Diskos is finally seeing the light of day, based on the Kadme Whereoil REST API. The ‘teething troubles’ with the Diskos seismic module have now been fixed and a lot of the data has been loaded. The trade module is proving more challenging. Production data is working well thanks to the Epim Mprml machine to machine reporting with its Schematron-based data validation. Currently data is still submitted on physical media. The plan is to move to online data load possibly leveraging an open source technology stack from Netflix. Currently SEG-D 3.1 is mandatory for reporting. A new SEG-U ‘unknown’ format was mooted, ‘something that blends SEG-D with SEG-Y into a common shared format for acquisition and processing.’

A joint presentation from Stein Sigbjornsen (ConocoPhillips) and Arve Osmundsen (Cegal) introduced a ‘big data’ seismic archive solution built on the new Iron Mountain Cloud. The new tool features a catalog and map interface with editing functions and plugins for Petrel and ArcGIS. Data is stored on EMC Isilon with replication across dual sites. Data can be cropped to an area of interest for workstation loading. In the Q&A, Sigbjornsen revealed that after migrating to the archive, the original tapes were destroyed, ‘a world first’ according to session chair David Holmes (EMC).

Odd Inge Thorkildsen unveiled Schlumberger’s Studio WorldMap for Diskos. Diskos holds some 322TB of public data that is ‘not necessarily ‘application ready. This builds on Schlumberger’s Petrel Studio/WorldMap offerings, now plumbed into Diskos for transparent access to data in the repository or in-house. Search now spans all public NPD data and data hosted by Schlumberger. SWM for Diskos is refreshed with monthly scripts that index and spatially enable Diskos data into the Schlumberger cloud in Stavanger.

Statoil’s Robert Skaar asked why oils accept paying twice for data, once to acquire it, and again to extract it from proprietary software for re-use. Skaar heads the Integrated lifecycle asset planning (Ilap) steering committee. Ilap, a.k.a. ISO 15926 Part 13 sets out to standardize scheduling data and terminology for megaprojects. The idea is to avoid having to re-key data from Primavera into SAP or between engineering contractors’ software tools. So far, some 800 terms have been standardized and packaged into an XML data transfer standard. Adapters allow users to leverage in-house software tools which are not changed. The idea is to offer users asset tracking functionality along the lines of Amazon’s ‘where’s my stuff.’ Ilap has support from ConocoPhillips, ENI, Statoil, EPIM and PoscCaesar. An Ilap draft international standard has been submitted to the IOGP ISSC.

TNO’s Stephan Gruijters provided an update on the North Sea Data Management Forum a low key international collaboration that effort that is working to ‘make reporting companies’ lives easier’ by providing common reporting standards across all circum-North Sea nations. TNO has surveyed North Sea regulators and found a desire for cooperation but also many subtle differences in data definitions. The participating regulators are interested in building a business case and possibly moving to a more formal arrangement. Gruijters wound up with a demo of the ‘North Sea Data Portal’ combining web services from four countries. This currently leverages Esri technology. In the Q&A, Gruijters acknowledged that an open data/Inspire-based approach would be possible in the future.

Pernille Hammernes and Darren Kearney (Statoil) presented on the new digital reality in our industry. Statoil’s big data and automation effort has been underway for some years. Lately ‘digitalization’ has landed on the top table and has sparked off a ‘sprint’ over the last year for ownership of the analytics function. Statoil is evolving local initiatives into corporate opportunities for generating a competitive advantage. This is achieved by bringing all stakeholders, discipline advisors together in a global calibration effort. The aim is to ‘use all the data.’

Petrel Studio and Documentum were highlighted as key components of Statoil’s digitalization effort. So far, some fifty improvement initiatives had been proposed which are being costed and prioritized. The initiative is driven from an ‘analytical factory.’ Here an innovation lab finds out what’s new and promising and tests it on business use cases. If they work, a new solution is productize and handed over to IT for deployment. Statoil is also working to plug the data science/analytics skills gap. Digitalization represents the golden age of IT and data management and an opportunity to understand new technology, to take more bets and build a digital competency in phase with the business.

One foundational project involves making data available for analytics by aggregating and consolidating diverse data sources. Statoil adopts a ‘cloud first’ approach. NoSQL and data virtualization technology herald a move ‘from silo to the enterprise,’ leveraging Statoil’s long established data management organizations in both business and IT. Current initiatives in the IoT, big data, automation and robotics feed into the ‘stepping up’ phase before ‘transforming industry.’

Shell’s Lars Gaseby wound up proceedings with a couple of quotes from some industry thought leaders viz. Michele Goetze (Forrester) the ‘demand for data and expectations are high but what the business wants is ambiguous.’ Jelani Harper (Dataversity) ‘trends in data management emerge rapidly but mature slowly.’ Gaseby added that some changes may be almost invisible. Cloud computing is used without being seen in Shell. Mark your diaries for next year’s Ecim on 11-13 September 2017 in Haugesund.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.