ECIM 2013—Haugesund, Norway

ConocoPhillips’ new paradigm of 4D seismic and Optowave. Statoil raises ‘big data’ questions for Diskos. Cairn on the state of play in data education. Shell/Enthought’s Geosearch. Schlumberger on ‘lean’ data management. Pemex’ 50TB ACN databank. DNV on the EU Optique semantic project.

The Norwegian Expert center for information management’s (ECIM) annual E&P data management conference was held earlier this year. To Oil IT Journal’s great regret we were unable to attend the show this year. But the organizers kindly provided access to the presentations from which this report is derived.

Henning Lillejord and Stein Sigbjørnsen (ConocoPhillips) describe 4D as driving a paradigm shift in seismic data acquisition and processing. The super-major Ekofisk field has had an extraordinary history and has been given a new lease of life since the topsides were physically jacked-up in 1989 and a massive secondary recovery program initiated. This has been tracked with four conventional seismic surveys from 1989 to 2008 when a value of information study was used to justify a 15 year long drilling and monitoring campaign including yearly ‘life-of field’ 4D surveys. Four component LoF data is used to image the central part of the field beneath the gas cloud. A single survey over the 200 km of ocean bottom cable produces 35TB of data—streamed ashore over a dedicated 1GB fiber link. Between surveys the array is used passively as an ‘Optowave’ seismic activity monitoring system with a snapshot image taken every 10 seconds and transferred to the University of Bergen and Norsar for seismic events detection. Around 20GB of data is collected per day which is creating interesting data management issues for ConocoPhillips, its contractors and the Norwegian authorities.

Sivert Kibsgaard, Statoil, offered another slant on the growing data volumes with the shift to managing a prestack databank. Statoil has around 1.5 PB of data online and some 10PB on tape. The Snorre LoF project generates 300TB every six months. Storage costs are a critical issue with a 40x price advantage of offline over online as per Diskos. Costs of backing-up and replicating a multi petabyte Diskos are scary. The ‘big data’ issue is also forcing a second look at Norway’s ‘Yellow Book’ seismic reporting regulations.

Vitaly Voblikov presented Lukoil Overseas’ data environment that is to support its expanding worldwide interpretation, drilling and production activity. Lukoil has around 150TB of online data to manage and is working on a data infrastructure that combines data in public sources (Iris 21/Enerdeq), application data stores (Petrel, Kingdom, Geolog and Roxar RMS) and an in-house developed BasPro corporate database. Kadme Wheroil is used as a front end and search portal to all of the above. Lukoil is now working to capture interpretation results in an automated process that identifies projects created in a specific application, tags them with contextual information relating to contracts and licenses and performs data QC. Project synchronization across Lukoil’s international operations is achieved with Quantum’s StorNext technology.

Eldar Bjørge described how Statoil is refocusing its data strategy to achieve the elusive goal of ‘quality data and active ownership of the data asset. ’ Despite hundreds of best practices and how-to guidelines per data type, it remains hard to connect the data management function to geosciences and onboarding new personnel is difficult. Data governance is harder than it looks as data owners tend to delegate complex tasks down the hierarchy to local resources with uncertain results. Statoil is working to fix these issues with better linkage between data management and geoscience, a new IM/IT architecture and a clarification of data ownership roles and responsibilities. The new push includes an extension into document management and life-cycle approach to data quality and retention—including automatic deletion. A key component of the new order is the allocation of sufficient time for data capture and QC—now a part of every project’s deliverables.

Elisabeth Hegle (Cairn) provided an update on the state of play in data management training in Norway. Back in 2008, ECIM sponsored the first year of what was to be a three year IM program at the University of Stavanger. Unfortunately the university did not continue with the second module of the program. ECIM has now turned to Aberdeen university whose online MSc in information management appears to fit the bill.

A joint presentation from Shell’s Dan Petcu and Enthought CEO Eric Jones outlined ‘data management friendly’ workflows. Shell has leveraged Enthought’s Python programming toolset in ‘Geosearch,’ a multi application, multi database front end for Shell’s in-house developed NDi Explorer. Geosearch goes beyond data monitoring and query with the ability to push data back to applications.

Gary Murphy (Schlumberger/InnerLogix) provided an elegant exposé/commercial on the merits of the manufacturing model for controlling upstream data quality. The current InnerLogix was inspired by the QC methods used in manufacturing. In an effort to improve the product, Schlumberger has taken a second look at how quality is controlled in manufacturing and how these concepts could be applied to E&P data. The study homed in on statistical process control (SPC). This involves flowcharting the production process, measuring different parameters and using ‘Pareto glitches’ to backtrack the process and pinpoint root causes of defects. Murphy suggests that QC in the future will see ‘more aggressive’ sampling of E&P data at appropriate points over time and the use of SPC to reduce variability by identifying and tracking glitch root causes. He also suggested that companies should work with their suppliers to produce quality-enabled applications and exchange formats.

Simone Andre described how Halliburton helped Pemex build a Petrobank-based seismic data repository, the ‘Acervo Geofísico Nacional,’ now a cleansed and reviewed 50 terabyte upstream dataset.

DNV’s Christian Hansen lifted the lid on the EU ‘Optique’ project, running under the auspices of the EU’s 7th Framework program. One use case from Statoil is to leverage semantic technology to answer the following type of geosciences questions—’Within this area of interest, return the wellbores that penetrate chronostrat unit C1 and return information about the lithostratigraphy and the hydrocarbon content.’ The Optique project is headed up by semantic specialist Fluid Operations of Walldorf, Germany. Others in the €14 million semantic junket include Siemens, Statoil, Schlumberger, Oracle, IBM, Halliburton, Microsoft and EMC. More from Optique and from Fluid. Visit ECIM.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.