Cold Lake, big data

ExxonMobil’s Canadian acquisition brings multi million document archive and real time ‘big data.’ London (SMi) and Microsoft GEF papers show how ‘everything is changing’ in data management.

At Cold Lake in Alberta Exxon-Mobil unit Imperial Oil is developing a huge oil sands resource using cyclic steam stimulation, steam-assisted gravity drainage and other more exotic technologies. Today, Cold Lake produces 140,000 bbls/day of bitumen from four major plants and 4,600 wells. Ongoing drilling from ‘megapads,’ targets a doubling of production and generates information at a rate that is stressing Exxon’s data management to the limit.

Two presentations this month offered an unprecedented insight into how Exxon’s long-established data infrastructure is adapting to the ‘big data’ era of continuous drilling. Speaking at the SMi E&P data management conference in London, Jim Whelan described how ‘everything is changing’ in data and information management. ‘We used to have well-defined data sources but acquisitions and mergers especially in non-conventionals have meant that we now have a wealth of information in archives that are more or less well managed.’ Real time ‘big data’ is swamping established processes.

Speaking at the 2013 Microsoft Global energy forum in Houston, Exxon’s Bret McKee offered a new slant on information management in this high throughput environment leveraging Microsoft SharePoint. Here an information framework developed with help from Access Sciences is used to register incoming documents in SharePoint according to enterprise keywords from a standard list.

But the ‘back office’ as described in Whelan’s talk is a mighty enterprise data infrastructure based around a corporate database and well established procedures. These seek to capture and QC incoming data into the ‘InfoStore’ before it gets used and abused by the projects. Exxon’s environment includes Recall, Petrel, PetroWeb, ESRI, SharePoint and many proprietary applications, all connected by ‘StanLAN,’ a standardized network. The Cold Lake use case worked because of the time spent on data up front, notably on scanning around 2.5 million legacy documents. Users are now very happy, but the project will need ongoing resources to assure sustainability and combat data ‘entropy.’

The next major task is hooking up all the real time data coming from Cold Lake’s drilling—here Witsml has been mooted. Control system integration is also on the cards. A ‘digital energy’ environment is under construction at Cold Lake linking the Honeywell TPS/Experion PKS control system via secure StanLAN network devices from Enterasys. Exxon is developing a SharePoint site for its Cold Lake programmatics group, tasked with migration to a new ‘General managed environment.’ More on Cold Lake, and on SMi E&P Data Management and the 2013 GEF.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.