SPE Digital Energy, The Woodlands, Texas part II

The second part of our report from Digital Energy hears from IBM on big data. Multiple presentations from Kuwait Oil and partner Halliburton on the ‘intelligent digital field.’ Schlumberger on running compute intensive Petrel and Matlab jobs on the cluster. And again, from KOC, IDF ‘smart flows.’

IBM’s approach to big data was sketched out by Mike Brulé. Oils are currently unable to accommodate exponentially exploding data. The first principle, physics-based approach to analysis has been a ‘stumbling block’ in the partnership with IT which wants do everything with data mining. But big data represents an opportunity to combine both. Brulé proposed a complex architecture operating at ‘mega to exa scales’ and at speeds from slow loop to real time. Real time streaming data ‘tuples’ feed a complete modeling and simulation environment. This spans accoustics, microseismic, ‘OpenCV’ for visualization and, of course, Hadoop. IBM already has three Hadoop implementations in oil and gas and is showing analytical speed up from ‘days to seconds.’ The approach combines full physics models where appropriate, e.g. for pipeline optimization with empirical/statistical models where physics is less well understood. The techniques have been successful in slugging avoidance and corrosion mitigation. IBM flagship project is Statoil’s environmental monitoring program. Another use is in exploration bidding where data can never be analyzed fast enough. Here a data cockpit ‘brings data together’ for the decision makers. In the Q&A doubts were expressed on two counts—regarding the difficulty of maintaining models and of propagating and comprehending uncertainty across multiple models. One observer asked, ‘Is this why the upstream is reticent to apply these techniques?’

Kuwait Oil (KOC) and supplier Halliburton pulled off something of a coup by dominating the last day of the conference with multiple presentations on KOC’s intelligent digital field (IDF) project. Halliburton’s Doug Johnson gave an overview of the Sabriyah IDF infrastructure that was designed to ‘turn data to actionable information.’ Data from instrumented wells and mobile sensors such as multi phase meters is consolidated at Emerson RTUs at the well site. A WiMax canopy connects field units to the corporate network and on to the collaboration center. Here, pre-defined workflows cover use cases from geology through to operations. KOC’s tool set includes integrated production models, AI/Neural networks and expert rules. There are multiple ways of getting at the data—the idea is to present them all and let the ‘smart user’ decide. One smart work flow covers production ‘gains and losses.’ Operators are shown what a well’s true potential is and have tools to tell them how it should be operating. These include real time pattern analysis and alerts for particular well events. Neural net model prediction has been shown to gives an accurate 30 day forecast of oil rate and water cut. Users can also perform ‘what if’ scenarios to check out what happens if pump or choke settings are changed. The system has been in use for two years, turning data into something that can be intuitively understood. In the Q&A, Johnson was asked about the data infrastructure and underlying ‘plumbing.’ He acknowledged that this is an entirely ‘data driven’ activity and workflows are highly dependent on the historian. But the data infrastructure was all in place before Halliburton came to the party. A KOC rep added that Finder and other repositories were used to capture multiple data types but that there was ‘no common integration platform.’ This just ‘does not exist.’

Harish Goel (KOC) delved deeper into the IDF with a talk on diagnostics and optimization. A complex injection program is run from the North Kuwait collaboration center. The key issue is more about managing water than producing oil. KOC uses Halliburton’s DecisionSpace for production to build its production workflows. The front end user interface is said to be key to take up. Lots of AI is embedded in the system leaving engineers to do their job. Statistical and analytical tools embed the smart workflows to filter correlate and perform Monte Carlo analysts and Pareto plots. One case study of artificial lift optimization resulted in a 420 bopd hike from one well and there are ‘many similar stories.’ Goel wound up contrasting traditional linear workflows that take days and produce ‘fragmented’ information. In collaboration mode, simultaneous action is possible by all stakeholders on the same workflow and data. Lessons learned included the need for ‘multi disciplinary people.’ Change management and an active partnership with IT are also required. In the Q&A, Goel added that the current program is a pilot on two smart wells? The plan is to convert all wells to smart completions over the next decade. Multiple interacting AI-derived models make for lots of ‘moving parts’ and complexity. KOC has handled this by a ‘top down,’ iterative approach and with ample resources—up to 250 people were working on the project at its peak. Goel was asked if the IDF closed the automation loop. The answer is no, not for now. KOC is waiting until the system and its people mature.

Dzevat Omeragic described an interesting Petrel workflow to automate geomodel update. Schlumberger’s researchers are performing geomodeling at borehole scale, inter alia to simulate log responses. This activity leverages a library of physics models running as a service on a high performance compute cluster. The library is shared with Petrel and other applications—notably Matlab. Current well placement workflows are limited to point by point inversion. Using the HPC compute service it is possible to perform full 3D inversion. Changes in pick propagate to the model geometry and a solver computes new locations of pillar nodes. The HPC facility is currently a research project for internal use. But, ‘there is no technical reason why the system could not be used by third party applications.’

In a second presentation, Harish Goel outlined the KOC IDF’s ‘report and distribution’ smart flow for assigning roles in real time. This smart flow ensures that users in different teams that impact the work flow record their actions. The system generates alarms and keeps track of where everyone is in the workflow. GUI widgets (tickets) show job number, status and days overdue. Supervisors can see quickly if there are gas lift or ESP tickets outstanding. An optimisation ticket is generated when an engineer chooses an optimization point. This is routed to stakeholders and the system tracks what actions are taken. The approach is said to minimize the risk of wrong or dangerous requests. Going forward, continuing enhancements to the IDF program are bootstrapped with the report and distribution workflow. Like any major project, many technologies are relatively untested. This system lets KOC track who did what and what worked. ‘The big picture is that we want to build a knowledge base of what works in what circumstances.’ More from Digital Energy.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.