Upstream Intelligence Data Driven Drilling & Production Conference, Houston

Anadarko DPAT drilling automation. Total D-WIS drilling interop proposal. Anadarko diffusive time-of-flight. Repsol’s EarthSpy, the answer to ‘plummeting’ upstream profitability. LNS Research defines the digital twin. Red Hat OpenShift Energy Commons. More on OSDU. EarthPeel seamless geoscience in the cloud.

Dingzhou Cao from Anadarko’s advanced analytics and emerging technologies unit presented Anadarko’s real-time drilling (RTD) journey at the Upstream Intelligence Data-driven drilling and production conference earlier this year. Anadarko’s first generation RTD Analytics System leveraged a physics/rule-based system that ingested Witsml drilling data into a StreamBase (now a Tibco unit) real time datamart for exploitation with MapR. The system proved to be poor at recognizing drilling states in real time. The second generation RTD system was AI-based, built on a Google BigQuery database and a machine learning pipeline running on Kubernetes. Tibco StreamBase was again deployed for complex event processing with results captured to MongoDB.

The new system leverages a convolutional neural net and ‘semantic segmentation’ (pattern recognition) to distinguished between rotate and slide (directional) drilling states. A U-Net deep learning model can recognize drilling states from limited measurements (of standpipe pressure). Latterly the system has undergone a ‘digital transformation' and is now rolled up into the ‘DPAT’ drilling program automation tool and a Google cloud based ML pipeline. The only drawback is that ‘the engineers love Spotfire and Excel’. On the plus side, DPAT has automated the whole process in the cloud on a StreamBase high availability architecture.

Darryl Fett, from Total’s US E&P Research & Technology unit, floated a proposal for a drilling and wells interoperability standards (D-WIS) initiative. The aim is to establish recommended practices and standards that enable interoperability between all components, equipment and systems used in oil and gas well construction. Such seamless data exchange will reduce cost, increase efficiency and improve safety. Effective well data management will pave the way for high-end applications in drilling automation. Fett takes inspiration from other initiatives such as the SPE’s DSATS, Norway’s NORCE/DD-Hub and the IADC. Fett proposes a ‘systems engineering’ approach, coupled with advanced telemetry, data analytics and automation. There needs to be a strong focus on interfaces between components, systems and processes and a ‘plug and play’ capability. The objective is to provide decision makers with the best answer rather than just more data. Fett argued for a shift from proprietary systems to an ‘open’ mentality, breaking down the silos. Data ownership is not a new problem, but it is solvable. Contractual and legal issues need to be addressed as does the economic model that ensures value to all stakeholders. The good news is that the technical part is not very challenging.

Anadarko’s Sathish Sankaran sketched a spectrum of modeling styles, from the full physics, high resolution of the simulator, through upscaled, reduced-order models, approximate physical (streamline), hybrid data-driven/physics, and finally, physics-free ML-based data driven models. Anadarko has tested a random forest ML approach on a deep-water Gulf of Mexico field, using a combination of a reduced order model along with machine learning. Fine scale training simulations were used to determine an optimal production strategy with significant speedup (~30x –40x) over a full field simulation. Another test on an onshore unconventional used a ‘diffusive time of flight*’ model combined with flowing material balance and non-parametric regression to calculate true well performance and forecast based on routinely measured data.

Sankaran concluded that although these approaches work, the industry ‘lacks champions’ for the new technology. He cited a study from Boston Consulting, that found energy to have the highest percentage of ‘digital laggards’ of all industries and one of the lowest percentages of digital champions.

* See also the Texas A&M work on diffusive time-of-flight on OnePetro.

Repsol too has drunk deeply of the machine learning digital Kool-Aid. Raul Cabrera-Garzon, presenting on behalf of Francisco Ortigosa, qualified the transformation on Repsol’s geoscience as a ‘redefinition’ of how Repsol works in exploration. Digital transformation is moreover framed as the answer to ‘plummeting profitability’ in the upstream business. The cloud, and a ‘democratization’ of technology is enabling a shift from qualitative seismic interpretation to an analysis based on data-rich, quantitative input. Repsol’s trademarked EarthSpy illumination technology is ‘changing the seismic processor experience’ and speeding high-end offshore data processing. Google Cloud Vision also ran.

Industry observers puzzled by the arrival of the ‘digital twin’ in an industry with more simulators than you can shake a stick at should benefit from Joe Perino’s (LNS Research) attempt at a taxonomy. While the marketers promote digital twins as an easy path to industrial transformation, the challenge is understanding exactly what it is! There is no commonly accepted definition and architecture in the process industry. The DT is said to unify the three simulator families of advanced process control, simulation and operator training. The DT replicates the physical process and pinpoints ‘previously undetected or unexplained patterns, meanings, anomalies, and discrepancies’. To deploy a DT, Perino recommends setting realistic expectations, working with an enterprise architect and ‘get your data engineering right’. A platform is preferable to point solutions, but, warns Perion, ‘avoid lock-in’.

John Archer (Red Hat) reprised his PNEC presentation on open source in the upstream with shout-outs to oil and gas success stories chez BP, Equinor and ExxonMobil. Archer enumerated some data science challenges. Many efforts are stuck on the desktop as data is often of dubious, random quality, incomplete or lacking metadata standards and pedigree. Most are still maturing their AI work in a ‘very crowded and fast-moving space’. What’s more, ‘no one wants to be called a citizen data scientist!’ Red Hat’s OpenShift Kubernetes community now has an energy special interest group with BP and Saudi Aramco onboard. The OpenShift platform provides machine learning workflows for data scientists. Interested parties may like to sign up for the upcoming OpenShift commons gathering on AI/ML (October 28, 2019 San Francisco). The Red Hat OpenData Hub and Noobaa, an open source multi-cloud object gateway also ran.

Johan Krebbers (Shell) fleshed-out the OSDU (open subsurface data universe) project as addressing a new generation of applications under development that will enable flexible workflow orchestration. Under the hood, a micro-services driven, Kubernetes architecture with an HTML5-driven (including 3D support) GUI. Both physics-based and data driven applications will feature, ‘exploiting machine learning wherever possible. Current (legacy) Windows and Linux apps will also be supported in the ‘game changing, desperately needed’ platform. Data is central to the OSDU initiative. Data will not be left to one company, ‘that has never worked’. Krebbers presentation was made before Schlumberger announced its involvement in OSDU.

Ananya Roy and Jeshurun Hembd presented startup EarthPeel and its vision for ‘instant access to geoscience data’. EarthPeel promises a cloud-native architecture, modular components and well-defined REST APIs. Partners are sought for pilot consulting contracts.

More from Upstream Intelligence.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.