The Digital Twin in Oil & Gas, an investigation

Oil IT Journal looks behind the hype at ‘twins’ from ABB, BHGE, BP, eDrilling, Halliburton, Kongsberg, Lloyds Resgister, Maplesoft and Siemens.

In a recent Google groups discussion, Jon Awbrey opined (in a different context) that ‘in fields … where fundamental progress is rare if not non-existent, one way researchers can stave off a sense of stagnation is by playing musical chairs with terminology.’ This could be a characterization of the current use of the term ‘digital twin’, which, on the face of, it is a new buzzword for the ‘simulator’, one of the very earliest applications of information technology. Imagine if Exxon’s researchers back in 1955had coined the ‘digital twin’ term, marketing would have progressed 60 years overnight! Digital models of plants and reservoirs have been around for a very long time and have been widely used to optimize operations. Hitherto these have been mostly forward-modeling, physics-based models. The digital twin notion frequently extends this paradigm with ‘data-driven’ models derived from machine learning from large sets of historical data. Combining physical models with the data-driven approach is a common theme in digital twin literature, but exactly how this squaring of the circle is achieved is usually left to the imagination.

In our investigation we find rather a lot of marketing-style music but also, insights into the complexities of trying models together, flanging them up with reality, adding in some AI and making the twin do something useful. Definitions and implementations differ but there is commonality. Before the ‘digital twin’, multiple simulators were used to model various parts of a plant or process. Process parameters were changed in different control ‘loops’ from high frequency automatic local loops to longer term ‘big loop’ updates to plant parameters. The digital twin sets out to consolidate multiple models in a single environment. This environment additionally collects real time process data which is claimed to enable real time update of the model(s). Prediction of future performance and feedback to optimize the process, are further ambitious claims.

Whether or not this is feasible is moot. But the idea of consolidating multiple models into a single environment capable of ingesting real time data is clearly of interest. In our examination of the digital twin, we see two approaches to achieving the ideal. One is the ‘platform’ i.e. a more-or-less proprietary solution from a single vendor. The other is a standards-based approach using, perhaps, FMI, the functional mockup interface (see below).

A twin is often presented as behind a front-end display similar to the control room. But there is a critical difference here. In a traditional control room, the intent is to show an operator what is happening in the real world. A control room is an exercise in situational awareness. In a digital twin however, it may be hard to distinguish between what is real and what is simulated. The digital twin is intended to stay in step with reality. What happens when reality and the model drift apart is both a potential source of insight and confusion.

Halliburton - The Voice of the Oilfield

The digital twin was elegantly described by Michael Grieves at the 2018 Haliburton/Landmark iLife event as a system of systems. This is not exactly new. Grieves himself introduced the term back in his 2002 review of the origins of the digital twin concept. Grieves is executive director of the Center for Advanced Manufacturing and Innovative Design at Florida Institute of Technology. The FL IT was spun out of NASA, itself an early proponent of the digital twin notion. Simulators date back to the 1970s. Around year 2000, 3D digital representations were becoming available. Since then, the digital twin concept connects the virtual and physical worlds with bidirectional data and information flows. The digital twin is used to test and build equipment in a virtual environment until it performs. Front-running simulation i.e. design also ran.

In another webcast, Dale McMullin and Ed Marotta presented Halliburton’s Voice of the Oilfield and well construction initiatives that leverage a digital twin and the ‘system of systems’ approach. Here, models are connected via an ‘open co-operative infrastructure’. This supports well construction/completions from subsurface to topsides. Present day digital twin thinking adds AI, ML and analytics to models such that the twin ‘learns and updates itself’. The system of systems potentially spans reservoir to refining as described in a Deloitte Center for Energy Studies publication bytes to bbls 2017. The Voice of the oilfield, aka Well Construction 4.0 adds prescriptive analytics, combining physics-based and data-derived modeling to offer a ‘scientific foundation along with data-driven adaptability’.

ABB’s twin provides a ‘formidable’ digital data trail

For ABB, the digital twin is a ‘complete and operational’ digital representation combining PLM* data, design models and manufacturing data with real-time information from operations and maintenance. ABB envisages a common digital twin directory that points to data stored in different places to enable simulation, diagnostics, prediction and other use cases. ABB’s twin tracks the ‘formidable digital data-trail’ of CAD drawings, design and build information and equipment and configuration data. In addition to actual observations, the twin offers algorithms to calculate ‘non-observable parameters’. As an example, ABB cites its electromagnetic flowmeter. Previously this might have been referred to as a ‘virtual gauge’ but the terminological musical chairs now have it as a digital twin. To summarize, ABB’s DT is a ‘directory’ containing a digital image of physical equipment. This is claimed to go beyond a ‘static description’ of the plant but how the DT is kept in sync with reality over time and the use to which its simulations are put could do with some more explanation.

* Product lifecycle management.

Maplesoft - MapleSim and the Functional Mock-up Interface

A publication from Maplesoft describes the ‘virtual commissioning’ (VC) of manufacturing systems leveraging a ‘virtual plant model’. Maplesoft traces VC back 1999, when ‘soft-commissioning’ was used to debug parts of a future physical system aka the digital twin. Today, model-driven digital twins are widely used in design and optimization leveraging tools such as MapleSim allowing the creation of a model-driven digital twin at design time. CAD import technology has been key in making digital twins more accessible. Different models address different aspects of manufacturing. One may model the complex physics of a plant while another the computer control systems model. Since 2010, the Functional Mock-up Interface* (FMI) is used as a standard interface for a variety of model-based processes. The FMI standard organizes model data such that it can be shared across software tools. FMI format data is shared as a single file of containing variable definitions, system equations and other parameters. As of 2017, FMI is supported by over 40 common engineering tools. MapleSim 2018 is presented as a system-level modeling tool for designing digital twins for virtual commissioning and/or system-level models for complex engineering design projects. MapleSim 2018 provides greater toolchain connectivity with the ability to import models from more software tools, and adds support for FMI 2.0 fixed-step co-simulation and model exchange.

* Other similar functions are achieved with protocols such as the High Level Architecture (defense) and CoLan (petrochemicals). Simulation interoperability is a domain unto itself - see for instance SISO, the Simulation Interoperability Standards Organization.

BP - APEX, a ‘welcome new member of the family’

BP’s APEX simulation and surveillance creates a virtual copy of all BP’s production systems throughout the world. Apex is a production optimization tool that combines asset models. It doubles as a surveillance tool used in the field to spot issues before they affect production. In a rather unhelpful analogy, BP compares APEX with ‘a digital twin of the human body’ but where, ‘instead of arteries, veins and organs, APEX is programmed with data about each BP’s wells, their flow regimes and pressures, underpinned by physics-based hydraulic models.’

Apex is claimed to have sped BP’s simulation from hours to making continuous optimization possible. APEX is further reported as having delivered 30,000 barrels of additional oil and gas production a day in 2017, across BP’s global portfolio. More is expected in 2018. ‘The digital twin is more than just a virtual phenomenon, but a very welcome new member of the BP family.’

Kongsberg - Kognifai and the maritime Tesla

Kongsberg rolled-out its ‘innovative new digital twin concept’ at the 2018 Offshore North Sea conference. The ‘groundbreaking technology’ that connects the digital and physical worlds has undergone a successful feasibility study with Equinor. The twin is a virtual model of unmanned oil and gas production facilities that leverages Kongsberg Kognifai AI platform. The twin integrates disparate data together in a single, secure and user-friendly cloud-based platform.

Another take on Kongsberg’s twin came from Lars Meloe’s presentation at the 2018 GBC IIoT in oil and gas conference. Meloe considers the twin as encompassing AI/ML, static, historical and real-time data and coverage of technical to business usage ... all delivered through a context-dependent user interface. Kongsberg’s flagship Yara Birkeland autonomous container ship, aka the ‘maritime tesla’ embeds Kongsberg’s digital twin and Kognifai. Under the scupper decks is a digital twin comprising 3D models and K-Sim simulator models. Another example is Equinor’s Oseberg-H 30/6/H platform where Kongsberg’s digital twin, along with Autonomy, SAS on an Edge Gateway and a KognifAI data platform. Read a version of Meloe’s presentation online here.

Siemens

Speaking at the 2018 GBC IIoT conference, Elgonda LaGrange from Siemens’ Dresser-Rand unit described placed Siemens’ MindSphere platform and Comos engineering data model as comprising its digital twin, aka SimCenter. Luc Goosens added that the digital twin concept evolved out of system engineering and product lifecycle management (PLM). Siemens has gathered multiple disconnected simulators and applications (STAR-CCM+. NX CAE, NX NASTRAN, LMS and more) into its twin which is claimed to allow complex cyber physical systems to be designed and compliance-tested before manufacture. SimCenter ‘combines physics-based simulations with insights gained from data analytics’. Goosens concluded that ‘a digital twin without a closed-loop reality check is an illusion’.

Baker Hughes GE

Baker Hughes, currently a GE company, has deployed digital twin technology to track and optimize its supply chain. The twin considers factors such as part delays and weather disruptions to continuously update and share information across multiple organizations, ‘improving delivery times, reducing inventory costs and creating efficiencies’. The twin was developed by an interdisciplinary team from GE Global Research and BHGE. GE claims to have ‘more than a million’ digital twins already deployed across its business but believes that the application of digital twin technology to supply chain is ‘breaking new ground’. Curiously however, the BHGE announcement makes no mention of Predix, previously presented as GE’s solution to all software and cloud integration problems.

Lloyds Register/GE

In a curious announcement, Lloyds Register’s has granted ‘approval in principle’ (AiP) to GE’s Predix asset performance management (APM) system under LR’s new ‘digital compliance framework.’ The AiP means that GE is recognized as ‘providing assurance that Predix APM meets the data, technology and software requirements for a predictive system’. The AiP to ‘digital twin ready’ is the first level of approval on LR’s digital compliance framework. Further approval levels can be applied to the digital twin throughout its creation and deployment with the ‘approved’, ‘commissioned’ and ‘live’ notations. ‘Digital compliance’ was validated in a joint LR/GE ‘co-creation project’. It is unclear whether LR now considers itself to be a certification body or a marketing organization. A bit of both perhaps?

eDrilling

Norwegian eDrilling’s digital twin is used across well planning, drilling monitoring and real-time optimization. The tool supports real-time forward simulations to avoid drilling problems. In our quick spin through the digital twin landscape, eDrilling’s control room has to be one of the sexiest interfaces to ... either reality, or a simulation, or again, a bit of both!

In conclusion ...

The digital twin buzzword has more substance to it that ‘digital transformation’ that we reviewed last month. It is an interesting notion, but is it new? Industry has got used to handling multiple simulators for many specific tasks. Computer control operates at many levels – from local loops controlling how valves open and shut, to ‘big loop’ adjustments made with reference to an overarching model of a facility. What’s difficult to achieve is to bring these different models with their own scope, granularity and time frames together. And what is perhaps even more difficult is to match model performance to real time data and act on the results. Calling the whole shebang a ‘digital twin’ does not progress these issues much.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.