We were intrigued by the title of an article in the Rockwell Journal that promised to explain the ‘fundamentals of digital twins’. The short posting makes a case that what distinguishes a DT from a ‘simulator’ is that the former is a ‘living digital replica’ and the latter is … Well it’s really not very clear. Strip out the marketing baloney and you are left with the concept of a simulator that is tuned to mimic a machine, plant or reservoir as accurately as possible. The article points the reader to a combination of MapleSim and Rockwell’s own Studio 5000 design environment. The referenced white paper in turn points to a use case of ‘virtual commissioning’ that dates back some 15 years. No the digital twin is really nothing new!
Hedging its bets on the newness issue we have the ‘Digital Twin Consortium’ that describes the DT as ‘relatively new’. The DTC now offers an open-source repository to ‘help DT communities collaborate while building the market’. To date the most active repo appears to be Bema’s Ecolcafé an ERP/MES system that underpins ‘Torréfacteur 4.0’, an Industrie 4.0 coffee roastery.
IT/OT data management specialist Element Analytics
has rolled out tools for digital twin builders in the form of a
connector portal that supports knowledge graph-based modeling and
‘advanced joins’. Element’s Unify Graph is said to support the mapping
of complex data environments. Data can be exported for consumption by
other graph database products such as AWS Neptune or Neo4j. ‘Advanced
joins’ let users combine data from various sources using approaches
including fuzzy and ‘contains’ matching. Connectors are available for
Amazon S3, IoT SiteWise, Azure Blob, Ignition, KepWare and others. More
from Element.
In its glossary, the Digital Twin Consortium defines the DT as ‘using real-time and historical data to represent the past and present and simulate predicted futures’. If you are into reservoir simulation you will recognize the phases of model building, history match and production forecasting. How ‘new’ is that? Well our records of this activity point back to 1955 with the work done at Humble Oil Co (later Exxon), among the first to use computers to model oil reservoirs.
© Oil IT Journal - all rights reserved.