What a wonderful picture the ‘Internet of Things’ conjures-up. Imagine being able to assemble information from different vendors' sensors at remote locations, and mash all the information up into your own monitoring, big data, artificial intelligence system or whatever. This is the sort of dream that gets the chattering classes going and has kicked off several attempts to occupy the standards landscape. It has also driven massive re-badging of vendors’ and integrators' offerings to 'align’ with the ‘emerging’ IoT (or the Industrial IoT if you like).
Before this quick run-through of some of these standards and solutions it is worth reflecting on exactly what it would take to achieve IoT nirvana. If you want to grab some data from a remote device, you need to know a lot more than just the ‘value’. Knowing the units of measure would be good too. In fact, there are all sorts of other bits of metadata (the sensor’s position, the local time, the instrument’s dynamic range, calibration status and other idiosyncrasies) which may be essential to a proper, unambiguous use of the data value. Some aspects of the newer standards do indeed allow for more metadata than previously, but all IoT solutions come with a significant gotcha. The more metadata you need and get, the bigger the message. At the receiving end, managing data from vast arrays of different sensors will create a significant processing overhead. The IoT is not magic.
For the sake of this introduction to the IoT, we propose a three-tiered analysis: political, standards and protocol. At the political level, politicians, egged-on by the IT consulting community, dream-up a rosy picture of the merits of a super IoT ‘platform’. The standards bodies, often populated by vendors with a large installed, proprietary base to which disruption may be awkward, come up with something that sounds nice, but that is unlikely to cause them too much trouble. Finally, the skunk works artists and interoperability zealots look down into the protocol landscape and see if they can find anything relevant to their own needs.
One IoT birthplace is Germany’s Industrie 4.0, a politically-inspired attempt to sell ‘digitalization’ into an already highly digital market, the factory. The idea was that instead of having a lot of competing digital protocols on the factory floor, one overarching Industrie 4.0 system would allow manufacturers to be more 'competitive’. The organization has defined Rami, the Reference architecture model for Industrie 4.0. But, from the latest ‘official’ Industrie 4.0 publication, it appears that the initiative is still struggling with proprietary IoT platforms from SAP, Bosh and others. Even if these folks manage to get together for tradeshow interoperability demos. The I4.0 conclusion and outlook is ‘(today) few platforms are designed to make full use of the advantages and opportunities that platform-based business models offer. Most initiatives do not fully harness the power of the network effect. The (proprietary) platforms shield themselves from potentially harmful competition. The prevalent mindset is one of ‘platform protectionism’ and risk aversion’. That’s telling it like it is.
In parallel with the Industrie 4.0 and its Rami architecture, a large group of (mostly) US companies formed the Industrial Internet Consortium to 'bring together the organizations and technologies necessary to accelerate the growth of the Industrial Internet by identifying, assembling and promoting best practices’. Like Industrie 4.0, the IIC eschews anything that could be considered a communications protocol. So, nothing is ever going to ‘run’ on either Industrie 4.0 or the IIC. Early in 2018, the two bodies ‘announced the publication’ of a joint whitepaper, ‘architecture alignment and interoperability’ detailing ‘alignment’ of the two reference architectures*. A late arrival at the IoT ball is the World Wide Web Consortium, W3, which has put forward its RDF/Linked Data work as ‘important to the field of graph data’ and the ‘web of things’. The W3 is working on a standardization effort to use graph databases as ‘an important enabler for the IoT, big data, data management and data governance’. The International Standards Organization has also got onto the IoT bandwagon with a 77 page ‘reference framework’ from its ISO/IEC JTC 1/SC 41 technical committee. The ISO reference architecture proposes a ‘common vocabulary, reusable designs and industry best practice’. The framework is available from the ISO Store, a snip at CHF198.
* We have been skeptical about reference architectures since we first encountered Mura, the Microsoft upstream reference architecture which seemed at the time to be a rather nebulous marketing concept.
Digging down into the IoT we eventually come across some communications protocols of interest. According to ARC Advisory, the OPC UA is ‘well positioned’ as a basis for IoT solutions. OPC Unified Architecture is a platform and vendor-independent communication technology for a secure and reliable data exchange over the different levels of the automation pyramid. We discussed the extent to which OPC UA supports unambiguous exchange of metadata with Matrikon France's Antoine Capitaine who confirmed that it is indeed possible to send units of measure and other metadata over an OPC UA network, although, when tens of thousands of measurements are being broadcast, ‘there is not much point overloading the network with this information for each sample’. Metadata will likely be recorded in some configuration files. This is easier to imagine in a factory context, but it will limit data interoperability between networks. ARC is probably right in that OPC UA has application in the factory (and perhaps in the drilling factory) but it is not the main contender for IoT-enablement in our reporting to date.
Curiously, the IoT protocol that seems to have attracted most attention in the oil and gas space, at least in the US, is the venerable Mqtt (Message Queuing Telemetry Transport) spec. Venerable, because it was introduced by IBM back in 1999 and later offered to the Oasis standards body. In a report by Inductive Automation from the 2018 ICC conference, Mqtt co-inventors Andy Stanford-Clark and Arlen Nipper described how end users were ‘frustrated’ with proprietary automation systems that are hampering innovation. Mqtt is presented as an open source response to these proprietary systems with development support from the Eclipse Foundation. Mqtt is a pretty low-level spec which is generally augmented for IoT use with the Eclipse Tahu (formerly Sparkplug) platform. Tahu ‘addresses the existence of legacy SCADA/DCS/ICS protocols and infrastructures and provides a much-needed definition of how best to apply Mqtt into these existing industrial operational environments’. Tahu has been productized by, inter alia, Ubuntu as an IoT gateway framework.
Now this is getting interesting and it is indeed what we have observed at earlier Wellsite Automation events as process control engineers build their own hooks into their scada systems with the technology. But if you follow the Mqtt/Sparkplug breadcrumb trail you get to even more interesting stuff, if that is, you are of the do-it-yourself disposition. Once you can connect to your expensive proprietary control systems you will probably want to build and deploy some of your own sensors. This is now ridiculously easy with devices costing a few tens of dollars, notably the Raspberry PI, today’s ubiquitous Internet ‘edge’ device. The ‘edge’ implies a center i.e. the cloud and it is indeed the promise of cloud connectivity that has brought these skunky IoT projects along so far and fast. As an example, visit Walter Coan’s Hackster page which shows how to reading data from a PLC right into the Azure cloud. Another is BigClown’s kit to build your own IoT devices such as a motion detector, climate monitor, flood detector and more.
So where does all of the above leave oil country process control standards initiatives such as ExxonMobil/The Open Group’s OPAF and Saudi Aramco’s ‘me-too’ Process Automation System PAS. OPAF sounds more like a reference architecture, it was described to Oil IT Journal as a ‘standard of standards’ that ‘will not duplicate work where useful standards already exist’. OPAF has just provided an update on its developing spec which we will be reporting on in our next issue.
In some ways, the advent of cheap processing power à la Raspberry at the edge may make the need for standard communications protocols less pressing. If a puck-like device can just get data from the control system into the cloud, then what protocol is used in the moving is maybe not so important. There is maybe one other consideration here. Industry’s love affair with open source has maybe come a bit late in the day as Microsoft now owns Github and RedHat. And while Google and Amazon may push out lots of code into the open domain, they are still .. err.. Google and Amazon. It is hard to find open source stuff today that is not in one way or another an invitation to be locked-in to a commercial provider. Likewise all that ease of getting your stuff into the cloud may herald a shift from ‘proprietary’ control systems to a golden handcuff tieing you with your cloud provider. Of course if you are using Azure or TensorFlow, well, the handcuffs are already on!
© Oil IT Journal - all rights reserved.