OSIsoft PI World Europe Barcelona 2018

OSIsoft president Pat Kennedy ‘PI-addressed hardware is here to stay for another 20 years’. PI Cloud Services central to IoT. Iberdrola on EU energy market revolution. Digital Transformation Forum. Sirius JV to build semantic digital twin. Data quality and TransCanada. Shell’s predictive asset management. Toumetis, Seeq on machine learning and visualization. PI central to BP operations. DCP Midstream’s Energy Lab transforms with PI AF.


Gregg Le Blanc (OSIsoft) opined that OSIsoft knew, ‘long before it was popular’ that being data-centric is key. But sensors may have odd naming conventions, units of measure issues and data gaps. The new data lakes can quickly become data silos. The answer, ’select your IoT platform, your cloud of choice and bring them together with OSIsoft Cloud Services’. Elsewhere OCS is presented as ‘built on Microsoft Azure, although we imagine this is not an exclusive relationship.

If the OCS is OSIsoft’s cloud management solution, at the IoT edge, as we reported last year from London, a parallel ‘Pervasive data collection’ (PDC) offering, along OMF, the OSIsoft message format, provide connectivity and edge processing to stand-alone kit such as vibration sensors. Chris Nelson (OSIsoft) reported that OMF is now supported by third party libraries (Open FogLAMP from Dianomic). OMF’s small footprint enables a persist data store for the IoT edge asset.

Iberdrola on the complexities of the EU energy market

Javier Valdes (Iberdrola) provided insights into the complex energy market in Spain (and in Europe). A huge transition is underway with more electricity generated from clean sources (renewables and nuclear). In electricity, the first big change happened in 1988 with EU liberalization. Now the change is around decarbonization, the ‘empowerment’ of the customer and integration of electrical systems. In Spain, renewables have risen but plateaued since 2013. And they are cannibalizing non-renewables as demand is not growing. Today 20% of electricity is renewable and 36% of this is wind. But wind is poorly connected and needs backup power. Nuclear is not great as a backup as it needs to run constantly. Coal is not considered an energy of the future and Iberdrola has asked for permission to close its coal plants. Combined cycle gas generation are not designed for low/intermittent use* and are loss making. They may be closed too unless a fixed payment for combined cycle operations (like in the UK) can be negotiated. ‘Efficiency, electric vehicles, heat pumps (!) and combined cycle gas are the keys to the future’. OSIsoft is a partner with Iberdrola on data analytics at combined cycle gas plants to enable them to start quickly when the wind drops.

* Although see below for TransCanada’s ‘peaker plants’

Digital Transformation Forum

The discussion forum on digital transformation revealed what we already surmised that, although vendors and IT departments are enthusiastic about the movement, management remains cautiously skeptical. Digital transformation has proved harder than was thought five years ago. For AstraZeneca, digital transformation is an interesting term that can terrify senior management. Does it translate into ‘let’s give IT loads of money’? It is a great buzzword but rather dangerous. For management, phone and email is more important and IT has a hard time just doing the basics. Some attempts to push digital whether management liked it or not had have been poorly received. What are the basics? Capture data and enable its use. Which translates into avoiding data hording or non-capture. ‘These basics are a positive thing for us’.

OSIsoft recommends taking small steps in the transformation. Some attempts fail because of the focus on the ‘shiny objects’ of artificial intelligence and machine learning. It is better to start with the data you already have and avoid starting big data initiatives without a business question to answer!

For Cargill the transformation is ‘absolutely about changing how you do your business’, towards a situation with no operators on the shop floor, no truck drivers. Digital transformation is a ‘North Star’ for the journey. But, ‘many can’t cope with the big journey’. Some have been doing this for 20 years and as ‘what more can I do now?’ But today, people understand technology change better.

Accenture’s global clients are on the transformation journey to improve operational effectiveness and also to find new sources of revenue. Some are striving to do both, trying new ideas and failing fast. While this can be problematical in risk averse industries, ‘you have to try’.

Better Data Quality for Better Data Science with the PI System

Brandon Perry (OSIsoft) stressed the importance of quality as a prerequisite for data science. Averages in PI and Excel may be ‘quite different’, apparently a common pitfall. Delving deeper into the nature of PI data streams, ‘you may have heard the term time series’. Data may not be evenly spaced, there may be gaps that need interpolation. PI captures quality tags such as ‘no value’, ‘modified’ and ‘annotated’. Other issues come from using PI’s option to compress data. Perry cited Nina Thornhill’s 2004 paper that found that ‘compression kills data’. So what should you do to enhance data quality? Check and understand the impact of data compression, filtering and sample rate. Add sensor metadata to PI assets, cleanse raw data and tag ‘no data’ correctly.

When using the PI Data Link to Excel, ‘interpolation may not be the best way to go’ it may be better to use time-weighted aggregates. Also, use PI Event Frames to delimit process states and derive aggregates inside frames. While all this is undoubtedly important, we were thinking ‘what about Nyquist?’, we were not alone, others were muttering ‘what about Shannon?’ There is quite a lot of ‘prior art’ here worthy of consideration. Another potential issue we spotted was how PI adds units of measure to its data points. This appears to be by overloading an ascii data field viz: ‘270°F’, a common, but surely not a best practice!

TransCanada machine learning for data quality

Keary Rogers and Ionut Buse presented TransCanada’s massive gas transport system that carries some 25% of US natural gas. Infrastructure includes 200 compressor stations and 800 other units. Gas is now flowing south to the Gulf Coast for petrochemical plants and LNG export. One particular issue is the need for quick gas to ‘Peaker Plants’ that compensate windfarm output drops in no wind situations. Behind all this is quality real time data, enabled by meticulous attention to bad values, stale data and flat lining sensors. For TransCanada data QC is (or should be?) amenable to a machine learning approach although this is ‘work in progress’. This is not the first time we have heard of ML being used as a corrective to data quality issues. While there are undoubtedly cases where this makes sense, the temptation to allow bad data to enter the system, on the supposition that it can be fixed later on, is clearly to be avoided!

Sirius’ semantics-based digital twin

David Cameron, from Norway’s Sirius R&D establishment, Evgeny Kharlamov (University of Oxford) and Brandon Perry (OSIsoft) proposed a joint venture/consortium to investigate a semantic digital twin. Sirius’ goal is ‘scalable data access in the oil and gas domain’, which, in principle should be a prerequisite for a digital twin. But what is the digital twin? There are many definitions and applications. The process industry has been doing multi-physics probabilistic simulations to mirror and predict its plants for over twenty years. Now consultants and marketing departments have discovered the digital twin and ‘they are everywhere’. They exist in automotive and aerospace (and in oil and gas already) but they are in reality, ‘systems held together by tape’ and are ‘too large and unmanageable’. There is a need for a scientific basis for these systems of systems.

Evgeny Kharlamov observed that today, PI Asset Framework is used to describe a plant and could be used as the basis of a digital twin. But Kharlamov believes that the digital twin would be better supported with a semantic model which would allow for wider open-ended use across machine learning, data science and analytics. Enter the semantic web and a graph database of process models combining ‘physical, digital and cognitive’. Now ‘there is no need for PI AF, just use a semantic model’. Tools of the semantic trade include RDF, RDFa, SKOS, SPARQL, OWL (and more). ‘Semantification is a trend’ Semantic equipment models have already been created, notably with Siemens in the EU Optique semantic project.

Brandon Perry floated the idea of a R&D consortium to develop a ‘cognitive understanding of our equipment’ to ‘receive predictions and warnings’, drop new apps into a twin or allow self-organizing applications. Perry acknowledged that industrial ontologies are tricky and have met with a mixed reception to date. This consortium should make them practical and ‘augment the physical world by mimicking the physical asset such that it might pass the Turing test’. The Big Data Value Association was cited in this context.

Cameron summarized that ontologies can be incredibly complex, some good, some not so good. They need to integrate with corporate knowledge structures. The consortium plans pilots with EPCs and vendors in oil and gas building a semantic backbone, faceted user interfaces and standards. There are ‘lots of standards out there’ to enable ‘better data science and hybrid analytics’. Sirius’ focus is the upstream, field management, topside facilities and lifecycle modeling but with new EU funding this may extend to process control.

Comment: It is surprising that semantics and RDF are presented to this community without reference to the huge amount of somewhat unsuccessful prior art in the field – as Oil IT Journal has diligently reported in over 200 articles since 2003.

Toumetis - doing machine learning for twenty years

John Wingate said not to believe the hype around machine learning. Industry has been doing this for years. Wingate’s company Toumetis is a machine learning boutique and a practitioner of applied data science that has been using neural nets for 20 years or so. Today these run quicker, there are new algorithms but it is still ‘just ML’. Toumetis’ Cascadence is applied ML for planning, ranking and forecasting, all fueled by data services. Industrial ML workloads require comprehensive storage, a Python API and modern web technologies. OSIsoft’s Laurent Garrigues showed how OSIsoft Cloud Services connect multi customer sites to technology partners including Toumetis without duplicating data. OCS is a new cloud platform, built (on Microsoft Azure) from the ground up, with Toumetis inside.

Wingate added that the tricky part of the ML equation is getting good labelled training data which is expensive and requires humans in the loop. The human attention span bottleneck makes it preferable to use unsupervised ML and a modern user interface to reduce the labelling workload. Enter the Cascadence Asset Modeler, a cognitive computing solution that accelerates the transition from a flat PI tag structure to a robust and dynamic PI AF with ‘a much more detailed description of equipment, its hierarchical relationships and associated metadata’.

Real time operations at Shell

Ali Hamza and Peter van den Heuvel presented a business perspective of real-time operations at Shell, updating the 2017 presentation. Hamza is global head of Shell’s wells reservoir and facilities management WRFM unit. Shell’s strategic IT themes are ‘everything to the cloud’, data and analytics, collaboration and mobile, legal/regulatory/cyber. The business guides IT investment decisions and digitalization. Shell already has 5 petabytes (out of 8) in the cloud and the PI System is ‘at the heart of our digitalization roadmap’. PI reads are of the order of 3.5 trillion/month, a steep increase over last year following the introduction of advanced analytics. Shell has begun an analytics proof of concept on its 500,000-strong portfolio of valves, migrating its legacy data to Microsoft Azure for analytics. PI data quality is key here. Shell is tracking the PI System roadmap to the cloud closely. BG integration was also a major undertaking. The whole BG landscape is now in the cloud. Also, everything is now done on thin clients. ‘We don’t want any more desktop because of the overhead of updating multiple endpoints’. 2018 saw Shell’s PI Vision flagship deployment on the Prelude FLNG vessel.

Hamza observed that the upstream and integrated gas businesses produce huge volumes of data, far too much for humans to analyze. A lot is expected from digitalization but this is ‘neither new, nor a one off thing nor only in the future’. Digitalization is driven by decreasing cost of sensors and data storage in the cloud, better AI and more compute power. Along with PI, Petex and Energy Components got a call-out. Petex Gap was used to model and understand failing wells and mitigate a 500 bopd production loss by tuning separator pressure and adjusting anti-foulant rates. The idea is simple, the technology available, it was just a matter of using it! Analytics and ML represent a new era in our industry. Shell’s work on control valve incidents (a $6 million loss in 2015) involves a ‘deep dive’ into performance data and collaboration across IT, engineering, maintenance. Shell has built a PI-based data quality system that now underpins its digital oilfield and analytics initiatives. Shell is now working with OSIsoft on a data governance solution embedding ISO 8000 standards for data quality, KPIs and PDCA remediation*. Heuvel added that Shell has learned from other initiatives that addressed devops, trainings and device management to avoid folks asking, ‘why did you develop this? we did not ask for it’. The way forward involves a closer relationship with the business, global roll-outs and fit-for-purpose, stable software. In the Q&A, Hamza opined that having units of measure in PI tags is ‘simple but really important’.

* Demming’s plan-do-check-act.

BP: Using Analytics in PI AF to improve operating performance

Samantha Ross and Rob Sutton explained how, following an enterprise agreement with OSIsoft, BP has been working on a solid foundation for PI AF. In general, technology has matured and hardware infrastructure is in place such that there are no more server throttles or scan rate limits on PI. A new generation of digital users and a social shift to tech adoption means more use of Excel and PI Data Link. Today PI a is a ‘regular conversation point’ in BP. A no-trips policy also means more use of software to reduce deferrals and abnormalities. Business analytics 'reduce cognitive load’ while automated surveillance identifies weak signals and generates insights. The new technology needs new visualizations and a ‘break from the legacy of the last 30 years’. BP’s new data team has already built severable minimum viable products (MVP) for analytics using PI Element templates and PI Visions.

HMI development has been challenging for the last 20 years. The objective is for a 90% reduction in legacy visualizations less text and deadbanding*-configured data hiding. BP uses a ‘task-based design’ and ‘human factors’ approach (but the screens shown were not very pretty – see below!) One success was reported from the North Sea when weak signals in analytics triggered new insights into failure modes – creating a ‘clear air of excitement’. BP is ‘turning dark data into leading indicators’. AF and PI Vision are key and are breaking legacy design, challenging resistance. In the Q&A BP acknowledged that its new generation visualizations could be improved. ‘They still look like grey scada images’. Seemingly PI Vision offers a ‘limited toolbox’ and BP plans to deploy more sophisticated visualizations real soon now. The MVP approach has allowed BP to deploy sandbox solutions in just a few days. But BP is not going to let operators do their own PI Vision screens. Development is centrally-controlled. BP does encourage users to come forward with ideas. False positives from analytics can be reduced by deadbanding on time or percentage values.

* Actuator thresholds which trigger data transmission.

DCP Midstream’s business transformation with PI and ‘Energy Lab'

Tauna Rignall presented DCP Midstream’s business transformation with PI (already a highlight of the 2018 PI World in San Francisco). DCP was confronted with familiar operator problems. Its data architecture was focused on process control and operations, analytics and reporting were afterthoughts. There was no centralized and normalized set of operational data in the company and copies of data were shared, in spreadsheets, with multiple parties. Rignall’s presentation focused on a company-wide PI System, deployed under an enterprise agreement. The thorough deployment included standard naming conventions, a rigorous PI AF structure and a governance capability. All delivered by a team of subject matter experts, PI product team and making heavy use of PI AF/PI Vision templates. DCP has now created an ‘Energy Lab’ unit for rapid development of digital solutions using the PI System. These are deployed in the ICC, the integrated collaboration center. One interesting facet of DCP’s infrastructure is the ongoing use of Windrock’s Spotlight application an ‘IIoT-enabled’ application for advanced machinery analytics. What we find interesting about the way Spotlight has been embedded into DCP’s PI infrastructure is that it shows how a business solution can be successfully used in such a ‘foreign’ framework. We can across a similar edge deployment last year with Setpoint (see below).

Seeq data cleansing and ‘fantastic’ visualization

Those concerned with PI’s challenged eye candy (BP?) may went to check out Seeq’s ‘fantastic’ visualization tool. Brian Parsonnet showed how Seeq can be used alongside PI AF to improve data quality and avoid the ‘garbage in, garbage out’ syndrome. Seeq claims to handle data gaps, flyers, noise, interference, sensor drift, timestamp alignment, incorrect interpolation, round-off and bad units. Seeq offers a ‘once…and done’ data stream approach as opposed to data movement with ETL*. Seeq is trained by a subject matter expert before deployment.

* Extract, transform, load.

Announcements, notes, indiscretions

Dominique Florack (Dassault Systèmes) outlined a collaboration between OSISoft and Dassault-Systèmes to maximize the ‘virtual plus real advantage’ by combining 3D Experience and PI. There is a huge untapped market for the digital twin across city planning, mining and massive cruise ships. Today, ‘few understand the connection from real to virtual and the data platform’. This is an 'unstoppable revolution’.

A chat with a Caterpillar engineer offered us another data point in our ‘how big is your data’ quest. Caterpillar records 1 HZ data from its largest marine diesels from some 200 sensors. That would likely be less than a megabyte per day. Big, but not so big as GE has claimed for its jet engines. Our Caterpillar contact was also circumspect about the new big data/analytics hype. Both Cat and GE have been recording sensor data for quite a while, using somewhat straightforward noise thresholds to indicate excessive component wear. In another cryptic note, we recorded that ADNOC’s ‘Panorama’ triple 50m screen ‘AI and big data center’ is ‘now only used for training’.

Finally, the Windrock deployment made us think back to the 2017 OSIsoft London conference. Much talk of ‘big’ data is ‘forward looking’ in so far as the future deployment of a large number of low cost (?) sensors will enable as yet unproven algorithmic techniques to perform better than… well, better than what exactly? Last year this was B&K’s Setpoint monolithic application that has been doing the run-life predictive thing for quite some time and integrates the OSIsoft ecosystem in a rather elegant way, bridging the gap between, low frequency PI data and HF vibration data at the edge. It looks like Windrock plays a simlar role and likewise can integrate the big data/cloud environement with help from PI.

Read these and other presentations on the conference minisite.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.