Oil IT Journal Volume 27 Number 5


Cyber wake-up call for oil and gas

Vedere Labs report on insecure-by-design OT. Bedrock warns of TSA relaxing pipeline cyber regulations. ISA’s five tips on cybercrime protection. Software Engineering Institute’s Guide to insider threats. RFence Horus radio monitor for critical infrastructure. Cloud Security Alliance reports on ‘Sensitive Data in the Cloud’. CybeReady’s cyberattack learning kit. ISA on cyber education for automation engineers. Trellix study finds US oil and gas lacks cyber skills. BreachBits, ‘US oil and gas industry at risk of a cyber breach’. DNV forecasts serious cyber-attacks on industry. EU Cyber Resilience Act. NIST program solicitation for scientific cyberinfrastructure. The Open Group hosts zero trust architecture event. ISA on why ICS/OT infrastructure is so hard to secure.

A recent report ‘OT:ICEFALL - A Decade of Insecure-by-Design Practices in OT’ from Forescout Technologies’s cyber security unit Vedere Labs found that ‘serious vulnerabilities still exist in the products of many of the largest control systems vendors even though many are sold as secure by design or have been certified with OT security standards’. Vedere analyzed products from ten of the largest control system vendors and found 56 cyber security vulnerabilities. Hackers exploiting these could gain network access to a target device, remotely execute, bypass authentication and create havoc. What’s perhaps worse is that three quarters of the product families affected by such vulnerabilities have some form of security certification. Vedere accused some vendors of ‘persistent insecure-by-design practices’. These can occur in products carrying security certifications such as IEC 62443 and Achilles L1.

The report was brought to our attention in a blog post from Bedrock Automation’s Robert Bergman who has also warned that the US Transport Safety Administration’s is rolling-back on its 2021 Pipeline Cyber Security Directive following pressure from operators. At issue is the requirement for operators to review and fill gaps between their current cyber security practices and the TSA’s 33-page cyber security guidelines which was deemed ‘too IT focused and not relevant to OT security’. CSO magazine reported that the original TSA recommendations were to disable Microsoft macros, and programmable logic controllers and change all passwords. Lobbying by the American Petroleum Institute has resulted in the TSA’s backtracking.

Aaron Smith, blogging on the ISA website offers five tips to protect your business from cybercrime These are 1) create a plan, 2) backup your data, 3) secure your network, 4) schedule updates (for software and operating systems) and 5) install security software. Read Smith’s blog here.

In a similar vein, but at considerably greater length, Carnegie-Mellon University’s Software Engineering Institute (SEI) has published a ‘Commons sense guide to mitigating insider threats’, now in its seventh edition. Insider threats come from individuals with access to an organization’s critical assets who use this to ‘act a way that could negatively affect the organization’. The Guide summarizes the SEI’s work since the publication of the 2017 US State of Cybercrime Survey that found that 20% of electronic crime events were suspected or known to be caused by insiders. Examples include stealing information such as trade secrets and customer information and sophisticated crimes that sabotage an organization’s data, systems, or network. More from SEI’s 174 page Guide.

French startup RFence has raised €1.3 million to develop its ‘Horus’ radio frequency scanning technology for securing critical infrastructure. Horus monitors the entire radio spectrum (GSM, 2G, 3G, 4G, 5G, Bluetooth, Wifi …) to detect emitting devices including Walkie-Talkies, IoT devices, vehicles and homemade radio transmitters. More from RFence.

A study, ‘Sensitive Data in the Cloud’ by the Cloud Security Alliance found that 67% of organizations store sensitive data in public cloud environments. The report somewhat confusingly found that although ‘89% of respondents found that cloud security controls are effective’, organizations ‘are not confident in their own ability to protect sensitive data in the cloud’. A quarter of the respondents leveraged another layer of security in the form of ‘confidential computing’ from Anjuna Security which sponsored the CSA study.

To celebrate Cybersecurity Awareness Month (October), CybeReady has released an Interactive Learning Kit to prepare employees and organizations against cyberattacks. Cybersecurity Awareness Month was established by the President of the United States and Congress some 19 years ago with backing from the Cybersecurity and Infrastructure Security Agency (CISA) and the National Cybersecurity Alliance (NCA) to raise cybersecurity awareness nationally and internationally. Download CybeReady’s complete guide to cyber awareness and get the learning kits here.

On the subject of cybersecurity training, the ISA’s Sourabh Suman recently blogged on how to better train automation engineers on IEC 62443. The Colonial Pipeline attack has exposed an ongoing problem facing the nation’s critical infrastructure, a gap in the cybersecurity workforce. ‘Future wars will no longer be traditional and the country needs to be prepared on both the defensive and offensive sides, which starts by addressing this shortage. Read how in Suman’s blog. To train the cyber army’s commanders, ISA has also launched a microlearning module for chief information security officers, a majority of whom believe, according to yet another study, that ‘their organizations are unprepared to fend off potential cyberattacks’.

A Cyber Readiness Report by Trellix https://trellix.com based on research conducted by Vanson Bourne, surveyed 900 cybersecurity professionals and found that the majority of US providers in oil and gas (and other sectors) have not implemented full cybersecurity capabilities due to lack of in-house cyber skills. Specifically, 75% of US oil and gas sector survey respondents have not yet fully deployed multifactor authentication ‘making remote access to systems much easier for bad actors’.

And again, according to BreachBits, a cyber risk rating and monitoring company, ‘the majority of companies across the US oil and gas industry are at risk of a successful cyber breach’. The analysis of 98 representative upstream, midstream, downstream and supply chain companies across the energy sector, is now available as BreachRisk: Energy 2022.

DNV has pitched in on the cyber scaremongering scene with new research into the ‘Cyber Priority’ that found ‘energy professionals believe that cyber-attacks on the industry are likely to cause harm to life, property and the environment in the next two years’. Moreover, ‘only 47% believe that their operational technology security is as robust as their IT security’. More in a similar vein from DNV.

The EU Commission has presented a proposal for an ‘EU Cyber Resilience Act’ XXXX https://ec.europa.eu/newsroom/ECCC/items/757902/en to protect consumers and businesses from products with inadequate security features. The Act heralds EU-wide legislation with mandatory cybersecurity requirements for digital products throughout their lifetime.

If, you think you know all this stuff already, you may qualify for NIST’s (the US National Institute for Science and Technology) program solicitation NSF 22-632 for a Cyberinfrastructure for Sustained Scientific Innovation (CSSI). The program is seeking recipients for some $34 million per year in government funding to be shared across about 35 participating organizations.

The Open Group recently hosted a cyber event that looked into zero trust architectures and supply chain security with input form NIST, NASA, Microsoft, IBM and others. The Open Group is to explore how open standards can provide actionable insights in these important and developing topics. More in the TOG blog.

Finally, ISA blogger Sagar Yadav sets out to explain just why ICS/OT infrastructure is so hard to secure. In essence this is down to a reversal of priorities between IT and OT. In OT infrastructure, availability is the highest priority while security comes in second. In IT it’s the other way round. At least that’s what we understood from a quick spin through Yadav’s blog.


Bechtel, Cumulus team on digital bolted-joint management

Bluetooth torque wrenches and cloud-based workflow support Shell’s new-build chemicals plant

Cumulus Digital Systems has partnered with Bechtel to deploy a bolted joint management system during the construction of Shell’s new-build Pennsylvania Petrochemicals Complex. When finished, PPC will produce 1.6 million tonnes of polyethylene per year using low-cost ethane from shale gas producers in the Marcellus and Utica basins.

Cumulus’ Smart Torque System (STS) includes a cloud-hosted control center and database of activities to be executed on site. A mobile application guides workers through each step of the workflow, communicating data collected from Bluetooth-enabled torque wenches and other tools. Target torque values are sent over the wireless connection and the actual values achieved returned in real time to the control center.

Poor bolted joint management is said to cause leaks that emit over 170 million metric tons of greenhouse gas annually. In addition to enforcing engineering controls and improving productivity, the deployment has reduced leak rates to 0.1%, ‘a 100 fold reduction on the industry average’.

The piping scope consisted of more than 120,000 flanged connections at the facility, with over 173,800 completed torque records. Construction began in 2016, with piping assembly ramping up in 2019 with most assembly work completed earlier this year. Cumulus developed an API to allow Bechtel to upload plant data and feedback to the EPC’s construction information management systems.

STS was originally developed by Shell at its innovation center in Boston, Massachusetts to prevent leaks and improve productivity for Shell’s capital projects and turnarounds. After early success internally, Shell spun STS out as Cumulus Digital Systems in 2018.

Cumulus digitalizes manual work that is ‘mission-critical, high volume, and difficult to automate’ improving the quality and productivity of safety-critical workflows across a range of industries. Cumulus also claims to be addressing the construction industry workforce shortage, with some 400,000 positions reported as unfilled late last year.

A Gastech white paper, ‘Deploying A Digital Bolted Joint Management System At Scale To Prevent Leaks And Enforce Engineering Controls’ can be downloaded from Cumulus.


VR, the Metaverse and Web3

Oil IT Journal editor Neil McNaughton reviews over 20 years of attempts to find a useful application for virtual reality technologies. From the Cave, fancy headsets of the last millennium, through Second Life, Google Glass and into the metaverse. But what exactly is the metaverse? Is it Web3? Is it blockchain? Will it be a closed world managed by Meta/Facebook. Or, as many in the IT universe seem to be hoping, an ‘open’ environment where all the GAFAMs can cash-in?

Back in the day Oil IT Journal we reviewed the state of the art in virtual reality. When do you think this was? 2015, 2010, 2005? Wrong. We published our VR Hardware Primer back in 1998 covering a dozen or so solutions. The 3D meme was having a good time towards the end of the last millennium as companies deployed various combinations of cave-style systems, 3D glasses and so forth. It seemed to us that these were more designed to impress than for real work. Often the massive installations actually had rather low resolution. And folks reported queasiness after wearing the 3D glasses for a while. You still see magazine with pictures of geophysicists and engineers wearing the clunky headsets and pointing out some feature of interest inside a VR cave. I’m not sure that any real work is done in these environments. If I’m wrong I’d like to hear about it.

My next observation along the 3D/immersive time came with the remarkable madness of crowds that was Second Life. This shared immersive environment, the first ‘metaverse’, was heralded as a game-changer – but I’m not sure what the game was. Folks created their avatars, jumped in and did stuff. Companies set up shop inside the system (IBM was all-in for a while), some countries opened ‘embassies’. An SPE president suggested that SPE could set up an ‘island’ in Second Life where our avatars could meet. He was kidding I think. But IBM wasn’t.

In 2009 BP trialed a VR collaboration environment from HP that blended the physical world with models for remote troubleshooting activities. Users’ avatars could gather in ‘rooms’ with walls showing information from diverse sources. More on the BP/HP ACE here. VR Context’s WalkInside also deserves a mention, a 3D model with click-through to plant data (but no avatars). WalkInside is now owned (an still going) by Siemens. But the technology seems to have migrated from the Cave to the workstation and of course, the 3D facility model has been rebranded as a ‘Digital Twin’.

Not all that long ago all this looked as though it would be blown away with the advent of the Google Glass that, like the early VR wearables, blends reality with things digital. My own reaction to meeting someone wearing the Google Glass, filming my every move and recording my every word, was an inclination to punch them the nose. Of course I refrained, but I wonder how many people will share my pugilistic instinct when confronted with a wearer of Facebook’s new portal to the Metaverse.

At a reported 700 grams, Facebook’s new headset is way more intimidating that the old Google Glasses but shares the annoying ability to incorporate outsiders with its virtual world without a by-yourleave. Although Facebook’s Metaverse is the object of considerable speculation, some see it as the future of the World Wide Web. The Metaverse is thus (according to Deloitte) conflated with Web3, a ‘blockchain-based’ upgrade of the world wide web. We will pass quickly on where blockchain fits in to the communications stack (blockchain-based HTTP requests for anybody?).

I attended a session on the Metaverse at the Paris Big Data conference and heard speakers from SalesForce, Havas Play and WorldLine opine on where the technology is heading. Their metaverse is seen as opening a new marketing channel leveraging blockchain, non fungible tokens (NFTs) and loyalty programs. Where Web2 uses passwords and cookies, Web3 will use blockchain ‘signed transactions’ which are ‘much more secure’. Moreover these transactions will be outside of the apps, there will be ‘no more silos’. NFTs will guarantee anonymity ‘within reason’. One example of an NFT application was the purchase and sale of high-end bottles of wine (this is France!). Here a smart contract ‘guarantees’ ownership rights and allows the original brand to take a cut on each resale. This, like other blockchain use cases, fails the McNaughton test in that there is nothing that stops a seller refilling a bottle with an inferior wine and drinking the original contents … but I digress.

The speakers assume that the Metaverse will, like the world wide web, be open to all. We came across similar sentiments expressed by Microsoft where we read, ‘Because there will be no single metaverse platform or experience, interoperability is crucial’. So Meta/Facebook has bet the house on a metaverse that will just be one of many? That will interoperate nice with Microsoft and the other GAFAs? I don’t think so! Zuckerberg’s intent is to upgrade Facebook with another addictive, sticky environment. Others will be able to join for sure, on a plug and pay basis.

Finally I have been pestering the W3C for some time now asking what they think of the Metaverse. You might think that the W3C might have something to say about a movement calling itself ‘Web3’. You might event expect W3C, as high priests of web tech, to have something to say about a ‘blockchain-based web’. But so far, no reply*. I imagine that W3 is playing a waiting game, making sure that nothing it says now is going to upset its sponsors. Who are they? Who do you think … the GAFAMs!

* While the W3C still has not replied to our pestering, we did get a reply to our under-the-radar query posted to a W3 mailing list where we have it, informally, that the W3C is ‘not working on a blockchain-based Web3’, and that it has no ownership on a Web.x brand.

STOP PRESS … The W3C is at least discussing WEB3 and blockchain as evidenced in the minutes of a recent discussion on Where’s the Web in Web3?


Rock Imaging SIG on data management and software tools

Math2Market on digital rock physics. Data management with ThermoFisher Athena. EarthNET – the digital underground. OSDU’s core challenge.

Christian Hinz presented Math 2 Market’s Digital Rock Physics. DRP is described as the ‘digitalization of special core analysis’ (SCAL). Numerical SCAL generates ‘statistical digital twins’ of the reservoir that save time and support automated workflows, controllable with a Python interface. Users include Shell, Petrobras and OMV.

Gwenole Tallec presented ThermoFisher’s Athena software for rock imaging data management, illustrated with a rapid core analysis use case on data supplied by KAUST. Computer tomographic (CT) core data and metadata is captured automatically for remote access and visualization along with annotations. The PerGeos core profile assembly wizard creates a single core image from partial scans for facies determination. Digital SCAL on micro CT scanned plugs also ran. More on Athena from ThermoFisher.

Daniel Austin presented Earth Science Analytics EarthNET, a.k.a. ‘the digital underground’. The cloud based suite of software tools supports 3D earth model creation from a range of data sources and scales, from micro CT up. Data management is achieved through EarthBANK, a cloud-native data platform with API links to OSDU and Petrel. EarthAI adds automated and semi-automated AI workflows. ESA is project leader for the Norwegian NOROG released well project to analyze drill cuttings samples from some 1,900 wells drilled on the Norwegian Continental Shelf.

A panel discussion led by RISIG Founder Ross Davidson heard Daniel Austin argue for open standard data formats (such as those used in seismics) for core data. Austin’s ambition is for OSDU to make core data exchange easier. Davidson compared OSDU to POSC’s standardization effort of the last century which failed because of the inefficiency of an open model. Austin agreed, this is a big challenge, but modern data formats are much better. ESA’s bricking data format for seismic machine learning comes from outside of the incumbent geoscience domain, leveraging novel approaches from the likes of VTK, Software Underground, PyVista and GemPy. Davidson observed that data management is a huge topic. Regarding management of rock imaging data, does it have to be done in the cloud? Tallec said yes. Training ML models on the desktop is hard. The cloud is the future. ESA has been cloud-native for six years. But what of security? Clients seem nervous. Is this paranoia or just being careful? It does not have to be a public cloud. Cloud-agnostic software can be deployed anywhere. For Hinz, more users are now in a cloud environment, either on prem or public.

More from the Rock Imaging SIG.


NSF Digital Rocks Project update

On quantifying sub-resolution porosity. A new rock typing approach from invasion capillary curves. Are machines capable of interpreting µCT images? What formats for digital rocks?

The US National Science Foundation-backed Digital Rocks Project has released a host of papers and YouTube interviews covering current projects. One notable presentation covers quantifying and segmenting sub-resolution porosity, said to be a bottleneck in porous media research. Researchers Shan Wang and Tom Bultreys from Ghent University present a novel rock typing technique that captures sub-resolution porosity. Watch the video here, download the paper here and visit the project home page.

Another project covers work on ‘unsteady-state capillary drainage experiments on the Estaillades carbonate combining differential imaging with pore network models in a rock-typing approach based on invasion-capillary pressure curves. The new technique is said to ‘outperformed classical porosity-based rock-typing’. The work leverages super high resolution micro-CT images (6.5 µ meters). The multi gigabyte datasets have been downloaded over 500 times.

Finally, another video features Ankita Singh’s work on Grayscale REV* Analysis. In a separate video Singh asks, ‘Are machines intelligent enough to infer REVs from raw µCT images?’ We skimmed through the video to find that the answer is, yes they are! The DRP authors also discuss image data formats used on the DRP. There is ‘no such thing as agreed upon data format in digital rock physics!’ A survey of all the files in the DRP found that TIF and RAW files were among the most popular file types across different projects. Tile types such as PNGs and JPEGs are also used to upload 2D slice data. Other file types are also used.

* representative elementary volume..

While the DRP newsletter is ‘new’ some of the project videos are a couple of years old. Access the Digital Rocks Projects . You might also like to sign up for the Porous Media Tea Time Talks.


MathWorks Matlab Expo 2022

Dan Jeavons on Shell’s 5,000-strong AI network and citizen data scientists. Matlab Production Server for Shell’s CatCheck Connect. Matlab plugs-and-plays with PyTorch, TensorFlow.

Dan Jeavons provided an update on his 2016 MathWorks talk on Matlab and advanced analytics in Shell. Digital innovation and AI are now a key components of Shell’s ‘accelerating’ energy transition with its bold ambition for net zero 2050. There will be massive shifts in energy in the next decade. Shell is to leverage AI and data science to transform and manage the energy system. There are already some 5,000 members of Shell’s AI network and 10,000 equipment items are ‘monitored by AI’. Other digital transformations involve Shell EV charging. Shell deploys some 100 AI apps running against 1.9 trillion rows of data. All is now in a common data frame, accessed from remote digital centers with embedded AI. Citizen data scientists are using Matlab and Azure ML. Amja Chaudry drilled-down into Shell’s use of MathWorks tools. With help from MathWorks execs, Shell embarked on experiments and proofs of concepts to see if they could be deployed cloud native. This led Shell to acquire the Matlab Production Server to deploy its in-house developed algorithms. The first product to be rolled (in 2016) was the Quest Solution for daily CO2 monitoring and alerting from laser measurement across the Canadian CCS facility.

Matlab has been used for over 25 years across Shell and usage is now consolidated into a single license agreement and center of excellence. This now involves a DevOps approach to ‘accelerate and operationalize’ deployment. In 2019 Shell rolled-out CatCheck Connect on Azure, an app to evaluate catalyzer health. In 2020 the Matlab WebApps Server was deployed to support Shell’s ‘MADA’ (modern data analysis) tool, and also to develop a enterprise app for subsurface geologic feature prediction. In 2021 Shell formed the OpenAI initiative, a collaboration with C3ai Baker Hughes and Microsoft to commercialize its know-how. The latest MathWorks tool is the Matlab Online Server that is allowing solutions developed with Simulink and Simscape to be cloud-hosted.

Jeavons summed up observing that there is now the potential for digital to become ‘the way we do business’, rather than an optional extra. ‘There is a confluence of science and data-driven modeling, everything Shell does is physical’. Applications like CatcheckConnect combine science and data, leveraging Matlab’s strength. Simulink is used to combine data-driven chemometrics with process monitoring and high resolution mass spectroscopy to optimize GTL technology. Jeavons concluded by welcoming MathWorks* into the OpenAI initiative.

Another noteworthy presentation at the MathWorks Expo looked into different ways of combining Matlab with open source machine learning environments including PyTorch and TensorFlow. The Matlab Deep Learning Model Hub offers some 50 pretrained models and a connection to TensorFlow and PyTorch repositories.

Watch this and other presentations from the show here.

* MathWorks and Kongsberg Digital joined the OpenAI initiative in 2021.


Software, hardware short takes

Aperio DataWise for PI data quality. Siemens Omnivise condition monitoring goes offshore. Schlumberger’s Enterprise Data Solution. MFE Inspection Solutions resells Hovermap. AIDA dashboard for computer science research. C3AI rolls-out ESG solution. Cudd’s cloud-based wellhead audit. AUI’s cognitive drilling advisor. EasyCore 3.0. Enverus’ Fusion Connect. GeTech Globe 2022. KBC PetroSim 7.3. L3Harris’ ENVI. Orfeo ToolBox. Ikon RokDoc 2022.4. Schlumberger’s Neuro-autonomous solutions and ProcessOps. Wireless Seismic’s DrillCAM. Sierra Digital’s EnerBridge, OhZone.

Aperio reports on a 2022 Pulse by Gartner survey of 160 ‘leaders’ in operations, engineering and IT, which found an industry-wide concern over data quality. Aperio has followed up on this with a 25-page white paper on monitoring and improving the health of your OSIsoft PI data. Poor PI data quality impacts analytics and reporting, causing data scientists to spend ‘80% of their time’ on data clean-up. Data quality is a serious problem without easy answers. One client reported that ‘It’s too expensive to manually manage and fix 1-2 million tags’. Enter Aperio’s DataWise. DataWise fixes PI data problems at scale with automatic anomaly detection, root cause analysis, quality scorecards and real time quality monitoring. DataWise offers engines for automatically detecting more than 16 types of anomalies. As we reported from the 2019 SAP in Oil and Gas Milan event, Aperio’s digital fingerprints of PI data can also spot cyber events, a broke, flatlining meter or correlations between sensors.

Siemens has rejigged its ‘Omnivise’ condition and performance monitoring suite for deployment offshore. Omnivise for Offshore (O4O) includes models of typical assets (AC distribution, DC distribution, rotating machines, battery systems and pumps), adding plant data visualization and reporting. The first offshore deployment of the O4O Digital Solution Suite addresses wind farm management but Siemens envisages other oil and gas use cases for the technology. The O4O will be the subject of a talk by Siemens’ Bruno de Oliveira e Sousa on Condition and Performance Monitoring of Battery Energy Storage Systems (BESS) in Drilling Operations to be given at an upcoming SPE Workshop on the Impact of Digitalization on Drilling Operations in Al Khobar, Saudi Arabia,

Schlumberger has announced an ‘Enterprise Data Solution’ powered by Microsoft Energy Data Services. The EDS is said to deliver a comprehensive subsurface data capability ‘aligned with the emerging requirements of the OSDU technical standard’. EDS is described as an ‘open and interoperable platform’ with embedded artificial intelligence and data management tools. Microsoft Energy Data Services is a ‘fully managed, enterprise-grade OSDU data platform’ co-built with Schlumberger. Early adopters include Petronas and Chevron.

MFE Inspection Solutions is now a reseller of Emesent’s products including the flagship Hovermap ST autonomous mapping system. Hovermap can be attached to a drone and fly autonomously beyond sight and communication range. The unit can also be used to perform handheld or vehicular inspections, or attached to ground-based robots like Boston Dynamics’ Spot.

Springer Nature, the Open University and the University of Cagliari have announced the AIDA Dashboard for exploring computer science research conferences and journals. Data in the dashboard is freely available under a CC-BY 4.0 license.

Barco and Igloo Vision have teamed on a new immersive collaboration environment that delivers a shared virtual reality experience. Scenarios are generated using the Igloo Vision Media Player and accessed from inside environments such as Barco’s high-end Canvas and Cave systems

C3 AI has rolled-out a new AI-driven ESG application, C3 AI ESG. The system collects data from ERP and manufacturing systems along with exogenous emissions and commodity data to provide ‘comprehensive’ Scopes 1, 2 and 3 reporting that complies with SASB, GRI, TCFD, and CDP standards.

Cudd Well Control has announced a new cloud-based Wellhead Audit inspection platform that promises efficient, traceable and accurate wellhead audits. WA, part of the CuddAssured brand, provides information on well conditions including problems with corrosion, valve functionality and pressure. The customizable solution targets both producers and storage wells.

AUI Systems has announced the Cognitive Drilling Analyzer, a tool that ‘understands’ unstructured information in daily drilling reports. CDA is used in offset well analyses and to investigate incidents such as stuck pipe and mud loss. CDA has been validated on AUI’s global data set.

The EasyCopy Company has announced a beta release of EasyCore 3.0 with ‘EasyDB inside’, a new SQL database for core data integration, search and retrieval.

Enverus’ new Fusion Connect package combines proprietary and Enverus’ data in Prism to reveal ‘previously unseen’ opportunities and insights.

The GeTech Globe 2022, now rebranded as ‘the Earth’s digital twin’, is a new release of the 3D dynamic plate model of global geologic, climatic and oceanographic systems that can be used to locate, develop and operate petroleum and geothermal sources. The 2022 Globe now also targets the search for critical minerals and for suitable carbon capture and storage sites around the world.

Geovariances has released a new version of Isatis.neo, its geostatistical flagship. Isatis.neo is available in a ‘standard’ edition or in special Petroleum and Mining editions.

Yokogawa’s KBC PetroSim 7.3 release includes an emissions calculator for gas turbines and burners. Petro-SIM 7.3 also lays the foundation for new AI-based automated model maintenance that are to roll-out later this year.

‘Free’ new machine learning algorithms are included in the upcoming 5.6.3 release of L3Harris Geospatial’s ENVI. Support vector machines, random forest and other methods extend functionality with supervised and unsupervised classification and anomaly detection.

The Orfeo ToolBox from OSGEO, the Open source geospatial foundation, provides state-of-the-art remote sensing algorithms for high resolution optical, multispectral and radar image processing ‘at the terabyte scale’. The OTB algorithms are accessible from Monteverdi, QGIS, Python, the command line or C++.

Ikon Science has announced RokDoc 2022.4, a major release that includes ‘powerful’ new machine learning features. A multi-well wavelet toolkit has been optimized for broadband seismic data and a new Rock Physics ML tool stores and propagates expert knowledge across geoscience and reservoir workflows. RPML was developed in collaboration with CSIRO, Australia’s national science agency, as an addition to RokDoc’s Deep QI module. More from Ikon Science.

‘Cognitive’ is just not enough for Schlumberger which has now announced a line of ‘Neuro-autonomous’ solutions, a.k.a. ‘connected, intelligent solutions to transform E&P workflows’. The Neuro package uses cloud-based software and connected intelligent systems to create a feedback loop between surface and downhole, ‘increasing the efficiency and consistency of E&P operations while reducing human intervention and footprint’. The first Neuro solution delivers steering autonomy for directional drilling combining artificial intelligence with surface and downhole automation workflows. In a separate announcement Schlumberger has unveiled ‘ProcessOps on DELFI’,a (deep breath) ‘collaborative, cloud-based digital facility twin that uses artificial intelligence and automation along with data and physics-based models to transform facilities workflows’. ProcessOps is deployed atop ‘Process Live’, a ‘data-enriched performance service’.

Aramco and Wireless Seismic are at work on a next-generation real-time seismic while drilling system, DrillCAM, that monitors the health of drilling equipment, the accuracy of the subsurface model and provides imaging ahead of the drill bit. More from Wireless Seismic.

Sierra Digital has unveiled new ‘low code/no code’ solutions that extend SAP’s business technology platform. EnerBridge simplifies production revenue and joint venture accounting. Sierra has also rolled out ‘OhZone’, a suite of workflow optimizers. More from Sierra Digital.


GO Digital Energy 2022, Amsterdam

More on Shell’s SSIP. Eigen’s knowledge graphs for safety critical operations. Cognite adds graph query to Data Fusion. Microsoft Energy Core - beyond the buzzwords and into the metaverse! Net Zero Tech Center’s P&A framework. Accenture on the metaverse … and OSDU. Lummus Digital deploys Mcube machine learning. Technip Energies ‘WISE’ digital twin. Earth Science Analytics EarthNET. Repsol Tech Lab and Technalia robotics. More from PipelineSentry, Prevu3D, Petrobras and start-up Cuurios.

Peter van den Heuvel provided an update on his GO Digital Energy talk of last year when he introduced Shell’s SSIP (Shell Sensor Intelligence Platform). SSIP, Shell’s OSIsoft PI-based real time data infrastructure supports some 15,000 users capturing 12 million events per second. van den Heuvel presented the real time architecture than feeds data into the SSIP. The IIOT remote monitoring service (IRMS) monitors non-critical equipment using a LoRaWAN wireless mesh. Shell’s IoT is claimed to be a low cost feature rich alternative to commercial alternatives and comes close to matching wired connectivity. IoT sensors can be added onto plant items for vibration and temperature monitoring. Today some 2,500 pumps are monitored, providing early detection of potential failure. Hand valve sensors from Aloxy provide situational awareness of line-ups and unintended changes to operating conditions. Plans for the future include low orbit satellite connectivity, more use of PI Asset Framework and Azure, and full replication of PI data into SSIP. Shell is also working towards a ‘multiple cloud strategy’.

Murray-Callander showed how Eigen has deployed knowledge graph technology to help Lundin’s engineers study blowdown events on the Norwegian North Sea Edvard Grieg field. Blowdowns are safety critical operation that automatically reduce equipment pressure to a safe level in an emergency, triggered automatically by the safety instrumentation. Analyzing the complex chain of events that caused a blowdown used to take a week, mostly spend on data collection. Eigen deployed its Eigen Analytics Platform, based on Neo4J’s knowledge graph*, added some event detection and a Python script to automate data gathering, allowing Lundin’s engineers to spend more time on ‘adding value’. More on Eigen’s EAP here and on its work with Lundin here. Eigen has also worked with Wintershall Dea on a virtual flowmeter application, in collaboration with Turbulent Flux.

* Eigen uses the ‘free’ version of Neo4J’s knowledge graph.

Petteri Vainikka presented Cognite’s industrial knowledge graph described as an ‘open, flexible, and labeled property graph data model that represents your operations’. Cognite puts the ‘average revenue lost’ due to poor business decisions at a surprisingly huge 30%!, all down to bad analytics and bad data quality. While there are ‘abundant’ digital efforts in the pipeline and many successful proofs of concept, few actually make it into production. Even fewer provide a significant return on investment. Cognite is now pushing the ‘data product’ concept. A data product is an ‘owned and governed set of data that is built for a particular purpose’. The relationship between data products, the knowledge graph and Cognite’s Data Fusion flagship is unclear, but we believe that Cognite is adding Neo4J’s graph database and a GraphQL API to CDF.

Osama Hanna, promising to ‘go beyond the buzzwords’, introduced the Microsoft Energy Core, a ‘global initiative and center dedicated to digital transformation in the energy sector’. The MEC includes AI, cloud technologies, the internet of things and now (da..da!) the metaverse. Hanna cited an IDC study* that found one third of the companies studied stored their real time data in the Azure data lake (another third used an in-house PI historian). Hanna presented Microsoft’s generic ‘three horizons’ framework for digital transformation. McKinsey’s three horizons model was introduced some twenty years ago and considers short, medium and long term tactics and strategies for growth. According to Microsoft, ‘digital’ is shortening the horizons and blurring** the boundaries as described in a 2019 HBR paper by Steve Blank. Microsoft’s horizons are shrinking. In 2018 quantum computing was at the H3 ‘evolutionary’ stage. Today QC is H2. The H3 slot is now held by the metaverse. For more on the Microsoft metaverse, we read this blog to learn that ‘Because there will be no single metaverse platform or experience, interoperability is also crucial’.

Of course Meta (Facebook) is going to open up its Metaverse to all comers and will be doing everything it can to avoid becoming (another) monopoly! … err maybe not!!
* Hanna’s reference was to the 2019 IDC study. The 2021 Worldwide Industrial IoT Platforms in Manufacturing Vendor Assessment is now available. A snip at $15,000.
** blurring an already nebulous concept can’t be too hard!

Craig Nicol and Keith Hogg from the UK’s Net Zero Technology Centre presented a risk-based well P&A modelling framework developed for the UK National Decommissioning Centre, a partnership between the NZTC and the University of Aberdeen. The framework, published in the SPE Journal in 2021 is now being adapted to carbon capture and storage work.

Jan van den Bremen spoke to Accenture’s technology advocacy in regard to the ‘next waves of innovation’ i.e. quantum computing, extended reality and the metaverse. After which Paul Hodson gave a more prosaic argument in favor of the open subsurface data universe, OSDU. Hodson’s talk involved a balancing act between the perceived advantages of OSDU deployment and the requirement for handholding (presumably from Accenture) on the complex transformation journey that requires a ‘holistic approach’. More from the Accenture publication ‘And the walls came tumbling down’ .

Another balancing act was on offer from Oleg Schkoda (Lummus Digital) as he presented the digital transformation of an unnamed ‘$3 billion petrochemical group’. He warned that ‘most digital transformations fail and do not deliver up to expectations’, that ‘90% of companies ‘lack the skills and capabilities to deliver digital’ and other gotchas. The company’s problems stemmed from a ‘disjointed implementation’ of workflow software from SAP Ariba and others. Poor data management was also an issue and earlier proof of concepts projects failed to scale. ‘Digital enablement’ was identified as a strategic imperative by leadership. Lummus’ solution included the establishment of a joint ‘AI Factory’ embedded in the client’s organization. This has now applied a DevOps/MVP approach to application development to address a large spectrum of use cases. The platform deployed is based on TCG* Digital’s Mcube advanced analytics and machine learning stack.

* Lummus Digital was formed when two companies (Lummus and TCG Digital) of The Chatterjee Group merged in 2020.

Francois Haynes introduced Technip Energies’ ‘Wise’ digital twin (as in ‘working intelligently and sustainably for better energy’). Yesterday’s ‘digitalization’ initiatives are over! Today’s facilities benefit from the ‘decarbonized digitalization’, heralded by the ‘fourth industrial revolution’. More prosaically, the digital twin allows for cost comparisons of different engineering concepts, feasibility studies that make for carbon-conscious choices and finally, a single source of truth for project continuity. Examples of the Wise DT include the automatic extraction of a fire proofing zone from the 3D model. Low-carbon and safer operations are enabled by a collaborative and immersive experience for training and reviews.

It’s curious that Haynes’ slide deck makes no reference to the CFIHOS, initiative for, inter alia, ‘project continuity’ as Technip has put quite some effort into the initiative.

Tatiana Moguchaya presented Earth Science Analytics’ flagship EarthNET application an ‘OSDU-ready’ platform for earth science data. Moguchaya’s presentation suggested considerable scope expansion for EarthNET into the fields of robotics, the digital twin and more. ESA is backed by Saudi Aramco Energy Ventures.

Alfonso Garcia from the Repsol Technology Lab has been investigating possible use cases for robotics in oil and gas. These include the use of drones to perform visual inspections of facilities such as tank farms and the building of digital damage and corrosion models. Repsol has also been working with partner Technalia on ‘indoor-outdoor logistics’ i.e. a self driving cart for shipping small items. Technalia is a member of the EU-backed Robott-Net R&D consortium that offers free advice on industrial robotics in the EU.

Amit Singh outlined three of Schlumberger’s initiatives as a ‘leading oilfield services company’. First is Delfi, Schlumberger’s ‘cognitive’ E&P environment, now billed as ‘powered by OSDU’ (although the small print is rather nuanced). Next up is Agora, a spin-out company that specializes in oil and gas edge computing. And also the Innovation Factori, a collection of world-wide R&D hubs working on AI and ML applications for energy engineering processes. Schlumberger has teamed with Microsoft to offer a ‘fully managed’ implementation of the OSDU platform. For more on Agora read the 2020 SPE paper ‘Edge Computing: A Powerful and Agile Platform for Digital Transformation in Oilfield Management’.

Hiroyuki Koito presented JGC’s Auto Plot Pathfinder an automated design system that provides recommended plant floor plan options along with a quantitative evaluation for project feasibility. The tool is said to have proved a key milestone in JGC’s digital transformation and aligns the EPC’s front end engineering with customers’ requirements. Again, no mention of CFIHOS although JGC has been a member since 2019.

Mohammed Tomehy reprised Saudi Aramco’s IMOMS (Integrated Manufacturing Operations Management System). The solution is currently deployed at four refineries in the Kingdom with future expansion to other facilities and joint venture operations. Despite being announced as ‘commercially available’ in 2019, in 2022, Tomehy appears to be speaking of IMOMS in the future tense.

James Wardrop’s presented PipelineSentry’s eponymous pipeline data manager solution. PipelineSentry is work in progress. Wardrop is seeking a pipeline operator to sponsor its further development from MVP to operational product.

Nicolas Morency showed how Prevu3D converts large point cloud data sets into a precise 3D mesh that can be visualized, manipulated and shared with key stakeholders.

Jarbas Silva presented Petrobras’ strategic plan for 2022-2026 with key players SAP, Deloitte and Microsoft. The plan includes the use of Kairos for ‘IT transformation’.

Leed de Graf introduced his start-up Cuurios stating that data ‘should not be center stage if you want to improve your business’. It’s a better idea to analyze your workflows first and then look for relevant data to improve operations. The Cuurios software platform ‘optimizes workflows for asset rich businesses’.

The next GO Digital Energy Conference will be held from the 16th to 17th May 2023.


Folks, facts, orgs

3D at Depth, Asset Guardian, Atwell, BCCK, Bracewell, CSA Ocean Sciences, Canvass AI, Chevron, Cognite, ESG Global, Eaton, Enverus, Fugro, Greensea Systems, Hart Energy Conferences, Helix Energy Solutions, Honeywell, Ikon Science, Inductive Automation, Kongsberg Digital, Lufkin, Michael Baker, Opportune, PRCI, Pyxis, Quorum Software, Society of Petroleum Engineers, Siemens, Twin Brothers Marine, USA Compression Partners, Veriten, Verve Industrial, UK, Object Management Group,

3D at Depth has appointed Rob Davidson and Tarry Waterson to market its Cuvier Deep subsea Lidar system.

Asset Guardian Solutions has hired Stephanie Calder as chief commercial officer.

David Richter is now EVP and chief growth officer with Atwell. He hails from Hill International.

BCCK has appointed Naomi Baker as director of engineering. She was previously with Enterprise Products.

Steven Cook is joining Bracewell’s environment, lands and resources practice. He was previously with the EPA.

Vanessa Ward has joined CSA Ocean Sciences as GIS Analyst. Cairra Martin takes up the post of GIS Developer. Ward hails from the Florida Department of Environmental Protection, Martin from Trimble.

Canvass AI has appointed Kevin Smith as chief commercial officer. He hails from Aspen Tech.

Chevron has promoted Alana Knowles to VP and Controller.

Gabriel D’Onofrio is to head-up Cognite’s expanding Latin American business. The company recently opened offices in Colombia, Argentina, and Brazil.

ESG Global has appointed Patrick Smith as MD North America and Japan and Karen Tegan Padir as chief product officer. Smith was previously with Antuit.ai, Padir with Binx Health.

In an internal promotion, Eaton has named Matt Hockman to president of its Crouse-Hinds, B-Line and Oil and Gas organization.

Jan Stoklasa heads-up Enverus’ new software development hub in Brno, Czech Republic.

Fugro opened a new facility in the Jebel Ali Free Zone, (JAFZA) expanding its footprint in the Middle East and India region. The facility hosts a state-of-the-art center for remote/autonomous operations

Dennis Walsh has been hired as chief revenue officer at Genasys. He was previously working as Walsh Consulting Services.

Laura Krahn has been appointed director of programs with marine robotics specialist Greensea Systems. She hails from CWH Advisors.

Hadley McClellan has been named VP and GM of Hart Energy Conferences. She hails from the OTC.

Helix Energy Solutions has hired Diana Glassman and Paula Harris as directors. Glassman hails from Federated Hermes, Harris from Schlumberger.

Former Wood Group CEO Robin Watson has been appointed to the Honeywell board of directors.

Ikon Science is expanding its presence in the Americas, naming Dan Tostado (VP sales Latin America) and Vadim Khromov (VP sales North America). Tostado’s is an internal move. Khromov hails from CGG.

Inductive Automation has promoted Travis Cox to chief technology evangelist, Kevin McClusky to chief technology architect, and Jason Waits to CISO.

Shane McArdle is now CEO at Kongsberg Digital following Hege Skryseth departure to Equinor. McArdle’s is an internal promotion.

Brent Baumann is the new CEO at Lufkin. He hails from Weatherford.

Michael Baker has named Susan Howard to VP National Industrial Control Systems. She hails from Jacobs.

John Harris is now MD process and technology at Opportune LLP. He was previously with CubeLogic. The company has also hired Daniel Rojo as co-head and MD Opportune Partners. Rojo hails from Wells Fargo.

PRCI, the Pipeline Research Council, has named Nick Homan (Marathon Pipeline Co.) to its executive board.

Rodney Smith, Steve Bradford and Matt Flanagan now serve the advisory group at Pyxis, a Stancil & Co. affiliate.

Gene Austin is to resign as CEO Quorum Software and become chairman of the board. President Paul Langenbahn is the new CEO.

Sushma Bhan (Ikon Science) chairs a new Society of Petroleum Engineers data science and engineering analytics technical section. The previous digital energy, petroleum data-driven analytics and data science and engineering analytics technical sections have been moved to the new section. In a separate announcement, Mark Rubin, SPE CEO and executive VP is to retire next year following 21 years service.

The Siemens supervisory board has appointed two new executive board members. Anne-Laure de Chammard will lead the ‘transformation of industry’ business area. CTO Vinod Philip assumes board responsibility for ‘global functions’. de Chammard is also CEO of ENGIE Energy Solutions.

Twin Brothers Marine has appointed Wayne Theriot as VP finance. He hails from TTMK Holdings.

Michael Pearl has joined USA Compression Partners as CFO. He hails from Western Midstream Partners.

Veriten has created a strategic advisory board with appointees Greg Armstrong (co-founder and former chairman and CEO of Plains All American Pipeline), Leslie Beyer (CEO of the Energy Workforce & Technology Council), Naomi Boness (MD of the Natural Gas Initiative at Stanford University), Bill Flores (entrepreneur and public policy leader) and Arjun Murti (director of ConocoPhillips).

Verve Industrial has hired Marcel Kisch as senior solutions consultant. He was previously with IBM.

We’re hiring

Enverus is hiring for its new software development hub in Brno, Czech Republic. Apply here.

Offshore Energies UK is looking for a new CEO following Deirdre Michie’s departure.

Death

The Object Management Group reports the death of Jon Siegel, VP of Technology Transfer (1993-2019). Visit his obituary page here.


Done deals

Altair and RapidMiner. Baker Hughes bags Quest Integrity. Aramco, Equinor back Data Gumbo (again). Element Materials Technology acquires Singapore Test Services. Faro and GeoSLAM. Hexagon, iConstruct. Katalyst/Geopost. Kofax and Ephesoft. Omni Environmental Solutions and Purity Oilfield Services. GAI Consultants bags PGH Petroleum & Environmental. Sercel and ION software. TGVest Capital backs TXOne Networks. Xpansiv to acquire Evolution Markets.

Altair is to acquire RapidMiner, a low-code platform for AI/ML applications.

Baker Hughes is to acquire Team Inc. unit Quest Integrity, boosting its asset integrity solutions offering.

Data Gumbo has secured $4 million series C funding from the Energy Ventures arms of Saudi Aramco and Equinor to finance its blockchain-based ‘smart contract’ network.

Element Materials Technology has acquired testing, inspection and certification provider, Singapore Test Services.

Faro has acquired GeoSLAM, a provider of ‘simultaneous localization and mapping’ (SLAM) software that creates 3D models for use in digital twin applications.

Hexagon is expanding its Smart Digital Reality offering with the acquisition of iConstruct, a provider of building information modelling (BIM) software.

Katalyst Data Management is acquiring Geopost Energy, a Brazilian provider of oil and gas data products and services.

Kofax has acquired Ephesoft, enhancing its intelligent document processing offering and sales.

One Equity Partners’ Omni Environmental Solutions unit is to merge with Purity Oilfield Services, creating a nationwide provider of environmental services and equipment.

GAI Consultants has acquired PGH Petroleum & Environmental Engineers, a provider of petroleum and regulatory consulting services. Generational Equity advised on the deal.

CGG’s Sercel unit has completed its acquisition of ION Geophysical’s software business.

Industrial Internet of Things security specialist TXOne Networks has raised $70 million in series B funding in a financing round led by TGVest Capital.

Xpansiv is to acquire Evolution Markets, a global carbon, renewable and energy markets brokerage.


2022 AVEVA PI World. Amsterdam

ENI’s Digital Oilfield: PI + AI. PI at core of TotalEnergies Digital Twin. ExxonMobil consolidates on PI Vision. OMV’s ‘Helius’ data ecosystem. Schlumberger as PI SI. Inpex PI for Ichthys LNG. Aveva Video Wall for Aramco 4IRC. Namur, PlantXML for Evonil OneCAE. Energy is ‘eternal delight’ (ENI and William Blake)!

Matteo Boscato and Giuseppina Tomei presented on the optimization of the water treatment system on ENI’s ‘complex’ Armada Olombendo FPSO offshore Angola. The solution leverages machine learning on top of PI System data to provide predictions and anomaly detection at an ‘Integrated Operations Center’. The ‘digital plant’ displays information on onboard processes including four AI/ML algorithms for water treatment. EDOF (Eni’s Digital Oilfield, a PI System development) acts as a single source of real time data. The ML toolset includes components from Deepmind/Impala and Apache Hive. Data display uses Appian BPM linked to the PI System.

Virginie Segard and Gaël Cottet presented TotalEnergies’ approach to PI System data management. TE has been using PI for over 20 years and has developed many local use cases and governance best practices. The aim now is to scale these up, with strong data management and central governance, to create a ‘robust data contextualization platform’ supporting PI AF/Vision for data sharing, deployment and maintenance. TE is developing digital twins of its plants from generic site templates, localized with site-specificities such as UOM, language and historic tag naming conventions. Scalability is achieved with a distributed AF architecture running on central and local servers. On the niceties of data management vs. data governance, the authors pointed to a Tableau reference. Templates and naming conventions have been mapped and standardized and now, all generic use cases are developed by a team of AF developers and process engineers working on an Azure DevOps platform. To date some 400 AF master templates cover various activities. These have been instantiated to local sites for processes such as boilers, furnaces and reformers all with PI Vision-based displays. An AF to AF Manager is used to replicate between sites, with particular attention to units of measure,… ‘we guarantee that everyone works with the same UoM conversions’. Other custom tools have been developed for AF data validation and to detect differences between AF source and reference data. Work is carried-out in a new, consolidated ‘OneTech’ organization with some 3,400 engineers, researchers, technicians and support teams.

Azim Ezani bin Muzamli explained how ExxonMobil is moving from decentralized implementations of PI Process Book to a consolidated deployment on PI Vision. The 2021 release of PI Vision is an operational visualization tool that integrates with PI Asset Framework. ExxonMobil is leveraging PI Vision to manage load balance, failovers and offer a centralized visualization platform.

Amir Sadeghi and Yassine Messaoud presented OMV’s ‘Helius’ data ecosystem, underpinned by a PI System-based ‘single source of truth’ for operations. Helius is the central point of access for all E&P data for application developers, citizen developers and end users. Other Helius components include OSDU, Cognite Data Fusion and Esri ArcGIS. The system provides interfaces to other applications from Seeq (analytics), Halliburton (Decision Space), Siemens and Black & Veatch. One use case involved compressor membrane failure identification using Seeq to spot anomalous pressure and instability before failure happens. The deployment is a component of OMV’s DigitUP digital journey. More from OMV and Oil IT Journal.

Vishal Mahna and KK Chong presented Schlumberger’s enterprise visualization platform for asset performance management. The Schlumberger EVP corrals distributed data to provide contextual insights and optimal decisions. Reading between the lines of the presentation, it would appear that Schlumberger here is playing the role of a systems integrator, building its EVP on top of multiple Aveva tools running under the control of the Aveva Unified Operations Center.

Inpex Corp.’s first use of the PI System was as an enterprise historian. Since then, usage has evolved, as Naoto Yamabe and Kohei Kawamura described. PI played an essential role during the commissioning of the Ichthys LNG asset. Inpex has migrated its Japanese PI System infrastructure to the Microsoft Azure cloud, streamlining maintenance workloads and optimizing its corporate IT infrastructure. With help from systems integrator MKI Japan the whole on-site PI infrastructure was migrated to Azure in about a year, including an upgrade from Process Book to PI Vision. Various options for data migration were investigated. Negotiations with stakeholders were necessary to accommodate the eight hours of downtime during the switchover. A detailed cost breakdown showed that over a five year period, total cost of ownership is about the same for on-prem as in the cloud. Interestingly, the hardware cost for on prem is much less than the cost of the Azure virtual machines. But there are advantages in that on-prem implies extra costs in server setup and IT labor. IT can now ‘focus on supporting process engineers working on advanced analysis’. Use cases include analytics/ML with Databricks and visualization with PowerBI. Note that this migration was limited to Inpex’ Japanese data. Migrating the Australian Ichthys LNG systems is a more complex task that is still under study. The plan is for a scaled-up ‘hot’ cutover for Ichthys that adheres to the joint venture’s ‘zero data loss’ philosophy.

Abdulaziz Alzahrany and Reham Faqehi presented on the role that the PI System plays in Saudi Aramco’s Forth Industrial Revolution Center (4IRC), first unveiled in 2019. PI System is deployed to automate rotating equipment switchover. Tools used include PI System, Aveva InTouch and Unified Operations Center. The Aramco 4IRC presentation included a glimpse of a spectacular Aveva Video Wall project, predictive analytics using Aveva Aventis Prism also ran.

Tom Jacobs Stephan Leufke from Evonik (a German headquartered specialty chemicals company) presented the company’s OneCAE project to transform its existing plants with a ‘harmonized’ CAE* landscape. Evonik currently has a heterogeneous and complex system landscape with over 164 engineering IT systems, some paper based work processes and poor data governance. The OneCAE project sets out to fix this with a digital twin (DT), new data-driven processes, data quality assurance and an alignment between the physical plant and associated digital information. Evonik’s earlier digitization work has leveraged standards including PlantXML, DEXPI and most recently, the Asset Lifecycle Data Model from the German Namur standards body. The latter is the basis (inspiration?) of OneCAE whose target landscape is an assembly of software tools from Aveva, SAP and others. The user interface is to be Aveva Engage. Currently the project is at the MVP* development stage. A substantial migration and digitization of existing plant documents and data including laser scans is to create a virtual as-build representation of the plant. The target DT platform includes multiple Aveva tools along with SAP.

* Computer-aided engineering.
** Minimal viable product.

Finally we have to mention Gianmarco Rossi and Luca Cadei’s presentation on the use of ‘data democracy’ as an enabler for ENI’s upstream digital transformation. The paper reprises Eni’s work on the e-DOF (above) to show how this is being extended with data science and PI Vision graphics, accessed from an Integrated Operations Center (IOC). But what most caught our fancy was the literary endnote to the presentation, a snippet from poet William Blake’s ‘Marriage of Heaven and Hell’ viz …

Man has no Body distinct from his Soul for that call’d Body is a portion of Soul discern’d by the five Senses, the chief inlets of Soul in this age.
Energy is the only life and is from the Body and Reason is the bound or outward circumference of Energy.
Energy is Eternal Delight.

All this may or may not be relevant to the e-DOF, but the Blake snippet is actually from the chapter titled "The Devil’s Voice", and the three lines are a rebuttal to the Devil’s arguments.

More from the PI World home page.


PIDX 2022 European Conference: Journey to Net-Zero

PIDX Emissions Transparency Data exchange work group update. Schlumberger’s move to net zero. Future Energy Partners on flaring and emissions monitoring and mitigation … and greenwashing! Open Footprint Forum/PIDX collaboration – work in progress.

Chris Welsh Chair of the PIDX Emissions Transparency Data Exchange Work Group cited the Carbon Trust’s definitions* of the different scopes in carbon reporting. There are two approaches to reporting. Top down ‘science-based’ targets and bottom-up aggregation of line item emissions. The latter dovetails neatly with PIDX XML-based invoicing which now includes a <pidx:EmissionsData> collection of GHG volumes, units of measure and scope. Data exchange between operator and suppliers using PIDX order creation and invoicing can embed full cycle emissions data reporting, although this can get quite complicated! Welsh’s straw-man describes the ‘art of the possible’. Currently reporting leverages top down macro level industry averages. Welsh sees these being gradually replaced by detailed line item bottom-up reporting by 2030. There are of course challenges, as The Carbon Call puts it, ‘Today, carbon accounting suffers from data quality issues, measurement and reporting inconsistencies, siloed platforms, and infrastructure challenges. This makes it difficult to compare, combine and share reliable data, particularly for companies.’ PIDX orchestration of supply chain messages is seen as the way forward. Welsh concluded with a reference to ongoing collaboration between PIDX and the Open Footprint Forum.

* A guide to carbon reporting and the scopes is available from the Trust here.

Reem Radwan revealed that Schlumberger is aiming for net zero by 2050, with ‘minimal reliance on offsets’. Clearly for any company in the oil and gas supply chain, scope 3 is the elephant in the room. Schlumberger has analyzed 15 categories of scope 3 emissions and is working with suppliers to improve disclosure data and help them reduce emissions. Emissions are studied at a very granular level – down to the inputs and emissions from individual offshore equipment items. Schlumberger is proposing, along the lines of the GHG Protocol, to include ‘avoided emissions’ by its clients in its reporting. Thus technology substitution, such as the sale of a subsea boosting unit that avoids gas lift, is included in the calculation. Hardware for ‘digitally-enabled emissions quantification’ also ran.

Greg Coleman explained how Future Energy Partners provides a suite of top-down and bottom up technologies to facilitate flare and methane emissions mitigation and reporting. Coleman observed en passant that emissions from the energy vertical are significantly less that those released by biomass and agriculture, citing a Manchester University study of global biogeochemical cycles. He moved on to the subject of greenwashing. Companies’ reported environmental, social, and governance (ESG) data is often unaudited. Some ‘greenwashers’ may reveal large quantities of ESG data but in reality perform poorly. FEP has analyzed some 1925 large cap firms to create a greenwashing scorecard for the EU majors. He observed that although the companies have allocated significant resources into their GHG reporting, none have a clear, comprehensive and quantitative systems in place. FEP’s analysis found differences between companies studied that might translate into a competitive advantage, ‘which should help investors make informed choices’.

Sumouli Bhattacharjee presented on collaboration between The Open Group’s Open Footprint (OFP) and PIDX. Bhattacharjee made a strong case for collaboration. But how this is going to be achieved is work in progress. This is being carried out by the ‘A team’, a.k.a. the OFP Workstream 3 that is ‘defining the data model and schema for a robust Scope 3 capability’. Chris Welsh from PIDX is on the team.

* The Open Footprint Forum is variously acronymized as OFF or OFP. The latter is canonical.

Comment: PIDX would appear to be further advanced than OFP in the field – although OFP is more secretive than PIDX. An earlier PWC presentation stressed the ‘criticality of the data model’. Critical it may be but there is no sign of where or what it is!


Sales, partnerships, deployments

Arbo/East Daley. BP/Microsoft. Peloton/Texas A&M. Bridger Photonics/Repsol/MiQ. CapturePoint/Energy Transfer/CENLA Hub. Project Canary/Sensirion. ADNOC/Dataiku. Shell/Aker Solutions. Cognite/Schlumberger. By-Lo Oil/ClearDox. CGI/UiPath. Down Under Geophysical/Vast Data. Datagration/OneNexus Environmental. Dover Fueling Solutions/Bottomline. East Daley/E&P Cash Flow Modeling. Equinor/Vissim/Aker BP. World Fuel Services/Fivetran/Snowflake. Fortress Information Security/ONGISAC. Fugro/Ocean Industries Concept Lab. GE Digital/AWS. Ghost Robotics/HUVR. Halliburton/SDAIA. Maillance/INT. Intelligent Wellhead Systems/Corva. Indian Oil Corp./Fiserv. Aegion/Ivalua. Kevton Technologies/Velo3D. NOV/Bardasz. Parsons/AVEVA. RINA/Asprofos/Gastrade. Wintershall Dea/Schlumberger. Resoptima/RoQC/Schlumberger. Aker Solutions/Subsea 7/Schlumberger. UK North Sea Transition Authority tenders. Childers Oil/iRely.

Upstream/Data

Arbo is partnering with East Daley on the provision of Permian Basin pipeline intelligence. East Daley’s crude oil throughput models are now integrated with Arbo’s Liquids Platform in a new intelligence report, the Permian Edge.

BP is using Microsoft’s Intelligent Data Platform and Azure Synapse to consolidate data from multiple cloud and on-premise sources, accelerating the production of data products, and training AI/ML models.

Peloton is collaborating with Texas A&M, providing content for two courses within the university’s Petroleum Engineering curriculum. Content covers the role of data management and analytics in the petroleum industry in Dr. Catherine Sliva’s Introduction to Petroleum Engineering course. Peloton is also a Capstone Project Sponsor for the Interdisciplinary Data Analytics Practicum, led by Prof. Sutanoy Dasgupta, where a Peloton representative is to train senior-level students.

Emissions

Bridger Photonics is to deploy its airborne Gas Mapping LiDAR methane detecting technology across Repsol’s Marcellus Shale assets. Quarterly overflights will feed emissions data for independent certification by MiQ, a non-profit third-party certification body.

CapturePoint Solutions has signed a letter of intent with a wholly-owned subsidiary of Energy Transfer to participate in a feasibility study to capture CO2 emissions from Haynesville Shale natural gas production facilities for sequestration in the CPS Central Louisiana Regional Carbon Storage Hub (CENLA Hub). The CENLA Hub has the potential to be one of the largest onshore deep underground carbon storage centers in the United States. If commercially viable, the project will launch as a joint venture between Energy Transfer and CapturePoint.

Project Canary has teamed with Sensirion Connected Solutions to provide methane detection and measurement services in the US. Canary’s climate analytics platform will now take data from SCS’ Nubo Sphere metal-oxide technology-based sensors.

Operations/Production

Abu Dhabi National Oil Company’s audit analytics team has deployed Dataiku’s data wrangling and machine learning solutions to review the adequacy and effectiveness of its risks mitigation controls covering the measurement and back allocation of crude oil production. More from Dataiku.

Shell has awarded the engineering, procurement, construction, and installation contract for its North Sea Jackdaw Platform to Aker Solutions. More in the release.

Aker’s 50%-owned Cognite unit is teaming with Schlumberger to deliver data-driven solutions to the energy industry. The partnership combines Schlumberger’s software and market reach with Cognite’s Data Fusion industrial data platform.

Miscellaneous software/hardware

By-Lo Oil Co. has selected ClearDox’ intelligent document processing solution to automate pricing reconciliation. ClearDox’ Spectrum IDP platform will digitize By-Lo’s manual processes and emailed pricing data.

CGI and UiPath have announced a new managed services partnership to accelerate oil and gas digital transformation through automation. CGI’s Accel360 automation-as-a-service offering will be deployed alongside UiPath’s business automation platform. More from UiPath.

Down Under Geophysical (DUG) is migrating its big data environment from on-site hard disk-based storage to a flash-based, parallel file system from Vast Data.

Datagration is working with OneNexus Environmental on the ‘trillion-dollar problem’ of aging oil and gas well infrastructure. OneNexus will use Datagration’s PetroVisor to assist with engineering, operations and financial analysis.

Dover Fueling Solutions is to add Bottomline’s fuel logistics software to its end-to-end fuel management solution.

East Daley Analytics has joined forces with E&P Cash Flow Modeling to integrate its gathering and processing data with E&P’s cash flow forecasting software.

Equinor has teamed with Norway’s Vissim on a ‘new and expanded’ surveillance system for North Sea operators. The ocean space surveillance and vessel traffic management system targets improved safety and cost efficiency of marine operations. Vissim is also developing an spill monitoring and detection platform for Aker BP.

World Fuel Services has deployed a real-time data warehousing solution from Fivetran and Snowflake to identify new business opportunities and unify its customer list. The cloud-based data warehouse is displacing a legacy on-premise Oracle database. More from Fivetran.

Fortress Information Security and the Oil & Natural Gas Information Sharing Analysis Center have announced an industry-wide initiative to secure hardware and software components and supply chains. The solution will leverage Fortress’ repository of supply chain data developed for utilities and the US Department of Defense.

Research teams from Fugro and the Ocean Industries Concept Lab at the Oslo School of Architecture and Design are working to unify offshore control systems with a ‘next-generation workplace’ for safe remote operations.

GE Digital has achieved AWS Industrial Software Competency Status and is adding its offerings in operational intelligence and manufacturing execution system to the AWS Marketplace. The solutions include GE’s Proficy Historian for Cloud and Proficy Smart Factory.

Comment – there is something of the ‘how are the mighty fallen’ in this announcement. Only a few years back, GE itself was going to be a ‘top 10 software company by 2020’ with its hosted ‘Predix’ big data solution.

Ghost Robotics and HUVR have teamed to control a quadruped robot inspector from Ghost with HUVR’s inspection data management software (IDMS) platform. The system was demonstrated recently at Quasset’s test facility in Houston.

Halliburton has signed a memorandum of understanding with SDAIA, the Saudi Data and Artificial Intelligence Authority for the provision of its DS365.ai data science and artificial intelligence offering. The solutions will be deployed at AICE, the Saudi AI Center for Energy (AICE), a joint venture between the Ministry of Energy and SDAIA.

Maillance, a software-as-a-service boutique providing augmented intelligence business applications to the oil and gas vertical, has selected INT’s IVAAP geoscience and engineering visualization toolkit for its custom workflows.

Intelligent Wellhead Systems has joined the Corva partnership program to optimize completions operational performance and maximize well productivity. IWS’ SIMOPS app is now available on the Corva App Store.

Indian Oil Corp is to deploy Fiserv’s smart point of sale terminals at some 15,000 retail fuel outlets. Fiserv’s Carat operating system provides cloud-based processing of debit cards, credit cards, FASTag-enabled payments, QR payment and digital wallets.

Aegion has selected Ivalua to digitize its procure-to-order process across its MRO services.

Houston-based Kevton Technologies has signed a sales agreement with Velo3D for the supply of seven Sapphire 3D printers.

NOV is to integrate Bardasz Octopus Witsml technology into its cloud-based drilling data delivery platform NOV Max.

Parsons is now a registered system integrator with AVEVA. Its Parsons X vendor-agnostic toolbox for energy, oil and gas facilitates real-time, data-driven project and asset management decision making.

RINA, an inspection, certification and consulting engineering service provider in partnership with Asprofos, a Greek engineering consultancy, is to provide project management services to Gastrade’s Alexandroupolis Independent Natural Gas System (INGS).

Wintershall Dea has selected Schlumberger as ‘preferred partner’ on its ‘Terra Nova’ subsurface transformation program. Schlumberger’s Enterprise Data Management solution ‘built specifically for OSDU’ will underpin Wintershall Dea’s ‘OSDU-enabled future’ .

Schlumberger has announced a digital platform partner program whereby independent software vendors to leverage the ‘openness and extensibility’ of Schlumberger’s DELFI digital E&P platform. Partnership poster children include Resoptima’s ResX package and RoQC’s LogQA.

Schlumberger, Aker Solutions and Subsea 7 have formed a joint venture to provide subsea production and engineering services. The deal builds on an existing subsea integration alliance between Schlumberger and Subsea 7. For more on the financials of the deal visit the Schlumberger investor center.

The UK North Sea Transition Authority is to publish upcoming ‘multimillion pound’ maintenance and operations contracts as North Sea operators add tenders for operations and maintenance work. The tenders will be available on the Energy Pathfinder website.

Childers Oil is to deploy a cloud-based wholesale and ERP solution from iRely.


Standards stuff

OPC Foundation and FieldComm to interoperate. ClassNK rolls-out marine standards Rule Viewer. EU Commission reports on INSPIRE geographic information standard. SPDM webinar introduces EU PEPPOL procurement standard for process and power. OSGeo PyGeoapi now ‘fully-fledged’. PyTorch moves to Linux Foundation. SPE updates SRMS for CCS. Eclipse Foundation rolls-out Sparkplug V3.0. TOGAF V10 released. W3C’s geolocation API now a Recommendation. MathML V4.0 announced.

The OPC Foundation and FieldComm Group are to provide an interoperable interface between PLC/DCS and instrumentation devices, such as transmitters, instruments, and actuators. The solution targets oil and gas and other verticals. A new OPC UA Instrumentation Working Group is being hosted by the OPC Foundation as the Field Level Communications initiative. More from OPC and FieldComm.

Japan’s ClassNK marine standards body has a new web application ‘ClassNK RuleViewer’ for its Rules and Guidance. ClassNK issues standards and certifications for ships including tankers and LNG carriers. More from ClassNK.

The EU Commission has just published an evaluation of its Inspire directive, a standard for EU-wide exchange of geographic information. The evaluation is published in a pair of documents totaling over 200 pages. These describe in a somewhat self congratulatory manner how Inspire has been deployed in support of the ‘dual digital and green transitions’. The evaluation does note that the ‘current interoperability and technical provisions should make better use of state of the art digital technology and an improved governance structure’. Curiously, while there are now some 83,000 spatial data sets available in the EU, this is actually less than were available in 2016. The full evaluation documents can be downloaded here.

In a Society of Petroleum Data Managers webinar, Join Tormod Tønnesen (Norsk Olje & Gass) and Andre Hoddevik (Peppol) presented Peppol a procurement standard for process and power procurement across the EU and worldwide. The Peppol Network provides a set of technical specifications that can be implemented in existing eProcurement solutions and eBusiness exchange services to make them interoperable between disparate systems across Europe. Peppol enables trading partners to exchange standards-based electronic documents including orders, shipping notes, invoices, catalogues and more.

OSGeo’s PyGeoapi project is now a ‘fully-fledged’ OSGeo project managed by Tom Kralidis. PyGeoapi is a Python server implementation of the OGC API suite of standards. The project provides an off-the-shelf capability for data providers to deliver geospatial data, metadata and services using OGC APIs. Python developers can extend, customize and integrate solutions with custom plugins and templates. The API supports microservices architectures, and can be deployed using a number of different deployment patterns (cloud, on premises, etc.). PyGeoapi is open source and released under an MIT license. More from OSGEO.

PyTorch is moving to a new home at the vendor-neutral Linux Foundation. PyTorch was originally developed at Facebook/Meta and is now ‘one of the fastest growing open source software communities in the world’. LF recently announced a free course, ‘PyTorch and Deep Learning for Decision Makers’ that explains how the deep learning framework can automate and optimize state-of-the-art AI applications. More from LF.

The Society of Petroleum Engineers has updated its Storage Resources Management System (SRMS) for Carbon Dioxide Capture, Utilization and Storage. The new edition is available from the SPE here, a snip at $25 for members!

The Eclipse Foundation has announced a Version 3.0 release candidate for its Sparkplug IoT/Edge Computing and process control standard. More from the GitHub Sparkplug channel.

The Open Group Architecture Forum has announced the tenth edition, of its venerable TOGAF standard for computer architectures. A new modular structure simplifies navigating and applying the framework and business strategy guidance has been added, including for Agile methodologies.

The World Wide Web consortium’s W3 Geolocation API is now a W3C Recommendation. The Geolocation API provides access to geographical location information associated with a hosting device and was developed at the W3C’s Devices and Sensors Working Group.

The W3C has also updated MathML to Version 4.0. MathML is a markup language for describing mathematical notation and capturing both its structure and content in a web page. More from the MathML home page.


Devon Energy at Seeq Conneqt 2022

‘Just say no to Excel’. Seeq plugs into Devon’s PI System for integrated analytics. Working with data scientists is hard.

Don Morrison (Devon Energy) presented on Seeq at Scale for Data and Analytics Integration. Devon has a culture of innovation and a large inventory of future projects across its US assets. One challenge for data science is the widespread use of Excel. The answer for Morrison is ‘just say no to Excel!’ Another issue is the fact that commercial analytics platforms often require a copy of Devon data in their cloud instance. Moreover many such solutions are ‘black boxes’. ‘Our engineers want control of the math!’ ‘They are experts in their data and have many ideas for solving operational problems’. Transferring these ideas into action is hard and working with data scientists is time consuming.

Seeq provides a better environment for data science. A direct connection to Devon’s PI System allows engineers to experiment with their data, iterating use cases and edge cases into a broad-solution that can then be scaled-up, with some additional engineering tinkering with the math. The engineering experience has improved over Devon’s legacy processes. Time to solution is faster and development resources are only required at the implementation phase. Seeq’s ‘export directives’ feature is used to scale a solution to 100’s of assets and output the result back to PI data archive The directive controls what, where, and how often. Devon has successfully scaled a complicated calculation from a single asset in Seeq to 250+ similar assets, with results written back to the PI archive. Seeq manages the schedule and formulas. PI Vision is used for data visualization. ‘We want the results of solutions from Seeq to be available in our PI System’. Seeq is leveraged as an advanced calculation and data science tool for PI Users. Watch the Devon video here and download the PDF.

Chevron is another oil and gas Seeq user. The 2022 Seeq Connect keynote was given by Brent Railey, data science manager with Chevron-Phillips Chemical.

More from Seeq.


2022 ARC Industry Forum

Peter Reynolds on ARC’s sustainable digital twin work group. Today’s engineering digital twins ‘die after handover’. ExxonMobil on engineering/IT approaches that are ‘solutions looking for a problem’, on ‘lost technical debt’ and the need for an agnostic digital twin.

Speaking at the 2022 ARC Industry Forum Peter Reynolds (ARC) introduced a session devoted to the notion of a ‘sustainable and open’ asset digital twin to support the plant lifecycle. ARC is hosting a Sustainable Asset DT Working Group, along the lines of Exxon’s work on open automation (which led to the ongoing OPA Forum). The focus of the WG is openness in the context of plant operations and maintenance. Today’s solutions depend on siloed data created in point solutions with different tools. This leads to successive lost revenue opportunities as data is exchanged between stakeholders who leverage yet more tools! By eliminating the information handover dips, ‘profitability would be off the chart’. Fields of application include brown fields, green fields, new reactors, furnaces, piping and upstream hardware. ‘This is just never ending’. The WG includes Chevron, Dow, Exxon and Shell, and is further presented in an ARC white paper, Moving Toward Sustainable Asset Digital Twins. ARC defines the DT thus, ‘A digital twin is a digital replica of a physical asset using a 3D model and/or math algorithms’. Current plant digital twins enable collaboration among those using engineering data and documents during design, build and commissioning. The DT then becomes a vehicle for continuous handover of engineering data and documentation to operations. The DT is said to address sustainability in the ‘currently underserved’ asset management business where ‘nearly all digital twins created during design and build decay and die after handover’.

ExxonMobil’s Michael Hotaling bemoaned the way that many IT/engineering approaches are ‘solutions looking for a problem’. These are ‘sexy shiny keys’ that the vendors hand over, and, a couple of years down the road, ‘our technical debt has been lost’. Hotaling sees the DT as essentially a 3D model. ExxonMobil already has some 80-90 use cases for the DT. But these focus on benefits, less on costs and sustainability. Which is where we are today. He suggested being humble, ‘oil and gas is not a leader in the DT space, the automotive, airline markets are much further along’. This is an industry-wide opportunity to transform how we operate. Why are we stuck with 2D drawings? One reason is regulatory reporting. Industry needs to talk with a single voice to the regulator and persuade them of the merits of a 3D approach. Hotaling warned that the DT term can be misleading. DTs may cover physics, engineering and thermodynamics which are out of scope for the current work group. While a 3D model is just a dumb picture until you interlace it with data, data cleansing and management are likewise out of scope. Hotaling’s focus is 3D visualization of the asset. This is where an open architecture is required. And (as for OPA) with a separation of software from data. To support everyone’s needs it needs to be an open architecture. Otherwise ‘it will die a slow death’. 3D model maturity is defined by complexity, level of integration (how many datasets) and number of sustainable business use cases (Hotaling admitted that to date, Exxon has only one!) Things will get interesting when you can add time data. Currently it is hard to reconcile documents covering engineering design, as built, approved and change-managed. Model fidelity and frequency is a constant battle. The current state of 3D and data is ‘all over the map’. But we now know what not to do. The future DT will be agnostic in terms of imagery. There is huge potential to capture 3D data from Lidar or 3D images on your phones. Scanning data into an open architecture will automate model update, moving from today’s custom built, costly models. So is this down to a lack of standards or too many? No current reality capture to 3D is standards-based. This is where we are trying to work. It will take the industry to get there, both operators and suppliers.

Watch Hotaling’s Journey to a Sustainable Digital Twin on YouTube and the ensuing ARC Panel Discussion.

Fore more on the SADT watch the video here and download the ARC white paper.


Are you on Edge?

What is the Edge? Is it AI? Is it visualization? Is it low latency? Or all of the above? Answers from Frost & Sullivan, the EU CAPRI project and NVIDIA.

A 20 page publication from Frost & Sullivan, ‘Edge Computing Drives Industrial IT/OT Convergence in Oil & Gas’ posits that ‘broad, organizational information technology (IT) and operational technology (OT) convergence, in which data-centric IT systems integrate with operations equipment and technology, can provide a huge competitive advantage when executed successfully’. Working from a recent survey*, IT-OT convergence is cited as one of the most important factors for organizations to achieve their strategic goals. F&S highlights Red Hat as ‘a leading organization focused on helping industrials align IT and OT’. Apart from in the title, the F&S study barely mentions edge computing.

* A 2021 Frost & Sullivan survey of 376 industrial organizations working in the utility, automotive, manufacturing, and oil & gas markets.

A group of Italian researchers led by Antonio Salis (EII) have published an ‘Edge-cloud based reference architecture’ for the Process Industry a.k.a. the ‘CAPRI’ reference architecture. Capri is an EU Horizon 2020 project to develop CAP, a ‘Cognitive automation platform’. The CAP reference architecture, leverages open source frameworks such as Fiware along with multiple other industrial frameworks (RAMI4.0, IDSA and BDVA). A ‘Data Space Enabler’ extracts data from multiple sources including MQTT, CoAP and OPC UA.

Nvidia recently published a solution brief and edge computing showcase. For Nvidia, the edge ‘extends compute capabilities from data centers out to the edge of networks, allowing organizations to act quickly on data where it’s captured’. Reducing the distance between where data is captured and where it’s processed improves latency, bandwidth utilization and infrastructure costs. However, Edge systems ‘lack the centrality that a data center presents’. Software updates can be hard to deploy, manage, and scale across vast fleets of devices. Edge locations lack the physical security that data centers have, so an end-to-end security model that protects both the application intellectual property and the sensor data is ‘paramount for a successful deployment’. Enter the Nvidia EGX platform, a combination of high-performance GPU computing and high-speed, secure networking. EGX provides a suite of applications for edge AI including automation and quality control in manufacturing facilities, 5G multi-access edge computing and freight tracking and route optimization for efficient logistics.


Transform yourself!

With declining demand for drillers, Eightfold AI analyzes the skills required to transition oil and gas workers to new energies.

A new study from Eightfold AI, ‘The Great Energy Transition: What’s Next for Talent in Oil and Gas’ investigates the skill set required for new energies and compares these with what current oil and gas employees have to offer. Eightfold used its ‘deep-learning powered’ Talent Intelligence Platform, a global talent dataset, to pinpoint opportunities for energy company employees.

Demand for 68% of today’s most common roles in oil and gas is roles either stable or declining, particularly for drilling engineers. On the other hand, demand for chemical and mechanical engineers is rising. Eightfold advocates ‘strategic upskilling’ whereby individuals with adjacent skills like hydraulics and preventative maintenance can pursue future roles such as wind turbine technicians. There is a significantly expanded potential talent pool for new energy jobs such as electrical or mechanical engineering. According to the World Economic Forum, the clean energy transition could generate 10.3 million new jobs worldwide by 2030.

Eightfold AI president Kamal Ahluwalia observed ‘Today, new energy roles do not necessarily exist. Limiting our talent searches to those with particular knowledge sets no longer makes sense. To build a future-ready workforce recruitment should include individuals with skills from adjacent industries. By shifting our mindsets we can include more qualified people with the potential to upskill themselves.’ The report outlines how energy companies can develop talent strategies, calibrate future roles, and hire for potential to expand the available talent pool. More from Eightfold.



© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.