Speaking at the (virtual) Digital Transformation in Deepwater Production Conference organized by the International Society of Automation (ISA), Shell’s Johan Krebbers asked, ‘What is the Role of Standards and Technology in Getting Data Ready for Analytics?’ Proper modeling requires accurate data in an open and adaptable format. Good Models start with good data.
Hitherto, OSDU, as it says on the can, covers a data environment for the upstream. But Krebbers’ presence at an automation conference reflects considerable future scope creep for the protocol (as indeed did Shell’s putative OSDU-based IT for its hydrogen unit – see our last issue).
Developments in the field of sensors, internet of things, AI and digital twins are the next targets for OSDU-style data support. In common with others from the IT side of the IT/OT divide, Krebbers recommends constraining process control technology to where it is appropriate, leaving all the smart stuff (predictive/analytics) to the IT side. Even inside the process domain, the sensor market is developing fast with lower cost, ATEX compliant devices coming from startups. Krebbers recommends ‘looking beyond’ the incumbent process control providers.
On connectivity, LoRaWAN-based IoT can replace locked-down vendor solutions and should become ‘part of the standard infrastructure of every asset’ enabling data collected from any location of an asset to be collected in a cloud-based data platform. All data collected on an asset by your own staff or by a third party should go into your cloud. It is your data. Once all the time-series/event data/video/corrosion data is in a single data store, artificial intelligence can be applied to predict future equipment failure, leverage machine vision for spotting leaks and so on.
So where does OSDU fit into the bigger picture of IT spanning everything from the subsurface to process? Enter the Open Energy Data Platform, a huge expansion of OSDU concepts and methods in support of all of the above, plus digital twins of oil production facilities, windfarms, solar and more. The endgame is to connect all data sources (operational, engineering, finance … ) into the integration and visualization layer of the digital twin.
When is this likely to happen? The OSDU timeline foresees completion of R3, the first ‘commercial release’ in Q1 2021. R3+, planned for mid 2021, adds drilling and production to the mix. By late 2021, early 2022, a hypothetical R3++ release will extend OSDU into the new energies domain and add connectivity to an ‘engineering data platform’ housing engineering/construction data, piping and instrumentation data and more. Krebbers concluded his presentation by evangelizing the approach, ‘Everything is possible. You are only limited by your imagination Think end to end workflows Think database for everything. Just start’!
More from the conference home page.
Comment: As OSDU expands into operations and process, it may start to tread on the toes of the folks in the neighboring The Open Group initiative, OPA-S. Maybe Pedro Vieira (Petrobras) had something with his TOG in TOG suggestion in our last issue.
Chinese geographic information system developer SuperMap held its 2020 Global Technology Conference recently, a multi-lingual virtual event with minisites and presentations in Chinese, English, French, Japanese, Russian and Spanish. The conference theme was ‘Geo-intelligence, connecting the future’ with presentations on GIS integration, blockchain, big data, AI and more.
SuperMap head of R&D Li Meng described new features in SuperMap GIS 10i with notably, a new Hyperledger fabric-based ‘geo-blockchain’ solution which is claimed to secure distributed geospatial analysis and processing. Other novelties include augmented reality GIS based on spatialized video, cross-industry 3D GIS, and ‘smarter and more powerful’ AI GIS.
SuperMap’s distributed GIS operates from the edge to the cloud. The cloud versions leverage Docker containers and a microservices architecture. A poster child deployment is the geological ‘big data’ system developed for China’s Yuhan Province, a 400 terabyte data set spanning a variety of storage from H-Base, HDFS, Yarn, Hive Spark, MapReduce and more. The distributed architecture supports ‘GeoAI’ machine learning and 3D point cloud data.
The geo-blockchain is said to provide ‘GIS data credibility’, albeit with a processing overload and larger data footprint. The appropriate strategy is to only upload ‘key data’ to the blockchain. A ‘geospatial smart contract’ mechanism controls the process.
SuperMap appears to be behind China’s surveillance society and offers geo-fenced analysis of video camera feeds to perform, for instance, vehicle number plate recognition and spot traffic violations. The 3D aspect appears to offer a fully immersive gaming-like functionality integrated with the GIS. The toolset makes considerable use of open source technologies and standards (CityGML, Postgres, HBase, Elasticsearch) and the whole gamut of Keras, PyTorch and TensorFlow deep learning packages for image processing.
Natalie Kang from Korean GIS specialist SPH presented PLAS, its pipeline analysis solution. PLAS embeds DNV GL’s ‘killer app’, Synergi Solver, within SuperMap’s iObjects GIS. Pipe flow and pressure data can be visualized and modeled in the GIS environment. PLAS also interfaces with customer billing systems. The system has been deployed by Korea’s JB Corp natural gas utility.
Long time readers of Oil IT Journal may recall my 2004 editorial A million miles of spaghetti eaten every day where I reported from a plenary meeting of the W3C Semantic Web Interest Group. I will spare you a complete reprise of the editorial, just say that I asked a question of the assembled semantic web folks as follows…
‘[computer] data typing is important but what about units of measure? IT looks after its own with ‘strong typing’ of data pigeonholes, and then throws caution to the wind when the value of say, ‘3.75’ is stored in a precise floating point representation without recording if the length is feet, miles or meters. This causes spacecraft to crash, bridges to fall down and wells to be drilled in the wrong place’.
At the time I did not get a satisfactory answer. I had the impression that the UoM issue had dropped down a crack between the W3C and ISO. But the issue did not appear to trouble the community unduly.
I was reminded of all this when I received an announcement from the Industrial Internet Consortium for V2.3 of the Industrial Internet Vocabulary Technical Report, ‘one of six IIC technical reports that serve as de-facto industry guidelines on vocabulary, architectures, security, analytics, connectivity, and business strategy’. The report is said to specify common definitions for communication within the industrial internet ecosystem and is ‘the foundation for the collective body of work from the IIC’.
OK I thought, if you want to be able to talk to a connected device on another network, you had better be sure that you know what units the device is using. You may be interested in a whole slug of other metadata from the device but let’s keep it simple for the moment.
I downloaded the Report and scanned for units of measure. Zilch! So, I quizzed (trolled?) the IIC with the following: ‘I am curious that there is no reference in the vocabulary to “units of measure” or “engineering units”. This begs the question of how a remote IoT device can be interrogated and return a value with unambiguous units. Perhaps this issue is handled elsewhere in the IIC specs?’
First reply was a bit of a brush off viz. ‘Units of measure are part of an implementation and not part of an architecture’. When I hear ‘architecture’ I think of arm waving and PowerPoints. The most egregious example being the long-forgotten Mura, Microsoft’s upstream reference architecture. I prefer code to architecture. So, I got back to the IIC asking for pointers as to how units are ‘implemented’. We have had some good feedback from IIC members which is ongoing. For now, I can report that there is clearly a recognition that UoM issues are important but that they are relegated to lower level ‘implementation’ issues. Having said that, our interlocuters at the IIC seem to think that it would be a good idea to look at how UoM implementation could be standardized.
So where might the IIC look for inspiration on the Units issue? A starting point might be the IIC’s own website and a paper on Usage of Standards in the Smart Factory Web Testbed, An Industrial Internet Consortium White Paper. Here we learn that the IIC Smart Factory (or its Industrie 4.0 cousin) is built on three ‘principal pivotal standards’ viz, OPC Unified Architecture (OPC UA), AutomationML and the Open Geospatial Consortium’s (OGC) SensorThings API. It would appear that Industrie 4.0 relies on the OGC for its standard units ontology*. So we can now trace the units of the IIC to the OGC’s Sensor Things, which seems to make sense.
But where did the OGC units work come from? The original 2002 OGC OpenGIS Recommendation Paper, Units of Measure Use and Definition Recommendations was authored by one John Bobbit from Energistics’ predecessor POSC. This built on prior art including POSC’s Epicentre data model and the API/Schlumberger’s RP66/DLIS spec. So now we have a traceback from the IIC to Energistics/POSC which is cool. Looking forward in the direction of OSDU it appears that OSDU embeds the Energistics UoM work which is even cooler.
Of course, it is not enough to have a UoM standard someplace in your portfolio. What is important is that it is actually used. For the IoT this means a mechanism for broadcasting sensor metadata from a local part of the network in a machine-readable way. For OSDU, given that the data layer is Schlumberger’s OpenDES, it depends on whether this exposes a UoM service. For both IoT and OSDU, it also supposes that developers use the service in ‘deployment’.
* Although it could have relied on OPC UA which I understand has its own UOM spec although this would not appear to be machine readable.
A new book*, authored by Chinese researcher Zhenpei Li**, titled ‘Pipeline Spatial Data Modeling and Pipeline WebGIS’ (PWG), provides in-depth coverage of the development of a pipeline data model that extends and adapts Esri’s technology to the non-US (Chinese) market. The 164-page publication is translated from the Chinese in a readable fashion and is Volume 1 in a series titled ‘Digital Oil and Gas Pipeline: Research and Practice’.
PWG enumerates a few GIS environments as Esri, MapInfo, Intergraph MGE, and China’s own SuperMap (see our report from the SuperMap UG elsewhere in this issue). There is no mention of open source GIS systems. In fact, PWG is pretty well 100% Esri. Li has been using the technology for a couple of decades. PWG sets out to establish a Pipeline Spatial Data Model (PSDM) and its implementation in a) ArcGIS/ArcObjects and b) using ‘WebGIS’ technology.
The introduction is interesting from a historical viewpoint. Li places pipeline GIS in the context of the ‘Digital Earth’, an idea that was floated by Al Gore back in 1998. Li seems to have drunk more deeply of the Digital Earth Kool-Aid than his US counterparts, describing the US National Spatial Data Infrastructure as the ‘commanding height’ of the present technology. The Digital Earth trope is further expanded to cover digital cities, digital universities, digital enterprises, digital communities and … digital oil fields. Al Gore has a lot to answer for!
Li’s research was devoted to producing a spin-out of the digital earth, the digital pipeline, a ‘virtual, interactive expression of the pipeline, which collects natural and social information concerning the pipeline’. The digital pipeline (DP) collects a large amount of multiresolution, multi-scale, and 3D geospatial information. Core technologies span remote sensing, GPS, GIS, SCADA, communications and reality technology. DP is broken down into three systems: survey and design, construction and operations. A pipeline information database comprises the core of the system, storing, managing and sharing data across the pipeline’s life cycle.
Early examples of DP systems include the US Pipeline Safety Administration’s National Pipeline Mapping System (NPMS), Italian SNAM and Gasunie of the Netherlands. A significant system developed by General Electric and Accenture, dubbed the ‘Intelligent pipeline solution’ was deployed by Columbia Pipeline in 2015. Such systems inspired various Chinese operators to deploy DP systems extended to include construction data, steel pipe delivery, weld progress and more. China’s Huawei has rolled out a digital pipeline information and communications technology solution, adding a communications layer adapted to the long distances involved.
PWG’s payload comes in the form of three sections, the proposed data model, its implementation in ‘component GIS’ (with ArcObjects and ArcGIS Server) and as a web-based system. Li’s ‘Pipeline Spatial Data Model (PSDM) is based on APDM, the ArcGIS Pipeline Data Model. PSDM also considers the design concepts of PODS and ISAT as ‘important references’. PSDM extends and improves APDM with support for real-time data including scada, modular pipeline elements, and coverage of fire protection, repair and maintenance. PSDM is an ESRI Geodatabase that has been adapted for Chinese domestic best practices. PSDM is delivered as a Unified Modeling Language (UML) file that can generate a geodatabase using the ArcGIS Case Tools extension.
When it comes to web GIS, Li has some interesting comments on the state of the art. GIS interoperability is plagued by different data formats and software systems. There is a need to ‘unify the GIS data format and software interfaces’. Here the Open Geospatial Consortium (OGC) is working on standards such as GML, the Geography Markup Language (GML). However, according to Li, ‘GML has limited support for large spatial data sets***’. Moreover, ‘current OGC standards are not practical enough, and manufacturers will adopt their own GIS data formats and software systems for a long time’. Li’s fallback position is to plough-on with Esri technology. PWG’s web services are implemented with ArcObjects. This does mean that the interoperability issue remains unsolved and that PWG is now competing with the very comprehensive information on ArcObjects development that Esri itself provides.
The summary reads rather like a project final report, the author developed the PSDM for China’s Yong-Hu-Ning long-distance pipeline. Localizing APDM meant adding modules for landslides, unfavorable geology, debris flow and ecological protection. The as yet unpublished Volume 2 in the series will cover pipeline real-time data Integration and virtual reality systems’.
On the downside, while there are plenty of code snippets in the book, the UML for the PSDM does not appear to be publicly available. One wonders why a Chinese localization of an Esri APDM data model merits an English translation. The inevitable delay between development, publication (in Chinese), translation and publication by Springer means that much here is out of data relative to Esri’s constantly evolving technology. It is also somewhat out of data in terms of cloud deployment and terminology. On which topic, Al Gore’s digital earth spiel from 1998 could be cut and pasted into any of today’s marketing material for the ‘digital twin’. Whether 20 years of IT evolution has really brought us any closer to the ideal is moot.
* Pipeline Spatial Data Modeling and Pipeline WebGIS Springer 2020.
** Department of Surveying and Mapping Engineering Southwest Petroleum University Chengdu, Sichuan, China.
*** It is questionable that GML is thus limited. Maybe Li’s analysis is a bit out of date.
How do you sell digital into a business that has been digital for decades, was in the forefront of big data, has been doing analytics and digital twins forever in the form of simulators of just about every facet of its business? The answer is by changing the terminology and moving the goalposts. This is pretty much what the authors of a recent ‘UKCS data and digital maturity survey’ (UKDDMS) have done. The terminological shift is that today, ‘organizations need to have digitalization in mind, not digitization‘. What exactly does the extra ‘al’ entail? Offshore teams using ‘tablets’, onshore folks using ‘digital twins’ and most important, ‘a significant shift in mindset’. The goalposts have been moved by a brand new ‘digital operations transformation model’ from Deloitte, a roadmap to digital nirvana that ‘helps to understand the maturity of the industry compared to others’.
So first, what exactly is ‘digital’? UKDDMS defines it thus: ‘For the purposes of this report, [digital] represents the use of data and technology to gain additional insight, support better decision making, reduce risk, improve efficiency, and ultimately drive improvement in business performance’. You can’t disagree with that. You might observe that this is a rather unsurprising categorization of the use to which computing (let alone digitizing, digitalizing or what have you) has put in the oil and gas business since (at least) Exxon’s Messrs. Douglas, Peaceman and Rachford rolled-out their first 'digital twin’ back in 1950!
UKDDMS determined that around 60% of surveyed companies have digital transformation programs, the ‘overwhelming majority’ of which are less than three years old. Moreover, ‘digital initiatives only provide their full value when rolled out at scale across the organization. Most initiatives are not at this point. It sounds more like the UK’s oil and gas sector has not so much failed the test, it hasn’t really even sat the exam!
The 37-page report available from OGUK* actually has some quite encouraging findings that run counter to the overall dissing. Attitudes towards data are reported as ‘consistent’. They are in fact consistently pretty good with over 2/3rds of respondents reporting satisfaction with their data set up. One curious graph appears to show a positive correlation between ‘digital investment to date’ and ‘amount of manual data re-formatting required’ that suggests that the digital transformation is working in the wrong direction. The old issue of data formats remains challenging. Converting well data ‘from LIS to DLIS’ is cited as problematic, again suggesting a digital retrenchment rather than a transformation.
UKDDMS steps away from its ‘survey’ remit with some bold, if unsubstantiated claims, notably that ‘there is at least $15/bbl of opportunity that could be addressed through digital transformation of onshore processes’. Of course, no one is saying exactly how ‘opportunity’ is valued.
Respondents were asked to rate the value of different technology investments with some interesting results. Maintenance & Ops Readiness, Reservoir Engineering, Info Governance & Management were rated as ‘best value’. Applications & Systems Mgmt., Employee Performance Mgmt. and 3D Modelling were worst. Performance management least value? Why are we thinking turkeys not voting for Christmas?
Technologies that are widely used or have widespread ongoing rollout include ‘reporting dashboards’, the cloud and mobile. One senses that SharePoint is doing well. Another finding is that ‘innovation’ i.e. exploring digital initiatives is, in the main, performed in house or ‘in collaboration’. Little digital technology is bought off-the-shelf, a far cry from the days of ‘buy not build’. This is interesting when you look back through the report to the ranking of different industries in terms of their digital maturity. Do hospitality, the media, retail and real estate (all of which are cited as being far more ‘mature’ than oil and gas) do their digital innovation themselves?
A related curiosity is that software vendors are reported as having much better ‘end to end’ innovation processes than operators. Collaboration is cited as being a good thing with OSDU and CFIHOS cited as having ‘widespread traction’ which may be a bit of an exaggeration. Barriers to transformation are ‘legacy structures and processes’, lack of investment and lack of leadership.
UKDDMS concludes somewhat paradoxically that ‘although the value of
digital is widely recognized, and the digital journey is well underway
for most, 73% of practitioners are yet to see a positive impact’.
Comment: We conclude that the main problem with the digital transformation is that the very concept is dissociated from the business. Digital transformation comes as a suite of hammers looking for suitable nails to hit. While reviewing UKDDMS we received an email from Rigzone announcing Rystad’s grim UKCS production forecast where production ‘will never again exceed two million barrels of oil equivalent per day on the UKCS’. The big digital question is how much IT dollars are likely to be spent by a sector on its knees.
* Oil and Gas UK.
French software house Kappa has launched Kappa-Automate (K-A), a containerized version of its well test suite for the modern IT world of microservices-based workflow automation. K-A adopts a similar software development pattern to OSDU, the Open Software Data Universe, but Kappa’s development preceded OSDU by a couple of years and the toolset is now deployment-ready.
As Kappa CEO Olivier Houzé told Oil IT Journal, ‘The Automation project started with a KAPPA consortium in 2016, after requests from our major clients to move beyond permanent downhole gauge preprocessing and integrate some of our software functionality with automated workflows. OSDU began in 2018 and deals with a much bigger picture, but the objectives were similar. It was natural for us to join OSDU when it started gaining credibility’.
‘Automate is exactly what you describe: a self-contained automation architecture which deals with the workflows related to our software portfolio. It can have its own life and be installed and operated on its own. But we expect that it will be more like a satellite of a larger OSDU compliant structure. In the latter case, the Automate structure will be transparent and it will be as if third parties were directly using our microservices. Under the hood, it will be a bit more complicated!’
‘The OSDU R1 and R2 releases were not directly related to our segment but things are changing with R3 and the integration of Schlumberger’s Delfi platform. R3 will cover the data that interests us, so we anticipate that we are going to integrate the OSDU process during the development of R3 and that Automate should be OSDU R3 compliant when released in the not-too-distant future’.
In an online video presentation, Houzé traced the history of well testing in the context of an industry prone to periodic downturns. Automating analyses began as early as 2000 when the deployment of permanent measurement led to information overload. Kappa’s software evolved to help engineers by identifying events of interest and sending data on for analysis. The 2014/15 downturn and the ongoing big crew change has led to a ‘perfect storm’ of human and technology disruption, compounded by a demographic gap that resulted from the no hiring policies of earlier crises. Engineering expertise has gone from many operating companies.
On the positive side, technology, especially artificial intelligence, holds promise, even if there is some controversy around AI’s potential and limits. AI that identifies a squirrel as a sea lion (with ‘99% confidence’) is not going to replace the engineer.
The technology under the hood (as in OSDU) represents a shift from the monolithic apps of the past to ‘small pieces of software’, a.k.a. ‘microservices’. These can be installed anywhere, on servers on-site or in cloud. K-A provides microservices for pressure and rate transient analyses and more and is said to present a great opportunity for interface with other vendors’ solutions via the upcoming OSDU R3 release. The first commercial release of K-A is scheduled for Q4 2021.
On the application front, Kappa is cautious about the move to the cloud/web/HTML5 paradigm. It took some five years to see the benefits from the previous migration from MFC to .NET. It remains to be seen if HTML5 software can be as rich and user friendly as Windows. The first Kappa tool to benefit from the Automate interface is Carbone (PVT analysis), part of the ‘next generation’ G6 release, now deployable from the cloud.
In the video, Houzé commented that Saphir (Kappa’s well test flagship) is ‘a bit big for microservices’ which echoed earlier considerations that we have encountered for instance in our exchange with EnergySys’ Peter Black. We pressed Kappa on what exactly constitutes microservices and asked for some examples of how they might be deployed. Kappa’s Olivier Allain came back with the following.
One example is the ‘incremental PTA*’ workflow. Traditionally, an engineer takes high frequency raw downhole pressure measurement and production data and executes a build-up analysis and outputs a PTA analysis file. An automated workflow replaces this manual analysis as follows. A filtering service preprocesses the raw pressure data which is fed into a processing service to identify shut-in periods. For each shut-in, a PTA service is called to load data and diagnostic plots and rerun and improve a pre-selected model. The results are fed into time lapse and/or map displays or perhaps on into another workflow.
This may just be a matter of semantics but are Allain’s above ‘services’ microservices? Or more to the point, will they ever be used outside of Kappa’s automated workflows? There must be a lot of folks in the OSDU community struggling with similar questions. KAPPA expects answers to these questions in a 2020-2021 K-A field test, where several operators will be trialling K-A in their own workflows. The Kappa Automation Consortium (KAC) currently has 13 members including BP, Chevron, ENI, Equinor, ExxonMobil, Marathon, Shell, Aramco and Total.
* Pressure transient analysis.
Beicip-Franlab has released InterWell 2020. The major release is compatible with Windows10 and RedHat7 OS and adds new functionality for depth domain well data management and seismic characterization. A new seismic fracture characterization workflow leverages Beicip-Franlab consulting’s experience. The new release offers advanced matrix characterization for lithology prediction and multi-variate analysis of 3D properties. Geostatistical inversion provides impedance estimates along with uncertainties in key reservoir properties. Beicip has also released Open Flow Suite 2020, now available in the cloud. On the desktop, the 2020 front end is the last version to run on Linux. As of OpenFlow Suite 2021, the GUI becomes Windows only, but all the simulators will be available for both Windows and Linux systems for cluster deployment.
Bluware’s InteractiveAI adds deep learning to seismic interpretation. InteractivAI automates data management and data science and allows interpreters to work on seismic data directly, without cropping, decimating, or rasterizing seismic traces.
CGG GeoSoftware has launched WellPath, a new interactive 3D well path planning solution for unconventional and fractured reservoirs and offshore developments. WellPath is part of CGG’s InsightEarth 3D visualization and interpretation suite.
Emerson/Paradigm has rolled out Tempest 2020.3 with improved accuracy for well-grid flow modeling, a new patented approach to well-grid connectivity and new ‘big loop’ ensemble analysis. Paradigm has also updated its Mette production modeling package. Mette 2.2 facilitates multiphase flow calculation and modeling wells, flow lines and gathering networks.
Rock Flow Dynamics has released tNavigator V20.2 with improved use of available GPU resources in simulation. Eight new seismic attributes are available in Geology Designer and new reservoir coupling functionality has been added to Model Designer. The new release also includes Fracture Simulator, a new module for hydraulic fracture modeling. More from Rock Flow.
Stratagraph has launched RockProp, a reservoir analysis package developed by the Cuttings Alliance - a consortium of Stratagraph, CoreSpec Alliance and PetroScale Reservoir Solutions.
Aspen Technology has announced the AIoT Hub, that enables OT/IT convergence. The Hub builds on Aspen’s IP.21 real time database, adding recently acquired technology including cloud-native connectivity from RtTech Software, the AI-driven IoT applications from Mnubo and the enterprise insights and visualization features from Sabisu.
Barco’s new RGB laser rear-projection video walls target mission-critical control rooms. The next-generation technology promises a ‘smaller ecological footprint’ and is available for both new installations and for retrofits. More from Barco.
Bilfinger’s PIDGraph uses machine intelligence to ‘understand’ blocks of plant equipment in a P&ID diagram along with the relationships between interconnecting pipes and wires. PIDGraph outputs XML files in the ISO15926i standard for data integration and hand-over. The open format allows unrecognized items to be rectified and the software learns from such user corrections. Bilfinger Digital Next (BDN) adds applications for maintenance and engineering.
Brüel & Kjær Vibro’s new VibroPort 8000 portable vibration analyzer for rotating and reciprocating equipment provides steady state and transient condition monitoring. B&K Vibro’s SetPoint condition monitoring software provides multiple analysis of shaft or bearing performance. Patented intelligent waveform capture identifies critical patterns. The system can be operated in standalone mode, linked to a laptop, or to an OSIsoft PI historian.
CEA Systems has added a new pipeline inspections module to its Plant4D flagship. The new module generates isometrics and inspection sheets automatically. Inspection reports are saved in Plant4D, linked to the corresponding pipeline and assets.
A new major version of the Modelica Standard Library, MSL version 4.0.0 is now available. Read the release notes on Git. The Modelica FMI group recently released Functional Mockup Interface version 3.0.
Neuron Soundware has released the nShield basic package, an entry-level edition of its equipment monitoring hardware and software combo. The package allows legacy equipment to be monitored with quick-to-install sound sensors. Neuron’s oil, gas and petrochemical industry solution provides remote machine health monitoring. More from Neuron.
Pegasus Vertex’ TMPRO provides tubing engineering analysis to mitigate tubing and packer failures. TMPRO works for vertical or directional wells and helps design optimal packer settings. Tensions, collapse, burst, and von Mises failure analysis are built atop a tubular and packer database.
Thermon has introduced the Genesis Network, a cloud-based solution for managing and optimizing heat trace systems. Users can monitor and troubleshoot large heat trace systems of over 10,000 circuits. Wireless comms connect heat trace controllers to the control room and alarms, history and operational data is available from any network-connected device.
The 9.7 release of Assai Software’s DCMS engineering document management system adds usability enhancements, improved planning and support for CAD file generation. Assai is also working on new ‘handover to project’ functionality to send selected document revisions directly to another Assai project.
Cegal has announced new ‘as a service’ offerings for application management operation. The offering builds on Cegal’s expertise in E&P application portfolio process management. More from Cegal.
Esri has released ArcGIS Analytics for the oil and gas internet of things, a demonstrator of Esri’s digital oil field offerings that monitor real-time SCADA data feeds along with live weather data. ArcGIS Analytics for IoT is a hosted solution that enables users to ‘ingest, visualize, and analyze spatial real-time and big data to gain new insights and take action to protect employees and critical assets’. More from Esri.
Ideagen has announced a cloud edition of its Q-Pulse governance, risk and compliance software. Q-Pulse is a quality, health and safety and environmental (QHSE) solution for, inter alia, the oil and gas vertical. More from Ideagen.
OGRE Systems has released a new version of its R3 Economics and Volumetrics package with updates to the scenarios, curve fitting and more. A new UI has been ‘designed for engineers by engineers’ More from Ogre Systems.
Safe Software has announced availability of a ‘distributed Windows deployment’ of FME Server on the Azure Marketplace. The solution comprises a virtual machine, file storage, load balancers and databases. More here.
Speaking at the Reuters online Petrochemical Development USA 2020 event, Shell’s Claudia Zuluaga who is principal digital product manager, outlined the impact of the fourth industrial revolution that is transforming how companies operate. The oil and gas industry is undergoing a double transition, both digital and energetic. Navigating both of these in the context of corona is particularly problematical. Corona is now the single biggest catalyst for digital transformation which is ‘no longer an option’. Now more than ever, companies need to adopt digital even faster.
Zuluaga offered three critical success factors for the digital transformation. Number one is ‘connections’ with customers/teams/groups. We are all creatures habit with our own ways of doing things. People need to be convinced that they will benefit from the transformation. Shell uses a ‘customer-centric’ approach as opposed to the traditional product-focused method. This has the advantage of being harder for others to imitate and preserves Shell’s competitive advantage. Next up is digital literacy. The lifetime of digital skills is getting shorter and shorter. Today’s ‘university of life’ can teach as well as traditional schooling. Training engineers in digital literacy creates a level of understanding that eases the transition and reduced burn out. Finally, leadership by leaders who are prepared to put themselves in other peoples’ shoes.
The methodology leverages what we have seen elsewhere described as ‘agile’ with minimum viable products developed in months at a cost of thousands of dollars. These are rolled-out until a ‘tipping point’ of take-up signals viability. At which point, major resources (in the million-dollar range) are devoted to develop a sustainable product and to embark on a multi-year journey. The whole process can be iterated, ownership handed over to business leaders, and the new tool baked into the digital transformation.
In the ensuing Q&A Zuluaga offered some more observations on the digital transformation within Shell. There are some obstacles. Connections can be hampered by resistance, by a lack of data infrastructure, and by poor support from leadership. Budgetary support is sometimes an obstacle. Hence the importance of educating people in digital literacy, with trainings for all, from operators to execs. Shell has also developed a rigorous process for project selection and deployment. All transformation initiatives require an economic model to be built demonstrating expected monetary benefits. Without demonstrating the potential value of a project, ‘perhaps it would be better not to do it’. This has created consciousness within Shell on cost preservation.
Zuluaga was asked how Shell handles customers who may not have a high level of digital literacy. Zuluaga has customers in finance, retail and across the supply chain. Some are eager to start, others less so. An extensive conversation is needed. The ‘What’s in it for me?’ question can be hard to answer. Developments can pivot into something very different from what was envisaged. Flexibility to pivot quickly is a good thing. The old product-centric approach may make this seem easy. Our customer-centric technique may be harder, but it is worth it. Automation requires new skills and new ways of working in a knowledge-rich environment. Zuluaga did not envision any job losses resulting from AI and machine learning, ‘people are more enthusiastic to embark on the journey - across all levels in the company’. Execs and operators are trying to understand the basics of coding in Python. Operators and engineers need to be a central part of a digital development. Subject matter experts need to be at the center of a project, and they can learn the new ways of working on the job. But a new mindset is required for rapid development. A minimum viable product should be developed in 3 months. Things are not going to run at the same speed as in the past.
More from Reuters Petrochemical Development USA 2020.
In his keynote address to the 2020 ICC Inductive Automation Ignition Community Conference Don Pearson stated that there are Ignition installations and integrators in over 100 countries. IA founder and CEO, Steve Hechtman reported on the company’s reaction to the covid crisis that saw a transition to remote work ‘over a weekend’. Unexpectedly, staff proved to be ‘much more efficient’ working remotely and now working remotely is the ‘new normal for many of our staff, even after the pandemic ends’. Ignition has just released a Maker Edition, a free personal-use version to enable people to do ‘fun home projects to learn and innovate in new ways’. One happy maker is Enuda of Sweden that has created an automated heating system for its greenhouse. Using temperature sensors, MQTT and a homemade controller, they can see and control greenhouse temperature from a mobile phone.
Colby Clegg presented Perspective, a cross-platform environment/GUI that works across mobile, tablet, laptop and PCs. Perspective uses web-native technology. But in the control room, where running the SCADA or HMI inside of a commodity web browser is not best practice, a new Perspective Workstation applications can run in ‘full-screen kiosk mode’, eliminating distractions from the underlying OS (no more playing solitaire?). Ignition is in the process of replacing its Symbol Factory, an industry-standard vector graphic library with new smarter ‘Perspective Symbols’ and their dynamic data model that enables drag and drop deployment.
Ramnath Mani from Indian Automation Excellence set out to ‘breaking the myth of Industry 4.0 with Ignition. Industry 4 is envisioned as a constellation of process data sources and distributed applications in the cloud. But currently, I4 is split down the middle, between operations technology and IT. For true I4 implementation, devices (regardless of manufacturer) need to ‘seamlessly and instantly communicate’. This necessitates a common understandable language and IT-compatible protocol. MQTT is the protocol of choice to decouple devices from applications and enable I4 and the new IIoT. And naturellement, Ignition is the system of choice here as it ‘addresses the goals of the Open Process Automation (OPA-S) initiative for an open, standards-based, secure, and interoperable control system’. The idea is to avoid ‘ecosystems built around intellectual property stemming from one core partner’.
Andrew Scott (Pioneer Natural Resources) along with Amita Kulkarni and Binh Vu from integration partner CSE-Icon presented on Pioneer’s event logging system. The project centralizes event records for each control room operator with an emphasis on alarming, adding functionality to Pioneer’s current Ignition system to improve operator efficiency and pave the way for future field optimization efforts. The operations event management system was built with Ignition, displacing a third-party event logger. Ignition allowed for automatic population of much of the required event information and associated metadata from the scada system, leaving the operator to provide comments and additional information. Alarm management has been improved with alarm status updated when an alarm is cleared. Daily reporting is now largely automated.
Arlen Nipper (Cirrus Link Solutions) showed how Ignition and MQTT/SparkPlug can be used for auto-discovery of data models and to push time series data into the Amazon* Web Services cloud. The AWS IoT SiteWise service allows operations data to be collected via MQTT Sparkplug (or OPC-UA) into a cloud-based model of an asset including time series data. SiteWise provides a standard API interface for applications to consume for AI and big data solutions. Cirrus Link’s Sparkplug SiteWise Bridge, currently in beta test, delivers data into SiteWise ‘with minimal configuration and zero coding’. The Ignition Perspective module adds HTML 5 data visualization from the AWS store.
* Amazon itself is an Enterprise client of Ignition.
Ryan Crownover (Vertech) believes that ‘Docker and DevOps are set to ‘dominate’ software development and deployment. DevOps is described as a ‘methodology and a mindset designed to integrate the development and operation of a system into a cohesive lifecycle’. The approach is claimed to be adapted to changing priorities, providing rapid turnaround times and coordinating the work of a team of developers. DevOps embeds a ‘continuous integration and delivery’ process that is said to be particularly suited to automation system development. At the heart of DevOps is GitLab and Docker. Git supports developer, source code and project management. Docker makes development independent of a target architecture. Software developed on Linux ‘runs just as well on Windows’.
A team from Corso Systems presented on blockchain-based operator logs and Ignition auditing. The concept is to add a Hyperledger Fabric blockchain to provides a chain of trust for all events tracked in Ignition. Public and or private blockchains allow for different modes of determining and enforcing trust. Corso did not go into the granularity of the events tracked in the blockchain which may be problematic for high volume streaming data.
Finally, Moxa presented its Things Pro cellular/VPN connectivity solution for the IoT.
More on ICC 2020 from Ignition.
Following a reorganization, Chris Villegas is now CEO of Reset Energy, an oil country plant design and fabrication specialist. Villegas comes over from the recently acquired Heroes Energy Solutions.
Dan Kluk heads-up Stress Engineering Services’ new Digital Solutions Group.
Jean-Marc Sohier (Concawe) is now an A.SPIRE board member.
Yan Nikitin is to head up AqualisBraemar’s Moscow newly opened office.
Babcock & Wilcox has named Gary Cochran as MD European region. Wassim Moussaoui is the MD of the newly formed B&W Middle East Holdings, based in the new Dubai, UAE HQ.
Thai Pham is now BCCK’s senior process engineer. He hails from UOP-Ortloff.
Bechtel has named Justin Siberell as President for Europe and the Middle East region. Paul Marsden is now President of the company’s Oil, Gas & Chemicals business, succeeding Alasdair Cathcart who is stepping down after 31 years at Bechtel.
Jon Huntsman has re-joined Chevron’s Board of Directors. He will serve on the Management Compensation Committee and the Public Policy Committee.
Matthew DeNezza joined Crusoe Energy Systems as CFO. Prior to joining Crusoe, he was with Meritage Midstream.
Felipe Saldanha is the new GM of the CSA Ocean Sciences’ Brazilian affiliate, CSA Ciências Oceânicas Ltda.
Peter Bernard, Executive Chair; Ike Epley, Vice-Chair; Jorge Machnizh, President and CEO; Michael Stundner, EVP, Technology and Founder of myr:conn; Dale Sperrazza, CCO; Kenton Gray, CTO; David Freer, CFO; Braxton Huggins, CMO; Carol Piovesan, SVP; Tom Jordan, VP Corporate Development; and Lars Olrik, VP Sales make up the new executive team at Datagration.
eDrilling COO, Sven Inge Ødegaard, is to head up the newly established eDrilling Research unit.
Harsha Agadi has joined Flotek’s Board of Directors and will also serve as the chair of the Compensation Committee.
Carlyn Taylor is now Flowserve’s independent director and a member of the Audit Committee and the Corporate Governance & Nominating Committee of the board.
Fugro has nominated Sjoerd Vollebregt as a member of its supervisory board following Douglas Wall’s intention to retire at the end of the AGM in 2021.
Michael Sheen (SVP and CTO and executive director), William Moody, and Charles Still are to retire from the Geospace Board. Tom Davis replaces Charles Still as lead independent director.
Halliburton Labs has appointed Reginald DesRoches (Rice University), John Grotzinger (Caltech), and Walter Isaacson (Tulane University) as its first advisory board members.
HARC has hired Lu Liu to guide the organization’s research on energy, air and water issues. Prior to joining HARC, Liu worked as a postdoctoral research associate at Rice University.
Hexagon has appointed Maria Luthström as Head of Sustainability and Investor Relations.
Iman Hill has joined the IOGP as Executive Director succeeding Gordon Ballard, who steps down at the end of 2020. Before joining IOGP, Hill was with Energean.
Schlumberger is to cut 21,000 jobs, roughly a fifth of its workforce
Andrew Gould, Schlumberger’s former Chairman, and CEO has been appointed to the McDermott Board of Directors.
Kimberly Green is now EVP Of Human Resources at Motiva. She succeeds retiree Ed Haloulos.
OGUK has appointed Alexandra Thomas (Neptune Energy), Emeka Emembolu (BP), and Mark Abbey (CHC Helicopter) to its board of directors.
Offshore Technical Compliance has opened a division in Mexico City.
Lance Abney is now the Operation Manager at PRCI Technology Development Center’s new facility.
Quality Companies has named Iain Gault as its business development manager - Houston and International. He was previously with Stork Technical Services USA.
Pamela Pierce is now Scientific Drilling International CEO and President.
SeekOps has named Iain Cooper as its new CEO. He has previously led the technology development, strategy, and investment at Schlumberger.
The Society of Exploration Geophysicists has announced its 2020–2021 board election results. Anna Shaughnessy is President-elect, Bruce Shang is Second VP, Pete Cramer is Treasurer.
Sercel-GRC has appointed William Milne as VP, Major Accounts, and operations, and to the Sercel-GRC Strategic Leadership Team.
Cedrik Neike is to succeed retiree Klaus Helmrich as Siemens Managing Board member responsible for Digital Industries. Current COO Matthias Rebellius replaces Neike as board member responsible for smart infrastructure.
Mike White has joined Starwood Energy as SVP.
Christi Craddick is now Chairman at the Railroad Commission of Texas.
Chris McLaren has been named technical program manager at the newly opened Transportation Technology Center funded by PHMSA in Pueblo, CO. Dave Mauger is VP Operations.
Brian Cothran has joined Venture Global as COO.
Weatherford has appointed Girish Saligram as President, CEO, and member of its board. Scott Weatherholt is EVP, General Counsel, and CCO. Keith Jennings is EVP and CFO.
Abrado appoints Jason Broussard as President and CEO. He hails from US-based Wellbore Fishing & Rental Tools.
Anders Opedal is Equinor’s new president and CEO following Eldar Sætre’s retirement after six years as CEO and more than 40 years in the company.
Aspen Technology has appointed Amar Hanspal (Bright Machines) to its Board of Directors.
Dylan Webb is now Datamine’s CEO taking over from Damian McKay who has been appointed CEO at Vela business, Datamine’s parent company.
Russell Crockett (Aztlán Chemical) is now a member of SEI/CMU Board of Visitors, advisors to the CMU president, VP research and the SEI director.
Kelly Tomyn is to assume the role of Interim VP, Finance and CFO at Computer Modelling Group while Sandra Balic takes a parental leave of absence.
Eric Mullins (Lime Rock Resources) is now a member of ConocoPhillips board of directors.
DXC Technology has added David Barnes and Raul Fernandez to its board of directors.
Gay Huey Evans OBE has joined IHS Markit as an independent director to its board and will join the company’s audit committee.
David Sebag has joined IFPen’s Geosciences department to work on the energy transition and environmental soil characterization and verification project.
The undergraduate scholarship foundation of the Houston Geological Society has announced the creation of the Paul M. Basinski Memorial Scholarship, established by his wife, Rene Basinski to honor his memory.
Reinhard Florey has been reappointed as CFO of OMV, extending his term of office for another three years.
Rockwell Automation has named Isaac Woods VP, treasurer, and board-elected officer of the company. He succeeds retiree Steve Etzel after more than 31 years with the company.
Erieta Dimitriou is now RPS Principal Energy Consultant at its Energy & Sustainability team.
François Poirier is to succeed retiree Russ Girling as President and CEO of TC Energy Corp and will join the board in 2021.
Former Anardarko EVP, Mitch Ingram is now Non-executive Director of Tullow Oil.
FESAus society reports the death of Hugh Crocker, one of its founding members, and ‘a great and tireless contributor to the Petrophysical community worldwide for 60-70 years’.
Longtime Bureau of Economic Geology researcher Bill White has passed away.
PIDX International has announced ETDX, a new standard for emissions transparency data exchange. ETDX is to cover exchange of data covering carbon emissions and other energy transition-type needs that need to be harmonized across industry participants. The initiative aims for clarity on energy standards by region and regulatory bodies as well as alignment between operators, suppliers, and network providers. APIs are planned for technical integrations for reuse and sustainability, and potential savings in resources. PIDX is also researching other organizations working on carbon tracking standards in specific areas of the energy industry. The initiative is headed-up by Chevron’s Franz Helin.
The Open Geospatial Consortium is considering the Zarr V2 storage specification for adoption as an official OGC Community Standard. Zarr is an open-source specification for the storage of multi-dimensional arrays of data a.k.a. tensors. Zarr stores metadata in JSON text files and array data as (optionally) compressed binary chunks. Zarr is particularly suited to cloud data storage. Zarr was originally developed for genomics research by Alistair Miles at Oxford University.
The PPDM Association has published a short explainer covering reference lists. Reference lists, a.k.a. controlled vocabularies can (and should) be used in data creation. Where possible, industry standard lists should be ‘owned and managed by a professional society or standards organization who develops and maintains the list on behalf of its members and industry’. PPDM’s 14-page explainer elaborates, inter alia, on the need for a reference list for units of measure, curiously without mentioning the Energistics work in this area. As we reported in 2013 the Energistics UOM work group included representation from PPDM. More from PPDM.
The US Securities and Exchange Commission (SEC) has published its 2021 draft SEC Taxonomies for public review and comment. Vendors and other stakeholders are ‘strongly advised’ to review the organization of the taxonomy files, which have been updated from the 2020 taxonomies. Read more and access the drafts here. Transition guidance for the 2021 update will be posted on the standard taxonomies page.
The OPC Foundation and DeChema have produced a release candidate companion specification UA for DEXPI, V1.00 (OPC 30250). DEXPI is a general data exchange standard for the process industry, covering all phases of the lifecycle of a petrochemical plant, ranging from specification of functional requirements to assets in operation. Currently, the focus of the initiative is the exchange of Piping and Instrumentation diagrams (P&IDs). Members can download the release candidate spec.
The Industrial Internet Consortium has released V2.3 of its Vocabulary Technical Report including new or updated definitions for Internet of Things terminology. More from the IIC and in this issue’s editorial.
The Open Geospatial Consortium (OGC) is teaming with the Open Design Alliance (ODA) to ‘promote and strengthen’ the use of open standards for the location and geospatial industries. ODA builds software development kits for CAD and BIM standards including IFC, .dwg, and Autodesk Revit files.
An open letter co-signed by the Engineering and Construction Contracting Association (ECC), the Construction Industry Institute (CII) and the Construction Users Roundtable (CURT) heralds the establishment of an industry alliance to ‘capture benefits of increased content collaboration, better sharing of research and benchmarking learnings to improve application, diversifying industry sector participation and maximizing benefits from integrating conference production activities’. The announcement was made by 2020 ECC Chair Tony Bazzini (ExxonMobil). More in the Letter.
ExxonMobil has expanded its agreement with direct air CO2 capture specialist Global Thermostat following a year-long technology evaluation. Global Thermostat uses proprietary amine-based adsorbents to remove CO2 from the air which is then ‘stored underground, used to make chemicals, consumer products or construction materials’.
The US Department of Energy has awarded $1.5 million to Lafarge-Holcim’s CO2ment project in Florence, CO. CO2ment captures CO2 (a byproduct or cement making) from flue gasses using Svante’s technology. What happens to the CO2 is not clear, although another company, Solida Technology claims to turn cement into a carbon sink. Total and Oxy are involved in the CO2ment.
Houston Chron FuelFix editor Brian Rausch reports that NRG Energy’s Petra Nova CO2 sequestration project has shut down. Its economics ‘no longer make sense with oil prices hovering around $40 a barrel’.
Petrofac’s Engineering and Production Services business has been awarded an Engineering and Project Management Office support contract for the Acorn project. The works cover FEED for the Acorn CCS project and concept selection for Acorn Hydrogen. Both projects are planned for the St Fergus gas terminal near Peterhead, Scotland.
The International Energy Agency’s new ‘Special Report on Carbon Capture Utilisation and Storage: CCUS in clean energy transitions’ is a 174-page analysis of the state of the art of CO2 capture and sequestration. CCUS is said to be ‘the only group of technologies that contributes both to reducing emissions in key sectors directly and to removing CO2 to balance emissions that cannot be avoided’. Although there are some 20 commercial CCUS operations worldwide, this is ‘nowhere near the amount required to put global emissions on a sustainable path’. Pre-covid there were plans for 30 more 30 commercial facilities, but their fate is moot today. In all events, ‘reaching net zero will be virtually impossible without CCUS. IEA Director Fatih Birol stated ‘There is a stark disconnect today between the climate goals that governments and companies have set for themselves and the current state of affordable and reliable energy technologies that can realize these goals’. The CCUS report is a part of the IEA’s Energy Technology Perspectives 2020 series.
On a slightly more positive note, the Global CCS Institute has added ten carbon capture and storage (CCS) facilities to its global database, bringing the total number of CCS facilities in various stages of development to 59 with a capture capacity of more than 127 million tonnes per annum (mtpa). There are now 21 facilities in operation, three under construction, and 35 in various stages of development. For more, visit the CO2RE Database and read the GCCSI’s ‘Value of CCS’ report for 2020.
A refinery and fertilizer plant are now connected to the Alberta Carbon Trunk Line. The ACTL will initially move 1.6 million tonnes of CO2/year with a 14.6 MTPA expansion capacity.
The Colorado-based Rocky Mountain Institute, in partnership with Spherical Analytics has developed the ‘Climate Action Engine’, a data and analytics platform to monitor Permian Basin greenhouse gas emissions. The CAE is built on Immutably, Spherical Analytics' blockchain-based ‘enterprise data fabric platform’. Operators including Shell, Origin Energy, ExxonMobil and Chevron are to ‘shape’ the CAE’s outputs and use cases to help identify opportunities to reduce methane emissions.
An analysis of flaring is available in a study from Gaffney-Cline’s carbon management practice. Many Permian producers are consistently ‘best-in-class’ with flaring intensity of around 2% (basin average is 4%). However, flaring can be aggravated due to malfunctioning and unlit flares. Increased scrutiny of incomplete flare combustion and venting is warranted as the warming potential of methane is approximately 84 times that of carbon dioxide over a 20-year period. The Permian Methane Analysis Project survey by the Environmental Defense Fund found 11% of flares were either unlit or malfunctioning. In response, the Texas Railroad Commission has formed the Texas Methane and Flaring Commission and is recommending changes to the current regulatory regime.
DCP Midstream is now monitoring emissions from its portfolio of natural gas assets with Kairos Aerospace’s airborne methane monitoring techniques and advanced analytics.
Shell has awarded Bureau Veritas a global framework agreement for the use of its Leak Detection and Repair solution to measure and control emissions. The Veritas Ldar solution includes optical gas imaging, laser and ultrasonic technologies and inspection drones.
The Alberta Energy Regulator has produced a video explainer covering its Directive 060: Upstream Petroleum Industry Flaring, Incinerating, and Venting and Directive 017: Measurement Requirements for Oil and Gas Operations. The AER also does a polite RTFM, with a pointer to Manual 015: Estimating Methane Emissions, and Manual 011: How to Submit Volumetric Data to the AER.
The latest Energy Transition Outlook from DNV GL holds that ‘deep decarbonization’ of the world’s energy system is ‘still 15 years away’. While decarbonization is ‘rising rapidly up the agenda’ of industry and governments, but ‘not at the pace or depth required to meet the Paris Agreement’. The oil and gas industry is set to reduce its carbon emissions by 32% by 2050, but world emissions will remain stubbornly high until the mid-2030s. Hydrogen and CCS are seen as key to decarbonization after 2035, if ‘incentivized by policy’.
DNV GL Energy CEO Ditlev Engel unveiled the ‘Transition Faster Hub’ warning that, ‘Global warming will have catastrophic consequences for humanity. Five years on from the Paris Agreement we have not made the progress required to deliver this. Technology, policy and societal change have the power to create a clean energy future, but we need to go faster, much faster’. DNV GL’s industry information source showcases best-in class projects, technology, innovation and solutions, as well as thought leadership reports, podcasts and news on ‘accelerating the transition’.
The EU Commission has published a ‘Communication’, Powering a climate-neutral economy: An EU Strategy for Energy System Integration, a 22-page explanation of how the EU is to achieve ‘climate neutrality’ by 2050, through the ‘deep decarbonization’ of all sectors of the economy.
A paper* titled The Global Methane Budget 2000–2017 by researchers from Stanford University’s
Global Carbon Project found that global methane emissions have ‘soared
to a record high’. While the pandemic has reduced CO2 emissions,
methane emissions continue to climb, ‘dragging the world further away
from a path that skirts the worst effects of global warming’. Increases
are being driven primarily by the growth of emissions from coal mining,
oil and natural gas production, cattle and sheep ranching, and
landfills. More from Stanford.
An open access paper from the London Geological Society titled ‘Geoscience and decarbonization: current status and future directions*’ reviews the role that geoscience and the subsurface could play in decarbonizing electricity production, industry, transport and heating to meet climate change targets. The paper is based on presentations made at the Geolsoc’s 2019 Bryan Lovell meeting.
Hydrogen Europe has produced ‘Clean Hydrogen for Europe’, a 157 page ‘strategic research and innovation agenda (SRIA) of the clean hydrogen for Europe partnership’. Hydrogen is a solution ‘without which Europe cannot achieve its 2050 goals on GHG emissions reduction’. The report provides a background on EU research notably the fuel cell and hydrogen joint undertaking or ‘FCH-JU’ as the EU-acronym speak has it.
Gaffney Cline’s Focus on Blue Hydrogen promotes the merits of ‘Blue Hydrogen’, i.e. obtained from reforming methane in the energy transition. Hydrogen’s promise stems from its potential use as an energy vector rather than as a primary energy source. Hydrogen can be added to natural gas to reduce carbon content, it can be used in fuel cells, a feedstock for industry, or as a ‘battery’ to store excess renewable energy.
Fleetcor has introduced a US Fleet Card offering ‘100% tailpipe emissions offset’. The company has teamed with enviro-tech specialist GreenPrint to launch the Fuelman Clean Advantage fleet card. When a user fills-up, 100% of future emissions are offset through investments in ‘independently certified carbon projects’.
Intertek has launched ‘CarbonClear’ a certification program that independently verifies the per barrel carbon of produced oil.
CleanDesign Power Systems has developed a hybrid power management system for oil and gas drilling rigs that uses Lavle’s Proteus lithium-ion battery. The system is claimed to reduce fuel consumption, lower emissions and cut power-related downtime.
Hamish Wilson, Bill Senior, Tony Smith, Martin Dru and Sarah Milne have collectively launched BluEnergy. The consultancy sets out to help oil and gas companies leverage their existing asset base to create low carbon energy streams, generating value and reducing carbon intensity.
Finnish Puro.earth has announced a ‘proven’ carbon transformation business model for CO2 removal with the potential to remove 10 Gigatons of CO2 a year by 2050, equivalent to half of global emissions cuts required to hit net zero. The DNV GL-verified process involves the trade of CO2 removal certificates, ‘matching companies that lock away CO2 in environmentally sound processes with companies that have pledged to get to neutral emissions’. TietoEvry is a client.
DNV GL has approved Shell’s Cansolv CO2 removal technology for use in a ‘full-scale demonstration’ project at Fortum’s waste-to-energy plant in Oslo, Norway. Qualification verified application of DNV GL’s recommended practices; DNVGL-RP-A203 and DNVGL-RP-J201.
An update on the NPD’s 2008 Power from the shore report has it that almost half of Norway’s oil production will soon be run on power from shore. Currently, eight NCS fields shelf receive power from the Norwegian grid. Decisions have been taken on a further eight with a further six approaching FID. Cost of the mitigated emissions is estimated at under NOK 1500 ($163) per tonne of CO2.
PIDX International has formalized its Emissions Transparency Data Exchange (ETDX) initiative with the aim of developing the energy transition standards for data exchange regarding carbon emissions and other energy transition-type needs that are designed to be harmonized across industry participants. ETDX will cover data formats and APIs for upstream, midstream and downstream.
BP and others have formed the Coalition for a Better Business Environment to encourage US East Coast states to enact the Transportation and Climate Initiative, a regional carbon pricing policy designed to reduce emissions from the transportation sector.
Shell* and Microsoft are ‘embarking on a strategic alliance to progress towards a world with net-zero emissions’. The alliance sees Shell will supply Microsoft with renewable energy, (Microsoft plans to be 100% renewable by 2025) and the companies continue to work on artificial intelligence that has ‘already driven transformation across Shell’s operations’, ‘delivering efficiencies that have helped reduce Shell’s carbon emissions’. Shell is also to use Microsoft’s Azure cloud computing system to strengthen operational safety, by improving risk analysis, prediction and prevention.
* BP has made a similar announcement with Microsoft.
The Oil and Gas Climate Initiative (OGCI) has announced a new target for the reduction of its members’ carbon intensity. The target for upstream oil and gas operations is now for ‘between 20 kg and 21 kg CO2e/boe by 2025’, down from 23 kg CO2e/boe in 2017. The target covers both CO2 and methane emissions from exploration and production. OGCI members include BP, Chevron, CNPC, Eni, Equinor, ExxonMobil, Occidental, Petrobras, Repsol, Saudi Aramco, Shell and Total.
The Environmental Partnership has just published its Annual Report 2020 subtitled ‘Improving the oil and natural gas industry’s environmental performance’. The 38-page document summarizes TEP’s programs, notably an expansion into the midstream segment, and provides data highlights for 2019 along with a Focus on Flaring. This chapter is more of a justification for flaring than a program for environmental enhancement. TEP is backed by the American Petroleum Institute. Greenpeace it is not!
Total has joined with ten other diversified French companies to form the ‘Coalition for the energy of the future’, engaging in ‘nine concrete projects for developing energy solutions to accelerate the energy transition in transport and logistics’. Projects include the use of hydrogen and biofuels in the transport sector and the development of a ‘digital door-to-door route planning system*’ to minimize environmental impact. The Coalition’s initial findings will be officially presented in January 2021 at the IUCN World Conservation Congress.
* The Coalition’s digital folks may not have noticed but the ‘travelling salesman problem’ is one of the earliest (solved) problems in computing.
Speaking at the Energy Conferences Network/Energistics co-hosted online Summer Digitalization Summit, self-confessed ‘digital rebel’ Harald Wesenberg from Equinor’s Research Center in Trondheim presented on the role of blockchain in the Industrial Internet of Things and how to ‘bridge blockchain to the physical world’. Wesenberg’s presentation was billed as an exposition of Equinor’s trials of the use of ‘sensors and IIoT to bridge between the virtual world of blockchains and the physical world of the oil and gas industry.
Blockchains can be public (à la bitcoin), permissioned or private. Transaction visibility differs as does performance. Public chains are slower than permissioned and private. On the issue of tying blockchain to objects in the real world, Wesenberg offered that a bicycle’s ownership can be determined by a blockchain record, but this does not identify the rider.
Since the difficulty of relating a blockchain token to a real world object was core to our 2018 ‘blockchain is bullshit’ editorial, we pinged Wesenberg on LinkedIn to ask for more details on relating tokens to things. It turns out that Wesenberg’s reading of the situation is quite close to ours. He explained, ‘I’m not sure what you mean. One of my key points in the presentation was intended to be the opposite, i.e. that for some trust problems (and especially trust problems related to real world events captured with IoT) is that you don’t need a token model, just an immutable distributed ledger. There are other trust mechanisms at play also’.
Wesenberg’s presentation included a link to the Blockchain for Energy home page. Here a ‘successful’ pilot on water haulage is reported using ‘Data Gumbo’s GumboNet’ blockchain. The question now arises as to what is the functional difference between a ‘private’ commercial ‘blockchain’ network and a conventional B2B portal of which there are many.
More from Energy Conferences Network.
Jonathan Carpenter reported that Petrofac is cutting spending to conserve cash and pulling back on digital in a ‘mixed picture for the digital transformation’. Fayez Kharbat concurred, Saudi Aramco is likewise ‘more selective’ on digital spend with a focus on making operations more efficient and profitable. McKinsey’s Anosh Thakkar described shrinking digital activity albeit with a remaining focus on high-impact scalable projects. Covid has called a halt to ‘endless proofs of concept’.
On decarbonizing the industry, Schneider Electric’s, Eric Koenig outlined the societal and financial pressured that have made some oil and gas players to diversify their portfolios to more resilient industries and to ‘avoid black swan events’.
In a more down-to-earth session on digital solutions for the future, Sebastien Marquardt presented Bilfinger’s work using artificial intelligence to decipher paper piping and instrumentation (P&ID) diagrams. Bilfinger’s PIDGraph produces XML or JSON output. ‘DeepGraph’ analysis identifies components and deciphers interconnected data sources to provide a 3D, holistic view of an asset. An ‘open source’ graph structure said to be based on the ISO 15926 standard allows data to be exchanged with computerized maintenance management systems.
Kongsberg Digital’s Shane McArdle forecast that the industry is ‘in for 3-4 years of covid disruption’. In the interim, ‘we need to do more and fix pre-covid risk aversion to digital’. A huge shift is about to happen involving a 50-80% reduction in field personnel while maintaining and maybe even increasing production. The covid pandemic has brought us remote support using Microsoft Teams and Zoom. These ways of working are here to stay. McArdle sees the digital twin as key here. Today we have humans in the loop. In a next couple of years we will have ‘fully autonomous’ operations. There will still be people in the loop, but fewer, working from one central facility. Kongsberg’s Kognitwin Energy offering is a ‘virtual representation of all assets and their behavior’. A ‘Unify’ data management component adds graph-based data drill down into equipment tags.
Khalid Alharbi and Mohammed Tomehy presented Saudi Aramco’s IMOMS (integrated manufacturing operations management) system. IMOMs is a group of apps (GE SmartSignal, Microsoft Project, PI system) sitting between data sources and SAP ERP. Aramco’s IMOMS has been in use for some time as a 2017 article in Digital Refining shows.
More from GO Digital.
A new white paper ‘Digital twin and asset administration shell concepts and application in the industrial internet and Industrie 4.0’ co-authored by representants of the US Industrial Internet Consortium (IIC) and the German Industry 4.0* body explores the relationship between the IIC’s vision for a digital twin and I4.0’s work on the asset administration shell, an overarching platform for plant and process industry automation.
In February 2020 the IIC produced a white paper, ‘Digital Twins for Industrial Applications’, with definitions, technical aspects, standards and use cases. The I4’s ‘Asset administration shell (AAS) is an implementation of the digital twin for industrial applications, developed to support cross-company-interoperability across the value chain.
The 30-page publication traces the history of the DT since the term was coined by Michael Grieves in 2003 in a course on product lifecycle management. The term was picked up by NASA in 2012 and defined as ‘a multi-physics, multiscale, probabilistic, ultra-fidelity simulation that reflects, in a timely manner, the state of a corresponding twin based on the historical data, real-time sensor data, and physical model’. Today, the DI is ‘about more than just simulation’.
There are other definitions, the DT, as a concept has grown to encompass, well just about anything, from data storage, analysis, control and AI. ‘Some implementations of digital twin may contain many attributes and data, computational capabilities and perhaps even a formal interface for communication to satisfy the application requirements, some others may only need a small set of attributes and data to be sufficient to support their application’. Your milage may vary.
In some senses, the DT represents push back from the operations technology community in reaction to the takeover of IT. The white paper has it that ‘The DT organizes and enables access to the data in association with its corresponding real-world objects from an Operational Technology (OT) perspective, rather than the usual data tables in databases from an IT perspective, making it better suited for running computational models and developing applications’.
The bi-directional nature of the DT is demonstrated by the fact that sensor data and operational states of an asset are sent continuously to the twin; any operational instructions resulting from decisions based on analytics in the specific application context can be sent back to the real-world entity to be executed’.
Under the hood computational and presentational models there are physical, statistical, control and machine-learning models, along CAD, engineering and visualization 3D applications including 3D simulation and VR/AR/MR. In a plant, twins can be combined into a hierarchy that mimics the whole system. In other use cases, twins can be combined in a peer-to-peer configuration, for instance in a windfarm.
Such combination requires a ‘common construct’ of data, models and a service API. Enter AAS, the asset administration shell,
a standards-based ‘uniform regulatory framework’ for decentralized
systems and artificial intelligence. AAS combines elements from the
IIC’s Industrial Internet Reference Architecture and Industrie 4’s own
‘RAMI’. A ‘technology-neutral’ information model embeds technologies
enumerated as ‘XML, JSON, RDF, OPC UA and AutomationML’. An offline-use
file information exchange package format (AASX) is also available.
Communications appear to favor MQTT and/or OPC-UA. The AAS claims
alignment with the ISO/IEC standard ISO/IEC 21823-1 governing IoT
system interoperability. Other AAC standardization activities are
ongoing in IEC TC65 WG24 and ISO CD 23247. Examples of open-source AAS
activity include the Admin Shell, BaSyx 21 and SAP’s i40-aas.
Comment. Industry 4 and the AAS probably have more to say about the discrete process industries (manufacturing) but the DT trope has so much currency in oil and gas that an awareness of how this large operations technology community sees the world may prove useful. What is also interesting is the parallel between an assemblage of digital twins, and the kind of interoperability promised by the IT communities push for ‘microservices’. As OSDU extends towards operations, these two worlds are getting closer!
* More strictly Germany’s Plattform Industrie 4.0.
Accenture has used its proprietary myWizard automation platform to migrate Equinor’s SAP software environment to the Microsoft Azure public cloud. The migration aims to optimize Equinor’s IT costs and support oil and gas and renewable energy operations.
ADNOC and Group 42, an Abu Dhabi-based AI and cloud computing company, have jointly launched AIQ, to develop and commercialize AI products and applications for the oil and gas industry in the UAE.
ConocoPhillips has awarded Aker Solutions a three-year contract extension to an existing framework agreement for work at North Sea fields.
Akselos has deployed a structural Digital Twin, based on its patented RB-FEA technology, for Shell’s Bonga Main FPSO, located 120km Southwest of the Niger Delta in Nigeria.
Alta Resources is to deploy Ambyint’s InfinityPL and SmartStream to improve operational efficiency and ‘drive additional free cash flow’.
China Zhenhua Oil, one of the largest crude oil importers in China, has selected Amphora’s Symphony Commodity Trading and Risk Management (CTRM) solution.
Applied Petroleum Technology and UK-based Fluid Inclusion Consulting are partnering to provide a spectrum of post-well analyses. The JV combines APT’s wellsite gas processing and interpretation software package (Girasol) with FIC’s exclusive access to Fluid Inclusion Technologies’ (FIT) datasets and Rockwash Geodata’s patented sample washing and digitization process.
Archer, a global oil services company, has secured a four-year contract extension with Apache for the provision of platform drilling operations and maintenance services on the Beryl and Forties North Sea fields.
Heide Refinery is implementing Aspen Generic Dynamic Optimization Technology (GDOT) software to help improve refinery margins with real-time, closed-loop, dynamic optimization increasing the flexibility of operations.
National Oilwell Varco has selected Axway to provide an API platform, integrating the Amplify API manager with Microsoft Azure.
BKO Services and Seeq have partnered to deliver advanced analytics solutions to customers in the oil and gas and other industries.
Blockchain specialist Data Gumbo, has secured the first close in its Series B funding round of $4 million from new investor L37, a Bay Area and Houston-based venture capital company, Equinor Ventures, and Saudi Aramco Energy Ventures.
Shell is to deploy Bluware Pickasso, a custom version of the InteractivAI seismic interpretation solution.
BP and Microsoft have joined forces to further digital transformation in energy systems and advance the net-zero carbon goals of both companies.
Wintershall DEA is to deploy Cognite Data Fusion (CDF) to Brage, expanding the Wintershall DEA data hub from Germany to Norway. The partnership will allow the ingestion and contextualization of data from the Brage field to support Wintershall DEA maintenance and production optimization specialists.
Siemens and Atos have signed a five-year extension of their strategic partnership. Atos is to deliver a solution from Cohesity to handle Siemens’ backup and recovery, data storage, and long-term archival.
Snøhvit Unit partners have awarded Aibel a FEED contract with an estimated value of NOK 140 million for modification of the Hammerfest LNG plant in connection with the Snøhvit Future project. The FEED will also cover two sub-projects: onshore compression and Hammerfest LNG electrification.
Ctrl2GO, a provider of predictive analytics and maintenance services, has ‘partnered with Saudi Arabia’ to maintain the country’s strategic assets.
Aberdeen-headquartered Korea National Oil Co. unit, Dana Petroleum, is to use Progressive TSL’s Infor SunSystems oil and gas accounting solution.
Ecopetrol and Doris have signed a five-year agreement for the provision of studies, engineering, and support across offshore oil and gas developments.
eDrilling has been awarded funding by the Norwegian Research Council through the DEMO 2000 Program. DEMO 2000 program seeks to ensure long-term competitiveness in Norway’s oil and gas industry and ‘continued profitable development of the petroleum resources on the Norwegian continental shelf’.
Enbridge has joined ONE Future natural gas coalition, bringing the total number of member companies to 27.
DUG Technology and Equinor have signed a multi-year deal for DUG McCloud, a combo of DUG’s HPCaaS seismic processing, PandI imaging and Insight geoscience software.
Earth Science Analytics has raised a total of 75 m NOK series B funding from Equinor Ventures, Wintershall Dea Technology Ventures, and Sumitomo Corporation. The funds will be used to develop cloud-based geoscience software and expand ESA’s presence across the world.
Neptune Energy and Eserv, a 3D technology specialist, have partnered for the digitization of the Cygnus platform.
Explor, Halliburton and AWS have partnered on seismic data processing in the cloud. Explor provided the 3D seismic dataset, data science and geophysical expertise. Landmark provided the seismic processing and AWS provided the cloud computing resources and solutions architects.
Four new SMEs: Billington Process Technology, Ontopic, Oxford Semantic Technologies and Prediktor have joined Norway’s SIRIUS research establishment.
Halliburton and Honeywell are to ‘co-innovate’ on digital solutions for oil and gas operators.
eDrilling has partnered with ICT to provide well construction services to customers in Qatar.
Total SE has selected Ikon Science’s iPoint data management solution to provide corporate-wide, centralized access to all interpreted laboratory results associated with drilled wells. Total described the software as ‘a key part of Total’s digitalization strategy’.
Implico and Aquarius Software are collaborating on automation and digitalization projects in Latin America. The partners will implement innovative technology, such as the terminal management system OpenTAS, at their customers’ sites.
Juniper Networks, in collaboration with Schweitzer Engineering and Dragos, has launched Converged Industrial Edge; a communications architecture designed to reduce cybersecurity risk, lower operational expense and enhance situational awareness for utilities, oil and gas and other industrial markets.
Kongsberg Digital and ABB Turbocharging have signed a digitalization collaboration on edge data collection and ship engine performance analysis.
Kongsberg Digital is to deploy Kognitwin Energy, its digital twin software to Shell assets globally.
M-Files Corporation and Iron Mountain are extending their partnership agreement to new geographical areas in the Asia region.
Marlink is to provide a special purpose highly resilient satellite network solution for the ROSS project developed by offshore services operator SeaOwl.
P97 Networks has been selected by Mobil Oil New Zealand, an affiliate of ExxonMobil Asia, to enable mobile payments across its Mobil-branded retail network in New Zealand.
Refuelling Solutions has chosen PDI Logistics’ cloud to power its fuel distribution network.
Peloton has teamed up with Sfile, a ‘cognitive computing’ company and provider of AI and ML solutions to synthesize raw unstructured, unnormalized data into continuous flows of ‘qualitative normalized understandings and features’.
The Oil and Gas Technology Council (OGTC) has appointed Petrofac to join its Project Data Analytics Task Force.
PGS has partnered with SurplusHub, the global online portal for selling and buying surplus oil, gas and shipping equipment, and materials, to promote greater environmental responsibility.
As part of PTTEP’s Advanced Production Excellence initiative, PTTEP is to implement Landmark/Halliburton’s DecisionSpace Production Suite in the cloud to improve operations from the subsurface to processing facilities.
Quorum has announced recent software deals with Greylock Energy, Black Bear Transmission, Secure Energy, Brazos Midstream and XcL Midstream.
Santos has selected P2 Energy Solutions to provide a single, comprehensive platform for its production needs across Australia and Asia.
Schlumberger, IBM and Red Hat are teaming up to provide global access to Schlumberger’s Delfi cloud-based E&P software from IBM’s hybrid cloud, built on Red Hat’s OpenShift platform. Schlumberger has ‘committed to the exclusive use of Red Hat OpenShift’. The container platform will enable applications to be deployed across traditional data centers and private and public clouds.
Turkish Petroleum International Company has selected WiNG, Sercel’s wireless land nodal acquisition system, for upcoming seismic acquisition projects in difficult-to-access mountainous areas of Turkey.
Sixgill, an AI IoT platform provider, and Colorado-based CleanConnect.ai have partnered to deliver a suite of AI solutions to the oil and gas industry.
Kuwait Oil Company has awarded Schlumberger a five-year contract, valued at $109 million to deploy ‘best-in-class’ software solutions.
BP is to deploy SparkCognition’s AI predictive analytics to improve reliability and reduce the carbon footprint of its operations. SparkPredict is currently deployed on Atlantis and ETAP to predict equipment failures and process vulnerabilities.
Alibaba and Total (China) Investment have signed a MoU to pursue a strategic collaboration that will leverage their respective resources to drive the digital transformation of Total’s operations in China.
Weatherford has selected INT’s IVAAP framework for its Centro digital well delivery software, advancing its data visualization capabilities.
Husky is to deploy White Whale’s DeepSea AI analytics platform to monitor and optimize its remaining thermal bitumen projects in Saskatchewan.
Uniper has selected Wipro to implement the blockchain-based small-scale LNG trading/fulfillment platform to address the complexity of the European LNG market.
Wood and Aspen Technology have partnered for predictive and prescriptive maintenance.
SRO Solutions, an IBM Business Partner, has completed a major digitization project for MODEC’s offshore operations in Ghana.
Aspen Technology received a staff determination from the Nasdaq for its failure to file its 10-K annual report for the year ended June 2020. The delay was due to accounting errors identified by AspenTech and COVID-19 remote working complicating its reporting. The company has 60 days to submit a plan to regain compliance.
CGG is selling its Multi-Physics Business to Xcalibur Group of South Africa. The deal does not include CGG’s multi-client library.
Covenant Testing Technologies is to merge with Stuart Pressure Control in a debt-free, paper transaction. The combined company will be jointly owned by Catapult Energy Services (an NGP Energy Technology Partners portfolio company) and affiliates of White Deer Energy.
EQT has exited its investment in Altus, a provider of well intervention services, selling its stake members of the Altus management team in a transaction supported by a consortium consisting of three Nordic banks. Financial terms were not disclosed.
Fracker FTS Intl. has entered into a restructuring support agreement with its lenders in the form of a ‘prepackaged’ chapter 11 plan. The deal will deleverage the FTSI’s balance sheet by $437.3 million. Vendors, suppliers and customers ‘remain unaffected by the transaction’.
Wolters Kluwer is acquiring CGE Risk Management Solutions. CGE Risk will integrate WK’s EHS/Risk software group, alongside Enablon and eVision.
Aveva is acquiring OSIsoft, developer of the ubiquitous PI System in a $5bn deal. OSIsoft founder and majority shareholder Pat Kennedy becomes chairman emeritus with a 4% stake in Aveva.
Swiss ‘code as data’ software boutique Olympe SA has raised 2.5 million Swiss Francs from Inter Invest Capital and others. Olympe has developed a ‘supply chain and industry 4.0’ solution targeting the oil and gas vertical, leveraging ‘IoT, digital twins, and blockchain technologies’
Palantir Technologies long-anticipated float was something of a flop. Shares opened at $10 and drifted down to $9.50 over the next few days.
Pelican Energy Partners has acquired measurement while drilling specialist Noralis. Noralis will integrate Pelican’s Gordon Technologies unit.
Reset Energy has acquired Heroes Energy Solutions. The combination will ‘take Reset Energy from a plant design and fabrication company to a diversified business capable of providing turnkey solutions from conceptual design to commissioning’. Heroes’ specialty is the design of modular gas treating skids and process equipment.
Seeq Corp has closed a Series B expansion with an investment from Cisco Investments, along with renewed participation from Saudi Aramco Energy Ventures, Altira Group, Chevron Technology Ventures, Siemens’ Next47 venture group and others. The new cash will ‘further the growth’ of Seek’s business.
Hexagon AB has acquired 3D Lidar security surveillance specialist TacticaWare. TacticaWare’s Accur8vision flagship delivers 3D situational awareness of an intruder’s location, size and movement in a 3D digital reality view.
Malaysian Iraya Energies has received Series A funding in a round led by a ‘major energy partner’ (Petronas is a client). Irya’s ElasticDocs flagship applies machine learning to large bodies of, inter alia, geoscience documentation.
Siemens has floated its Siemens Energy unit on the Frankfurt Stock Exchange. Siemens AG retained a 35% stake. Shares opened at €22,01 and drifted slightly down over the next few weeks to around €21.20.
Schlumberger has signed over its US and Canadian onshore hydraulic fracturing business to Liberty Oilfield Services in exchange for a 37% share in the combined company.
Schlumberger New Energy and Thermal Energy Partners are to form GeoFrame (sic!) Energy, combining Schlumberger’s subsurface and drilling expertise with Thermal Energy Partners’ experience in geothermal power project development. The new company’s first project is a 10-MW geothermal power project on the Caribbean island of Nevis.
Total, formerly Total Société Anonyme has transferred to become a European company and is now Total Société Européenne, a.k.a. Total SE. Comment on the LaTribune.fr website has it that the SE tag heralds the adoption of an EU legal status which will allow Total to operate more easily across the EU. CEO Patrick Pouyanné was quoted as saying, ‘legally this doesn’t change much, but I think it is a good thing that large EU groups demonstrate their faith in Europe’.
The Calgary-based Professional Petroleum Data Management Association, PPDM, has published a 12-page white paper to explain its relationship to OSDU, The Open Group’s subsurface data universe. ‘Collaboration with OSDU’ (CwO) sets out to explain PPDM’s activity and how its members could support the development of the OSDU platform. The paper was not ‘formally endorsed’ by the OSDU Forum.
CwO recasts PPDM’s various upstream data initiatives with modern terminology. PPDM’s data object definitions are described as structured according to the technical requirements of a technology specific container. The PPDM flagship database is now ‘embedded in a relational database container’. PPDM’s efforts to standardize upstream terminology, the ‘What is a …’ series of publications are now grouped in IPDS, the ‘International petroleum data standards and best practices’.
What has been agreed so far is that OSDU is to leverage the PPDM rules library to increase data fidelity and OSDU is to submit changes to the library back to PPDM. OSDU has also agreed to leverage the knowledge contained in the PPDM Data Model and in its reference lists, ‘incorporating PPDM content by reference’. PPDM is also hopeful that OSDU will formally include its best practice guidelines in its data platform. Although as of OSDU Release 3, these principles are not explicitly implemented since well identifications specs are deemed to be ‘implementation-based’.
PPDM is also in discussion with OSDU on a training program. PPDM proposes to develop educational material that will be harmonized with existing PPDM activity to support emerging OSDU-centric roles. The program will leverage PPDM’s Chartered petroleum data analyst certification. PPDM proposes that such material will be maintained on behalf of The Open Group and used in the OSDU accreditation process.
Schlumberger has published a Concise guide to the Open Group Open Subsurface Data Universe (OSDU) Forum with, notably, details of the upcoming R3 release, said to be the first release available for operational use by the wider industry. In August 2019, Schlumberger contributed the data ecosystem developed for its Delfi cognitive E&P environment to the OSDU Forum as open source code.
The 12-page Guide likens the Delfi ‘openness’ to that of Ocean and Petrel, conflating ‘open’ with the existence of an API, rather as Landmark did with OpenWorks. This raises a couple of questions for would be OSDU users. One, can you run the OSDU data infrastructure as contributed by SLB without deploying and paying for Delfi? Secondly, if you can do this, how practical is it?
The Guide is somewhat unclear on this*. ‘The OSDU data platform will be a reference architecture and a reference implementation for cloud-native subsurface data platforms. It will not in itself be a truly production-ready subsurface data platform. To use OSDU, either your organization or a third-party vendor can develop the proprietary applications that turn it into a working E&P data platform. If your organization has a deployment of the Delfi environment, the OSDU data platform is already integrated as part of your Delfi solutions’. Clearly it will be easier to use OSDU if you are already a Schlumberger/Delfi shop.
* We did ask Schlumberger for clarification, but none was forthcoming.
Shell Oil Co. has announced ‘Studio X, an ‘open innovation studio’ providing ‘energy innovators’ with software tools, on-demand work, prize-winning challenges and mentorship opportunities. Shell has seeded Studio-X with three of its own software products that are set to ‘define the future of exploration’.
Xeek is a portal for crowdsourcing geoscience ideas that sets out to bring data scientists and geoscientists together to work on complex problems. XCover is a global talent network for virtual exploration projects that provides state-of-the-art virtual workstations to geoscience specialists for remote, collaborative project execution. A SixLab incubator provides exploration entrepreneurs with ‘world-class mentorship and resources to help shape the future of energy’.
Despite its virtual credentials, SixLab actually has a (rather unprepossessing) physical studio in Austin, TX, although the pandemic means that current tenants are working from home. These include David Thul whose GeoLumina startup is leveraging Shell’s Studio-X facilities to ‘find oil and gas cheaply and produce it with high cash margins’ using its own open source, AI/image processing algorithms. Boston Consulting Group’s Mauhan Zonoozy is acting MD for Studio-X.
Lloyd’s Register Foundation has published a 68-page, free report on the Industrial Internet of Things (IIoT) cyber-risk landscape. LR’s Report Series: No. 2020.1 covers current and future approaches to IIoT operational security and risk management. The report does more enumerating of potential problems that offering the ‘practical next steps’ promised in the introduction, omitting IIoT protocol considerations such as OPC-UA or MQTT and their security. This approach may be useful for managers of a brigade of ‘hands-on’ security engineers (perhaps provided by LR). When they are through locking down today’s IIoT, a manager can then raise the issue of quantum computing, presented as having ‘the most important potential for disruption’.
A new Special Publication SP 800-207 from the US NIST covers zero trust architectures. Zero trust is a cyber security paradigm that ‘moves defenses from static, network-based perimeters to focus on users, assets, and resources’. The 60-page report has a US Federal government focus but covers issues such as multi-cloud security and joint venture data security. NIST Special Publication SP 1800-11 is a 450 page (!) treatise on Recovering from ransomware and other destructive events. Probably worth reading before any bad stuff happens.
The UK Oil and Gas Technology Council (OGTC) has commissioned a report from Baringa on Cyber security in the UK oil and gas industry. The 37-page study describes significant cyber incidents in history that have befallen oil companies and NOCs which remain ‘likely targets for similar attacks in the future’. Following regulatory pressure, operators and suppliers are making ‘significant investment in cyber security initiatives’. The study investigates cyber security in the supply chain and the thorny issue of ‘IT/OT ‘convergence’. Cyber risk management stood out as a priority. Here, ‘security is struggling to keep pace with business initiatives aimed at delivering new digital technologies’. Oil and Gas has been disrupted by significant digital transformation with many businesses planning and executing large-scale and ambitious change agendas. These bring new risks, which are challenging how cyber security is currently managed across the industry. A ‘multifaceted, collaborative approach to breaking down and overcoming these challenges is required’. Baringa found that security specialists are often considered to be scaremongers and their language is often too technical and unclear. This leaves business leaders disinterested in cyber security and uninformed on relevance to its operations. Despite the potential risks to health and safety, such events are unlikely, and may be dismissed with a ‘so what’ from senior leaders. Cyber security regulation in the UK Oil and Gas industry has been ‘uplifted’ with the introduction of the EU-derived NIS Directive and the Cyber Assessment Framework (CAF). Non-compliance with the Directive may result in a fine of up to £17 million.
Acronis Cyber Backup SCS Hardened Edition is a disk image backup solution for safeguarding sensitive data in air-gapped, ‘no internet’ environments. Acronis Cyber Backup features FIPS-validated encryption and RSA key generation, as well as an Intel-pioneered, hardware-based random number generation method to ensure complete protection.
Asigra has announced cloud-based backup with deep multi-factor authentication. Deep MFA policy settings and controls prevent backup data deletions or malicious encryption caused by malware (including ransomware), by criminal organizations, or human error. Deep MFA immutable retention prevents malware or unauthorized actors from deleting, modifying, or encrypting data in storage. More from Asigra.
Commodities trading group Noble Group has deployed Alsid’s Active Directory (AD) security solution to protect and harden its AD and entire IT infrastructure. Cyber ‘hygiene’ has been improved with the removal of thousands of ‘forgotten’ organizational units and accounts inside the domain. Hidden AD admin accounts are a major security concern because once compromised, they allow cybercriminals full access to an organization’s systems.
Bayshore Networks has rolled-out SCADAwall, a new hardware device that provides safe, non-routable, one-way data transfer from trusted sources in-plant to untrusted destinations, such as corporate IT and other outside business destinations. A ‘data diode’ physically separates the plant from the risk of internet exposure or malicious activity while allowing critical plant data to flow into corporate business systems. SCADAwall is a low cost, rack mounted 1 gigabit/sec unit providing content-inspection and policy enforcement for data in-transit
Chevron has selected SecurityGate.io for its operational technology cyber security. SecurityGate.io is to replace Chevron’s manual, spreadsheet cybersecurity practices with ‘scalable, digitized processes’.
New XDR and Response Automation capabilities are components of Cynet 360 V4.0 autonomous breach prevention platform. Version 4.0 of Cynet 360 also includes an Incident View feature to help security administrators reduce response times ‘to minutes instead of hours or days’.
RigNet’s Cyphre. patented encryption technology now operates at 5x the speed and is optimized for harsh edge environments. Cyphre delivers ‘military-grade’ cybersecurity to protect against cache-memory side channel attacks such as Heartbleed, Spectre, and Meltdown. The new capabilities are based on Cyphre’s recently awarded US Patent (No. 10,623,382) for an innovative transport layer security protection that increases end-user security by keeping session keys out of memory and preventing them from being stolen in a cache attack.
Carnegie Mellon’s Software Engineering Institute has released the source code for Kalki, a ‘software-defined’ IoT security platform. Kalki allows IoT devices that are not fully trusted to be integrated into networked systems, providing new capabilities for keeping networks and physical assets safe. IoT device vulnerabilities have enabled many recent attacks such as the Mirai botnet and the Ripple20. Many IoT devices now added to SCADA systems have little or no onboard security. Kalki fixes this with network -level security and fine-grained monitoring with ‘µmboxes’ (micro-m-boxes) that provide virtualize security tuned to a device’s specific vulnerabilities, traffic and sensors. Download the Kalki source code here. And watch the video.
The SEI’s CERT/CC unit has also announced ‘Vince’, the Vulnerability Information and Coordination Environment, a web platform for collaborative software vulnerability reporting. Vince replaces the SEI’s 20 year-old legacy email reporting system. Vince is now live.
The Industrial Internet Consortium has published a white paper on software trustworthiness best practices. The 45-page publication covers safety, security, privacy and reliability of IIoT software and provides ‘practical and actionable’ best practices for recognizing, addressing, managing and mitigating risks and their sources.
Surge Engineering has joined the ISA Global Cyber Security Alliance, reflecting a ‘stronger focus’ on cybersecurity for its scada systems engineering capabilities. The ISA unit builds on the UN-endorsed ISA/IEC 62443 cyber security standard. Other oil country GCSA members include Honeywell, Rockwell Automation and Petronas. More from Surge.
A series of upcoming webinars co-hosted by ISA and Saudi Aramco will provide an overview of the ISA/IEC 62443 series of standards, ISASecure certifications, and end-user and supplier perspectives on OT cyber security. Presentations from Aramco, Chevron, ExxonMobil, and others. More from ISA.
A study, by EY on behalf of PetroLMI* finds that the adoption of automation and AI in the oil and gas industry will change activities and displace jobs. Technologies such as robotic process automation, artificial intelligence, natural language processing and machine learning could gradually eliminate 30% of oil and gas jobs by 2040. Unsurprisingly, technical jobs are ‘twice as likely to be automated over leadership skills’.
EY broke down the sector into some 124 job descriptions which were then analyzed in terms of the probability of their automation. Drilling and operations came out as having the highest potential for automation. Land, ESH and IT came last – albeit all three with significant potential for economies.
The 20-page report forecasts that jobs in drilling will be down 65% by 2040, geology by 50% and IT by 45%. One driver for the cuts was a finding in a 2017 EY study of oil and gas in the US that found that ‘62% of Generation Z and 44% of Millennials are not attracted to careers in the industry’ and which concluded that the use of advanced technologies could help increase the appeal to the upcoming tech-savvy workforce and to fill talent gaps associated with the reduced appeal.
EY Canada’s Lance Mortlock said, ‘The prospect of a jobless recovery means that companies will be looking to fill roles and add capabilities through technology and automation to increase optimization and reduce costs even further. While many companies had already begun this digital transformation, the pandemic created a sense of urgency to accelerate technology adoption.’
Writing in the Financial Times, Myles McCormick quoted Rystad Energy’s Matthew Fitzsimmons as saying, ‘The crash has kickstarted a lot of these digitization initiatives. We’re going to see a greater or faster adoption of some of these digital technologies, which will lead to some of these [traditional] jobs not coming back.’ About 107,000 jobs were slashed in the US oil, gas and petrochemicals sector between March and August, according to Deloitte. Shell is planning some 9,000 jobs in ‘Project Reshape’. BP 10,000 with office-based workers expected to bear the brunt of the redundancies. Schlumberger announced a monster 21,000 cut earlier this year.
Deloitte’s 32-page analysis, ‘The future of work in oil, gas and chemicals’ sees the industry as in a ‘great compression’ where companies’ room to maneuver is restricted. In the US, oil and gas employs close to 1.5 million people. The downturn means that retaining employees may be difficult as oil and gas employees now have ‘fungible digital skills’ and may migrate to other industries (technology and consulting firms, digital solution providers) where the prospects of career growth may seem brighter. A conclusion that marks a refreshing change from the frequent pretense that oil and gas folks are digital dunces!
Deloitte draws one fairly obvious conclusion, the return or otherwise of jobs to oil and gas will be mostly determined by the future oil price. A pessimistic $35 oil price for 2021 will see a measly 3% of jobs returning. An optimistic $55/bbl scenario sees 76% of jobs returning.
All three reports imply that the pandemic is forcing more digital transformation. We learn elsewhere in this issue (2020 GO Digital) that the crisis has caused a degree of digital retrenchment. In any event, if the economy gets back into shape (which is possible) and if the energy transition takes longer than folks expect (probable) then oil and gas consumption will rise. There will be another boom and folks, including geos, engineers and data scientists (maybe), will all be back in business till the next upset. But for now, one interesting question regarding the job cuts is, who is actually getting fired? If you were the leader of a large oil company and had to arbitrate between say, a reservoir engineer and a data scientist, which would you choose to ‘let go’?
* Petroleum Labour Market Information is a division of Energy Safety Canada.
In a rather uninformative release, the Government of Alberta reports that its public inquiry into ‘foreign funding of anti-Alberta energy campaigns’, announced in 2019, has now ‘established a framework for conducting the Inquiry’s engagement with specific parties about the information gathered to date’. A more detailed explanation of the sensitive situation is available in a CBC online item which casts the $3.5-million inquiry as a ‘massive waste of money’.
The Norwegian Petroleum Directorate (NPD) is re-tendering the contract for its Diskos online data store. Diskos 2.0 involves separate contracts, one for the management of seismic, well and production data and another for the Trade module. Implementation involves the transfer of some 13 petabytes of data and testing which will likely take all of 2021. Go-live of Diskos 2.0 is set for 2022. The current Diskos operation for seismic, well and production data (run by CGG) will continue until the switch, with 24/7 availability. This will also be the case for the Trade module, that is managed by Kadme. More from the NPD.
NPD is now requiring annual status report (ASR) for fields in production to be submitted in Excel (they were previously in Word) in what is described as ‘a first step towards full digitalization of ASR reporting’. Reporting data in Excel ‘will make it easier to compare data for purposes such as field and portfolio analyses by the authorities and the licensees’. ASR is an important basis for the authorities’ assessment of applications for production permits. Checkout the new ASR guidelines.
IOGP Report 373-03 - Contract Area Descriptions provides a comprehensive guide to the delineation of license block boundaries. In the past, boundaries have often been inadequately described by leading to overlapping licenses or unlicensed slivers between adjacent blocks. The Note targets regulators, company negotiators with ‘limited geodetic awareness’ and geomatics professionals.
The North Dakota Department of Mineral Resources has reacted to rising oil and gas unemployment in the State with a cunning plan. To keep furloughed and fired workers in the State, the Oil and Gas Division is working to confiscate and bid out the plugging of abandoned wells. With 200-400 older uneconomical wells to be plugged, the state expects the plugging work needed will keep close to 600 skilled workers employed and busy over the next 6 months. A $66 million budget has been allocated for the program from CARES Act fund. More from the DMR Director’s Cut video.
The DMR is in the process of updating its NorthSTAR database, with Release 3, scheduled to arrive early 2021. The new version will include a ‘Data Tier’ query tool to extract data from NorthSTAR for viewing and custom reporting. The update will also allow Oil and Gas Division inspectors to create, process and monitoring inspection of wells.
The Railroad Commission of Texas has released its Fiscal Year 2021 priorities in the Oil and Gas Monitoring and Enforcement Plan. The agency plans to ensure that all wells in the state are inspected at least once every five years. Inspections will benefit from technological advances including a drone program for emergency response and surveillance. The RRC’s transparency initiative progresses with an online portal for hearings.
The RRC has also released another round of data on its website with visualizations from its site remediation and abandoned mine land programs. The RRC’s CASES (case administration service electronic system) portal has been enhanced with dockets from the agency’s Oil and Gas Division, Oversight and Safety Division, and the Surface Mining and Reclamation Division. The system now allows administrative penalty payments to be made online. Visit the RRC CASES.
In an epitome of a ‘transformation’, digital or not, Equinor has announced a ‘world’s first’ logistics operation involving flying a drone from its Mongstad base to the Troll A platform in the North Sea. But the cherry on the transformative cake was the drone’s cargo, a 3D-printed ‘diesel nozzle holder’ for the platform’s lifeboat system! The part was out of stock, so it was re-designed and modelled in 3D before printing in Inconel 718 on a 3D metal printer.
A Camcopter S-100 from Schiebel made the 80km flight in around an hour at an altitude of approx. 5000 feet. The 4 meter long drone can carry a load of up to 50kg. Flight operator was Nordic Unmanned (sic).
Equinor’s Arne Sigve Nylund sees a huge potential in drone technology ‘that could transform the way we operate, both under and above the sea surface. Drones could improve safety, boost production efficiency and contribute to lower CO2 emissions’. Our congratulations to Equinor’s PR folks for bringing us this spectacular digital transformation demonstration. All that is missing is a reference to blockchain!