Shell, Baker Hughes and Microsoft have launched the Open AI Energy Initiative (OAI), described as a ‘first-of-its-kind’ open ecosystem of artificial intelligence (AI)-based solutions for the energy and process industries. OAI will provide a framework for operators, service providers, OEMs and ISVs to offer interoperable, AI and physics-based solutions for plant monitoring and optimization. OpenAI builds on the BHC3 AI Suite running in the Microsoft Azure cloud.
BHC3 is a joint venture between Baker Hughes and C3.ai whose CEO Tom Siebel commented ‘This initiative is about combining the efforts of global leaders to accelerate the digital transformation of the energy industry to new, safe, and secure energy and to ensure climate security’.
OAI has been seeded with some reliability solutions developed by Shell in-house. These cover predictive maintenance for control valves, rotating equipment and electrical submersible pumps. The Shell-developed modules will integrate BHC3 Applications including BHC3 Reliability, Production Optimization and Inventory Optimization.
Back in 2018, Shell selected C3 IoT with Microsoft Azure as its AI platform of choice for operations. In 2019, they were joined by Baker Hughes in the formation of an alliance to ‘accelerate the digital transformation of the energy industry’.
The latest addition to the BHC3 software lineup is PSO, a Production Schedule Optimization app, described as ‘an enterprise AI application for industrial demand planning and manufacturing production scheduling’. The downstream app has been trialed with a large hydrocarbon processor that, in a 16-week trial, ‘achieved a 20% improvement in demand forecasting accuracy by automatically generating optimal production schedules’.
We were curious to know if OAI had any relationship with the
Shell-backed OSDU open subsurface data universe. Especially because an
OSDU scope-shift into the plant/process arena was mooted last year as
the ‘Open Energy Data Platform’. We
pinged Shell whose spokesperson replied, ‘There is no connection
between OSDU Data Platform and the Open AI initiative; OSDU is the Open
Energy Data Platform supporting all Energy Sources and Open AI is
targeted at the developments of AI (such as predictive) applications
for assets, etc.’
More on the Open AI Energy Initiative from BakerHughes/C3.
The enthusiasm surrounding the launch of an ‘operational’ Release 3 of The Open Group’s Open Subsurface Data Universe has clearly piqued the Society of Petroleum Engineers which has hastily organized a four half-day Virtual Workshop. The workshop is to ‘serve as a platform for SPE to reflect on the role it should play to help and support open subsurface projects in the future’. OSDU, The Open Group’s Open Subsurface Data Universe is conspicuously absent from the Workshop announcement and provisional agenda.
In a Damascene conversion, the SPE now sees open source data and ecosystems as ‘enabling a faster pace of innovation’. Some open source proselytizing appears to have been cut and pasted from the OSDU hymnal as in ‘today, subsurface modeling relies predominantly on proprietary solutions’, ‘data types and file formats greatly vary and change over time … infrastructure to manage data [ is needed to benefit from ] data and software exchange [and to] ensure and foster an environment that enables users to easily contribute’.
Notwithstanding the ‘me-too’ nature of the SPE’s initiative and the obvious FOMO*, the workshop is well structured and appears to address some of the more delicate facets of open source software in industry, notably project sustainability, legal aspects of open source development and licensing, and business models, even envisaging how a ‘transition to an open environment could be achieved, leading to a successful business environment for all stakeholders’.
The SPE workshop program is billed as a platform for SPE to reflect on the role it should play to help and support open subsurface projects in the future. If you want to take part in the ‘reflection’, registration for the workshop costs $495. We offered (for free!) the services of Oil IT Journal as scribe to the event but were told, ‘SPE workshops do not allow press reports. In order to stimulate frank discussion, no proceedings are published and members of the press are not invited to attend’. Well, we can swallow our pride regarding the ‘no press’ snub. But should an event run by a self-selected coterie with no published proceedings determine what role the SPE will play for its broader membership?
Curiously, despite the 199 corporate members of OSDU, few or none are cited as presenting at the SPE Workshop. It looks like there is a fork in oil and gas open source movement even before it has begun!
* Fear of missing out.
I’m not usually short of ideas for editorials, this is my 258th or thereabouts. I had a few of my usual rants about this and that in the pipeline before I realized that the best editorial fodder I could dream up comes from this action-packed edition of Oil IT Journal. There is just so much going on in oil and gas IT. Where to begin?
OSDU, the open subsurface data universe, has just released ‘Mercury’, its first ‘operational’ edition. That is great but what is spectacular is the amount of attention OSDU has gotten with plethoric announcements from software houses teaming on various OSDU-based platforms from the cloud providers. We have 78 mentions of OSDU in this issue.
A birds-eye view of what has happened is rather perplexing. Not long ago, a suggestion that the whole industry would ‘collaborate’ around Schlumberger’s Delfi data infrastructure would have met with some skepticism. One might have imagined that Halliburton for instance would push back. But no, Halliburton is offering DecisionSpace 365 ‘on OSDU’ and a ‘publicly available’ OSDU reference implementation in its Open Earth Community. Most all the vendors we have come across have backed OSDU with unbridled enthusiasm. A feeding frenzy and a software supercycle? With regards to Schlumberger’s generous donation, it is more ‘don’t look a gift horse in the mouth’ than ‘beware the Greeks … ’!
One facet of OSDU which may or may not have come to the attention of the holders of the purse strings is the requirement for a massive data migration effort into the new cloud environment. Data under consideration includes … ‘ProSource, R5000/OpenWorks, OSIsoftPI, IHS and more’. This is conceivable for software in the geoscience area. But a mass migration of data in OSIsoft (now Aveva) and/or IHS Markit is a bit harder to imagine. The size of the data migration task is also difficult to imagine in the context of a cash-strapped industry.
OSDU, like NASA’s Martian helicopter, has kicked up some dust on planet subsurface. Notably, chez the Society of Petroleum Engineers whose planned workshop on open source software is, at the time of writing, an OSDU-free zone. Shell’s own recently-announced ‘OpenAI’ collaboration with Baker Hughes/C3 is likewise OSDU-free. Another significant open source movement in geosciences, the Software Underground is currently holding its annual Transform event. We understand that there has been a lot of discussion of OSDU but the outcome is, as far as we can tell, that the SU will proceed sans OSDU. The two bodies have a very different approach to ‘open’ with SU’s freewheeling hackathons and OSDU’s members-first approach.
We also report from the other The Open Group/Shell-backed open initiative the Open Footprint Forum. This now has teamed with the venerable PIDX oil country e-business standards body. In the TOG-hosted event, we learned that the OFF APIs will open up emissions data, but that companies can deploy their own implementations and can ‘hide their own data from competitors’. This may be problematic in the context of transparent open data.
It’s interesting to reflect on the two memes of ‘open data’ and ‘open standards’. Objectively, IT standards are orthogonal to actual reporting, although the two are often conflated. What’s key in reporting is a genuine need (due to regulations) or desire (due to self-motivated transparency) to report! It does not matter a jot whether this is XBRL, XML, JSON or what. A corporate logic that proceeds from a perceived obligation (as opposed to a wish) to report, through a few committees and on to an ‘IT standard’, is really just kicking the can down the road.
But the Great Conflation, as admirably exposed in last year’s EAGE, is that of just about any IT development with ‘green’. At a recent OSDU event, Reuters’ John Nixon’s doggedly questioned presenters with ‘and how is OSDU going to help with the energy transition?’ Answer was there none. Elsewhere, quantum computing is conflated with GHG mitigation. Heck, if the world is to wait for QC to mature enough to ‘solve’ global warming we are really in trouble. But the conflation is enough to allow QC afficionados to qualify for taxpayers’ ‘green’ money.
No doubt financial considerations are behind Total’s decision to dispose of its Alternative Subsurface Data facility, a test center in Pau, France for rock physics and logging tool calibration. Meanwhile, Total continues with its considerable investment in data science. Perhaps the connection is just a figment of my imagination. But it reminds me of the pride (?) with which the seismic contractors took in going ‘asset light’ a couple of years ago. Good for the balance sheet. But ‘data science’ needs real data and, as it is now emerging, a helping hand from ‘real science’.
As folks in the OSDU constellation puzzle over what Schlumberger has to gain from opening up its data infrastructure, I draw your attention to a statement made by Schlumberger’s Steve Freeman at the 2020 EAGE who said, ‘If you need a head of IT then the service companies have failed you’. This does not need too much unpicking. Schlumberger wants to cut out the corporate IT middlemen. Software-as-a-service will be served to end geoscience users, armed with their Python notebooks for some added data futzing. SaaS all sounds very 1999/dot-com boomish, which got me rereading some back editions of Oil IT Journal, always a worthwhile exercise, though I say it myself. I found this evidence of ‘prior art’ from Schlumberger in our February 1997 issue …
As they say over here, ‘plus ça change, plus c’est la même chose*’.
* The more things change, the more they stay the same.Speaking at the launch of The Open Group’ launch of the OSDU ‘Mercury’ edition (see also our last issue), Phillip Jong (OSDU & Shell) traced the evolution of OSDU from 2018, when Shell seeded the initiative with its in-house developed subsurface data universe (SDU). In 2019, Schlumberger threw OpenDES, an open source edition of its Delfi data infrastructure, into the mix. In the same year, cloud-based editions of the platform were announced for Azure, AWS, GCP and IBM. Since OSDU kicked-off, some 200 ‘active’ software developers have spent a million plus minutes on Webex and technical subcommittees. The Open Group has contributed branding, support, recruiting and marketing, ‘all during COVID’.
James Moran from BP’s ‘Dataworx’ unit placed OSDU in the context of BP’s 40-year journey across different IT systems and ‘deep but somewhat siloed’ technologies. The last five years have seen a shift away from BP data centers with a move to ‘cloud first’ deployment. With OSDU, BP hopes to reduce deployment costs. BP has ‘10 to 30’ people working on its Azure system of record effort and the intent is to leverage OSDU here alongside a ‘twin cloud’ AWS component. OSDU is also germane to the ‘new BP’ and its vaunted transformation to net zero ‘by 2050 or sooner’. A mooted OSDU ‘R3++’ will embrace new energy, digital twin and emissions.
Shell’s Johan Krebbers presented the Mercury release, now supported in the AWS, Azure, Google and IBM clouds. OSDU R3 is secure and extensible, providing microservices for authentication and authorization – ‘who gets access to what’. Web domain APIs expose well data and seismics, Bluware’s OpenVDS is a core component. OSDU supports both platform as a service and software as a service deployment to suit large and small operators. The code is ‘all open source on Gitlab’.
Mercury includes orchestration and workflow services to ingest data, extract metadata and tagging according to company policy and data residency requirements. Some 100 data types are ready to roll. Data ingestion implies a major data migration project for operators, moving data from the deep legacy silos through Apache AirFlow and DAG. Also new in R3 are Domain Data management services (DDMS), a means of adding new data sources via a schema service API. Support for Schlumberger’s OpenZGY compressed seismic data format was also announced.
One of Krebbers’ slides showed a range of data sources (ProSource, R5000/OpenWorks, OSIsoftPI, IHS …) and an ingestion pipeline. We asked if the intent was for a big, one-off data migration from these major industry data sources that would then be retired? Krebbers replied ‘These are now legacy. All OSDU data sits on top of cloud services. Migration from Prosource is envisaged, OpenWorks is a very different story*. Ideally, this would be a onetime migration to an OSDU SOR in the cloud. But this will take time, and will likely involve a hybrid environment’. The ‘daunting’ migration problem ‘will be solved by the power of the Forum, which will provide these services’.
Krebbers then fleshed out the OSDU roadmap for the next couple of years. An R3+ release sees the universe expanding into drilling and production. R3++ (planned for 2022) covers further expansion to real time data across Solar, Windfarms, Hydrogen, CCUS and Geothermal domains in what will become the ‘Open Energy Platform’ (you read it here Oil IT Journal first!). OSDU is also to collaborate, notably with the IOGP on engineering digital twins. OSDU is also to be an ‘optimized data platform for AI work*’.
The R3 ‘operational’ release of OSDU has sparked off a host of offerings centered on the open platform. Many application vendors presented briefly during the launch event. Others participated in cloud vendors OSDU platforms – Amazon and IBM. There are enough vendor announcements to fill a whole issue of Oil IT Journal so we will be brief. Twenty plus partners and customers participated in the launch of Amazon’s OSDU platform in a Harts Energy webinar. IBM has also launched an OSDU platform leveraging its Cloud Pack for Data Industries and Red Hat’s Open Shift containers . Microsoft’s OSDU offerings transit through partners, Schlumberger and, as we announced in our last issue, Cognite. At a Reuters event, Schlumberger explained however that alongside its Microsoft Azure preference ‘for public cloud work’, it was collaborating with IBM/RedHat ‘for private clouds’.
* But see the recent LinkedIn post announcing DecisionSpace365 as running on OSDU.
** However not all Shell’s AI enthusiasts have opted for the OSDU route. See the article on the Shell. BakerHughes/C3 OpenAI initiative elsewhere in this issue.
The first International Rock Imaging Summit (IRIS) was held as a virtual event in November 2020. Digital rock imaging involves 3D computer tomography scans of cores, plugs or cuttings that are used in sedimentological analysis and to support petroleum engineering studies. Image segmentation allows for the identification of grains and pores. Other techniques such scanning electron microscopy are used to identify individual mineral grains. These measurements are also used as input for machine learning-based techniques to automate interpretation.
Carlos Alberto Santos Molina of Repsol Technology Lab showed how reservoir properties are obtained from different types of rock sample. Digital rock physics can be performed on ‘cheap and ubiquitous’ cuttings, providing less spatially-biased sampling across the whole reservoir than a single core. Multi-physics imaging across various measurement types is analyzed with PCA* and clustering. The ‘cheap and powerful’ approach has some drawbacks, notably cuttings quality. ‘Rock flour won’t work!’
* Principle component analysis
Leonardo Salazar presented Thermo Fisher Scientific’s (TFS) new ‘MAPS’ mineralogy software that provides ‘multi-scale and multi-modal’ data insights. New ‘Mixel’ smart X-EDS* aids mineral identification. MAPS produces detailed petrology statistics, elemental assay and particle size. Combined with the Apero scanning electron microscope, the solution is claimed to be the ‘platform of the future’ for automated mineralogy.
* X-ray energy dispersive spectrum analysis.
Andrew Fogden (Wintershall-DEA) provided a comprehensive review of various enhanced oil recovery (EOR) applications. These combine the results of digital core analysis with lab analysis of wettability and other characteristics required for two-phase flow studies. Fogden showed how EOR assessments have been made for different reservoirs: low salinity tertiary waterfloods, carbonates and gas condensate in sandstone. In the latter, digital analysis of fragmentary cuttings and sidewall cores were upscaled for use in the reservoir fluid flow model to evaluate the risk of condensate banking. The outcome was a go-ahead decision on the field’s development.
Thermo Fisher Scientific’s Gwenolé Tallec showed how dual energy computed tomography (DECT) can provide interactive visualization of an entire well. DECT analysis is provided in TFS’ PerGeos digital rock analysis toolset. Voxel by voxel density and atomic number measures provide a rock-typed image of a complete core. This can be used for plug-site selection for SCAL*. A whole core CT-derived ‘heterogeneity log’ can be correlated with LWD measurements. Other functionality includes automatic heterogeneity, dip and strike logs. A supervised machine learning approach is used for facies analysis. The approach has been presented in SPE Paper SPE-197628-MS, ‘Improved reservoir characterization through rapid visualization and analysis of multiscale image data using a digital core analysis ecosystem’, co-authored with Saudi Arabia’s KAUST R&D organization.
* Special core analysis.
Christian Hinz of Math2Market showed the impact of different segmentation methods on digital rock analysis results, including M2M’s trainable, deep learning-based segmentation GeoDICT. DRP is used to predict electrical, flow and mechanical properties. Good image segmentation is the key to accuracy. GeoDict 2021 offers a variety of segmentation methods including deep learning, along with modules for saturation and flow.
Federico Gamba (also with Thermo Fisher Scientific) provided more examples of PerGeos’ supervised ML, applied to facies detection over the whole well. PerGeos works across cores, cuttings, thin sections and slabs. These ‘ground truth’ data sources are merged into a database and used to train a rock model and automate facies analysis. Predicted facies compare well with lab measurements. PerGeos is a complete Python deep learning environment built on standard Python packages such as NumPy, Scikit-Learn and TensorFlow. More on artificial intelligence tools for Amira-Avizo Software and PerGeos Software.
Jun Luo (iRock Technologies) presented on image segmentation with a
U-Net* deep neural network. U-Net image segmentation separates
constituent parts such as matrix, pores and pyrite across a range of CT
scans of different lithologies. With sufficient training datasets,
U-Net compares well with human-interpreted results. iRock Technologies
uses industry-standard analytical tools and its own ‘RockDNA’ pore
network modeling technology. More from IRT.
* A convolutional neural network originally developed for medical images.
Using data from the University of Texas at Austin’s Digital Rocks
Portal, Matthew Andrew (Zeiss) showed
how multivariate statistical regression and reinforcement learning can
be used to predict permeability from core images. The approach has been
published in the open access E3S Web of Conferences.
Mohamed Soufiane Jouini of Khalifa University has used a 3D printer to ‘validate results’ obtained from a machine learning-based analysis of CT core scans. A 3D-Systems’ ProJet MJP 3600 with a 30 micron resolution printed virtual copies of the cores from the scanned/segmented data. Porosity measurements on the printed cores compared well with the real thing. Permeability less so.
We asked TFS for more on the MAPS/PerGeos distinction and learned
that ‘Maps software is embedded with many TFS microscopes and is used
mainly for 2D data acquisition. For more sophisticated 3D and image
processing, PerGeos adds 2D/3D multi-scale multi-modality image
analysis. More from Thermo Fisher Scientific.
The IRIS virtual event was hosted by UAE-based EXPROBIZ FZE located in Ajman, U.A.E. Rock Imaging 2021 is scheduled to be held in November.
Speaking at the 2021 EU Upstream Digital Transformation virtual event, Philip Neri and Dave Wallis presented Energistics’ standards portfolio for Industry 4.0. Today’s multidisciplinary geoscience specialists like to use best in class technology from different vendors, possibly augmented with in-house R&D. Today’s current complex, ever evolving data ecosystem can be an obstacle. Energistics focuses on the unambiguous transfer of grids, well and seismic data, along with reliably conserved units of space and time. EPC, the Energistics packaging convention allows for quality-assured transfer of files, data schema and metadata along with an UID for each object. The application-agnostic format lies under Energistics’ flagship protocols WITSML, RESQML and ProdML.
EPC is built on the open packaging conventions (ex Microsoft) with a zipped XML file for small data objects plus an added HDF5 binary file for vectors and arrays. EPC can also be used as an application-neutral archival format. The addition of Energistics transfer protocol (ETP), a web socket TCP/IP stack, allows for the discovery and connection of data objects without intermediate files. This is said to be the preferred option for transfer between apps on the same machine.
In the Q&A, Wallis described OSDU as ‘a fast moving, growing association’. Energistics was the first non-oil company member. Energistics will be a component of the infrastructure for managing data transfer to and from OSDU. Following the event, we had a short email exchange with Energistics resulting in the following Q&A.
Oil ITJ – We assume that ETP is still mainstream for Energistics. Have you any thoughts on how ETP may be leveraged in or alongside OSDU. Or will OSDU make ETP redundant?
Neri - ETP is and will remain mainstream for Energistics as the transfer mechanism that supports the transfer of all Energistics data standards. ETP is also being considered by the OSDU community to support various use cases. We regretfully cannot share details as The Open Group rule is that non-published standards are privileged to members only. Whatever the outcome with respect to OSDU, it would not make ETP redundant since not all data transfers involve cloud systems, notably in the real-time drilling and production domains. ETP can also be used for data streaming between applications rather than file-based processes.
Oil ITJ - Do you have any more on a favorite topic of ours, data validation? It says on the ETP home page that ‘Our use of XML allows us to leverage the schemas for self-validation’. Can you elaborate on this? Are there test suites for ETP developers?
Neri - My use of the word ‘validation’ may or may not align exactly with yours, so to be clear: XML offers the possibility of a data schema that allows developers and data recipients to check formats that they are receiving for compliance. This capability is not as yet widely available for, e.g., JSON. Our use of the word validation is not related to the actual validation of incoming data in terms of quality assurance of the data itself.
Oil ITJ - The developer resources page refers to ETP 1.1 as the latest. Elsewhere I see that there is a 1.2 RC. Any ideas when this will be finalized? Any summary of what’s key/new in 1.2?
Neri - ETP v1.2 has passed its final Release Candidate cycle as of February 15th, and it will be formally published as soon as comments raised during the review period have been addressed. We are targeting publication for May 2021. Our website clearly states the current active version for each standard, versus information describing on-going developments of upcoming versions. For clarity, the User & Developer page lists the latest published of the version of our standards, which for ETP is v1.1. When ETP v1.2 is published, the page will be updated. For developers, the ETP DevKit makes it relatively easy to build software based on the standard. The ETP DevKit was developed by Petrotechnical Data Systems (PDS Group) and contributed to Energistics. The DevKit has been updated for v1.2 and is available to those interested in developing and testing v1.2 implementations.
Watch Neri’s talk on Youtube and visit the Offshore Networks conference home page.
Cognite is to integrate the Cognite Data Fusion platform with The Open Group’s OSDU data platform and align with the OSDU technical standard ‘when it is published’. More from Cognite.
The 12.1 release of Emerson’s Roxar RMS allows for the use of an external orchestrator to control ‘big loop’ workflows. The Python API adds access to RMS project data and jobs. An updated RESQML data transfer tool supports app-independent file transfer and data exchange with an Energistics ETP server such as Emerson Epos. Emerson Tempest 2021.1 has also been released with improved Python support for external packages running in a virtual environment. A new Tempest Analytics license option supports import of third-party optimization and history matching projects. More from Emerson/Paradigm.
The 21.1 release of Rock Flow Dynamics’ tNavigator includes updates and improvements to the simulator kernel, assisted history match and uncertainty module, geology and model designer, PVT designer and the well and network designer.
Ikon Science has released RokDoc and iPoint V2021.2. The RokDoc structurally-oriented filtering function has been improved with the use of multi-processes, decreased project opening times and a better-performing full waveform synthetic function. Arithmetic operations can now be applied to pre-stack seismic data and the 3D Bayesian classifier now integrates per-zone prior probabilities. iPoint/RokDoc compatibility has been enhanced through front-end schema information visibility. iPpointWeb’s Google Query Language support has been improved.
PHDwin V3 Build 3.1.7 (registration required) has been released with a new nested graphs feature. This feature allows users to run economics reports with selected graphs automatically collated by case. The release also improves CSV reporting.
Open Subsurface Data Universe (OSDU) workflows established by Quorum acquisitions Aucerna and EnergyIQ will continue to be supported by Quorum. Quorum has worked with Amazon to create a staging platform where OSDU members can test implementations against a pre-populated, OSDU-compliant sample North Sea data set. More from the EnergyIQ datasheet.
Kongsberg Maritime, COSL Drilling Europe and drilling equipment provider NOV have cooperated on an ‘Energy Control’ project to reduce greenhouse gas emissions and lower fuel and maintenance overheads on the company’s rigs. The solution combines Kongsberg’s energy management systems with NOV’s research into energy optimization, based on an analysis of historical data on power consumption across COSL’s semi-submersible fleet.
Pegasus Vertex Inc.’s CemPro+ V5.4.0 offers an improved model of in-pipe displacement efficiency that caters for different flow regimes including flow segregation, and instabilities. The improved version also updates the foam model to account for changes during shut-in, or on closing the annulus.
Amazon has announced the general availability of the AWS Production Monitoring & Surveillance solution, developed in collaboration with BP and Embassy of Things. PM&S combines edge software with a cloud-based data historian. The solution is said to ‘liberate’ data from legacy decentralized and local historian instances. The cloud native historian offers flexibility in the choice of visualization and analytical workloads compared to ‘incumbent single vendor lock-in’. The solution is supported by deployment partners, Accenture, Infosys, TEKsystems, Umbrage, TensorIoT, Wipro Limited and AWS Professional Services. More from Amazon.
Seeq has rolled out Seeq Enterprise and Team editions running on either AWS or Microsoft Azure. The execution engine Seeq Cortex (previously Seeq Server) provides multi-source and datatype connectivity, security, scalability and more. The Enterprise edition includes Jupyter Notebook interfaces to a library of open source modules and algorithms, and audit trail support for users in regulated industries. Seeq integrates with business intelligence and process applications such as Tableau, PowerBI, Spotfire and OSIsoft PI Vision. A REST API and software development kits for Java, C# and Python are available under license. More from Seeq.
ABB’s Flow-X gas and liquid flow computer has obtained C1D2/Zone 2 Certification for accurate measurement in mid and downstream oil and gas. The unit meets Class 1 Division 2 and ATEX/IECEx Zone 2 certification, with an operating temperature range from -40°C to +75°C. A 0.008% accuracy is claimed for the programmable unit’s analog inputs.
CEA Systems has released 11.2 of its Plant4D. The Component Builder and CAD content creation functions have been redeveloped, improving stability and performance. A more consistent look and feel improves usability. The release adds support for AutoCAD 2021.
Emerson has launched Rosemount TankMaster Mobile, a cross-platform inventory management application for tank gauging systems. Also new is the integration of corrosion and erosion monitoring with new the Rosemount 4390 series of corrosion and erosion wireless transmitters connected to the Plantweb Insight Non-Intrusive Corrosion application.
Emerson has launched a Rapid Adaptive Measurement solution for process automation leveraging the Roxar 2600 Multiphase Flow Meter (MPFM) for the oil and gas industry.
Lloyd’s Register has added a new module to its AllAssets performance and risk management platform. The new ‘events management’ module simplifies the resolution of non-conformance reports from offshore, LNG, chemical and other industrial applications.
Phase One has announced the ‘P3 Payload’ for fast, efficient, and safe inspection of critical infrastructure. The combo consists of a M300 drone equipped with a Phase One iXM (100 or 50 megapixel) camera, one of the RSM lens options, and a new gimbal mount with integrated rangefinder. More on the P3 Payload turnkey solution from Phase One.
New MobileTrack technology from Total Valve Systems displays a valve’s service history by scanning a QR code tag. A Total Valve Live product adds real-time monitoring and reliability reports on every valve Total Valve sale or service.
FDT Group has announced Mobile FieldMate for field device management. A PC/Tablet-based configuration tool that performs tasks including initial setup, daily maintenance, troubleshooting, and configuration backup for device replacement. FieldMate incorporates the open FDT technology standard and is compliant with FDT/DTMs that use either the FDT 1.2 or 2.0 standards.
Emergency pipeline repair specialist IRM Systems (IRMS) has announced the Pipeline Integrity & Budget Optimization Tool (PIBOT), a new software tool to optimize pipeline integrity engineering and management. PIBOT provides data management, integration, and smart data analytics. North Sea operator Petrogas uses the tool in its risk-based inspection program. The system provides a risk matrix across an asset portfolio along with an Esri ArcGIS map of data and activity. PIBOT’s report generation feature has been used to report to Staatstoezicht op De Mijnen, the Netherlands regulator.
Maloney Technical Products, in conjunction with Control Devices, has rolled-out (!) the patent-pending Maloney Smart Sphere, a real-time trackable pigging system. MSS is used to locate damaged pipe, to map old pipelines and to measure flow velocity. More from Maloney.
Dover Fueling Solutions has announced the DFS DX connected platform for the global fueling and retail industry. The cloud-based DFS DX addresses wetstock (fuel) management, remote asset monitoring, targeted advertising and media at the dispenser, fleet fueling site management and point of sale management. DFS DX runs in the Microsoft Azure IoT cloud. More from Dover.
A new app, the ‘Door Dash of diesel’ available from Fuel Me and from the Android and IOS app stores allows customers in the commercial transportation and construction industries to purchase fuel and receive emergency roadside assistance services with the click of a button. Fuel Me simplifies fuel procurement by allowing users to manage purchases on a single platform, optimizing operations and administration.
OneStopSystems’ 4U Pro is a high-performance AI Datacenter module as deployed in the GAS-R ruggedized ‘Datacenter in the Sky’ for ‘AI at the edge’. The appliance supports up to 8 NVIDIA A100 PCIe cards with four PCIe Gen 4 x16 HBA/NIC slots for up to 256GB/s of sustained data throughput. Alternatively, the 4U Pro can be configured to provide 16 single-width PCIe Gen 4 x8 slots for FPGA data ingestion, or the latest storage add-in cards.
NEC’s SX Aurora TSUBASA, the latest in the SX Vector computer series, targets high performing computing usage such as seismic imaging. The system comes with a Linux operating system and high-performance Fortran, C and C++ compilers that ‘remove the need for special programming languages like CUDA’. The machine includes an automatic vectorization compiler and InfiniBand MPI interconnect. Aurora’s technology powers Japan’s 17 petaflop Jamstec Earth Simulator.
DNV’s new additive manufacturing (3D printing) ‘service specification’, DNVGL-SE-0568 will be the foundation of DNV certification of facilities, manufacturers, processes and 3D printed parts. The new standard extends DNVGL-ST-B203, a standard for 3D printing in the oil and gas, energy and heavy industry sectors.
Xergy has launched Proteus, a digital platform for remote project team collaboration in the oil and gas gig economy. The solution combines a ‘world class’ cloud-based ERP system with on-demand access to rated talent and ‘pay as you go’ access to major engineering software. A marketplace of rated, skilled professionals helps companies find the right people for projects as and when they are needed. Companies post projects, locate freelancers and pay invoices through a single interface and wallet system.
The EAGE Forum Brains, Machines & Rocks: Assessing the Digital Revolution heard from Gabriel Guerra (Shell), Maitri Erwin (Microsoft Azure for Energy) and Steve Freeman (Schlumberger) with EAGE First Break Editor Andrew McBarnet in the chair. To summarize the debate, all were agreed that the cloud and artificial intelligence were ‘big’, and that the future will be ‘open’, with reference to Shell’s OSDU push and Schlumberger’s contribution thereto. McBarnet suggested that geoscientists were skeptical, considering machine learning as ‘just another tool’. All immediately agreed… ‘ML is good for specific simple tasks, you still need a person in the loop’ … ‘explorationists think in broader and different ways’ … ‘in highly faulted deep thrust will still need a human’. The quality of the data underpinning AI is critical, this is a ‘huge issue’ … ‘a fundamental problem’ … ‘the rational for OSDU’ … ‘ML has a huge role, but we need to get beyond the mediaeval stage of data management’. ‘This is a huge task where companies are struggling – especially in the current cost cutting environment’.
The discussion turned to the required skill set for the new workforce. For Erwin, Universities still offer ‘siloed, over-specialized’ approaches to education. Industry does not necessarily need ‘fully-fledged’ data scientists, but some coding is needed*, along with a ‘passion for data science’. For Guerra, there is still a lot of resistance to the new data-driven technology even though this is changing. ‘Using a pen and pencil to interpret seismic is not going to last very long’. For Schlumberger, the real resource is people. You can’t just buy data science expertise. It’s much better to provide petrotechs with new skills. If you need a head of IT then the service companies have failed you’.
The zoom format meant that some of interesting questions and comments were asked but not answered. Inter alia …
‘We are pressurized by the IT industry to buy, buy, buy. The IT people do not see how much science needs to be migrated into the new systems’.
‘What are the benefits of AI for seismic processing? Project turnover time? Noise reduction? Better imaging?’
‘How does having a big data mess in the cloud solve anything?’
‘How will digitization help with the energy transition?’
‘How should a medium size company gets into the digital transformation space? What are the fundamental questions should one ask before thinking or deciding on moving with a digital transformation?’
The EAGE Forum Session titled ‘Energy Transition: How Fast Really?’ purported to explore the reality of the energy transition and what it means for the geoscience community. Host Andrew McBarnet introduced panelists Bob Fryklund (IHS Markit), Philip Ringrose (Equinor) and Iain Stewart (Plymouth University).
Ringrose observed that many oil and gas companies are reporting carbon reduction targets ... but are they real? Businesses have to reposture (sic). There is now a perception that there is money to be made in low carbon energy systems.
Fryklund added that while in the past, the important thing was to please shareholders financially. Now the license to operate is critical and has changed the dialog.
Stewart thought the repositioning was unconvincing and could not see an ‘oil and gas’ company surviving as such.
Ringrose acknowledged that 10 years ago, ‘oils could be accused of greenwashing’. Things are now changing with the move into renewables.
Fryklund sees industry fragmentation with a splinter group of large Europeans that are transitioning to become ‘energy’ companies. Other smaller companies think that they won’t be able to compete with the utilities. ‘What we are good at is producing oil and gas’.
McBarnett asked if carbon capture would get the job done.
Ringrose hopes that we will do CCS in reservoirs and rock formations, building a ‘new, exciting hybrid industry’.
Stewart agreed, the Paris agreement requires negative emissions/CCS. Biotech, AI and quantum computing will all help.
Fryklund was skeptical that folks looking to do CCS would hire an ex-oil geoscientist!
Given the interest in AI and ML in seismics, we dipped in to the papers presented in the special session on machine learning. Topics included Reconstructing missing seismic data through deep learning with recurrent inference machines – Ivan Vasconcelos (Utrecht University). Source de-ghosting of coarsely-sampled data using a machine-learning approach – Jan-Willem Vrolijk (Delft University Of Technology). Automatic parameter selection using k-fold test – Nihed El Allouche (Schlumberger) and others in a similarly specialist vein. We have yet to read a truly killer application of AI in oil and gas. Perhaps the operators are keeping these to themselves.
Read the EAGE 2020 abstracts on EarthDoc.
* Just a little anecdote here. To our personal knowledge, ‘coding’ has been a part of the geophysical curriculum in the UK for at least 50 years.
Karen Golz (an EY retiree) is now an AspenTech board member.
SEG and AAPG have announced the merger of AAPG 2021 ACE and SEG 2021 Events, the first of which will be held from 26 September to 1 October 2021, at the Colorado Convention Center in Denver, Colorado. Visit the event here (AAPG) or here (SEG).
Brian Petko is now BCCK’s senior VP of engineering, replacing retiree Tony Canfield.
Alexander Buehler has joined the Brock Group as President and CEO. He hails from Intertek.
Jurgen Delfos is the new CEO at CEA Systems.
Andrew Taylor, NERA’s GM decommissioning, will lead the newly launched Centre of Decommissioning Australia (CODA).
Thomas Maurisse (Total) is now a Clean Energy board member, replacing Philippe Montantême.
Matt Fox is to retire as ConocoPhillips EVP and COO a after 35-year career with the company.
Chris Roberts is now Cynet Chief Security Strategist and is to launch a CISO Community and CISO Challenge for 2021.
Amy Chronis is now vice chairman of Deloitte LLP and leader of its oil, gas and chemicals sector within the U.S. energy, resources and industrials industry. She succeeds retiree Duane Dickson.
Digital Intelligence Systems (DISYS) has received the Capability Maturity Model Integration (CMMI) Level 5 SVC 2.0 certification.
Mark Hess is to succeed William Coskey as CEO at ENGlobal. Coskey remains as chairman.
Dean Cubley is interim CEO and Chairman at ERF Wireless after the death of its former CEO and Chairman John Barnett. Cubley, the founder of ERF Wireless, served as CEO and Chairman for over 10-years before retiring in 2017.
Neil Duffin is to retire as ExxonMobil President. Jon Gibbs (currently SVP) has been appointed as his successor. Michael Angelakis (Atairos) and Jeffrey Ubben (Inclusive Capital Partners) have joined the ExxonMobil board of directors.
Barbara Geelen is now a Fugro board member and CFO succeeding Paul Verhagen, who is stepping down to become CFO at ASM International. Geelen hails from HES International.
Suba Rohrman has been promoted VP Asia-Pacific following the FutureOn and Tracy Energy partnership agreement.
Florence Lambert is the newly appointed CEO of Genvia, a clean hydrogen production technology venture founded by Schlumberger, France’s CEA, VINCI Construction, Vicat and AREC (l’Agence Régionale de l’Energie et du Climat).
Brad Page is to step down as CEO of the Global CCS Institute. Russell Reynolds is to lead the search for a replacement.
Bob Patel (LyondellBasell) is now a Halliburton board member.
Halliburton Labs has opened a second application round for early-stage clean energy companies interested in joining its accelerator program.
Erik Josefsson is CEO of Hexagon’s new R-evolution unit. He was previously with Ericsson.
Former Chief Data Officer at Shell, Sushma Bhan is now an Ikon Science board member.
Tomomi Kaneko has been promoted to General Manager, Drilling Unit Domestic Exploration & Production Division at INPEX.
Kimberly McHugh is now a member of the IOGP Management Committee, representing Chevron. She succeeds Craig May, who retired at the end of January 2021.
Kosmos Energy has promoted Tim Nicholson to SVP and Head of Exploration, and John Shinol to SVP and Chief Geoscientist.
Bob Sternfels is the new global managing partner of McKinsey & Company.
Jay Collins is now Chairman at Oceaneering, succeeding retiree John Huff. Huff will continue as Chairman Emeritus for a transitional period.
Mike Kirkwood (T.D. Williamson) is now president of the Pigging Products and Services Association (PPSA) board of directors.
Kelly Gibson heads up the newly created SEC Climate and ESG Task Force in its Division of Enforcement.
Shell has presented an Energy Transition Strategy to its shareholders for an advisory vote at the company’s AGM on May 18 2021.
Andrew Mackenzie will succeed Chad Holliday as chair of Shell.
Ashby Pettigrew has been promoted president of Stratagraph.
Jon Porter is to head-up VELO3D’s commercial operation in Europe.
Peter Coleman is to retire as Woodside CEO after ten years in the role. Meg O’Neill (current EVP of Development and Marketing) is acting CEO.
Pablo González is the new president at YPF succeeding Guillermo Nielsen.
Deaths:
Bechtel has reported the death of Stephen Bechtel Jr. at 95, retired Chairman and CEO of the company.
The American Petroleum Institute (API) and the African Energy Chamber (AEC) have signed a Memorandum of Understanding (MOU) to collaborate on capacity building initiatives and standardization to enhance safety, environmental protection and sustainability in African countries producing natural gas and oil. The MOU will facilitate the development of training programs and seminars, sharing of HSE best practices and conference organization.
Brady Corporation now offers pipe markers compliant with ISO 20560, a new standard for the identification of hidden and often hazardous pipe contents in factories and facilities. More from Brady.
CEN, the EU Committee for Standardization has announced the CEN-CENELEC Sector Forum on Energy Management/Transition
(SFEM), an advisory and coordination body for policy and strategic
matters in relation to the standardization of energy management and
efficiency. The SFEM is to ‘anticipate future standardization
developments and map the need for legislative improvements’, and
support innovation by financing and de-risking the tools that
contribute to energy efficiency. To kick things off the SFEM is
launching a work group ‘dedicated to blockchain’ to explore a
‘framework for the internet’.
We have pinged CEN with a pointer to Neil McNaughton’s 2018 analysis of blockchain and his recent letter to the Financial Times on the subject.
Under the leadership of NEN, the Dutch National Standardization Body, CEN/TC 19 and the EU Oil Spill Identification Network have announced a new EU standard for oil pollution characterization. The standard can be used to prosecute violators of environmental legislation for the discharge of oil. More on the upcoming EN 15522 ‘Oil spill identification Petroleum and petroleum related products’ from NEN.
Energistics has issued an update to its Practical Well Log Standard. PWLS V3.0 results from numerous contributions and a review process involving experts from nine companies. PWLS is a free, central repository of well log curve names. There are now some 50,000 acquisition curve types ‘of which about 1,000 deliver the majority of the value’. Energistics CEO Ross Philo said, ‘As the upstream industry accelerates its digital transformation initiatives, vendor-neutral reference material such as PWLS 3.0 helps remove ambiguities from data, which greatly facilitates the automation of data management tasks ahead of analytics and AI-driven activities’. Download the Energistics standards and documentation here.
The Sustainability Accounting Standards Board (SASB) is seeking comment on a new Environmental, Social, and Governance (ESG) taxonomy. The XBRL taxonomy covers all 77 of the SASB’s standards. It was developed by SASB in conjunction with PwC, and tested using Workiva’s WDesk platform.
The XBRL standards body is pushing for the UK Government to leverage its ESG standards in its Streamlined Energy and Carbon Reporting (SECR) legislation that now requires all large UK companies to report on their annual energy use, greenhouse gas emissions, and energy efficiency measures.
The IOGP’s Digitalization and Information Standards Subcommittee (DISC), now chaired by Emile Coetzer (Chevron), has published DSDA, a data standards domain analysis. Some 100 information standards have been reviewed to identify ‘opportunities’ in process safety, asset integrity and information management and IT. Other work covers product lifecycle management and a global equipment hub taskforce to explore options for a cloud-based repository of standard vendor documents and data. An IOGP industry digitalization roadmap taskforce concluded that ‘data foundation standards are needed and that data applications should be separated via an API layer’. The mooted data foundation standards may be coming from a proposed ‘trilateral alliance’ between IOGP, OSDU (Open Subsurface Data Universe), and the blockchain-boosters of the World Economic Forum. More from the IOGP DISC webpage.
IOGP Report 604 provides guidance on developing requirements for large, complex oil and gas projects. This guidance draws from good practices across the industry and has been applied by some IOGP JIPs to improve the quality of technical writing.
IOGP Report 373-19 ‘Guidelines for GNSS positioning in the oil and gas industry’, a joint IOGP-IMCA publication, provides guidelines for the use of global navigation satellite systems to position vessels, vehicles and other fixed and mobile installations during oil exploration and production. The overview of recommended principles for reliable positioning includes recommended minimum statistical testing and quality measures essential for rigorous QC and performance assessment.
A new white paper, ‘Characteristics of IIoT Models’ from the Industrial Internet Consortium surveys IIoT information models and proposes a meta-model for interoperability. IoT system interoperability requires agreement on the context and meaning of the data being exchanged, ‘a.k.a. semantic interoperability’, as captured in an information model. The white paper addresses the challenge of integrating IIoT subsystems that use different information models, proposing a ‘descriptive or semantic approach’ to enable interoperability and ‘ultimately digital transformation’. IoT models under consideration include the W3C’s Web of Things, the OGC’s SensorThings API, the OPC Foundation’s OPC UA, Industrie 4.0’s Asset Administration Shell, IPSO Smart Objects and the IETF One Data Model.
Nikola Corporation’s Antonio Ruiz has been appointed to lead a three-year hydrogen fueling global standardization project for the International Standardization Organization’s Technical Committee 197 (ISO/TC 197). The TC 197 is to standardize systems and devices for the production, storage, transport, measurement and use of hydrogen. Nikola is developing an electric truck that got much media attention when the company admitted that a promotional video showed the vehicle rolling downhill.
The Open Geospatial Consortium (OGC)’s Executable Test Suite (ETS) for version 2.0 of the Observations and Measurements (O&M) XML Encoding Standard has been approved by the OGC Membership. Products that implement the O&M XML 2.0 Standard and pass the tests in the ETS can now be certified as OGC Compliant. The encoding underpins the OGC Sensor Observation Service (SOS), used in air quality monitoring, hydrology, agriculture, environmental protection, land management, geology, defense, security, and public safety. Implementers of OGC standards can validate their products using the OGC validator.
The World Wide Web Consortium (W3C) and the Open Geospatial Consortium (OGC) report from a Joint Workshop on Maps for the Web. Peter Rushforth (Natural Resources Canada) chaired the event, which set out to ‘improve browser-based maps on the web through a standards-approach’. Read the comprehensive report and download the presentations here.
AspenTech’s Mike Brooks opined recently that ‘the current [digital twin] technology focus is all over the place and leads to many questions’ like ‘what is a digital twin?’, the title of his blog. A digital twin is a virtual model of a process, product or service represented in computer code, logic, equations and algorithms. AspenTech has been in the business of creating digital twins ‘since well before they were ascribed the DT moniker’. They originated as the computer models used to design and simulate chemical processes. No single digital twin can address all scenarios. Brooks advises plant personnel to ‘think about the problem they are trying to solve, and get to the heart of the matter by asking the right question of their twin(s)’. More on AspenTech’s twin philosophy in the white paper, The Digital Twin and the Smart Enterprise.
If we can just push Brook’s reasoning a tad further in the interests of de-mystification, the ‘digital twin’ is simply a rebranding of a the model/simulator.
The March 2021 Issue of the Industrial Internet Consortium’s Journal of Innovation is devoted to the digital twin. The 85-page publication has three sections, on DT and the web, open source/standards for the DT and on an oil industry specific DT from Osprey Data. The latter, titled ‘Design and implementation of a DT for live petroleum production shows how an ensemble of DTs can represent a system of assets and can be used to determine optimal operational set-points. The use case presented is artificial lift where production control is traditionally dependent on ‘manual’ simulation design and expert recommendation. Current oil and gas DTs allow for broader, overall process optimization, but do not support fully automated set-point recommendation. This requires a data processing component integrated with a simulation engine that can manage, process and generate large volumes of data. This is a ‘deficiency’ that Osprey Data sets out to address. The process leverages ‘live data, in-cloud processing power alongside simulation and data science tools’. This is one of the best presentations we have seen covering the nitty-gritty of how a commercial simulator is used to generate training data for an ML model. Tools of the trade include Apache Airflow (scheduling) and Spotify’s Luigi (workflow authoring in directed acyclic graphs). Streaming sensor data is captured in systems such as OpenTSDB or TimescaleDB, purpose-built data stores for store and query of time series sensor data. This kind of infrastructure paves the way towards the next step in the DT, closed loop dynamic set-point optimization. A highly recommended read!
Another section proselytizes for DTs based on ‘open standards and open source software’, authored by ABB, Bosch, Microsoft and SAP, all now apparently converted to ‘open’. The authors note the lack of a common definition of the DT that has led to ‘many flavors’ of the concept, leaving some to wonder if DT is just a buzzword. The IIC and the German Plattform Industrie 4.0 have both provided DT standards. A third set comes from the Industrie 4.0 spin-off, the Digital Twin Consortium (DTC). These cover information models, APIs, connectivity to physical twins, data ingestion mechanisms, security and interoperability. While DT vendors offer proprietary solutions, complex systems of systems are likely to involve components from different vendors. DT interoperability is now desirable. Here ‘multiple new organizations’ have been set-up to address the issue, including the Industrial Digital Twin Association* (IDTA) and the Linux Foundation’s Open Manufacturing Platform (OMP).
The situation in the DT interoperability space reminds us of the title of Evelyn Waugh’s novel ‘Put out More Flags’. It looks like DT interoperability is to be achieved by ‘Putting-out more standards’!
* For more on the IDTA’s Asset Administration Shell (AAS) see Oil ITJ 2020/5.
OneGeology, the BGS-backed initiative of the geological surveys of the world reports in its latest Newsletter the outcome of the first OneGeology digital twin workshop. Representatives of 21 geological surveys explored a long term vision for global geoscience DTs. Presentations covered DTs for river dikes in the Netherlands, nuclear waste storage in France and others. OneGeology has a role to play in ensuring that geology and the subsurface are included in future DTs such as the Deep-time Digital Earth (DDE), EPOS, AuScope, LOOP and Earth Cube. DTs are seen as ‘a positive and challenging framework for our community to engage between our organizations and potentially in a stronger way with our stakeholders’. More from the OneGeology portal and the code base on Github.
In a recent webinar, Esri presented its ‘Digital twin for petroleum’
a.k.a. ArcGIS Velocity (AGV). The cloud-based solution takes real time
data from the field and adds big data and analytics capabilities to
ArcGIS Online. The solution is claimed to be ‘scalable, resilient and
managed by Esri’. Esri’s pitch for digital twin status stems from its
not unreasonable claim that ‘everything has location’ and the
observation that many of today’s operations control centers fail to
leverage the ‘power of location’. AG Velocity is an ‘end to end’ system
for the oilfield leveraging Kubernetes-based scalability and
resilience. The system is controlled and configured through a simple
web interface. AGV talks to multiple IoT platforms – MQTT,
AZURE/AWS/CISCO IoT and more. The big data captured can be pushed on
for analytics, all driven from a web interface. The video demo (see
below) showed a connection to the Microsoft Azure natural resources event hub.
One thing is for sure, the new ArcGIS functionality only received a
brief mention in the ‘software shorts’ section of our last issue.
Raising the stakes to a ‘digital twin for petroleum’ certainly gets
more attention. More from the Esri Petroleum Team.
An 88 (small) page whitepaper, co-authored by Accenture and Dassault Systèmes, found that ‘virtual’ (a.k.a. digital) twins are an ‘untapped opportunity to help companies unlock combined benefits of $1.2 trillion of economic value and 7.5 Gt CO2e emissions reductions by 2030’. The white paper reads like a pitch to access some of the EU taxpayer’s largesse via the ‘Destination Earth’ initiative (see below) that is to use computer modeling to ‘slow or stop global warming’.
Given that what passes as a twin is likely to be a hotch-potch of product-based simulators, interoperability of these would seem like a desirable characteristic. To which aim, the Digital Twin Consortium, a ‘program’ of the Object Management Group has partnered with LF Edge on DT/Edge platform Interoperability.
Esri too has joined the Digital Twin Consortium albeit from a slightly different direction than the DT for Petroleum. Esri’s utilities and GIS for architecture engineering and construction (AEC) unit, along with partners Autodesk and Microsoft are to demonstrate how users can benefit from GIS when developing digital twins.
The EU-backed Destination Earth initiative is to develop a precision
digital model of the Earth to monitor and simulate natural and human
activity, and to develop and test scenarios that would enable more
sustainable development in support of EU environmental policy. The
project is to contribute to the EU Commission’s Green Deal and Digital
Strategy and ‘speed up the green transition and help plan for major
environmental degradation and disasters’. The juggernaut is to be built
on a ‘federated cloud-based modelling and simulation platform’,
providing access to data, advanced computing infrastructure (including
high performance computing), software, AI applications and analytics.
The ‘7-10 year’ project starts in 2021. More on the EU Destination Earth boondoggle.
The Digital Twin Consortium and the Fiware Foundation are joining
forces to accelerate digital twin technology adoption. Fiware is a
not-for-profit organization with 370+ international members including
Atos, NEC and Red Hat. Fiware has defined standard APIs and models for
digital twins in multiple sectors including cities, utilities and
others. More in the press release.
Now for a really serious digital twin, co-developed by Altair and
Gruppo Cimbali, that puts data, simulation and the IoT at the core of a
platform that blends physics and data-driven twins of Gruppo Cimbali
coffee machines. The DTs, powered by Altair Activate, capture information on each espresso/latte served to a database, along with insights about drink quality. More from Altair.
Our last look at the DT was in 2018 when we concluded that digital twin buzzword did not represent anything really new and that building an overarching model of a facility was hard since each component model comes with its ‘own scope, granularity and time frame’. Interoperating models, DTs or whatever they are called has been considered a holy grail of IT for many years, notably with the IEEE’s High Level Architecture. It appears as though not only are there very many DTs, there are also quite a few different interoperability ‘frameworks’. Everyone seems just to be rolling their own amongst these plethoric ‘standards’, interoperability pipe dreams and rebranding. Esri’s ‘digital twin for petroleum’ gets a shout out for astute marketing of a product update. A digital twin? Not really, more of an entreaty to ‘cut the crap, use a map!’
The University of Houston has opened an Artificial Intelligence Industry Incubator and Digital Oilfield Lab to allow students, faculty and industry professionals to create ‘technologies and solutions of the future’. The lab is a project of the UH College of Technology and the AI Innovation Consortium (AIIC). Projects will focus on safety and efficiency using machine learning and other forms of AI ‘to reduce the human footprint in the field’. More from U. Houston. The University has also announced a new Upstream Energy Data Analytics Program, a structured series of micro-credentials or ‘badges’ to provide data sciences skills and solve current and emerging challenges using ‘advanced data-based decision making’. Each badge is a 15-hour module, delivered over a 3-week period, and the badges are ‘stackable’. The first three badges, which together form the ‘bronze belt in upstream energy data analytics’, are available online from UH Energy.
Spirit Energy, Neptune Energy, and Norwell EDGE are to collaborate on a digital well integrity training program based around Oil and Gas UK’s well lifecycle guidelines and Norwell EDGE’s e-learning. The course is similar in concept to the UK Opito Minimum Industry Safety Training (MIST) qualification. The organizers hope that more operators will join the initiative over the next 12 months. More from Norwell EDGE.
Esri is offering online MOOC*-based training for mapmakers. ‘Once, only cartographers made maps. Today, anyone can’. The course will ‘go beyond the defaults’ to enable ArcGIS Pro users to produce ‘engaging maps that communicate with impact’.
* Massive Open Online Course.
GSE Systems is to provide its EnVision training software to an unnamed major Canadian energy company. EnVision combines computer-based tutorials with high-fidelity simulation models The deal involves the conversion of an earlier ‘perpetual’ license to a hosted, subscription-based solution.
Landmark’s SmartDigital co-innovation service is offering training in the use of AI and ML. This includes a ‘comprehensive workflow’ for data-driven fluid prediction.
Honeywell has announced an
industrial training solution that combines 3D immersive technology with
an operator training simulator. The Immersive Field Simulator is a
virtual/mixed reality-based digital twin of an asset that provides
targeted, on-demand, skill-based training for workers. Download the
free app from the Microsoft app store.
The Society of Exploration Geophysicists (SEG)’s EVOLVE 2021 training program will leverage data and petrotechnical solutions from Schlumberger’s
Delfi. These include the GAIA (data discovery) and the Delfi
Petrotechnical Suite. EVOLVE, now in its fourth year, is a five-month
virtual internship during where geoscience and engineering students
work on real-world data sets to identify the best investment
opportunities in their assigned geographic areas. More from SEG EVOLVE.
The Carnegie Mellon Software Engineering Institute has put all of the training material from its Software Product Lines online under a creative commons attribution 4.0 international license (i.e. for free). Check out the SEI course presentations and videos.
Trendsetter Engineering has released a source control (i.e. blowout) e-learning course. The program builds on Trendsetter’s expertise in emergency well response. One key client has already enrolled its well response team in the program which includes an introductory module, followed by capping, containment, and relief well management. More from TrendSetter.
Those who believe that the writing is on the wall for oil and gas may be interested in Opito’s
new ‘transition standard’, a training program for oil country workers
to develop the skills and competencies required in the offshore wind
sector. The new program recognizes prior training undertaken in the oil
and gas industry. More from Opito.
Jean Francois Bobier from Boston Consulting Group believes that some of the first and most important applications for quantum computing (QC) will be in the energy transition and emissions reduction. Bobier noted that, seven years later, the world is far behind the objectives of the Paris accord and the planet is reaching a tipping point in atmospheric CO2 which will dramatically transform society. Recent research suggests that QC can contribute to energy efficiencies, with new chemicals and material design, making things stronger and lighter. ‘NP-hard’ problems such as the travelling salesman, of importance to supply chain logistics, that currently would take millions of years to compute, will be solvable with a QC. Simulating new fertilizers or catalysts would take hundreds of thousands of years on a classical computer could run in under 24 hours on a QC. Bobier cited work by BASF and HQS Quantum Simulations* and by Google. When is this going to happen? Bobier has it that we will be living with noisy NISQ computing for 10 years. Then, with more q-bits, the field will open up to more apps for cement manufacture, energy production and storage and logistics/transformation. More in a similar vein from BCG.
* Accuracy and Resource Estimations for Quantum Chemistry on a Near-term Quantum Computer.
Doug Millington-Smith showed how QLMtec’s
Quantum Gas Camera is used to detect fugitive methane emissions. QLM’s
high-sensitivity gas detection and imaging system targets natural gas
producers, distributors and service providers with ‘fast, accurate and
low-cost gas leak identification’. The camera uses a LIDAR scanner to
bounce a laser off solid objects. A SPAD (single photon avalanche
detector) a.k.a. a ‘Geiger counter for light’ takes millions of
measurements per second. The laser can be tuned to detect different
gasses. A GUI shows methane release as a heat map atop plant imagery.
QLMtec has received a grant from The Splice Project and has support from BP and the UK National Grid inter alia. QLMtec has worked for Total at its TADI
accident prevention testbed at Lacq, France facility on various blind
gas release tests. Detailed results are under NDA but QLM was ‘the only
system that made the cut for return trials’. Read QLM’s March 2021
paper on the SPIE Digital Library.
Alexia Auffeves (CNRS/Institut NEEL) presented on the possibility of reducing the world’s digital footprint with quantum computers. In addition to holding the potential to solve some of the world’s most computationally challenging problems, quantum computers use significantly less energy, which could lead to lower costs and decreased digital carbon footprint as adoption grows.
Philippe Bouyer (LP2N/CNRS) gave an extremely bullish presentation of Muquans’ quantum gravity meters for, inter alia, earth exploration. This is claimed to be the first commercial quantum gravimeter promising inertial positioning ‘so accurate that the world could do away with GPS!’ Muquans’ Jean Lautier-Gaud explained how the ‘cold atom’ gravity meter produces absolute measurement of gravity. The ‘free fall’ gravity meter exploits superposition of quantum states to provide continuous measurements over years, without any moving parts. The system has been deployed on Mount Etna to monitor aquifer charge/discharge for natural hazard mitigation.
Olivier Ezratty (BCG) has made available a free, 684-page e-book ‘Comprendre l'informatique quantique’ on quantum technology (in French).
More from Quantum Business Europe.
AqualisBraemar has acquired Norwich, UK headquartered East Point Geo, a geoscience consultancy providing support for major oil and gas and renewables engineering projects offshore and onshore.
Bentley Systems is to acquire Seequent, ‘deepening the potential of subsurface digital twins’. Bentley’s ESG-conscious release has it that ‘Seequent’s products aren’t appreciably used in oil and gas exploration or production—which is served by its own dedicated industry of specialized geophysical software’. However, the deal does include Seequent’s oil and gas solutions including Oasis Montaj. The $1.05 billion deal was backed by investor Accel-KKR.
Caterpillar has acquired the oil and gas division of Weir Group. The unit will integrate a new company, named SPM Oil & Gas, after the Weir flagship pump brand, headquartered near Fort Worth. SMP combines Weir’s pressure pumping and pressure control portfolio with Cat engines and transmissions.
DNV GL is to combine its Oil & Gas and Power & Renewables businesses into a new business area, ‘Energy Systems’. The change reflects the ‘emerging energy future’, where renewables take a greater share of the energy mix, and decarbonization becomes a major focus. More from DNV.
In a $1.6 billion cash transaction, Emerson has acquired Open Systems International (OSI) with its Monarch operational technology software platform and Chronus ‘big data’ industrial historian. More from Emerson/OSII.
Fugro has reached a binding agreement with PXGEO Seismic Services to sell parts of its Seabed Geosolutions unit for $16 million. The deal includes Seabed’s ocean bottom node (OBN) inventory, handling equipment, related technology and order backlog. Proceeds will be used to cover the restructuring cost of winding down the remaining parts of Seabed Geosolutions.
geoLOGIC systems has acquired JWN Energy, an energy insights and intelligence provider best known for its Daily Oil Bulletin, Evaluate Energy and CanOils brands.
Hexagon has launched ‘R-evolution’, to accelerate its transition to a sustainable economy, targeting ‘profit-driven investments’ in green-tech projects where Hexagon’s technology can be applied.
Ingersoll Rand is selling a majority interest in its high pressure solutions segment to American Industrial Partners. The $300 million deal sees Ingersoll retain a 45% interest. The sale reduces Ingersoll’s revenue exposure to the upstream oil and gas market to under 2%, ‘accelerating Ingersoll’s ESG commitments’.
Total is to close its TEP-ASD* unit and run-down the activities of its test and measurement center in Artigueloutan (SW France). Total is ‘aware of the technical interest and importance that this activity can bring in the field of logging tool calibration’ and is prepared to ‘hand over’ the facility which has ISO 17025 accreditation. Interested parties can contact Total vai the TEP-ASD website. See also our 2017 announcement.
* Total E&P Alternative Subsurface Data.
Peloton is to acquire Cevian Technologies, developer of FracNet, a completions-focused software platform. FracNet is a cloud-based tool that standardizes and visualizes frack data. Cevian was spun out in 2019 from Canadian pressure pumper Trican which developed FracNet as a remote-monitoring service.
PMC Capital has acquired engineer Universal Pegasus International from Huntington Ingalls Industries. Los Angeles-headquartered PMC ‘empowers management teams to execute their business plans’.
Recon Technology has acquired a further 8% of Future Gas Station (Beijing) Technology’s share capital, bringing its stake in FGS to 51%.
Concomitant with the publication of its 2020 accounts, Shell has lodged a copy of the 2020 Annual Report with the UK Financial Conduct Authority’s National Storage Mechanism. The report and accounts can also be downloaded from Shell. For more on the FCA NSM read the FAQ.
Titan Cloud Software has acquired Environmental Monitoring Solutions (EMS), expanding its portfolio of wetstock (fuel) management solutions into markets outside the US. The deal was backed by TCS investor M33 Growth.
CGG has secured three major seismic imaging projects from BP, two for the deepwater Gulf of Mexico, and one for offshore Trinidad & Tobago. CGG is to use its CGG Cloud supercomputing technology for seismic imaging with its Geovation software.
Elsevier has announced the integration of MapStand data layers and news into Geofacets, its information solution for geoscience professionals in oil and gas, mining and renewable energy companies.
FutureOn has appointed Tracy Energy as its representative in China. The companies are also to collaborate on software development including a reservoir twin for production.
Getech and GeoMark are to share geoscience expertise, products and services. Through the alliance reservoir engineers, geoscientists and petrophysicists will be able to navigate comprehensive E&P lifecycle data.
Ikon Science has partnered with Amazon Web Services (AWS) for its next-generation knowledge management solution, Curate, which is built on the AWS implementation of the OSDU Data Platform.
The Schlumberger Enterprise Data Management Solution for the OSDU Data Platform, a new industry standard for energy data, running on Microsoft Azure, is now available for global customers.
UP42 and CATALYST have partnered to deliver CATALYST InSAR processing block on the UP42 marketplace and developer platform.
Aker Solutions has secured a ‘substantial’ contract including EPCI of new equipment for Equinor’s Åsgard B gas and condensate platform offshore Norway.
DyFlex Solutions and LTI have formed a strategic alliance to provide EC&O (engineering, construction and operations) companies with a ‘tailored and enhanced’ ERP solution.
Wison Engineering has secured an EPC lump-sum contract from Saudi Aramco for a gas processing project in the Shaybah oil field in Saudi Arabia.
Ecopetrol is to implement Aspen Generic Dynamic Optimization Technology (GDOT) software as part of its digitalization initiative to improve refining margins at its two refineries in Cartagena and Barrancabermeja and is upgrading to Aspen DMC3 advanced process control software in the Barrancabermeja refinery.
Bilfinger is to use its PIDGraph to digitize piping and instrumentation diagrams for LANXESS, a German chemical company.
Asystom and Archer have partnered to commercialize Asystom’s predictive maintenance technology. The first deployment was made for Total ABK on a platform offshore from Abu Dhabi (UAE). Asystom has also entered into a technological alliance with Apollo, a UK-based engineering and technology consultancy, for lifetime facility monitoring.
Chromalloy has selected VELO3D Sapphire as their additive manufacturing solution for future maintenance, repair & operations projects in aviation and energy.
Phillips 66 has chosen Crux OCM’s pipeBOT for a pipeline control center operations pilot.
The Equinor & Techstars Energy Accelerator and Fieldmade have developed a digital inventory ecosystem for the energy sector inventory supply chain.
Halliburton is to help Kuwait Oil Company (KOC) on its digital transformation journey through the maintenance and expansion of digital solutions for the North Kuwait Integrated Digital Field.
In Q4 2020, mCloud Technologies has announced it had added 4,692 connected assets to its AssetCare portfolio a 45% year-on-year increase. Total asset count now stands at around 60,000.
mPrest has joined Microsoft’s global independents software vendor partner ecosystem. The collaboration will leverage the flexibility of cloud capabilities and the power of SaaS to deliver mPrest’s orchestration and optimization platform on Azure to leading energy companies.
Petrofac, Repsol Sinopec Resources UK and TechnipFMC have formed an alliance to maximize the recovery of oil and gas from the UKCS, leveraging Technip’s iFEED front-end engineering and design solution and its iEPCI integrated subsea business model.
Accenture and Ripjar are collaborating with Shell on AI-based risk screening across the global supply chain.
OMV has successfully completed its first SAP S/4HANA Go-Live for OMV Corporate as part of the group wide S/4Future Program, the largest in its history. OMV leveraged a dual vendor strategy with Accenture and IBM as implementation partners.
Aker Solutions, aiZe and Cognite have initiated a large-scale digital initiative in the NOAKA development area. Cognite’s Data Fusion will be deployed in a new ‘EurekaX’ digital organization.
Emerson has joined the EU Clean Hydrogen Alliance (ECHA) to support the goal of reaching net-zero carbon emissions by 2050, as outlined in the European Union’s European Green Deal.
Equinor and SSE Thermal have partnered to jointly develop two first-of-a-kind, low-carbon power stations in the UK’s Humber region, comprising one of the UK’s first power stations with carbon capture and storage (CCS) technology, and the world’s first 100% hydrogen-fueled power station.
ABB and Siemens Energy have secured an estimated value at around NOK 4.5 billion framework agreements from Equinor to provide service of electrical equipment on all Equinor’s installations on the Norwegian continental shelf (NCS) and onshore plants in Norway. The scope is expected to require about 100 man-years in Norway.
The Government of Canada has launched the Canadian Centre for Energy Information, a website providing ‘comprehensive, accessible, reliable energy information’. The website, a joint effort from Natural Resources Canada, Statistics Canada, Environment and Climate Change Canada, and the Canada Energy Regulator, is said to deliver on the Government of Canada’s Budget 2019 ‘commitment to develop the CCEI website’.
The Houston Chronicle reports that the SEC
now requires American oil and mining companies to report on payments to
foreign governments, after a decade-long lobbying campaign by oil
companies to weaken transparency efforts. The watered-down rules only
cover overall payments to governments. Ropes & Gray LLP have published a ‘deep dive’ into the new rules.
The February 2021 issue of the North Star newsletter from the North Dakota Department of Mineral Resources includes details of a new XML reporting format and reporting workflow. XML files can be generated using a supplied Excel template. On upload, data is validated against an XSD schema and a ‘pass’, ‘warning’ or ‘fail’ status is returned. All errors are visible to the submitter. Files that pass the validation process are then tested against business rules such as ‘Is the submitter the operator of the facility?’ Submitters then receive feedback on what was fully accepted into the system and what was not, including reasons for non-acceptance.
The Railroad Commission of Texas has launched a new website to make online resources for the regulation of the energy industry more user-friendly. Visit the new RRC website and watch the Youtube demo.
Earlier this year, the Pipeline and Hazardous Materials Safety Administration has issued a final rule ‘that eases regulatory burdens for gas transmission, distribution, and gathering pipeline system operators without compromising safety’. The then PHMSA administrator Skip Elliott stated ‘Safety is always the Department’s top priority… [but?] ... It is our responsibility to ensure that we do not overburden the industry with unnecessary operating costs.’ The mooted rule change was to provide cost savings of $132 million from more ‘flexible’ inspection requirements and a ‘revised’ inspection interval for corrosion monitoring of gas pipelines. Following the US presidential election, Elliot was replaced by Tristan Brown. More from PHMSA.
The Norwegian Petroleum Directorate (NPD) has confirmed the award of the Diskos 2.0 to Landmark (subsurface and production data) and Kadme (trade module). The five-year contracts have a total estimated value of NOK 157. A new IT architecture is said to ‘facilitate the use of open industry standards such as the OSDU platform, and the use of … machine learning and artificial intelligence’. Diskos 2.0 is scheduled to launch mid-2022.
A position paper from the NPD explains the rationale behind the measurement and metering of oil and gas produced from the Norwegian shelf. Hydrocarbons from the NCS were valued at NOK 424 billion in 2019. Oil companies also paid some NOK 5.5 billion in CO2 taxes.
The NPD has also announced a new system for reporting geophysical surveys. This is to replace an older ‘cumbersome’ system which did not accommodate newer digital technology. The new system is a ‘digital milestone for the NPD’ and has been developed on the NPD’s new IT platform. The NPD has also released a new edition of its FactPages and FactMaps resources, now customized for PC, tablet and mobile phone.
The Oil and Gas Authority (OGA) has opened an investigation into a possible breach of reporting requirements under a license. The investigation follows the 2020 publication of a ‘thematic review into industry compliance with regulatory obligations’ which found that ‘too many issues [were] taking too long to resolve’ and warned that ‘we will be progressively more proactive in using the OGA’s powers’.
OGA has also set up an Environmental, Social and Governance (ESG) taskforce to plug a gap between investor expectations and what is reported. The taskforce will study flaring and venting, scope 1 & 2 emissions, HSE statistics and more. OGA is to develop an action plan to support a low-carbon economy. More from OGA.
dGB Earth Sciences and PanTerra Geoconsultants have written to the Netherlands Ministry of Economic Affairs and Climate to request that public data released under the Dutch mining law should be more accessible. Data is currently released via the TNO’s NLOG, a platform that is ‘seriously outdated’. NLOG lacks an underlying database built on modern open data standards. Valuable information is hidden and unsuitable for data mining. The authors of the petition advocate that Dutch released data be released in alignment with The Open Group OSDU Forum data platform, ‘a widely supported open-source initiative within the energy sector’.
From 1 January 2021, all 55 countries implementing the EITI Standard will be required to publish new and amended contracts, licenses and agreements concluded with extractive companies. To date, many contracts are unpublished, ‘increasing the opacity of the extractive sector and making it more vulnerable to corruption’. More from the EITI’s #opendeals202 campaign.
An EITI study of 25 countries found that only 25% of the data required by the EITI Standard is systematically disclosed. Norway tops the charts with 92% of data disclosed at source. A new tool from the EITI shows what is being disclosed systematically, and where.
Andrew Mercer (independent consultant) presented the business case for carbon data exchange standards. There are today some 46 national carbon-taxing schemes in place, addressing the Paris agreement targets. According to the World Bank*, an increase of carbon prices from $2 (per ton global average 2019) to the $75– 100 range is needed to stay on a Paris-compatible trajectory. This will ‘fundamentally transforming’ commodities supply chains. Mercer sees this as driving demand for low carbon LNG.
* State and Trends of Carbon Pricing, World Bank, 2020
Chris Welsh (PIDX COO and CEO OFS Portal) recapped PIDX activity and standards in the supply chain. The plan is now to extend PIDX standards to allow operators to specify emissions ratings requirements while sourcing, and to identify contracted items’ emissions ratings. This data could be rolled-up into a total, calculated over the lifetime of an item’s deployment. This will extend the PIDD* definitions for materials and services to include greenhouse gas ratings (GHG) for oil country consumables. PIDX has kicked off a workgroup to extend the PIDX XSD and JSON GHG definitions using The Open Group’s Open Footprint work. These will ultimately be embedded in the PIDX invoice/purchase order standards.
* Petroleum Industry Data Dictionary.
David Shackleton (IDS DataNet) warned of the risk of not reporting scope 3 emissions* from regions where the company, its suppliers, or its customers operate. Extended supply chains may pass higher energy or emissions-related costs to customers, risking business interruption. There may be a decreased demand for products with higher emissions that favors competitors’ products. Other risks include GHG-related lawsuits directed at the company or an entity in the value chain, consumer backlash and negative media coverage. But there are also opportunities, since reducing GHG emissions may make for decreased costs and improved operating efficiency. The key here is to be able to demonstrate improvements through disclosure and proactive environmental stewardship. A scope 3 inventory is a best practice that can differentiate companies in an increasingly environmentally-conscious marketplace. In which context, IDS has developed a tool to support the record-report-reduce cycle across well operations lifecycle, presenting drilling performance in the light of GHG emissions. More from IDS.
* Scope 3 emissions come from third party end-use of e.g. oil and gas when it is finally burned.
Kadri Umay (Microsoft) presented a ‘conceptual architecture’ for managing emissions data across the supply chain. Today, emissions data exported from industrial historians goes into Excel and the connection with operational data is lost. Emission reporting is done manually using various models and tools. The result is that emissions are consolidated at different levels and copied to many databases in propriety formats and schemas. Reporting is manual and performed yearly. Adding Scope 3 Emissions makes it even more complex. Umay cited the World Resources Institute’s GHG reporting standard that could be used, along with the PIDX Petroleum Industry Data Dictionary calculate near real-time emissions. Microsoft’s conceptual architecture for automating the emissions lifecycle represents a pipeline taking field emissions data through Azure IoT, Azure emissions models and into an OFF repository running in an OSDU/Azure instance. Output is all rolled-up into the Microsoft Azure Sustainability Calculator.
Mary Sailors (PIDX) also referred to the Greenhouse Gas Protocol to argue that PIDX current standards and schemas can be used and extended to support Scope 2 and 3 efforts, extending the PIDX Catalog Schema with emissions data. There are challenges. There are many ongoing industry efforts and a methodology has not yet been locked down. Many such efforts are dependent on volunteer resources that are ‘stretched thin’ due to the current industry environment. Participants in the PIDX Emissions Transparency Data Exchange (ETDX) workgroup include: Baker Hughes, BP, Chevron, Independent Data Services, Microsoft, OFS Portal and Sullexis. The workgroup has figured how the current PIDX schema could be extended to support bi-directional emissions data transfer between suppliers and operators. Sailors also referred to the memorandum of understanding signed with the Shell-sponsored/The Open Group Open Footprint Forum that will allow PIDX to understand the OFF emissions data architecture and use it to extend the PIDX schemas. Here again, PIDX reference data and the 4,100 entries in the PIDD are seen as germane to a joint development. PIDX is also investigating relationships with other forums with a view to collaborating on a minimum viable product for emissions.
More from PIDX.
Shaji John’s (Pioneer) presentation on emerging digital technologies and their impact on operations focused on the use of ServiceNow’s ‘low code’ platform in oil and gas operational workflows. The presentation covered a ‘fit-for-purpose’ digital solution developed for Parsley Energy before its acquisition by Pioneer. John advised optimizing operations, not technology or IT, and making best use of what was already in-house before adding more stuff. The ServiceNow platform as a service was already in use for IT service management. John expanded its across the Parsley software portfolio to include SAP, OpenWells, WellView, Ignition Scada, WorkDay (HR) and Quorum (Finance). An ‘out of the box’ AI/ML solution was configured for operations use cases. The project took 2-3 months to deliver solutions and benefitted from user familiarity with ServiceNow. After the event, John told Oil IT Journal that other low code solutions could have been deployed, but that Parsley found ServiceNow to be ‘ahead of the game’ in terms of PaaS maturity. Post-acquisition, Pioneer continues to use these workflows for ex-Parsley assets and is in the process of reconciling some overlapping solutions. John believes the low code approach will become mainstream and accelerate digital transformation for oil and gas operations.
Steve Aitken (Intelligent Plant) and co-author Stuart Harvey (Proserv) advocated a shift from today’s business model where a service company tends to lock-in clients with a more or less proprietary offering. A typical AI project might involve paying to have company data ingested into a third party’s data store then paying more for custom analytics. At this point, sunk costs mean that cancelling the subscription is hard. IP/Proserv’s approach is to supply software tools that have low configuration cost and ‘low lock-in’. The authors further advocate data sharing between suppliers and customers via an open app marketplace.
Brent Kedzierski’s talk, ‘The Future of Human’ covered Royal Dutch Shell’s ‘Industry 5.0’ program. This spans maintenance execution, remote assist mixed reality and virtual over-the-shoulder coaching. Shell’s Connected Badge provides the exact location of employees and contractors along with an emergency message function. A key enabler of Industry 5.0 is Citizen Development and ‘do IT yourself’. This allows end users to solve business problems using their own digital solutions. ‘From business problem to resolution in days not months’. Shell empowers the individual to improve his or her work environment and ‘collectively create a better business’. ‘Conventional wisdom is to innovation what rain is to a barbecue!’
Carlos Calad (Tachyus) stated that, on its own, data-driven analytics can lead to spurious correlations. Data driven models can mislead outside of the range of their input data. On the other hand, physics-based numerical simulation is poorly suited to optimization. Full field models may take months to run, making scenario-based optimization impossible. Optimizing injection in a field of 100 wells at 1 hour/simulation would take 2.7 x 1014 years of compute time. Tachyus uses a combined data and physics approach to the problem. This blends machine learning, physics-based reservoir modeling and advanced optimization techniques. Models are built in weeks and simulations run in minutes, ‘fundamentally altering the way decisions are made’. All of the above is bundled as an optimization and reservoir engineer’s toolbox delivered as a cloud-hosted solution. More from Tachyus.
More from Petroleum Trade Network.
An recent online event organized by The Open Group Open ‘Digital Standards 2021 include a presentation on ‘Digital Standards and Saving the Environment’. Geert-Willem Haasjes (Shell account at IBM) described open source software as the ‘new default choice’, citing a 2020 Red Hat (now an IBM unit) report on the State of Enterprise Architecture. Open source APIs allow for connecting applications that ‘propel innovation’. Haasjes also referred to AI/open source guru Gregorio Robles as advocating the user as developers ‘and vice-versa’. The Shell/Open Group Open Footprint Forum* (Haasjes is its’ architect) leverages the open source values as above.
In a discussion between TOG CEO Steve Nunn, OFF director Heidi Karlsson and Shell’s Johan Krebbers it emerged that despite the best efforts of WEF/GRI/GDP on GHG reporting, there are ‘no clear standards on what data to store’. It is impossible to aggregate emissions data across multiple companies. Data on Ikea’ furniture involves many suppliers which all give values that are figured differently. OFF is to go beyond GHG reporting and cover cross-industry supply chains in mining, aviation and marine and will report on water, landfill and more. Krebbers presented a straw man architecture showing microservices-based ‘near real time’ data capture leveraging OpenID Connect. The architecture will reduce the work of collecting supply chain data for GHG Scope 3 reporting. OFF leverages the work already done for OSDU and will be available on Git Hub later this year.
Regarding adoption and TOG certification, Krebbers stated that certification was not yet on the agenda. But it will be OK for third parties to put OFF inside their products. OFF is not for a single company or even a single industry it is a ‘society-wide initiative’. But how do you demonstrate that reported emissions are true? ‘Proof of origin is important, we need something like blockchain to prove origin’. Today, OFF members mostly work with Excel. In the future, a truck moving Ikea furniture could ‘stream data across the IoT’.
Krebbers was questioned on the relationship with other reporting initiatives, notably the OGCI of which Shell is a member. ‘We looked at these initiatives but they only cover a single industry. OGCI does not go across industry and into the supply chain. We speak to them and learn what they are trying to do. What we are doing is different’.
How will it work? Will there be a central OFF database? Haasjes responded that with APIs, ‘anyone in the ecosystem can tap into the data to decrease their GHG footprint. Krebbers stated that ‘all can have their own implementations and can hide their own data from competitors**.
Will OFF leverage the TOG ODEF open data element framework? ‘No, we have already a data platform (from OSDU), a flexible meta data store with pointers to real data. But this is all at an early stage’.
* See our earlier report on the OFF.
** i.e. open source software but not open data! This is rather curious in terms of GHG reporting transparency.
The virtual event ‘Securing Your Automation and Controls Using ISA/IEC 62443’ was held during a four week period, late 2020, hosted by Saudi Aramco and ISASecure, the cyber security arm of the International Society of Automation.
Dan DesRuisseaux (Schneider Electric) and Kevin Staggs (Honeywell) offered a supplier’s perspective of the complex, evolving cyber security regulatory environment that differs from country to country and even state to state. Regulations are likely to be a major influencer on cybersecurity* in the near term and regulatory demand leads to customer demand. There are 11 different state regimes in the US with varied definitions and requirements. Multiple standards are being created in China, and the relationship between all of the above is unclear. Today’s regulatory disharmony is not going to change in the foreseeable future as cyber nationalism predominates over international standardization.
Customer requirements are likely to specify a plethora of standards and protocols. Certification to a particular IEC 62443 security level has been a commercial differentiator in the past. This is now changing with increased customer demand and looming regulatory requirements. The number of certified offerings has steadily increased over time and the authors expect ‘continued acceleration’.
The reason for the change is the increased risk from the advanced persistent threat (APT), where an actor gains unauthorized access to the network, remaining undetected, gathering intelligence or preparing sabotage. Equipment vendors typically subscribe to threat intelligence platforms which provide APT details that can be used to harden their kit. One recent APT example, Dragonfly 2.0 (with Beserk Bear and Dymalloy) has targeted the US energy sector by spear phishing attacks on the supply chain. Such APT risk can be mitigated by firmware signing, code signing, device genuine-ness and secure boot. Of course, internal IT policies and training also play a role in educating users against phishing.
Of increasing interest is cyber risk insurance. Coverage assessments help companies discover cyber risks and conditional coverage forces clients to address gaps, in order to get reasonable coverage terms. The more operators undergo the underwriting process, the sooner cybersecurity baselines will emerge. Access to meaningful cybersecurity insurance at affordable rates becomes a motivator to continuously improve cybersecurity performance.
* Not so much for the hackers!
William Goble (Exida) gave an overview of IEC 62443, the ‘global’ automation cybersecurity standard. The standard has four levels – general terminology, policies and procedures, system-level security and component-level. The focus is operational technology as opposed to IT. Different parts of the IEC stack will be addressed by different communities, from product/device supplier through system integrator to operator. IEC 62443 addresses network robustness testing to demonstrate safe and correct operation. Products and systems must have cybersecurity protection mechanisms. Engineering processes must be defined and documented to minimize design errors. A certification methodology is available for the various aspects of the standard. This can be self-certified by the manufacturer or (better) by an accredited third party, such as Exida, the ‘first ISA Secure-accredited cybersecurity certification body’.
Camilo Gómez (Yokogawa) explained how OPAS (a.k.a OPAF*), The Open Group/ExxonMobil process control standard is approaching cyber security. OPAS is to leverage existing industry standards ‘whenever possible and practical’. OPAS components are expected to meet or exceed the security requirements of the system owner. OPAS ‘shall allow’ for the development of OPAS components using secure programming practices and restrictions. OPAS has defined standards for components at the IEC 62443 SL2 Security Level. OPAS leverages a combination of OPC UA, Redfish and IEC 62443 security mechanisms (OPC UA and Redfish have been mapped to IEC 62443-4-2). Gómez stated that OPAS is ‘moving away from the device mentality’ and asked ‘are external certifications ready for OPAS component types/products?’ OPAS is currently evaluating external certification against certification by The Open Group. ISASecure involvement in OPAS certification appears likely as OPAF has signed a memorandum of understanding with ISASecure in a ‘commitment to cooperate on a component cybersecurity assessment/testing program’. This will dovetail with the relevant OPAS specifications and associated certification program. ISASecure is to assess the security conformance of OPAS products using IEC 62443-derived certification specifications.
* OPAS is the Open Process Automation Standard that is managed by OPAF, The Open Group’s Open Process Automation Forum.
ISASecure, a.k.a. the Security Compliance Institute of the International Society of Automation is an operating unit of the ISA’s Automation Standards Compliance Institute. Members include Aramco, Exxon, Chevron and YPF.
The ISA Global Cybersecurity Alliance is a collaborative forum to advance cybersecurity awareness, education, readiness, and knowledge sharing. The objectives of the ISA Global Cybersecurity Alliance include the acceleration and expansion of standards, certification, education programs, advocacy efforts, and thought leadership.
ISA Security Compliance Institute (ISCI), a not-for-profit automation controls industry consortium, manages the ISASecure conformance certification program. ISASecure independently certifies industrial automation and control (IAC) products and systems to ensure that they are robust against network attacks and free from known vulnerabilities.
Securing Your Automation and Controls Using ISA/IEC 62443 event home page.
The European Inspire directive was designed to ‘remove borders to the exchange of geospatial data’. In 2013, the EU published an Inspire Data Specification for Energy Resources. Despite this, in the EU oil and gas industry, particularly in the (pre-Brexit) UK and Norway, oil country ‘standards’ appear to have conspicuously avoided Inspire. Sanae Mendoza, blogging for Safe Software, reports that a ‘full realization’ of the Inspire standard is slated for 2021, 14 years after its enaction into EU law.
Mendoza places Inspire in an environmental/ecological context. Data needs to be harmonized to track geographical features and pollutants as they cross national boundaries, ‘everything is connected to everything else’. Accessible and usable data (particularly spatial) is critical for developing policy that can address the complexities of an interconnected environment.
Inspire data models are built atop the OGC Geographic Markup Language (GML), an XML-based encoding*. There are some 34 different Inspire models covering different geographic themes with their own unique features, properties and geometries. Atop the model are implementing rules for the use of metadata, interoperability, services, sharing, monitoring and reporting.
Mendoza reports that ‘few were equipped for the momentous task that is involved in harmonizing existing datasets with the Inspire standards’. The required breadth of resources and expertise to consume, integrate and create Inspire data was the biggest challenge faced by the directive. Projects typically required ‘data domain specialists, XML developers and a mastery of regulation compliance’. Safe has risen to the Inspire challenge and has released the FME Inspire GML reader/writer to integrate or create compliant GML easily. Read Mendoza’s blog here.
* Other encodings, HTML, JSON, GeoJSON and REST APIs are also under consideration.
See also the Esri Inspire minisite and Energistics EIP which alludes to Inspire.
dBG Earth Sciences has launched a consortium to investigate machine learning, combining its open source seismic interpretation suite OpendTect with data stored in an OSDU database. The aim is to demonstrate in a series of proprietary case studies ‘what the future of E&P data management looks like and how machine learning models can add value to a standardized, cleaned-up datastore containing well and seismic data’. More from dGB Earth Sciences.
The University of Houston,
under the leadership of former SEG president Fred Aminzadeh, has
launched the AIM-DEEP program. AIM-DEEP is developing a platform to
promote AI/ML in E&P, bringing operators, service companies, data
scientists and academia together. AIM-DEEP is billed as not ‘yet
another’ university consortium. R&D topics will be prioritized by
‘base’ member/sponsors who will receive all deliverables. Individually
sponsored project (ISP) membership is also available for targeted
R&D into a chosen focus area. AIM-DEEP vendor partners are
Petrolern, Sotaog, NEC, Enfinitech and … dBG Earth Sciences.More from AIM-DEEP.
* Artificial Intelligence, (AI) Machine Learning and Data Analytics (DA).
Equinor, with partners Shell and Total, is to release data from the Northern Lights confirmation well 31/5-7, a.k.a. ‘Eos’ which was completed in 2020. Eos was drilled to test a reservoir for the storage of CO2 in preparation of the Northern Lights carbon capture and storage project. Northern Lights is to pump liquid CO2 from a 100-km pipeline into the Cook and Johansen Formations (Jurassic) at a depth of 2,500m. The data includes well logs, core data and tests. The dataset, comprising some 850 files and 83 GB can be downloaded from the Equinor data portal (login required).
Offshore and onshore reliability data (OREDA), gathered by several oil and gas operators for nearly four decades, is now available online through DNV GL’s data platform, Veracity. OREDA was established in 1981 in cooperation with the Norwegian Petroleum Directorate and now covers data from some 300 installations. Data from over 18,000 equipment units with 43,000 failure and 80,000 maintenance records represents a total of over 2,000 years of operating experience. French IT service provider Satodev helped migrate the data from the traditional handbook to a digital tool, ‘OREDA@Cloud’ available through DNV GL’s Veracity data platform. A single user license to the ORDEA Topside + Subsea data costs €500 per year. More from DNV.
The US Energy Information Administration’s International Energy Portal includes several new features. Data can be viewed in physical units, BTUs of energy content, Terajoules and million tons of oil equivalent. The world energy overviews, rankings, and country analysis include annual datasets through 2018, along with many for 2019. The Portal now includes monthly updates for petroleum and other liquids production for the world, and energy consumption figures for OECD countries.
The EIA has also updated its US Energy Atlas with a new interface* for web map applications and a comprehensive open data catalog. The Atlas shows detailed energy infrastructure in redesigned maps with enhanced navigation and data accessibility features. Users can now combine the EIA data with information from other sources for custom geospatial analysis. The Atlas features 84 map layers showing the locations of power plants, coal mines, oil and natural gas wells, pipelines, storage facilities, natural gas processing plants, refineries, and other types of energy facilities.
* The Energy Atlas was built with the Esri ArcGIS hub, an ‘easy-to-configure community engagement platform’.
Fugro and project leader EOMAP are to ‘remotely map and monitor’ seafloor habitats, morphology and shallow water bathymetry. The 3-year, EU Horizon 2020 ‘4S’ (Satellite Seafloor Survey Suite) project will leverage artificial intelligence, physics models, and satellite and airborne data to de-risk marine site survey using satellite data.