What has Spotify got to do with a bottom hole assembly? A lot, according to Scott Sims* (Baker Hughes) who has been trialing Spotify’s Word2Vec technology, building a recommendation engine for bottom hole assemblies. Speaking at the 2021 Nvidia Global Technology Conference, Sims presented a means of clustering and identifying similar BHAs ‘regardless of operator or basin’. Drilling bottom hole assemblies comprise everything between the end of the drill pipe and the rock face. They are made up of various elements, motors, sensors and other components along with crossovers that tie bits of kit together and to the drill bit itself. Optimizing horizontal ‘factory’ drilling requires a knowledge of how different BHAs perform in different circumstances. Different diameters can cause the BHA to sag, compromising the logging while drilling response. A smart recommendation from the model might advise adding stabilizers.
Drilling parameters including rate of penetration, weight on bit and brake speed are easily recorded and available to train a neural net model. Capturing the makeup of the BHA is more problematic. BHA descriptions are recorded, but in rather summary form in a small text file that tabulates components from different vendors. The massive number of different possible component combinations makes it impossible to derive a simple rules-based approach. Data is recorded in a more or less free-form text description with ‘words’ such as “6 ¾ x-o 4 ½ to 4 ½” .. “9 ½ x-treme motor” and so on. There is no canonical format for (e.g.) motors, crossovers etc. such that regular pattern matching fails.
The problem is analogous to automatic translation. It turns out that a neural net-based approach such as Word2Vec works well, even in the presence of oddball abbreviations and misspelt words. Word2Vec puts similar words close together and they can be tied to BHA components item – motor, stabilizer, x-over etc. Similar words should (and do) get close embeddings in the word vector space** which shows hot spots for ‘bit’ and ‘motor’. More sense can be obtained from the data by adding ‘context’ i.e. incorporating surrounding words. This led Sims to use Doc2Vec and treating the whole BHA string of words as a ‘paragraph’.
Baker Hughes has trained its neural net on some 10,000 BHAs from different North American basins. Sims graciously acknowledged the performance of Nvidia’s high-end DGX-1 GPU computer and containerized machine learning software stack. ‘We are definitely getting our money’s worth from this box’. The system now provides feedback into the field, to inform drillers of any issues with similar BHAs that were run in the past.
Word2Vec and similar tools are used in machine translation and by companies like Spotify, Airbnb and Google where their ability to perform ‘fuzzy’ matches compensates poor quality or ambiguous input data. Sims concluded, ‘The oil industry is used to working with [hard] sensor data. Working with text is something new for us. The study has produced very encouraging early results and we are building a great foundation for real-time work and field deployment.’
Comment: Simm’s comment on sensor data vs text is interesting in that it contrasts with the sales pitch that came from advocates of AI/ML. The early hope was that ML would find hidden truths in large volumes of good data. The BHA study, like other applications of ML such as well event detection and screening scanned well logs, is more concerned with making something useful from rather poor data. For another example of an NLP-style approach, see Rice University’s SEDSI in our report from the 2020 HPC in Oil & Gas conference in this issue.
* With co-author Mohammad Khan.
** You can play with a word vector space and read the Word2Vec tutorial on tensorflow.org.
CGI has announced CGI Pivot, a platform platform-as-a-service offering based on the OSDU data platform. CGI Pivot provides the subsurface community, from oil and gas operators and regulators to renewables organizations and academic institutions, with an ‘easy and low risk’ starting point to explore and migrate to the OSDU Data Platform. Pivot offers data ingestion, a map-based display and ‘embedded intelligence’ to extract meta data and recognize geospatial data. A topology service links trajectories and wells, automating technical data managers’ work. CGI’s OSDU credentials appear good. The company has been under contract to Shell since 2013 for the provision of key application services in support of Shell’s TaCIT technical and competitive IT unit, that provides IT solutions in support of Shell’s upstream and downstream business. CGI is also working to extend CGI Pivot to include renewable energy, with a pilot of wind and photovoltaic data on OSDU. A new renewables data management system will soon be available on CGI Pivot. Read CGI Netherlands’ Michaël van der Haven’s blog on CGI Pivot on the OSDU Forum website.
A 12-page publication from Schlumberger, ‘Understanding the OSDU Opportunity, a Schlumberger Guide to the OSDU Data Platform’ confirms the consubstantial nature of the code for OSDU and Schlumberger’s Delfi. ‘Following the contribution of the open sourced Delfi data ecosystem into the OSDU data platform code, the OSDU data platform sits at the heart of the Delfi environment’. Release 4, due out later this year, will be the ‘official V1.0’ of the OSDU data platform. R3 already includes the full open-sourced Delfi data ecosystem contribution by Schlumberger. Schlumberger’s explainer also includes the structure of the central OSDU management committee which comprises of representatives from BP, Chevron, Equinor, ExxonMobil, Shell, Schlumberger, EPAM, Teradata, Dell, Google, and Microsoft. On the question of OSDU-certified apps, Schlumberger is clear, while certification by OSDU may be offered at a future date, right now there are no legitimate OSDU-certified solutions. ‘Currently, the plans for certification are at a platform level, meaning there will be no ‘OSDU-certified’ apps for the time being. Meanwhile ‘solutions that claim to be OSDU-compliant or OSDU-certified have not been verified or endorsed by The Open Group’.
Gazprom Neft has joined the OSDU Forum and expects that OSDU ‘will become a platform for digital technology collaboration that erases borders and gives an opportunity to contribute and market products for the Russian technology ecosystem’. Gazprom Neft plans to contribute its own software to the OSDU platform, as both open-source code and as commercial products. Gazprom partner CROC has also joined OSDU.
Landmark is planning to ‘liberate’ data from applications in a transition to OSDU. Data needs to be quality controlled before input to OSDU in a ‘robust ingestion pipeline’. With this in mind, Landmark has committed to making ‘all Landmark products and services OSDU-compliant’*. DecisionSpace 365 software as a service on OSDU provides a means of ingesting large quantities of data from a variety of sources. These are now available publicly with the R3 commercial release of OSDU. Landmark has also contributed new code to OSDU to support binary log data storage and allow performant access to metadata, files and high precision log curves. Data can be accessed from the OSDU Search APIs or a Jupyter notebook. More from Landmark.
* Notwithstanding Schlumberger’s warning on OSDU-compliant claims!
Petrosys’ R&D roadmap includes a commitment to OSDU. The company has already developed a connector to extract and view well data from and OSDU store. The commercial version is available on OSDU Mercury R3 and enables mapping, modelling and data exchange through connected workflows. Petrosys is seeking feedback from its users on the future direction of its OSDU program and is ‘open to broader collaboration’. Initial thoughts are for enhanced ingestion of OSDU data into Petrosys PRO and on data exchange and integration, with a focus on the migration of legacy data into OSDU. Provision of a coordinate reference system service is also under consideration. More from Petrosys.
Schlumberger is working with Equinor to deploy a Delfi subsurface data environment ‘fully integrated with the OSDU data platform’ in what is described as the ‘first major deployment of OSDU’. The OSDU-enabled solution will be embedded as a key part of Equinor’s Microsoft Azure enterprise-wide data platform, OMNIA. This will establish consistent data standards across the subsurface to enhance overall decision making. At the application level, Equinor’s project will include the Delfi Petrotechnical Suite, ExplorePlan (an exploration planning solution co-developed between Schlumberger and Equinor), and data science solutions from Schlumberger.
The last couple of months have been amazing. The environmental movement has trounced the oil and gas industry in shareholder meetings and in the hearts and minds of the western world. Regulators over here have announced the end of life of my car for a couple of years out and my oil fired central heating is likewise under threat! I said ‘western world’ advisedly because I am pretty sure that middle eastern producers do not have the same concerns. No more do the Russians if we judge from Rosneft’s plans, as reported in the Financial Times, to build ‘15 new towns to house 400,000 employees’ in support of the giant Vostok oil development.
Objectively, the efforts of the environmental movement to ‘defund’ oil and push the majors into the energy transition will likely drive up the price of green projects, as everyone piles in at the same time. It may also drive the price of oil up, as the cost of capital rises and the social acceptability of oil and gas exploration makes for a reduced exploration playground and a scarcer resource. An oil price rise does assume that demand remains high, which depends on how quickly the world comes out of covid and how fast (or slow) the energy transition goes.
That makes for a lot of imponderables but if I was an investor, I would prefer the upside available in a pure play oil company to the opportunities that may arise from an overcrowded ‘energy transition’ play. I may have been influenced in my thinking by a report from Deloitte that found inter alia that oil ‘still delivers significant value for many. Two-thirds of oil-heavy portfolios deliver above-average performance.’ Deloitte also finds that ‘green’ energy transformation may be neither profitable nor scalable. ‘Of portfolios that have become greener, [only] 9% delivered top quartile financial performance’.
As a lifelong worker in oil and gas, I have to admit some schadenfreude with regard to the difficulty that the energy transition is experiencing. I am not saying that this is a good thing. I try to refrain from the atavistic defense of the oil industry that is evidenced in many LinkedIn comments. In fact, we have been doing our bit for the energy transition in Oil IT Journal for some time. Our first ‘going green’ rubric appeared in 2010. In this issue ‘going green’ weighs-in at over 3,000 words. It is well worth a read to see what is happening, who is doing what, who is posturing and how. There are many carbon capture projects ‘under study’. The US leads the world in the number of CCS projects. Up till now, the driver has been the commercial use of CO2 in enhanced oil recovery. But, as the demise of the Petra Nova project shows, commercial CCS (like shale), needs a high oil price to work. Maybe the updated 45Q tax régime will change this.
Oil and gas had a wakeup call with the Colonial Pipeline ransomware attack. We have covered cybersecurity in oil and gas for at least 15 years. At the 2005 DigitalSecurity event, we reported on the then newish notion of deperimeterization as IT muscled in on the plant, dissing clunky old scada systems. The deperimeterization movement was launched by The Open Group back in 2002 with the formation of the Jericho Forum, with a mission to ‘evangelize the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of deperimeterization’. Ten years later (in 2013), deperimeterization was declared a success, a ‘landmark victory in the evolution of information security’. The Jericho Forum was duly ‘sunset’. Maybe the sun set too soon on Jericho!
I recently read Joseph Menn’s account of the Cult of the Dead Cow, on ‘the original hacking supergroup’ (well I tried, not exactly easy going). In between the rather breathy account of curious personalities, Menn’s account constantly refers to the abysmal security of early versions of Microsoft’s software. Although Cult of the Dead Cow does not cover the process field, one does have to wonder what the contribution of Microsoft’s OLE for process control was in deperimeterization. I’m sure that Microsoft’s security has improved since the early days, but then, so has the acumen of the hacker. For more on the Colonial Pipeline attack and on the latest developments in cyber security read the cyber security round-up in this issue. Another bumper (2,000+ words) edition!
I noted that the Society of Exploration Geophysicists (SEG) and the American Association of Petroleum Geologists (AAPG) are to hold joint annual meetings ‘for at least the next five years’. The first combined event, now billed as the International Meeting for Applied Geoscience & Energy, IMAGE ’21 will be held as a ‘hybrid’ online and in-person event in Denver next September. Meanwhile, the AAPG is engaged in discussions with the Society of Petroleum Engineers (SPE) to ‘formally explore’ a merger of the two organizations. To my mind, the geophysics and geology disciplines make more natural partners than engineering. So joining the trade shows is understandable in the context of a diminished industry. But engineering is already such a broad church that adding in geology would make the SPE’s focus even more diffuse. I’m speculating, but the dual collaborations (AAPG & SEG, AAPG & SPE) would seem to herald a hook-up across all three bodies, at least for the annual trade show. If that means more exchange of ideas between the disciplines that’s great. If it means even bigger booths and yet more razzmatazz, maybe not such a good idea.
Allan Huber, current Chairman of the PPDM Association, opened the proceedings recalling an ‘amazingly successful’ 30 years of PPDM*, founded as a community around a data model, a ‘consistent way of capture and sharing petroleum data’. Over time, activity has expanded under the umbrella of IPDS, the International Petroleum Data Standards. Huber acknowledged Trudy Curtis and her staff ‘for making all this possible’.
Melvin Huszti looked back over the early days of PPDM. At the time he was working for Gulf Canada and was fascinated by computers in the oil industry. In the late 1980s, an oil price collapse meant ‘getting rid of the costly mainframe’ and moving the company’s Oracle database to a Spark workstation. Gulf was looking for other willing companies to share development and provide users and testers for the migration effort. The legal department OK’d the move to the public domain and the creation of an association to look after the model. Thus was PPDM born.
Yogi Schulz took up the story. Before PPDM, companies operated mainframes with separate databases for seismic, land and mapping. Each company developed its own software in-house and ran its own unique database. Applications were few and far between and datasets were sparse. The cost-cutting of the late 80s coincided with the emerging technology, notably the Oracle RDBMS. Gulf Canada was ‘thinking outside of the box’, along with other companies like Finder Graphics, Applied Terravision and Digitec Info Services. The original PPDM data model was unveiled at the May 1990 AAPG convention and published in Geobyte in October 1990. At the time the model was distributed for $100. Shortly afterward PPDM was incorporated in Alberta as a not-for-profit. The first Houston meet was held in 1991. In 1993, the model (PPDM 3.0) was delivered in an Oracle Case tool.
Later in the 1990s, PPDM faced some competition. From the IBM Mercury petroleum data model and from Epicentre, a new data model from the Petroleum Open Software Consortium (POSC, now Energistics). PPDM tried to collaborate with POSC but the effort failed because of different views on model technology and because of culture. PPDM focused on current technology whereas POSC was more interested in future technology wherever that might be going**. PPDM then refocused on developing a ‘business-driven’ data model and saw industry adoption increase.
2008 saw the name change to Professional Petroleum Data Management Association, and a shift to a wider vision of data management and an expansion to oil and gas semantics, education and professional development with ongoing commitment from workgroups and community. The move to semantics saw the start of the What is a Well? What is a Completion? And What is a Facility? projects. Education and training led to the creation of the Certified Petroleum Data Analyst qualification. PPDM has been and is still a ‘great, continuing story’.
PPDM CEO Trudy Curtis then introduced a round-the-world tour of the videos produced by the volunteer committees, many of which will soon be available on the PPDM YouTube Channel.
Allan Huber returned to present PPDM’s new strategic plan. This is to track oil and gas companies’ transformation into energy companies and their need to reduce or eliminate emissions and align with the Paris agreement. This will mean, increased sensor data, cloud computing and represents a tremendous opportunity for PPDM in the form of the nascent data management needs of new energy. The What is a Facility semantics work needs to expand into environmental standards to embrace emissions, geothermal wells, CCS alongside ‘legacy’ oil and gas. For all these domains there is a need for ‘technology agnostic’ data objects’ derived from the model. PPDM intends to stay at the forefront of digital and data.
Today, PPDM is facing another ‘competitor’ in the form of OSDU, the open software data universe. Currently, PPDM is working hand in hand with OSDU, with participation in various OSDU programs. A division of labors has been mooted where OSDU certifies technology and PPDM certifies people. Many PPDM definitions have fed into the OSDU work. More of these will be required as new data definitions are needed for greenhouse gasses and mining ‘all areas for PPDM members and standards’.
Bill Whitelaw, from sponsor company geoLOGIC systems, wound up the proceedings recognizing PPDM’s data quality work which has ‘fed into credible corporate ESG’. Whitelaw also acknowledged PPDM’s contribution to geoLOGIC’s success, ‘We are Western Canada’s subsurface data leader thanks to PPDM’.
* PPDM was founded as the ‘Public Petroleum Data Model’ association. In 2008, it changed its name to the ‘Professional Petroleum Data Management Association.’ At the time, PPDM CEO Trudy Curtis explained, ‘Industry now looks to the PPDM for leadership in data management and governance, business knowledge and master data management. PPDM’s activity now includes best practices, certification and training’.
** For more on the failed PPDM/POSC collaboration, read the article ‘Discovery, what did they find?’ that appeared in Oil IT Journal’s very first issue, back in 1996.
Oil IT Journal - dGB recently announced a Consortium* to demonstrate the ‘future of E&P data management’, using machine learning and OSDU. How did this start?
Paul De Groot – I began as something of a skeptic with regards to OSDU having seen similar earlier initiatives from POSC and Open Spirit fail. Things changed when Schlumberger put its weight behind the project and threw its Delfi data infrastructure into the pot as open source software. OSDU now looks to be a game changer. I believe it will be the standard in the near future. So the next step for dGB Is to make OpendTect adhere to OSDU.
The Consortium is billed as a ‘digital transformation demonstrator’ what does this mean?
OK, nobody knows what the ‘digital transformation’ really means, but there is a consensus that machine learning/artificial intelligence is key. So we are setting out to demonstrate ML capabilities with OSDU, ‘the database of the future’.
Isn’t this a takeover of the industry by Schlumberger?
I don’t know. Maybe Schlumberger sees a business model here and is expecting more work via OSDU. But the Delfi formats are now open and will be the platform of the future.
What do you see as a killer app for AI?
I’ve been working for 30+ years in AI for geophysics, in the last 3-4 years on deep learning. DL can do stuff that was not possible previously with shallow nets and accelerators. Now some models can be re-used on other data sets. You can train a model to produce shear logs from sonic, and apply it across the whole of the North Sea. You can build a library of hydrocarbon direct indicator artefacts and apply it everywhere. There is more to this than hype.
Which cloud provider is leading OSDU?
Amazon is probably in the lead. But there is also stuff running on Azure for some clients. OSDU is also on Google Compute Cloud and IBM/RedHat.
Are you backing Open VDS as the new default seismic format?
Yes, we are!
So does that mean that the first step in moving to the OSDU world is migrating SEGY (and other formats) to OpenVDS?
No, OpendTect supports SEG-Y as a native format, which we will continue to support for file-based access. But if you want to store data in the cloud via OpendTect’s implementation of OSDU, the seismic data will be stored in OpendVDS and the well data in Delfi. We expect most vendors to follow this implementation, which is recommended by OSDU. If you want to store SEGY data in the cloud, you can for instance use Osokey’s implementation of OSDU.
*The OpendTect/OSDU Machine Learning Consortium.
Oil IT Journal - What is the rationale behind Ikon Science’s latest offering Curate?
Dennis Saussus - We spent some 20 years developing RokDoc, our flagship package for real-time geomechanics, pore pressure prediction and more, and all this will continue. At the same time, we have learned a lot about the challenges that clients face, knowledge sharing, working with data trapped in specialized desktop apps, transferring data, finding out what has already been done. All this in the context of project management and decision support. These problems have been compounded in the last five years with downturns, people leaving or being let go.
How does Curate relate to your existing portfolio. Is it a rejuvenated iPoint?
In some ways, yes? iPoint helps people find and share data. It started with core data and expanded to a large range of data types. In Curate, we set out to combine iPoint functionality with RokDoc, adding a modern user interface.
What is Curate’s scope, what applications are you seeking to displace?
We are not seeking to displace other applications. Rather to build apps around RokDoc that automatically reuse data. For instance, a web app that supports and democratizes AVO modeling with sensitivity analysis.
The focus is still reservoir geophysics?
Yes. Our aim is to expand with apps for pore pressure, real-time ..
So this is a platform?
Who do you sell to? End users, geophysicists, IT?
A good question! The answer is ‘all of the above’. The Curate workspace integrates application databases including Kingdom, OSDU, Petrel … so it is key to talk to the IT department and leverage Curate’s open API.
Is the API provided to clients or just for Ikon internal use?
It is very much our intent to offer an open API to clients and third parties including other software vendors.
An OSDU purist might argue that vanilla OSDU does much of this already? Is there overlap here?
OSDU’s focus is more on the data management/access end. Curate’s main focus is at the end-user level.
How open is ‘open’? Will this be on Git Hub as open-source?
No, this will not be open-source – but the APIs will be ‘open’ at some point down the road.
The IEA says (today) that oil exploration must end. What is your backup/alternative plan?
Yes, this is one reason for broadening our offering into the data/workflow space. We expect our users to be working in the geothermal/CCUS/renewables areas. Data/Workflow is key to all of these. We do anticipate working outside of traditional oil and gas. The intensity of carbon-neutral pressure will only grow. But the world will not switch off hydrocarbons overnight. Curate opens the door to new products beyond oil and gas.
Vincenzo Lipari presented Milano Poly’s* entry to the Norwegian Force Predicted Lithology using Machine Learning competition as ‘lots of brilliant ideas, too bad they don't work’. Lipari observed that ‘almost all the top scores in the Force predicted lithology competition derive from the use of ‘classic’ machine learning methodologies and a standard approach to the problem. It is therefore puzzling that more innovative approaches and more sophisticated reasoning do not improve the result in any way’.
Lipari referred to the 2016 SEG Machine Learning contest which resulted in a ‘very influential’ standard approach to lithological ML. However, applying the usual techniques of ‘trees, xgboost and random forest’ produced poor results. Lipari suggests that the Force dataset may contain a lot of data, but that ‘maybe from a deep learning perspective this is still too small a data set’.
Handling geographical information also proved problematical. Lipari (who confessed to being better versed in signal processing than in geology) found no evidence that new geological meaning was being derived from geographical information although this possibly due to the model already embedding XY coordinates. In the end, the most successful model used a XG boost method. More sophisticated ‘modern’ deep learning based sequence models all failed, again probably because ‘big’ well data is ‘small’ for deep learning. Even the standard approach appears to have an upper limit to its precision score of around 80%. Lipari concluded that to improve performance, it is probably useful to be guided by a geologist and to embed geological knowledge into the model. Watch Lipari’s and other Force presentations on Youtube.
*Politecnico di Milano Image and Sound Processing Lab.
Comment: The ‘big data is too small’ may affect other applications of ML in oil and gas such as Total’s work on ESP monitoring.
Speaking on the “Next Five Years of HPC” panel, Katy Antypas (Berkeley Labs) called for more connectable HPC systems to allow complex workflows to span different hardware with specific capabilities. Optimizing end-to-end workflows should be the aim rather than building a massive binary application. Intel’s Dan Stanzione was philosophical, ‘In five years we will still be fighting heterogeneity, and there will not be a single programming language’. There will be more object stores, but ‘Posix isn’t going anywhere’. One ‘very problematic’ trend will continue, the ‘fork between the exascale crowd (versed in heterogeneous architectures, C/C++, HDF5 and storage hierarchies) and everyone else who just use Python!’ Andrew Jones (Microsoft) saw the cloud as the ‘future supercomputer’. In five years we will be evaluating a significant new technology (quantum computing?). Shell’s David Baldwin placed HPC in the context of the energy transition. Seismics will continue but will be a smaller part of the HPC pie as we see new uses and applications with disparate and even conflicting requirements. Data sizes will continue to grow and data management and movement will challenge, requiring careful choices.
Mary Wheeler (UT Austin Center for Subsurface Modeling) presented on Bayesian optimization for field-scale geological carbon sequestration. With help from IBM, a framework has been developed for blending high fidelity geological models with machine learning techniques. Wheeler traced current CO2 sequestration initiatives from ExxonMobil’s $3 billion test, Oxy’s Midwest CO2 ‘Superhighway’ and the Illinois CCS project. CCS modeling presents a range of challenges. To model fluid migration and interactions, cap rock integrity and more, scale-variant, non-linearly connected, space-time dependent grids are required, incorporating geochemistry and geomechanics. Enter IPARS, the integrated parallel, accurate reservoir simulator, the CCS simulation workhorse. IPARS embeds components from CMG and others, notably IBM’s Bayesian Optimization Accelerator (BAO). Here, the true analytical functions are replaced with surrogate Bayesian models whose predictions inform subsequent workflow steps, making for a more computationally efficient system. Parallelizing the approach is both necessary and hard which is where BOA fits in, running in the IBM cloud. The approach was demonstrated using data from the Louisiana Cranfield CCS site with a 60% reduction in compute time.
IBM’s Chris Porter and Martina Naughton showed how the Bayesian Optimization Accelerator (BOA) is used to find the ‘ideal solution at lowest cost’. BOA is delivered as a stand-alone appliance along with its own API and GUI, located alongside the HPC environment. BOA is delivered on either x86, Power or ‘anything else’ architectures. According to IBM, BOA is used for ‘hyperparameter optimization’ and automatic machine learning, ‘two applications among many, BOA is a machine learning algorithm in itself’. More on BOA from IBM .
Ricard Durall (Fraunhofer Institute) presented a generative model for transfer learning on seismic data. Progress in machine learning has meant that it is now possible to automate tasks that were previously only doable by expert geophysicists*. Applications include fault picking, horizon detection and salt body segmentation. Machine learning methods traditionally fall into two camps. Those derived from human-labelled real data and those trained on synthetic models. Fraunhofer’s approach is to use generative adversarial networks (GAN) to enhance and automate labelling the training data. The system generates artificial faults and diffractions. The approach is said to outperform standard methods that use pure synthetic data.
* This is something of a bold claim as automated interpretation has quite a long history, dating back much earlier than the current AI/ML boom.
Sergio Botelho (RocketML), with researchers from Shell and Rice University, presented on 3D seismic facies classification using distributed deep learning. His starting point was the observation that seismic facies classification is a 3D problem and hence is not amenable to 2D approaches. Using a 3D CNN is however extremely compute-intensive. Enter distributed deep learning, a.k.a. DeepFusion*, demonstrated on the SEG’s Parihaka 3D dataset. DeepFusion supports massive networks on large-cale CPU/GPU clusters. It has shown ‘excellent strong scaling of 3D seismic classification problems on massive networks’.
* We asked RocketML’s Vinay Rao for more on DeepFusion. He replied ‘DeepFusion is a RocketML-developed distributed deep learning framework that abstracts out the complexities of HPC, purpose-built for solving large scale machine learning problems. It is particularly useful when the data resolution sizes, model sizes, batch sizes are too big to fit into the compute memory footprint. DeepFusion works on a "cluster of computers", be it GPU or CPU, out of the box without any customization, special code or knowledge of HPC and Cloud services. Users can focus on solving a machine learning problem vs. struggling with HPC hardware+software infrastructure. Many applications in the Oil and Gas industry are compute-intensive, including seismic-related, and can benefit from RocketML DeepFusion. This technology is available to customers who subscribe to RocketML SaaS product.’
Qie Zhang (and others from Microsoft and Imperial College London) demonstrated a cloud-native approach for 3D full waveform inversion on Microsoft Azure. The ‘hyperscale’ seismic imaging toolset, built on Docker, Kubernetes and Dask and programmed in Python and the open source Devito. Dask is a scheduler for parallel programming in Python. The Devito FWI code runs on multiple CPU/GPU architectures from Intel, AMD and Nvidia and was demonstrated on 256 Azure virtual machines. HDF5-based lossy compression allowed for a 15x reduction in the data footprint. Tests were performed on the SEG/EAGE Salt Overthrust open dataset.
Zhaozhuo Xu and Aditya Desai (Rice, working with Shell) presented on ‘Beyond convolutions - a novel deep learning approach for raw seismic data ingestion’. The idea is to shorten the traditional workflow of seismic processing by going straight from raw data to the subsurface model, reducing processing times from months to minutes. Current data-driven research treats seismics as image data. This is a ‘sub optimal’ approach as raw seismic data is ‘at least’ five dimensional. Unsurprisingly, such methods have not been successful. ‘Raw seismic data is not an image and should not be processed as one’. The authors instead propose an approach that is more akin to natural language processing. Enter SEDSI (set embedding-based seismic data ingestion). SESDI breaks down large-scale prediction into a small auxiliary task that ‘gracefully’ incorporates data irregularities. SESDI is claimed to be the ‘first successful demonstration of end-to-end machine learning on real seismic data’.Read the SESDI paper here.
Comment - One question that arises from an approach that removes geophysical ‘smarts’ from the processing workflow is survey design. Geophysical survey is the first step in a workflow that is tuned to both the geological problem and to subsequent processing steps. If these are replaced with AI, who designs the survey and with what in mind?
Hervé Gross presented GEOSX, the Total-backed open source CO2 sequestration simulator. CCUS is said to be ‘at the limit’ of existing simulators’ capabilities. GEOSX blends fluid flow simulation with geomechanics, HPC R&D and was developed on use cases supplied by Total. HPC innovations include the RAJA / LvArrayperformance portability framework and Chai, a ‘copy hiding application’.
We asked Gross what if any was the relationship between GEOX and the NETL’s ‘CCSI’. He replied, ‘There is no relationship between GEOSX and the CCSI-toolset released by NETL. The NETL toolset aggregates a number of utilities for carbon capture, whereas we focus on geological carbon storage. Their most dynamic repository (FOQUS) is a Python-based platform for optimization and uncertainty quantification. FOQUS seems to be a very flexible “pegboard” where you can easily create workflows by tying various pieces of software together, but all are related to engineering for carbon capture. Flexibility seems to be their selling point for solving surface engineering optimization problems. We have a different objective: we simulate geological CO2 injection in large formations. Simulating such subsurface phenomena requires multiphysics modeling that is not numerically tractable without a scalable design’.
Comment In any event, CCS modeling is difficult, as we concluded in 2015: ‘Numerical evaluations of CCS projects in saline reservoirs showed that it is very hard to find a target that matches all of the desired parameters. In general, sequestrable volumes shrink as long-term migration risk to aquifers and caprock integrity concerns are considered.’
AquaNRG’s Spatika Iyengar showed how cloud-based HPC is used in digital rock chemistry/physics. AquaNRG’s reactive transport models solve multi-phase flow, solute geochemistry and biogeochemistry. A web application runs multiple pore scale models to obtain relative permeability and capillary pressure saturation relationships in what is described as a ‘digital twin for special core analysis (SCAL)’. The cloud architecture includes Lambda functions, DynamoDB, S3 Buckets and Step functions in a Dockerized implementation of a ‘continuous integration/continuous deployment' strategy.
Next year, for the 15th edition, Rice is to change the name from the Oil and Gas HPC Conference to the Energy HPC Conference. A sign of the times…
Watch the Rice HPC in Oil and gas presentations on Youtube.
An aside … In an interview that appeared in the SEG’s First Break, seismic luminary Oz Yilmaz was asked what he thought of ‘digitalization’ in geoscientific work? He replied that ‘There has been a strong push to apply AI and its variants - ML, DL, CNN - to solve difficult problems in exploration seismology, thanks to highly influential propaganda from the high tech companies of the Silicon Valley. With regards to AI’s applicability to problems in seismic data processing, inversion, interpretation and integration of diverse geoscience data, it is in the latter case, and to some extent in seismic interpretation, that AI methods have been rather successful. Whereas problems in processing and inversion really require natural, not artificial intelligence. I base this on my experience in testing the AI algorithms for these two categories.’
CGG’s Smart Data Solutions Business has released an enhanced version of PleXus, its cloud-based data management portal. An enhanced GUI improves search and visualization of corporate data assets, easy data up/download, GIS mapping and a ‘frequently used’ area for current projects. PleXus security features include authentication, time-based revision and non-use expiration, brute force attack lock-out, encrypted password storage along with entitlements management.
Version 6.0 of IHS Markit’s Analytics Explorer sports a new direct link to EDIN, International E&P upstream and midstream data by IHS Markit. Machine learning workflows are expanded and a new ‘multicollinearity analysis’ tool automates the selection of uncorrelated and relevant variables before any regression, clustering, or classification workflow. An advanced gridding tool creates predictive maps within Explorer, without the need of a GIS.
IHS Markit has also announced Kingdom 2021 with a new modern GUI, personalized quick access toolbar and multiple improvements to its geoscience modules including deep learning-based automated fault interpretation, 3D seismic volume blending, and combined geological and geophysical workflows. Kingdom Direct now connects to international/EDIN upstream and midstream data.
Ikon Science’s new Curate software is a ‘scalable, cloud-enabled knowledge management solution designed to provide cost efficiencies along with faster and more accurate decision making’. Curate enables energy companies to collaborate within a single workspace to access all subsurface data with streamlined workflows. Curate integrates with legacy databases and open industry standards such as the OSDU data platform. More from Ikon and in our interview with Denis Saussus in this issue.
Ikon has also announced RokDoc and iPoint version 2021.3 with improvements in performance and user experience. An updated version of Python for use with the RokDoc External Interface, and an ‘MTM simulator’ for RokDoc Ji-Fi.
Liberty Oilfield Services has introduced the FracSense diagnostic service that leverages Optasense’s fiber optic technology to monitor and optimize hydraulic fracture completions and well spacing. Real time fiber optic measurements monitor per-stage fracture placement and frac hits in offset wells.
Lloyd’s Register has added new modules and updates to its subsurface software packages IC (Interactive Correlations) and IP (Interactive Petrophysics). The updates include ‘refreshed interfaces’ for IP and more detailed correlations of the subsurface characteristics between wells for IC. The increased insights proffered by the tools are ‘earmarked for other uses such as carbon capture, storage, geothermal and waste disposal’.
Emerson has released Paradigm 19p3 which includes seismic technology developed under the Kaleidoscope alliance with Repsol. 3D refraction tomography for shallow velocity model building is also new as are various improvements including GPU/CPU performance of EarthStudy 360 Imager, ‘Marker-Map’ velocity model building and new velocity and time/depth conversion tools in SKUA-GOCAD. Paradigm 19 releases are available as both cloud-hosted and on-premise.
Users of Thermo Scientific’s PerGeos software can now generate 3D data from 2D images and synthetic parameters. The solution uses thin section images and petrographic data and ‘process-based modeling’ to generates 3D images and simulate the natural processes of sedimentary rock formation, allowing petrophysical properties and multiphase flow results to be computed from the 3D synthetic ‘cores’. Watch the webcast.
Stone Ridge Technology’s Echelon 2.0 reservoir simulator, jointly developed with Eni, offers a full GPU solution of compositional formulation, both fully implicit and adaptive implicit options. Echelon now also supports facility network modeling, multi-reservoir coupling and advanced well management. More from Stone Ridge.
V 6.0 of ZoneVu Ubiterra’s Azure cloud-native browser app for drilling visualization and geosteering, now includes a completions module and live notifications.
Schlumberger is offering its domain and AI expertise via a new Innovation Factori where a global network of digital experts and data scientists are ready to address digital workflow challenges with innovative new solutions. The Factori promises ‘open AI, data, and digital solutions for all domains’ deployed ‘on-premise, at the edge or in the cloud’ and embracing open systems, such as the OSDU data platform.
ABB’s Hoverguard detects and maps natural gas leaks while flying ‘with unprecedented speed, accuracy and reliability’. Hoverguard combines patented LGR-ICOS laser technology, wind velocity and GNNS sensors and advanced data analytics, detecting leaks far from hard-to-reach sources in minutes.
In its quarterly software roundup, GreaseBook reports that its eponymous app is now integrated with PHDWin for economics and decline curve analysis. A new well history file repository offers remote access to key production asset documentation. GreaseBook now automates Texas Railroad Commission production report filing. More on these and other novelties from GreaseBook.
Originally developed for the nuclear industry, Indeavor’s worker fatigue management solution has been adapted to the oil and gas environment and aligned with the latest API guidelines (API RP 755). Plants and refineries can now operate efficiently while automatically adhering to the American Petroleum Institute’s rules on limiting hours and days of work. More from Indeavor.
Samcom’s ExCam ultra-compact (127x48mm) cameras are certified for operating in explosion risk and hazardous areas (2014/34/EU ATEX). CCTV applications range from simple monitoring applications to fully controllable digital video monitoring systems with voice transmission. More on Samcom’s ‘triple launch’.
The 12.1 release of AspenTech’s AspenONE embeds first principles-driven hybrid plant models within Aspen HYSYS and Aspen Plus, ‘bringing AI directly into our simulators’. Existing first-principles models are enhanced with added AI to calculate unknown variables and relationships not captured by the original model, continuously recalibrating the model as conditions change.
Bilfinger has relaunched its connected asset performance platform, BCAP, a cloud-based AI/IoT solution for the process industry. BCAP offers data conversion and management, cross-silo process optimization and plant-wide predictive analytics.
Embassy of Things has rolled out an AI Edge Controller to perform real-time predictions and anomaly detection, providing ‘closed-loop, event-response operational action’ to maximize production. The solution includes EOT’s Twin Talk’s Insight Engine and was developed in collaboration with Xecta, TensorIoT and CTG. The product is also a component of the AWS production operations solution. More from EOT.
T.D. Williamson has improved its gouge vs metal loss (GvML) classifier resulting in the ‘industry’s first published performance specification for gouge identification and depth sizing within a dent’. The classifier is a component of TDW’s MDS (multiple datasets) platform that provides comprehensive mechanical damage assessment, leveraging multiple technologies. The new classifier identifies gouges in natural gas and hazardous liquids pipelines, providing depth sizing of gouge and corrosion features using field-validated tolerances.
Dover Fueling Solutions has launched DX Retail, part of its DFS DX connected solutions platform. DX Retail is a flexible and intuitive system, which allows retailers to manage and update their Tokheim Fuel point-of-sale (POS) systems remotely. DX Retail leverages Microsoft Azure and ‘intelligent edge’ technology to drive multiple point-of-sale updates simultaneously from any web-enabled device. Stores are kept updated with the latest retail items and pricing across the network.
As part of a $100 million, five-year investment program, PDI Software has enhanced its security platform and improved fuel pricing data integration. The delivery platform API integrates with delivery partners like Vroom, allowing automatic updates from the PDI Pricebook.
BP has announced Price Match to inform BPme Rewards members where to buy gas at the best price by comparing BP and Amoco prices to competitor stations within easy reach. If a lower price is found, the saving (max 5 cents) will be applied to the member’s next purchase. PriceMatch costs 99 cents/month. Average monthly fuel use in the US is around 50 gallons. Assiduous PriceMatch futzing could be worth up to $1/month!
Kadri Umay (Microsoft) and Josephina Schembre-McCabe (Chevron) demonstrated the use of Nvidia’s ‘Index’ 3D volumetric visualization framework on large-scale ‘super-resolution’ digital cores. Today, the bottleneck in oil and gas data visualization is software, as supplied by multiple vendors, in different formats and access paradigms. This makes it hard to develop custom, cross-vendor views of the data*. There is a need for central (cloud) data accessible and viewable by all. Umay described the shift from early steps in the journey to the cloud that involved a ‘lift and shift’ of storage, architecture and apps. The future will bring ‘cloud native’ visualization of data served from object storage in the cloud and in-cloud GPU computing ‘leveraging industry standard data and APIs’. Today, the cloud architecture is somewhat complex involving Nvidia Index, Azure Kubernetes services, blob storage, the Helm package manager, Azure pipelines, Docker, RBAC. An ‘Azure blob viewer’ launches Nvidia Index, allowing visualization of digital rock image in the cloud.
Comment: Whether this is up to tools from Thermo Fisher Scientific or really achieves cross vendor visualization would appear moot. The approach might even sacrifice app functionality and add considerable IT complexity.
* This problem was identified some 20 years ago and resulted in the development of Dynamic Graphics’ CoViz!
Zarif Aziz from Abyss Solutions showed how offshore asset inspections are performed using drones, with data post-processed with convolutional neural networks. Abyss has been using drones to perform photographic studies of offshore flare stacks. Drones are said to offer more complete coverage of complex structures than is possible with human inspection. Image processing (semantic segmentations with CNNs) was trained with manually collected corrosion data and classified as low, medium and high risk. Abyss claims a ‘best in industry’ corrosion model, trained on thousands of high quality labels from different offshore platforms. Machine learning was performed on Nvidia T4 Tensor Core GPUs running in the AWS cloud. Software included CuDNN and TensorFlow. The flare stack corrosion model represented a fine-tuning of Abyss’ topside corrosion model, incorporating some 1,000 high quality flare stack images. Corrosion identified by the model is displayed over the original imagery, resulting in ‘faster and more targeted’ remediation work. Abyss is now working to add drone-based Lidar scanner data into the model and, by repeat surveys, perform change detection of corrosion.
See also Abyss’ work for Anadarko on offshore corrosion identification.
Shouvik Mani and Michael Haddad showed how C3.ai showed how deep learning and graph search are used to parse and digitize engineering diagrams. Piping and Instrumentation (P&ID) diagrams are paper-based representations of plant components and their relationships. These are scanned and parsed with image recognition software to identify components and connections. Symbol detection leverages a CNN/rectified linear unit (ReLU) in a parsing pipeline. An EAST text detection network grabs textual information from the diagram. A LeNet-architected CNN was trained to classify components and tag numbers. The graph search is used to trace connections. The authors concluded that the diagram digitalization pipeline enables applications such as diagram search and equipment-to-sensor mapping, supporting the creation of a facility-wide digital twin.The methodology is a component of the Baker Hughes/C3.ai Reliability application.
Mike Walsh showed how PCPC Direct managed to serve high-end 3D graphics apps to remote workers from a private cloud during the pandemic. Walsh believes that most virtual desktop infrastructure (VDI) offerings fail to give a satisfactory user experience. All oil and gas 3D remote VDI solutions that PCPC has tested ‘do not solve common end user requirements when tested in a true remote manner, on a laptop, over an internet connection’. PCPC has developed a solution certification process to test remoting of Schlumberger’s Petrel, including metrics (FPS, latency, image quality …) and different user profiles. PCPC Direct’s solution is built on a Lenovo RVIZ VDI server. When suitably configured (Petrel requires 8GB GPU memory per user), remote VDI Performance was ‘indistinguishable from a workstation’. Leostream’s OpenStack VDI also ran.
Shashank Panchangam* presented Halliburton’s graphics-intensive geoscience applications, now deployed either on-premises or in the cloud via the Microsoft Azure Stack Hub (ASH). Halliburton/Landmark’s Decision Space 365 geoscience suite has been tested running on an Nvidia T4 Tensor Core on the ASH with applications running remotely on a virtual desktop interface. A speedup in seismic attribute computation (2.5 to 1.5 hours) was observed and a 70% decrease in deployment time on the Azure cloud infrastructure (Ansible, Terraform). Of note was the Azure Arc offering, a ‘single control plane’ spanning Azure, AWS and the Google Cloud, ‘bringing Azure services to any infrastructure’. More from Landmark.
* With Gaurang Chandrakant (Microsoft)
Andre Sih demonstrated’ the AI-powered ‘Pangea’ document management system that his company, Fu2re Smart Solutions, developed for Petrobras. ‘Pangea’ is an ‘intelligent’ document classification system that extracts parameters and images from documents and classifies unstructured data. The company also markets SmartVision.AI, a general purpose GUI and API for ML model development.
Vishal Vaddina* presented Quantiphi’s work for Chevron on ‘generative models for seismic image enhancement’. The work was performed with Quantiphi’s TensorFlow estimator API on the Google AI platform and a ‘super resolution’ general adversarial network (SRGAN). The approach has been published.
* With Aravind Subramanian (Quantiphi) and Chanchal Chatterjee (Google).
Ananthan Vidyasagar (Beyond Limits) presented on the application of a ‘soft actor critic’ method using GPU-accelerated deep reinforcement learning for field planning. The approach is described on the BL website as ‘Cognitive AI for Field Management’.
Anna Dubovik from Russia’s Data Analysis Center gave a wide-ranging presentation on geological interpretation with Open AI and CUDA tools. These are now claimed to ‘outperform human experts’. The DAC methodology is described in a paper by Dubovik’s co-author Roman Khudorozhkov on seismic horizon detection with neural networks.
Steven Goolsby (Coyote Oil and Gas) is the president-elect of AAPG for the 2021–22 term and will serve as president in 2022–23. VP–Regions (2021–23) Elvira Pureza Gomez Hernandez (CNOOC) and Jonathan Allen (Chevron) will serve two-year terms on the Executive Committee.
Jill Smith is now an AspenTech board member.
Baris Ertan is now Baringa Partners’ North America Energy & Resources Lead. He hails from Accenture.
Black & Veatch Management Consulting has promoted Deepa Poduval to VP and Chris Klausner and Joe Zhou as Associate VP in its management consulting business.
Alasdair Cathcart is now a Senior Advisor to Blackstone Energy Partners. He hails from Bechtel.
CAM Integrated Solutions has promoted Jason Newton to COO.
Ian Graham is now CFO at Cathedral Energy Services. He was previously with Trican Well Services. Cathedral has also appointed Fawzi Irani as SVP, US Operations. Irani hails from Precision Drilling Corporation.
The Global CCS Institute has opened an office in Abu Dhabi to support the development of the CCS community in the region. The Institute is now actively recruiting for a regional manager and local staff. Apply here.
Robin Macmillan is now Chief Corporate Development Officer at Data Gumbo. He hails from NOV.
Diamond Offshore has named Bernie Wolford President and CEO. Neal Goldman is Chairman of the Board. The appointments followed Marc Edwards’ retirement earlier this year.
Nick Wayth is the new CEO of the Energy Institute. He was previously with BP.
Tina Roberts has joined E&P Consulting to lead the Data Services team within its Oil and Gas business.
Equinor has re-elected Jon Erik Reinhardsen as chair and Jeroen van der Veer as deputy chair of the board. Bjørn Tore Godal, Rebekka Glasser Herlofsen, Anne Drinkwater, Jonathan Lewis, Finn Bjørn Ruyter and Tove Andersen were re-elected by shareholders.
Irene Heemskerk is now head of the European Central Bank (ECB)’s climate change center. She hails from the IFRS Foundation.
Bernard Clément is to lead French energy industry body EVOLEN. He hails from Total.
Fugro has appointed Barbara Geelen as CFO and member of the board. Marc de Jong is now a member of the Supervisory Board.
John Browne (formerly BP CEO) has joined General Atlantic as Senior Advisor for climate and net-zero.
Houlihan Lokey has launched an EU oil and gas unit and hired a team from BMO Capital Markets: Jeremy Low (MD), Tom Hughes (Director), Thomas Wheeler (VP) and Raffaello Avakov (Associate).
IOGP’s Brussels unit has hired new Policy Officers: Alexander van Hulle (sustainable finance topics), Julie MacNamara (Marine and Environmental issues) and Emils Lagzdins (CCUS, EU ETS, and methane emissions).
Dominic Berry is now Project Engineer at the IOGP JIP33 replacing Tom Byrne, who stepped back to return to BP.
Greg Desnoyers has joined Korn Ferry as Senior Client Partner in the firm’s global Industrial practice, focused on Energy and Energy Transition.
Tim Nicholson has been promoted to SVP and Head of Exploration, and John Shinol to SVP and Chief Geoscientist at Kosmos.
DeLome Fair is now the principal process engineer at KP Engineering.
Dale Posein is to lead Ledcor’s Pipeline, Industrial & Fort McMurray Operations groups. President Bill Partington is to retire.
Daniel Brown, Oasis Midstream Partners CEO, has been elected to the company’s Board of Directors.
Jonathan Harms has been promoted to MD at Opportune LLP.
Patty Mims (Esri), Javier de la Torre (CARTO), and Prashant Shukle (KorrAI) are now members of the Open Geospatial Consortium (OGC)’s board.
Marcel Kessler, James Howe, Jon Faber, Jay Collins, Judi Hess and Laura Schwinn are now directors at Pason Systems.
Sam Sledge has been promoted President at ProPetro following the recent appointment of Adam Munoz as COO and David Schorlemer as CFO.
Grant Creed is now CFO at Seadrill. Leif Nelson, current CTO, will also serve as COO following Reid Warriner’s decision to step down.
Xiao Song is the new Chairman, President and CEO of Siemens China, succeeding Lothar Herrmann, whose delegation is ending after seven years at the company’s helm.
Dirk Didascalou is now CTO of Siemens Digital Industries. He was previously with Amazon Web Services.
Katharina Beumelburg is Chief Strategy and Sustainability Officer at Schlumberger. She hails from Siemens.
Dr. Ravi Gopinath (AVEVA) is now a Non-executive Director at Spectris.
Peter Batty is now SSP Innovations’ Chief Research Officer.
Katerina Yared is the new President of the Society of Professional Well Log Analysts (SPWLA).
Murray Hinz is now a member of Stampede Drilling’s Board of Directors.
Marcus Deal has been named CEO at Specialty Welding & Turnarounds.
Rafael Ponce is now President of the Hydrographic Society of America (THSOA) following the death of Admiral Richard Brennan.
Christian Ness is now CEO at TMC Compressors. He succeeds Per Kjellin, who will continue as an advisor in the business development team and joined the company’s board.
James Block has been named account representative at Twin Brothers Marine.
OGC CEO, Nadine Alameh has been appointed to the Board of the United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) Private Sector Network (PSN).
Stefan Krause is now a member of VELO3D’s board.
A recent meeting of the Energistics WTSML special interest group has been validating the final version of ETP v1.2, due for release soon, and discussing the next version of WITSML. The discussions covered the authentication of connecting applications to an ETP server, JSON style guides for Energistics standards, and the migration from WITSML v220.127.116.11 to v2.x. The latter includes proposals to promote compatibility with the OSDU Data Platform, capturing Practical Well Log Standards information and incorporating new IADC codes in WITSML reporting objects. The PWLS mnemonics reference library, now available to OSDU, covers over 45,000 entries for data channels associated with well-logging services across the industry. Energistics’ RESQML Spring ILAB focused on the delivery of RESQML v2.2, with the expectation of a release candidate ready for public review and comment in a few months’ time. Again support for the OSDU data platform’s Reservoir Domain Data Management Service (rDDMS) is planned. A ‘significant portion’ of the ILAB week was spent discussing issues emerging from the OSDU reservoir initiative, which ‘exemplifies the ability of and need for the Energistics and OSDU communities to collaboratively deliver standards and related services that support the industry’. Energistics is supporting OSDU with technology that allows key WITSML data types and metadata to be ingested into the OSDU data store. These have been tested with all four OSDU cloud providers. Energistics has also proposed a post-Mercury (OSDU V3) project to add reservoir data to OSDU. This project utilizes RESQML, a vendor-neutral format for such complex data types, as the fundamental building block. Members are ‘actively developing’ code to add this key functionality to OSDU. PRODML is likewise a critical part of the Production project within OSDU, with proven capabilities to convey production volumes, fluid types, well-test data, downhole and surface facilities, production hierarchies, etc. More from Energistics.
The Open Group’s Open Process Automation Forum, the ExxonMobil-backed process control standards initiative, has published its O-PAS Version 2.1 preliminary standard. The release is said to be a milestone towards testing and field trials of the O-PAS Standard. The ‘standard of standards’ sets out to facilitate ‘open, interoperable, and secure process automation systems’. Version 2.1 adds new configuration portability capabilities to the information model of Version 2.0. An O-PAS certification program is due for launch in the first half of 2022 along with the final V 2.1 release. Following which, OPAF will work towards V 3.0, which will address system orchestration, application portability, and further detail the physical distributed control platform. More from the OPAF.
Emerson has integrated the Namur Module Type Package (MTP) process automation standard into its control systems, as part of a continued effort to help manufacturers increase speed-to-market and reduce project and operations costs. More about the integration at Emerson.
AmsterCHEM has released its Python CAPE-OPEN Unit Operation, a new solution for prototyping a unit operation model while ensuring CAPE-OPEN interoperability with a process simulator. The Python CAPE-OPEN UO builds on COBIA middleware in a notable ‘first-ever’ use. The new UO system was extensively and successfully tested by Martin Gainville from IFP Energies Nouvelles. The solution complements other wrappers such as the MATLAB, Scilab and Microsoft Excel, also from AmsterCHEM. More from COLAN.
The Digital Twin Consortium, an Object Management Group unit, has announced an open-source collaboration to encourage digital twin innovation. Consortium members and non-members can collaborate on open-source projects, code, and collateral and become part of the DTC ecosystem. Candidates complete a project application for review by the DTC Technical Advisory Committee. On approval, contributors upload their code to the DTC Open-Source Collaboration GitHub site. More from the DTC.
A new white paper in the ECCMA ISO 8000 series, authored by Peter Benson addresses the issue of import substitution industrialization (ISI). ISI aims to enable open markets and globalization by ‘efficiently distributing scarce resources using price to balance supply and demand’. As governments regulate the mobility of capital, labor, goods, or services in support of their local policies, this impacts market efficiency which is typically reflected in local price anomalies making it difficult to calculate the true cost. This white paper looks at how international data quality standards are used to aggregate production and supply chain data to characterize a product; including how, where and when it was made and by whom. Poster child for ISI principles is the Kingdom of Saudi Arabia through its 2030 Vision program. This includes the Saudi National Industrial Development and Logistics Program that delineates opportunities for import substitution across key industrial sectors. Benson advocates that requests for exemption from import duty should be accompanied by a part number in ISO 8000-115 format and a specification in ISO 22745-40 format, ‘simple requirements that can be met by any manufacturer or supplier’.
The European Commission has adopted a new EU Strategy on Adaptation to Climate Change which calls for ‘increased cooperation with standardization organizations to make sure that existing standards are climate-proof and to develop new ones for climate adaptation solutions’. CEN and CENELEC support the EU effort to mitigate climate change, and the preparation for its ‘unavoidable consequences’. CEN and CENELEC have been working with the EU Commission on ‘climate proofing’ key infrastructures by revising the relevant priority standards. More from the CEN/TC 467. An overview of CEN and CENELEC’s activities to support the efforts to mitigate climate change can be found in the policy paper ‘Standards in support of the European Green Deal’.
A recent joint SPE/OGCI webinar addressed the use of the SRMS to categorize existing CO2 storage assessment and for making new assessments. A new version of the OGCI’s CO2 Storage Catalogue has just been released with new countries and resources. These include an explainer of the SRMS classification system, how storage potential and maturity is evaluated, and how public assessments are translated into data for the Catalogue. The methodology developed by Pale Blue Dot Energy closely follows the definitions of the Storage Resource Management Scheme (SRMS). Under this system, the evaluated geologic formations are defined as ‘stored’, ‘capacity’ and ‘resource’. OGCI comments that ‘the current assessments notes that the majority of resource assessments are non-commercial, highlighting the critical need for the nascent CCUS industry to incorporate project-specific requirements (infrastructure, number of penetrations) to a specific resource’. The intent is to capture the maturity of commercial resources available to CCUS projects globally. More from OGCI.
XBRL International reports that the European Commission has adopted a proposal for a Corporate Sustainability Reporting Directive (CSRD). This is to leverage Inline XBRL (iXBRL) to report ‘detailed and consistent’ structured data, marking ‘a new chapter in environmental, social and governance (ESG) disclosure’. CSRD aims to create a set of rules that will, over time, bring sustainability reporting on a par with financial reporting’. CSRD extends the EU Single Electronic Format, currently rolling out for financial reporting, to ESG, requiring companies to tag sustainability information, making it machine-readable and comparable. The Commission also envisages that disclosures will be made available through the developing European Single Access Point. CSRD will introduce mandatory EU-wide sustainability reporting standards, developed by the European Financial Reporting Advisory Group (EFRAG). More from the EU Commission XXXX https://ec.europa.eu/info/publications/210421-sustainable-finance-communication_en#csrd
The US Financial Accounting Standards Board (FASB) has produced a new fact sheet in its ‘FASB in Focus’ series, titled ‘XBRL: What is it? Why the FASB? Who uses it?.’ The fact sheet explains what XBRL is, how it’s used in the US ‘GAAP Taxonomy’ (made up of the GAAP Financial Reporting Taxonomy and SEC Reporting Taxonomy) to tag financial reports, making them machine-readable and comparable. The FASB reports that machines represent over 95% of visitors to its online EDGAR portal.
Alexey Kleymenov (Nozomi Networks Labs) writing on the ISA Automation website showed the internal mechanics of DarkSide’s attack on the Colonial Pipeline. Nozomi Networks Labs has studied the internals of the DarkSide executable is sharing its findings to reveal the techniques used by its machine code in three areas: the selection of victims and files, ensuring anonymity and anti-detection, and preventing data restoration. Read his fascinating analysis here.
A report from MIT outlined how ‘self-promoting cybersecurity firms end up helping ransomware criminals’. In January this year, Bitdefender defender ‘happily announced’ a ‘startling breakthrough’, a flaw in DarkSide’s ransomware. The following day, DarkSide declared that it had repaired the problem, and that ‘new victims have nothing to hope for, special thanks to BitDefender for helping fix our issues, this will make us even better’. DarkSide wasn’t bluffing, it unleashed a new string of attacks including the one that paralyzed Colonial Pipeline. MIT believes that without Bitdefender’s announcement, it’s possible that the crisis might have been contained, and that Colonial might have ‘quietly restored its system with a secret decryption tool’. Read the full story in the MIT Technology Review.
The US Cybersecurity and Infrastructure Security Agency (CISA) has named has named pipeline infrastructure as one of 55 National Critical Functions (NCF) that can cause a ‘debilitating impact on security, national economic security and public safety.’ CISA has published a Joint Cybersecurity Advisory AA21-131A covering the DarkSide heist and outlining best practices to mitigate disruption from ransomware attacks. The advisory includes ‘indicators of compromise’, malicious lines of code that need to be scanned-for in company networks.
Michela Menting, ABI Research’s digital security research director described the Colonial Pipeline incident as ‘exposing a willful ignorance to take cybersecurity seriously!’ ‘Any company (especially one with upwards of $500 million in annual revenues) that is not prepared for such attacks has clearly been purposefully skimping on basic cybersecurity tools, training, and strategy.’ Menting surmised that as the attack shut down both IT and OT, ‘their security posture must have been poor at best’. In the face of continuously increasing threats, even the best cybersecurity solutions will not guarantee protection. Preparation for an attack means ‘architecting infrastructure so that it can continue to operate despite an ongoing attack while simultaneously recognizing and dealing with the threat’. More in ABI Research’s Critical Infrastructure Security market data report.
Reflecting on the Colonial Pipeline hack, Index Engines observes that hackers now routinely include backup infrastructure in their attacks, thereby making recovery impossible. Index Engines supports backup products from vendors such as Dell to ensure backup environments are available to provide clean recoveries. Index Engines Jim McGann commented ‘cyber criminals can now sabotage companies’ recovery processes. Both the REvil and Conti ransomware have can now corrupt or shut off backups.’ More from Index Engines.
Bedrock’s Sam Galpin, commenting the Colonial attack, said, ‘In an ideal world with state-of-the-art defenses, the attack would be detected and defeated before it could inflict any damage. In the real world, the first indication of compromise is likely to be the ransom note. Surviving ransomware is about what happens next’. This means immediate activation of the cyber incident team and response plan. The starting point may well be that an attack has disabled all the Windows workstations on the control network. Under these conditions it is probable that PLCs and other controllers are still running. This is, of course, uncertain. The HMI screens are displaying ransom notes. The operators are blind. To find out what might happen next, read Bedrock’s white paper: Chapter 4 – Securing Industrial Control Systems – Best Practices.
CGG reports a cybersecurity incident on a server hosting Accellion software*. The vulnerability exploited Accellion’s secure file transfer application (FTA), before a corrective patch was found. At CGG, Accellion’s FTA was used on a separate server, isolated from production IT infrastructure. This standalone server had limited use within CGG and was not used to transfer or store personal or commercial sensitive information. There has been no operational or financial impact. CGG is investigating the breach in collaboration with Accellion and external security partners. More from CGG.
* Ironically, Accellion’s Kiteworks flagship is claimed to ‘prevent breaches and compliance violations from risky third party communications’.
Gyrodata reports a data security that may have involved personal information of some current and former Gyrodata employees. On February 21, 2021, Gyrodata discovered that it was the target of a ransomware attack. In response, the company immediately took steps to secure its systems, launched an investigation, and a cybersecurity firm was engaged to assist with its investigation. Gyrodata also notified federal law enforcement of the incident and continues to support their investigation. Individuals whose personal information may have been involved should remain vigilant for incidents of fraud or identity theft by reviewing account statements and free credit reports for any unauthorized activity. As a precaution, Gyrodata is also offering individuals whose Social Security number or driver’s license number may have been involved complementary credit monitoring and identity protection services. More from PR Newswire.
A white paper from Abacode, a managed cybersecurity and compliance provider (MCCP) warns that cyber-insurance is not quite the panacea some companies hope for. Recent legal precedence has seen insurers voiding key coverage by involving the ‘act of war’ clause. The argument gained legal credibility in the trial of the six Russian military members indicted for cybercrimes in connection with the 2016 NotPetya wiper attack. Insurers claimed the NotPetya attack represented a hostile act by a sovereign power and did not pay out. Abacode states that cyber-insurance is not a ‘get-out-of-jail-free’ card, businesses and organizations need to start looking at cyber-related insurance policies as a supplement to their own risk calculations – not as a substitute. Denial of coverage reveals a fatal flaw in many companies’ risk management policies, notably as ‘silent coverage’, where the insurance is not bought specifically for the risk, is now being eliminated from property and business-interruption insurance policies. Companies need to perform a third party security assessment by experts (like Abacode?) to establish a cybersecurity capability baseline and then to focused on their own security controls around critical assets and on mitigating critical, low probability, high-impact cyber-threats like a ransomware attack. More from Abacode.
Special Publication (SP) 1800-25 from the NIST National Cybersecurity Center of Excellence (NCCoE) covers ‘Data Integrity: Identifying and Protecting Assets Against Ransomware and Other Destructive Events. SP 1800-26, addresses ‘Data Integrity: Detecting and Responding to Ransomware and Other Destructive Events’. These new publications complement SP 1800-11, which addresses recovering from ransomware and other destructive events. See also the NCCoE Data Security program page.
Cynet’s 2021 Survey of CISOs with Small Cyber Security Teams interviewed some 200 chief information security officers in ‘medium-large’ sized organizations with between 500 and 10,000 employees and security budgets mostly in the $500k to $1 million range. Findings are that such small (up to 5 FTEs) are forced to cut corners, 16% of teams are ignoring alerts that have been automatically mitigated, 14% of teams only look at the alerts that are flagged as ‘critical’, and 79% of companies take more than 4 months to get up to speed deploying and becoming proficient in top security tools. The realities of small security teams ‘are opening companies up to serious risk’ and are ‘drowning in duplicate processes, and complex controls’. The top two breach prevention technologies* used by about almost all respondents were EDR/EPP (52%) and NTA/NDR (45%), followed by CASB (29%), NGAV (18%) and XDR (15%). Looking forward companies are planning to acquire NGAV (64%), Deception (56%) and CASB (42%). Deception and UEBA were the top two breach prevention technologies that companies want but cannot afford due to high costs or lack of people to operate. Outsourcing (to Cynet?) is one way to handle risk. Read the Cynet analysis here. Cynet has also announced the creation of a CISO Consortium and the 2021 CISO Challenge for cybersecurity team leaders to test knowledge of compliance and regulation, risk assessment, management KPIs, threat and vulnerability management, sign up here.
* Terminology explained here.
A flyer from ABS Group, ‘Cybersecurity services for the oil, gas and chemical industries’ announces that ‘Cyber criminals worldwide are expanding their attacks from the IT systems that control your business data to the OT and industrial control systems that run your operations’. OT in oil and gas is ‘highly specialized, with network connections that are vulnerable to attack’. ABS cites a 2020 analysis from Lawrence Livermore National Laboratory* that concluded ‘The oil and gas industry is unaware of potentially useful technologies that have been developed for ensuring cyber-security of other infrastructure systems, such as the electric grid. Leveraging these technologies—and the science and engineering behind them—can provide some low hanging fruit that can greatly improve cyber-security in the ONG industry without significant investments in terms of time and money.’ ABS partners with Obrela Security Industries on a managed services package that improves visibility and control of industrial cyber risks. More from ABS Group.
* Dragonstone Strategy – State of Cybersecurity in the Oil & Natural Gas Sector.
aeCyberSolutions, the Industrial Cybersecurity division of aeSolutions, has announced ICS Cybersecurity Risk Screening, a new service to assist industrial organizations in understanding the worst-case risk to operations should their industrial control systems (ICSs) be compromised. Cybersecurity risk screening exposes potential cyber risk to operations and shows how assets can be grouped into zones and conduits, allowing budgets and resources to be applied appropriately. More from aeCyberSolutions.
Bayshore Networks, a specialist in ‘active protection’ for OT/ICS Networks, has added OTfuse Lite to its Modular Industrial Control Cyber Security Platform of products. OTfuse Lite sells for $999 MSRP and provides threat detection, policy learning and enforcement in a solid-state device with a 5-year warranty.
Syed Belal from OT Cybersecurity Consulting Services, blogging on the ISA website, provides advice on the ‘Top 7 OT Patch Management Best Practices’. Unlike the IT environment, patches cannot all be installed on OT assets because of incompatible hardware, lack of vendor approval and the possibility of a patch crashing the asset. Patches may need wait until the next shutdown, leaving the asset vulnerable. Risk can still be mitigated with alternative controls, a.k.a. ‘patching smart’. Read the full best practices here.
The EU Commission has unveiled a new EU Cybersecurity Strategy plan, a ‘key component of shaping Europe’s digital future. The Strategy will ‘bolster Europe’s collective resilience against cyber threats’. Download your copy here (the Darktrace folks are reading it now).
Giesecke+Devrient has launched StarSign Key Fob, a biometric access device that controls employee access to company assets, ‘at the highest security level, in a convenient way’. The key fob supports two-factor authentication, adding fingerprint identification to the coin-sized device. More from Giesecke+Devrient.
Mission Secure has upgraded its OT Cybersecurity Platform with a new ‘OT security score’ that helps users focus resources on activities delivering the biggest impact. Other new features enable faster, more efficient incident investigations and response, and improved bandwidth utilization for low bandwidth cellular and satellite communications systems. More from Mission Secure.
Siemens Energy has rolled-out ‘MDR’, an AI-driven cybersecurity monitor and detection service for the energy industry. MDR is powered by Eos.ii, Siemens security incident event management system. (SIEM). Eos.ii is an ‘interoperable and manufacturer-agnostic’ platform that aggregates IT and OT data from Siemens process security analytics (PSA) threat stream and contextualizes it to pinpoint understand anomalous behavior. In summary – MDR uses the Eos.ii SIEM which embeds PSA (any more acronyms Siemens?) Read the Eos.ii whitepaper here.
Andium has secured $15 Million in a Series A funding round led by OGCI Climate Investments. The funds will be used to accelerate the deployment of Andium’s proprietary video solutions product lines for flare monitoring, tank telemetry and object detection. More from Andium.
Arria seems to have got in a bit of a tangle with its upcoming TSX/ASX IPO. A key auditors’ report of Arria’s global operations by KPMG New Zealand and KPMG Canada was held up due to extra requirements covering Arria’s acquisition of NuSutus and compliance with an IFRS rule requiring an independent technical memo on the 2019 conversion of a US $18 million loan note debt to equity. While Arria’s teams are ‘working diligently’ to complete the prerequisites impacting the Arria IPO timetable, other options include a direct listing on either NASDAQ or the NYSE. Advisor Solebury Capital has even suggested that Arria take the special purpose acquisition company (SPAC) pathway to list.
Quorum Software has completed its merger with Aucerna and also acquired TietoEvry’s oil and gas business. The three units will now combine into Quorum Software which now boasts ‘over 1,350 employees and 1,800 customers across the globe’.
Bpifrance, the French national investment bank, has hiked its stake in Technip Energies with a $US 100 million investment, raising its holding to around 7%. Bpifrance is to become a ‘long-term reference shareholder’, supporting TE’s ‘energy transition-focused strategy’. TE is roughly speaking, the old ‘Technip’ company as it was before its short-lived marriage with FMC.
Cegal is to acquire Envision AS, bringing the company’s geoscience and petroleum engineering consultancy services, data management and software into its portfolio. Cegal is owned by Chip Bidco AS, itself majority-owned by Norvestor SPV I Holding AS.
In a $12.4 billion deal, an EIG-led consortium is to acquire a 49% stake (Aramco holds the other 51%) in a new entity, Aramco Oil Pipelines Co. The new company has rights to 25-years of tariff payments for oil transported through Aramco’s crude oil pipeline network. The network connects oilfields to downstream facilities, transporting 100% of Aramco’s crude oil produced in the Kingdom. Washington-headquartered EIG is an institutional investor in the energy sector with $22.0 billion under management.
Global private equity firm Hellman & Friedman is to purchase a majority stake in Enverus from Genstar Capital. Enverus provides energy data analytics and cloud technology, machine learning/AI and industry-leading intellectual capital to clients including ‘21 of the top 25 global energy companies’.
American Industrial Partners, another private equity firm, has bought a majority interest in Ingersoll Rand’s high pressure solutions segment. Ingersoll Rand’s ‘Execution Excellence’ engine is said to have accelerated the timeline to completion. IR is to retain a 45% interest in the business. Sale of the remaining majority stake will bring in some $300 million. The transaction reduces IR’s exposure to upstream oil and gas to ‘non-material’ exposure of under 2% of revenue.
Onpoint Industrial Services is to merge with CertifiedSafety and its subsidiary, Calculated Controls. CertifiedSafety provides ‘cost-effective, scalable and proven’ solutions to solve safety-related challenges for the petroleum and chemical industries. The combined companies will provide industrial safety, logistics and turnaround management services in North America. Private equity firms The CapStreet Group and HCI Equity Partners will continue as shareholders in the combined entity. More from Onpoint.
Tank monitoring specialist Otodata Technologies has acquired Wise Telemetry, a provider of remote monitoring devices and services.
PDI Software has acquired fuel savings app developer GasBuddy.
Avisto Capital Partners is merging with Eddye Dreyer and The Resource Group to form PetroLedger, a new provider of oil and gas accounting and transaction services. Avisto partner Chad Smith cited the ‘uncertain environment’ of 2020 as a catalyst for change that led the companies to create PetroLedger to meet the existing and emerging needs our clients are experiencing. ‘More than ever, oil and gas companies are seeking to reduce overhead costs, maintain quality and free-up management time to focus on profitability and value-added growth opportunities’. More from PetroLedger.
Following feedback from regulators regarding the proposed merger of S&P Global and IHS Markit, the companies have decided to explore a divestiture of IHS Markit’s Oil Price Information Services (OPIS) business, as well as IHS Markit’s Coal, Metals and Mining business. This decision was taken to ensure the pending merger of both companies closes on a timely basis. Both the divestiture and merger are subject to further review and approval by regulators and antitrust authorities.
Seeq has closed a $50 million Series C funding round, led by global venture capital and private equity firm Insight Partners, bringing the total invested in Seeq to $115 million. This funding will accelerate Seeq’s expansion of development, sales, and marketing resources, and help increase the company’s presence in international markets.
CGG announced today that Sercel and Low Impact Seismic Sources have entered into negotiations for the acquisition of LISS by Sercel. In collaboration with Shell, LISS developed the tuned pulse source, an innovative marine seismic source with enhanced low-frequency output.
As a result of recent SEC guidance on SPAC-related accounting for warrants, fracker US Well Services is to restate its financial statements for 2018, 2019 and 2020. US Well Services reports ‘no adverse change’ to its operations, liquidity or business prospects as a result of the correction. ‘Demand remains strong for the company’s services’. USW went public in 2018 through a business combination with a SPAC.
Whiteley Oliver, a data-focused service provider to the infrastructure and civil construction market, has acquired Onshore Pipeline Services, and engineering, construction management and inspections company. OPS’ expertise will allows WO to offer a ‘true turnkey project management service’ along with its surveying, integrity, compliance and maintenance work.
Ambyint has partnered with Microsoft to provide oil & gas E&P companies with solutions to optimize production at scale. Ambyint artificial lift optimization solutions are now available in the Azure Marketplace.
The Group on Earth Observations (GEO) and the Open Source Geospatial Foundation (OSGeo) are collaborating on a ‘shared vision of openness’. For GEO, this means open data, open standards, and open science. For OSGeo, this means free and open-source software. More from GEO.
Schlumberger and Equinor, in collaboration with Microsoft, are to deploy the Delfi cognitive E&P environment, integrated with the OSDU Data Platform, to ‘accelerate Equinor’s ability to integrate data at scale and improve decision making’. More from Schlumberger.
Flare Solutions has delivered a new digital strategy to support OGUK’s future digital and data capabilities. More from Flare.
Halliburton and TGS are to collaborate to bring advanced seismic imaging to fiber optic sensing providing operators to determine reservoir potential for oil and gas production or carbon storage. More from Halliburton.
Schlumberger has partnered with AWS to deploy domain-centric digital solutions, enabled by the Delfi ‘cognitive’ E&P environment on the Amazon cloud.
Schlumberger and NOV are teaming up to accelerate automated drilling solutions adoption to unlock higher performance for operators and drilling contractors. The combined solution leverages advanced AI from the Schlumberger DrillOps well delivery solution, while NOV’s Novos process-automation platform controls NOV rig equipment.
Metron has selected Dive Technologies for the commercialization of its Autonomy, Navigation, Command & Control (ANCC) software suite on Dive Technologies’ large displacement autonomous underwater vehicle (AUV), the DIVE-LD. More from Metron.
Halliburton and Optime Subsea have formed a global strategic alliance to leverage Optime’s technologies within Halliburton’s existing subsea completions and intervention solutions. More from Halliburton.
HWCG has selected Integrity Management & Response as its core contractor. IMR is to provide HWCG’s operating members with deepwater source control (blowout) responders to staff the Incident Management Team (IMT) with well integrity and subsea expertise.
Kongsberg Digital, BW LNG and Alpha Ori Technologies have signed a partnership to enhance efficiency and reduce the environmental footprint of LNG carriers (LNGCs) and floating storage and regasification units (FRSUs). Learn more about the alliance from Kongsberg.
Snam, one of the world’s largest gas networks, has selected Red Hat OpenShift and other cloud-native technologies to help drive the organization’s digital transformation. More from Red Hat.
The Seeq AWS Glue integration for Enterprise Historians is now available on AWS Marketplace. More from Seeq.
FTSI has initiated its machine health automation technology and KCF MachineIQ on its first fleet, working for Devon Energy, and is now rolling out this technology across all its fleets.
The partnership between GSE Systems and ABB Bailey Japan is to provide process simulation for the NIHONKAI LNG Niigata terminal in Niigata, Japan. More from ABB.
Lloyd’s Register and Falkonry are teaming-up to combine its predictive digital twins with asset performance and risk management solutions for heavy industry, including chemicals and oil and gas.
Tecnimont has signed a MoU with AVEVA to create new digital predictive and prescriptive maintenance services. More from Maire Tecnimont.
Quantum has secured a total value of approximately $10 million additional ‘virtual pipeline’ trailers from Certarus. Certarus’ total investment in trailers to be delivered in 2021 approach to nearly $32 million. More from Quantum.
LTI and AspenTech are teaming to deliver Aspentech solutions through LTI managed cloud services to Oil & Gas and Chemicals Industries. More from L&T Infotech.
Infosys and BP have signed a MoU to develop an integrated Energy-as-a-Service (EaaS), end-to-end management of a customers’ energy assets and services. More from BP.
Globalstar do Brasil and Cisa Trading are to bring asset management solutions to Brazil. More from Globalstar.
Shell Catalysts & Technologies has implemented Aspen Schedule Explorer software in multiple plants throughout North America and Europe to improve coordination, communication and visibility for operations and supply chain personnel.
NIS (Naftna Industrija Srbije) has deployed Nutanix to simplify IT management and provide a more scalable and agile platform for future developments. More on the NIS deployment here.
Oilfields Supply Center (OSC) has invested $570M for the construction of a center in King Salman Energy Park (SPARK), an energy hub managed by Saudi Aramco.
Barchart and Evolution Markets have signed a distribution partnership to offer Evolution Market’s energy and environmental data products over Barchart’s ‘cmdty’ pricing network.
Quorum Software and Merit Energy has successfully transitioned its gathering assets to myQuorum TIPS in the cloud. More from Quorum.
Future Gas Station (a Recon Technology subsidiary), Henan CNPC and Alipay have developed ‘Hao Ke Le Jia’, operation and maintenance services for the related joint membership program.
Diebold Nixdorf is to provide BP with an integrated solution comprised of hardware, software and managed services.
Implico has implemented SAP RFNO at Oest’s network of AVIA service stations.
P97 is to provide Petro-Canada a mobile commerce platform including Pay at the Pump and EV charging functionality. The P97 mobile commerce platform is PCI DSS and SOC 2 Type II compliant, utilizing Microsoft Azure Cloud Services with multifactor authentication to protect sensitive cardholder data.
Pri Mar Petroleum has selected iRely as its partner of choice for its ERP software solutions.
CGI has secured a five-year single-award Indefinite Delivery/Indefinite Quantity (IDIQ) $60 million ceiling-contract from the US Department of Interior’s Bureau of Indian Affairs for continued enhancement of its Trust Asset Accounting Management System (TAAMS). More from CGI.
Bechtel is collaborating with Greentown Labs in Houston to reduce carbon emissions and achieve a future of net-zero emissions through the deployment of existing and new climatetech solutions.
INPEX and Japan Oil, Gas and Metals National Corporation (JOGMEC) have jointly initiated a study for CO2 enhanced oil recovery (EOR) 1 pilot test at the Minami-aga Oil Field in Agano City, Niigata Prefecture, Japan. More from JOGMEC.
ENGIE Lab CRIGEN uses Ansys' physics-based digital twin technology to develop an ultra-fast and high-fidelity platform to help companies transition to carbon-free energy.
IFPEN and Unesco have partnered in the field of geosciences for sustainable management resources for the energy transition. More from IFPEN.
Total and IFPEN have signed a collaboration agreement to evaluate the contribution of new digital technologies to carbon capture use and storage (CCUS). More from Total.
Worley is now a member of the Blockchain for Energy consortium.
EZ Blockchain has partnered with Silver Energy to help its oil and gas clients define a new business model, monetizing waste natural gas while reducing emissions with the help of Bitcoin mining.
Servomex has opened a new laser test bench at its Houston facility, to expand its service capability and strengthened its commitment to supporting customers locally. More from Servomex.
INPEX and Plug and Play have partnered to promote a joint business with startups in the energy sector in Japan and the United States. More in the press release.
ZEDED has signed a strategic OEM supplier agreement with Agora to provide customers in the oil and gas and renewable energy industries with full lifecycle management capabilities for their edge deployments as part of Agora’s overall IoT solution portfolio.
Technip Energies has been awarded a contract for Project Engineering and Management Services (PEMS) by KIPIC, a subsidiary of Kuwait Petroleum Corporation for various projects in southern Kuwait. More from Technip Energies PMC.
ZEISS, SENAI (Brazil) and Petrobras have signed a research alliance to develop and validate methodologies for the manufacturing and qualification of static as well as dynamic critical components for the oil and gas industry.
The Society of Exploration Geophysicists Advanced Modeling Corporation (SEAM) is asking for input on a future SEAM modeling exercise that will address numerical simulation for CO2 sequestration. The JIP ‘Numerical Simulation of CO2 Subsurface Management’ project will run for three years, engage 10 or more companies, and is anticipated to launch later this year. SEAM CO2 will produce benchmark data sets of simulated CO2 injection and monitoring. The workplan includes numerical models, earth response, and monitoring simulations. More from SEAM. Those interested in such should also read our report on GEOX in Rice HPC in oil and gas in this issue.
Calgary-based Advantage and Allardyce Bower Consulting have rolled-out their ‘Modular Carbon Capture and Storage’ (MCCS) technology which will be implemented at the Glacier Gas Plant near Grande Prairie. A joint venture, ‘Entropy’ has been formed to market the new solution. The $27 million plant is said to operate at a cost of $15/tonne of the CO2 which is captured and stored in a deep saline aquifer. The deployment will allow Advantage to market a portion of its production as ‘blue natural gas’. More from Advantage.
Baker Hughes has teamed with Bloom Energy on the commercialization of low carbon power generation and hydrogen solutions. Bloom Energy’s solid oxide fuel cell technology (SOFC) will power Baker Hughes’ light-weight gas turbines, providing cost-effective cleaner energy generation, waste heat recovery, and grid independent power. Baker Hughes’ NovaLT gas turbines will run on hydrogen produced in Bloom’s electrolyzer cells. More from Bloom.
Baker Hughes is also involved in a joint venture with Horisont Energi to develop technologies at the Norwegian Polaris carbon capture, storage and transport project. Polaris is a component of HE’s ‘Barents Blue’ project, said to be the first carbon neutral ‘blue’ ammonia production plant, with a total storage capacity in excess of 100 million tons. Mote from Horizont.
Pale Blu Dot, on behalf of the Oil and Gas Climate Initiative, has published the CO2 Storage Resource Catalogue – Cycle 2, an independent evaluation of geologic CO2 storage assessments worldwide. To date the catalog covers projects in 18 countries, evaluated against the SPE Storage Resources Management System (SRMS).
Helsinki-based startup Carbo Culture plans to remove 1 billion tons of carbon from the atmosphere by 2030. CC received $6.2 million seed capital in a round led by True Ventures. CC plans to tap into the same greenhouse gas capturing process as the biosphere, using energy from the sun for photosynthesis and turning CO2 into a ‘stable and useful form’. More from CC.
The Global CCS Institute has announced that the US 45Q tax régime, ‘the most progressive CCS-specific incentive in the world’, is now ‘open for business’. The IRS has now released its Final Rule and Regulations governing the administration of the credit meaning that ‘the US is ready for a huge new market to open up’. GCCSI has produced an explainer of the three-step process required to ‘add significant financial value to the bottom line while ridding the globe of carbon dioxide’. More from GCCSI.
GCCSI has also published, with help from Columbia University’s SIPA Center on Global Energy Policy, an analysis of the technology readiness and costs of CCS. The report covers economies of scale, CO2 source gas partial pressure, energy costs and technological innovation.
The University of Houston, in collaboration with the Gutierrez Energy Management Institute and the Center for Houston’s Future, have just published a white paper, ‘Carbon Capture, Utilization and Storage, the Lynchpin for the Energy Transition’. The 25-page report outlines a pathway for the Greater Houston area to reach net-zero carbon emissions by 2050, a reduction of about 52 million tons/year from the various industry point-sources of energy production and carbon emissions. The transformation ‘will be expensive’. Capital required for carbon capture technologies, pipelines and geologic storage capacity development will be as much as $10 billion over the 30-year period. However, ‘the cost of not developing CCUS in Houston is an existential threat to these industries and to global energy leadership’. Also, Houston is favorably positioned to jumpstart a regional CCUS hub and ecosystem to service Texas, the Gulf Coast and the extended US energy system. Download the UH CCUS report here.
CGG has signed an R&D collaboration agreement with Durham, UK-based Geoptic is to trial a borehole solution for monitoring the spread of CO2 in subsurface carbon capture and storage. A new version of Geoptic’s ‘Diablo’ muon tracking tool is to be adapted to CCS applications. Continuous long-term subsurface monitoring is expected to reduce the risks associated with CO2 leakage and enhance the safety of CO2 storage projects.
Schlumberger New Energy, Chevron, Microsoft and Clean Energy Systems are to develop a biomass energy generation system combined with carbon capture and sequestration (BECCS) that will produce ‘carbon negative’ power in Mendota, California. The BECCS plant will convert agricultural waste biomass, such as almond trees*, into a renewable synthesis gas that will be mixed with oxygen in a combustor to generate electricity. More than 99% of the carbon from the BECCS process is expected to be captured for permanent storage underground, not ‘in the cloud’ as Microsoft’s involvement in the project might lead one to believe. BECCS advocacy was one of the few concrete outcomes of the 2015 Paris agreement.
* Almond trees as waste? How ‘renewable’ is that?
NextDecade is to make its Rio Grande terminal ‘the greenest LNG project in the world’ with the creation of NEXT Carbon Solutions, which is to develop one of the largest carbon capture and storage (CCS) projects in North America. CCS is expected to reduce CO2 emissions at Rio Grande LNG by ‘more than 90%’. Oxy Low Carbon Ventures has signed a ‘term sheet’ with NextDecade for CO2 transportation and storage in South Texas. OLCV will offtake and transport CO2 from the Rio Grande LNG project and sequester it in an underground geologic formation in the Rio Grande Valley. Comment – one would think that CO2 emissions at an LNG plant would be relatively small. Also, Next Decade does not appear to be addressing scope 3 (end-use) emissions, although we are assured that the LNG is to export ‘responsibly sourced natural gas’ from the Permian Basin and Eagle Ford.
Esri blogger Greg Milner, in a post titled ‘Tackling the Troubling Increase in Methane Emissions with Maps’ describes the methane mapping ambiguity that stems from the multiple potential sources of fugitive and other methane sources including oil and gas operations, agribusiness and thawing permafrost. GIS systems provide visualizations of where the problems are located, including the complications of methane seeps and leaks. Of note is the US Department of Transportation’s Pipeline and Hazardous Materials Safety Administration mapping service that reveals methane ‘hotspots’ where pipeline incidents are most likely to occur. Elsewhere, researchers have combined data on sediment, soil, and vegetation to create a ‘GIS-oriented landscape map’. More from Milner.
Scientific Aviation has announced the launch of Project Falcon, a 6-month joint industry partner study that aims to determine the best way to deploy continuous methane monitoring technology to allow energy companies to find, detect, and repair methane leaks faster. Chevron, ConocoPhillips, Devon Energy, ExxonMobil, Pioneer Natural Resources, Shell and TRP Energy are to test Scientific Aviation’s Soofie (Systematic Observations of Facility Intermittent Emissions) system, a ground-based technology that measures methane emissions 24 hours a day. Soofie is a self-contained leak detection system. Each sensor contains its own solar panel, battery, cellular or WiFi connectivity, and the ability to take five methane measurements per second. Soofie also measures other gases, such as H2S, NO and NO2. More from Scientific Aviation.
The Alberta Energy Regulator has approved Qube Technologies' leak monitor for use in its Alternative Leak Detection and Repair pilot, to detect and repair climate-warming methane leaks from the oil and gas industry. The pilot is also backed by Enhance Energy and Highwood Emissions Management More from Qube.
Neptune Energy and the Environmental Defense Fund are to pilot a novel method of measuring offshore methane emissions. EDF will coordinate a team from Scientific Aviation (emissions sensing - see Soofie above) Texo DSI (drone platform) to evaluate advanced methods for quantifying facility-level offshore methane emissions, identify key sources and prioritize mitigation actions at Neptune’s North Sea Cygnus platform.
Chevron Technology Ventures has taken a stake in Baseload Capital, a Sweden-based private investment company focused on the development and operation of low-temperature geothermal and heat power assets. The Baseload investment follows last month’s announcement of funding for Eavor and expands Chevron’s capacity to gain insight into geothermal innovations such as low-temperature power generation and closed-loop geothermal technologies.
CGG has signed a strategic agreement to support dCarbonX in the subsurface assessment of its operated clean energy projects offshore Ireland and the UK (which include geothermal energy and storage sites for CO2, hydrogen and ammonia). dCarbonX applies integrated business, geoscientific and well engineering solutions to deliver sustainable ‘subsurface baseload assets’, such as green hydrogen storage, carbon sequestration capacity and geothermal energy. More from dCarbonX.
Equinor and SSE Thermal are planning ‘two first-of-a-kind’ (sic) hydrogen and carbon capture projects on the Humber estuary, UK. The project envisages one of the UK’s first power stations with carbon capture and storage (CCS) technology, and the world’s first 100% hydrogen-fueled power station. The Keadby 3 and Keadby Hydrogen plants near Scunthorpe, North Lincolnshire, would replace older, carbon-intensive electricity generation. Both projects are in the development stage. Final investment decisions ‘will depend on the progress of policy frameworks that are commensurate with the delivery of this critical net zero enabling infrastructure’. More from Equinor.
Technip Energies has announced ‘BlueH2 by T.EN’, a suite of ‘deeply-decarbonized and affordable’ solutions for hydrogen production. BlueH2 promises a 99% reduction in the carbon footprint compared to the traditional hydrogen process, maximum yield and minimum energy. Carbon avoidance and carbon capture utilization and storage (CCUS) techniques promise the ‘lowest cost’ (blue) hydrogen. More from Technip Energies.
DNV has published new procedures designed to provide the required safety level in transporting CO2 by pipelines and strengthen the development of carbon capture and storage (CCS) projects. This follows the outcome of the CO2SafeArrest joint industry project (JIP) between Energy Pipelines CRC (Australia) and DNV. The work has been supported by the Norwegian funding body CLIMIT and the Australian Commonwealth Government under the Carbon Capture & Storage Research Development and Demonstration Fund. Download the updatedrecommended practice DNVGL-RP-F104.
An opinion piece from BlackRock, ‘ESG X Big Data: Solving for the Double Bottom Line’ has it that ESG ‘is an indicator of future performance potential and should be incorporated with data used to predict performance’. Environmental, social, and governance (ESG) outcomes and future profitability are inextricably linked, and analytical innovation, including the use of big data and artificial intelligence, ‘can offer investors sophisticated insights into which companies are likely to outperform others in the long term’. Read the BlackRock blog here.
Microsoft claims a ‘sustainability’ element in its work with clients including BP. Microsoft is helping BP’s decarbonisation effort by helping it to become more efficient and reducing its methane emissions from existing oil and gas operations. Microsoft is also working to help BP ‘reimagine’ its offshore oil rigs and turn these into wind sites. Naturally, IoT, deep data, analytics and digital twins are ‘all part of this’. The work for BP is said to count towards Microsoft’s ‘moonshot’ (not moonshine) goals for sustainability. More from the Microsoft Technology Record.
MIT has just published ‘Fast Forward: MIT’s Climate Action Plan for the Decade’. The 17-page plan outlines MIT’s commitment to leadership in solving the climate crisis. This involves, ‘moving fast with science and technology, policy, markets, infrastructure, and levers for behavioral and cultural change, investment in new tools and their ‘wise and equitable’ deployment, and the education and empowerment of the next generation, who are to inherit the problem and who must ultimately solve it’.
Gaffney Cline’s Special Report ‘2021 and Energy Transition, The Outlook for Energy Portfolios and M&A’ is a 20-page analysis of corporate strategies that balance the potential risks and specific opportunities. Download the report here.
Natural Resources Canada, in partnership with Environment and Climate Change Canada, has launched the Open Science and Data Platform (OSDP) for cumulative effects, i.e. environmental changes that have long-lasting impacts on the environment and health. The OSDP leverages the Federal Geospatial Platform and its public-facing site, Open Maps. OSDP provides access to provincial and territorial data, historical time series data and maps, surveys, satellite observations and scientific models.
Hexagon has launched R-evolution, a new business venture ‘focused on a sustainable future’. R-evolution is to accelerate the transition to a sustainable economy, running profit-driven investments in green-tech projects where Hexagon’s technology can be applied. Hexagon President and CEO Ola Rollén said, ‘saving the planet is the biggest business opportunity of the 21st century’.
The International Energy Agency’s new report ‘Net Zero by 2050’ has it that ‘pathway to the critical and formidable goal of net-zero emissions by 2050 is narrow’. The report proposes that there are ‘no investment in new fossil fuel supply projects, and no further final investment decisions for new unabated coal plants. By 2035, there are no sales of new internal combustion engine passenger cars, and by 2040, the global electricity sector has already reached net-zero emissions.’
The Government of Canada has called upon Microsoft AI for Earth to support science and research with digital solutions for Natural Resources Canada’s sustainable development and climate action research. NRCan will collaborate with Microsoft Canada to share expertise and use cloud, data and artificial intelligence (AI) services to develop a platform for national and global scientific cooperation.
The UK Oil and Gas Authority (OGA) has revised its Environmental, Social and Governance (ESG) strategy, which now requires operators and licensees to ‘support the drive to achieve net zero while also maximizing economic recovery from the UK continental shelf’. From 2023, reporting on these factors will be mandatory. KPIs will include flaring and venting emissions, scope 1 & 2 emissions, HSSE statistics, fugitive methane emissions, air and water pollution risks, waste management and disposal and carbon intensity. An action plan on ‘how to support a low-income economy’ is expected to be included in the first reports.
CGG has launched SeaScope, a satellite-based pollution monitoring solution. SeaScope combines earth observation data, machine learning and high-performance computing, to provide sea surface slick intelligence across offshore assets, coastal facilities and vessels. The solution, developed with help from the European Space Agency and a group of energy companies and emergency response organizations, has undergone successful 12-month trials in the North Sea, the Gulf of Mexico and South-East Asia.
Munich-based Isar Digital Ventures has announced Renewables Digital an open-access renewable energy portal for investors, developers, startups and others in the renewable energy industry. The portal is to facilitate investment in green energy. For instance, developers that intend to sell a solar park will find more than a hundred relevant investors on the platform who recently purchased renewable assets. Startups and others can market themselves to third parties. Renewable Digital screens renewable energy deals and funding rounds on a weekly basis using ‘sophisticated crawler technologies’. To date, over 200 investors and developers are listed on the platform.
The Society of Exploration Geophysicists has produced an updated position statement on climate change, emphasizing ‘the essential roles geophysicists play in carbon sequestration, monitoring of large ice masses, exploration for minerals used in wind and solar energy systems, geothermal energy exploration, and water-resources management’. SEG affirms the position of the Intergovernmental Panel on Climate Change that has concluded that anthropogenic greenhouse gas emissions are extremely likely to be the dominant cause of observed climate warming since 1950.
Cheniere and Shell have delivered a cargo of ‘carbon-neutral’ US LNG to Europe from the Sabine Pass export terminal. Neutrality was achieved by offsetting the ‘full lifecycle’ (i.e. through to scope 3) greenhouse gas emissions associated with the cargo by ‘retiring nature-based offsets’. These were purchased from Shell’s global portfolio of nature-based projects. Nature-based projects ‘protect, transform or restore land and enable nature to add oxygen and absorb more CO2 emissions from the atmosphere’. Sounds like sinners buying indulgences from the pope.
Venture Global LNG has likewise announced plans to capture and sequester carbon at its Calcasieu Pass and Plaquemines LNG facilities. The company is launching, subject only to regulatory approvals, a ‘shovel-ready’ carbon capture and sequestration (CCS) project, compressing CO2 at its sites and then transporting the CO2 and injecting it into subsurface saline aquifers where it will be permanently stored. Venture plans to sequester 1 million tons of carbon per year.
Total and Siemens Energy have signed a technical collaboration agreement to study sustainable solutions for CO2 emissions reduction. The collaboration will focus on natural gas liquefaction facilities and associated power generation. The JV will deliver industrial-stage solutions such as combustion of clean hydrogen in gas turbines, competitive all-electrical liquefaction, optimized power generation, the integration of renewable energy in liquefaction plants’ power system and their efficiency enhancement.
The HOT Energy Group, RED Drilling & Services and Chemieanlagenbau Chemnitz (CAC) have launched the Underground Energy Storage Technologies (UEST) centre of excellence. UEST delivers comprehensive storage solutions for natural gas, carbon dioxide and hydrogen, prospect assessment and operational planning, drilling, workover and well engineering solutions, through to surface facilities. More from UEST.
A short position piece from Boston-based IDTechEx regrets the closure of the Petra Nova facility in Texas, the world’s largest CCUS facility for a coal-fired power station. IDTechEx reports that Petra Nova needed oil prices of around $75 a barrel to break even. During its three-and-a-half-year lifespan, Petra Nova faced commercial viability issues and unexpected shutdowns. Global carbon capture capacity currently stands at about 40 megatonnes per year, less than 0.1% of the 30.6 gigatonnes of CO2 that are believed to have been emitted in 2020. Despite the US 45Q tax credit, ‘the reality is that CCUS remains an expensive technology’. ‘The next few years could be turbulent for the CCUS industry. However, it may be too important for the world to let it fail.’ More on the report here.
Speaking at the 2021 Data Amplified conference Robert Hudson (FERC) and Campbell Pryde (XBRL US) presented the FERC project for modernizing large reporting environments. The US Federal Energy Regulatory Commission regulates the transmission and sale of electricity and gas in interstate commerce and also the transportation of oil by pipeline, with Forms 6 and 6Q for annual and quarterly reports of pipeline companies. FERC’s scope covers the continental US (except Texas).
FERC require regulated entities to file financial and performance information electronically. Up till now, this has been done using a Visual Fox Pro*-based filing system to report financial and performance data. In 2021-2022 this is being transitioned to XBRL. The move is intended to improve data accuracy and transparency and enable an evolving, publicly available data model.
Reporting will now be through the FERC’s e-Forms submission portal. The new reporting taxonomy is also available. Behind the scenes, FERC uses SpiderMonkey to create and edit its taxonomy which is published though Corefiling’s TMS taxonomy management system.
Filer interaction via the submission portal is through a graphical application that is driven by the taxonomy. The app supports test data filing and validation before the final submission. Filers submit a standard XBRL (XML) file that is validated against the FERC’s rules. Filings can be rendered using a publicly available rendering template. Both rules and rendering code are available on the FERC website and on Github.
*The final version of Visual Fox Pro was released in 2009.
A new, 30-page white paper* from the International Association of Oil & Gas Producers (IOGP) addresses the specificities of ‘normally unattended facilities’ (NUF) i.e. offshore platforms where operations are ‘either completely automated or operated remotely, with no personnel typically onsite’. The publication examines the technological, logistical, financial and regulatory challenges that a ‘broader application’ of NUF would bring and how they could be approached.
The report provides a top level overview of enablers including remote monitoring and remote site support functions enablers. One key enabler is data. The facility must have sufficient data capture points from fixed and mobile sensing to enable remote analysis and to replace local human situational awareness. Required data should be determined by failure mode analysis for all elements of the equipment and facility so that people are not required to attend the site to capture data.
The NUF model proposed is decidedly high-tech, one of sensors, imagery, videos and other real-time data streaming over ‘robust high capacity communication networks’ for onshore analysis. NUF recommends data formats designed with redundancy and security in mind, and stored in the cloud to enable analytics and decision-making.
Telecommunication should be high bandwidth, stable, and redundant. These may include fiber optic cables to connect the facility to the support center, high penetration/high capacity wireless coverage (Wi-Fi, LWAN, 4G/5G, UWB) across the facility that will allow field devices to communicate with the network.
Analytics will allow for ‘high level problem solving’ without human intervention and lead to the reduction of on-site personnel. Analytics will enable predictive and corrective maintenance, artificial intelligence will enhance alarm management. ‘Enhanced diagnostics’ for complex packages will shorten maintenance campaign duration.
Without onsite personnel, reliable, remotely activated intervention is required, through automation, teleoperations and ‘task-specific or multipurpose systems’. The IOGP envisages a NUF that is kitted out with plethoric goodies, multispectral cameras and sound detectors (to identify gas leaks), self-calibrating cameras and remote monitoring systems, self-validating pressure, temperature, level, and flow instruments, remotely operated cranes, unstaffed pigging systems and more. Robotic systems for mobile sensing or physical interaction on-site, including drones (aerial and underwater) complete the picture. Onshore, a dedicated command center can operate multiple robotic systems remotely.
IOGP observes that the NUF ‘would be greatly facilitated by some level of standardization’ and gives its own JIP33 a plug (CFIHOS is conspicuously absent). But the NUF philosophy ‘may make many standards unsuitable, several regulations and regulatory documents may need to be amended or updated’. To which end, the IOGP Is investigating ‘possible encumbrances within current codes, standards and regulations’. Ultimately In terms of standardization, equipment providers should produce equipment with a ‘NUF compatible’ label to align procurement with NUF objectives. NUF concludes that ‘although technical challenges remain, there are no showstoppers that prevent the pursuit of normally unattended facilities in the very near future.
* NUF V 1.0 May 2021.
Comment: the IOGP paper would appear to be drafted from a North Sea viewpoint where operational de-manning is ongoing. It would have been interesting to add some Gulf of Mexico context to the study where the NUF concept is perhaps more mature. NUF ideas are something of an operator’s Arlésienne, see for instance this paper from Shell, ‘Remote Operations—A Remote Possibility, or the Way We Do Things Round Here’ which demonstrates that, ‘for Shell, remote operations is not a remote possibility, but is in the process of becoming a reality across our global upstream operations’. And that was in 2011!