Oil IT Journal: Volume 27 Number 3


Norway’s $3.6 million READi Project ‘to shape the future of oil and gas information management’

DNV ‘a union of the physical asset and digital twin’. The ontology ‘Dummies’ Guide’. Operationalizing READi at Equinor SPINE and the technical information requirements catalog. U Oslo on the ‘demanding’ world of ontology. AkerBP: READi, NOAKA and the Fixed Facilities Alliance JV. NOROG on technical information requirements catalog. READI and ISO 15926 Part 14. READi and OSDU collaboration.

Speaking at the recent closing-out presentation of the Norwegian READi* project that set out to ‘shape the future of digital requirements and information management in oil and gas’, DNV’s Magne Berg expressed the hope that the project’s deliverables could form a ‘pillar to sustain digitalization going forward’ with ‘a union of the physical asset and digital twin’. DNV project manager Erik Østby outlined the four-year project journey from its somewhat confused beginnings in 2017. One early participant commented, ‘I’m not too sure what this is about but I think it’s important’. Ontology was a new word for many so a ‘dummies guide’ was developed for the steering committee. The project was endowed with a 36.2 million NOK budged and some 21,000 hours of in-kind support from partners. Early in the project a decision was made to settle on the ISO/IEC 82229 standard for the ‘structuring of systems including structuring of information about systems’ for use in contracts with suppliers. Also in scope was the revision of ISO 15926 Part 14. Berg acknowledged that there is a ‘huge job to follow up on, the work will continue’.

* Requirements Asset Digital Information

Paal Frode Larsen presented Equinor’s SPINE Project that seeks to ‘take READi into operations’. The Spine project is a ‘digital improvement project to replace document flow with machine readable data to increase quality, HSE, productivity and reduce risk for projects and assets’. Larsen described the undertaking as ambitious and of ‘huge scope’. ‘While management expects us to deliver, working with in-kind resources is hard, everyone has a day job’. But the market really needs what has been done in READi. ‘We have an enormous amount of documents in a project and we can’t read them all’. End-user alignment is another issue, ‘do you really want to take new way of working on board?’ Equinor used to be afraid of sharing data with contractors. Today, while ‘every door is not open, but more are opening!’.

Milenija Stojkovic Helgesen (Equinor) presented the first READi deliverable, TIRC, a standard technical information requirements catalogue. This work was in response to a 2015 report from PTIL, the Norwegian safety authority, that concluded that ‘Norwegian safety standards were not fit for purpose’. READi also set out to respond to the 2018 KonKraft ‘competitiveness’ report that called for increased digitalization, standards and industry collaboration.

Developing the TIRC proved difficult with different forces pulling in different directions. The group had to balance ambition with what is achievable with a common standard for data-centricity. There is still a way to go. The work has integrated with the UK’s JIP33 and CFIHOS programs with the expectation of future collaboration. The TIRC will potentially become a part of the ISO 81355 standard for the ‘Classification and designation of documents for plants, systems and equipment’. The READi work had also been submitted to ISO as 15926 Part 14, which is to adapt the standard data for ‘OWL2 direct semantics’.

Arild Waaler (University of Oslo) presented the information modeling framework and guidelines for developing models for digital representation of asset information, another READi ‘deliverable’. Waaler’s work combines the systems engineering concepts of in ISO 81346-1:Aspects with the semweb approach of ISO 15926:14. He acknowledged that the ‘ontology world is more demanding for engineers to really get into’. Ontologies can grouped as ‘aspects’, a ‘useful’ approach that ‘needs more work’. Many challenges remain in this complicated process. There is a huge gap between standards and a detailed discipline knowledge. Using a team of IT consultants ‘does not scale’. It is better to provide experts with tools to let them generate the information model. The central idea is to ‘replace formats with information models’. Waller invited interested parties to consult the Reference Designation System for Oil and Gas.

Johan Klüwer (DNV) presented another deliverable, a common vocabulary for building the asset information model (a.k.a the POSC Caesar PLM RDL/ISO 15926 Part 14). Again, the difficulty of the task was emphasized, this is a ‘challenging and exciting place to be!’ We need an asset information models in a common format. But should this be based on practical usage or ‘normative’? The PLM RDL is ‘online, modular and in OWL’. This was developed in a collaboration between READi and AkerBP’s Krafla and NOA development projects. Klüwer stated that OWL is used worldwide and is supported by many vendors. Here the aim is for a ‘contradiction-free’ RDL to support duplicate-free, constrained relationships conformant to the upper ontology of Part14. Klüwer concluded showing how the CFIHOS equipment type class can be ‘ontologized’ with Part14.

For more on this work see for example the joint POSC Caesar Readi workshop on a MEG (Mono Ethylene) injection use case and the corresponding OWL code illustrated with ‘Gruff’ graph technology from Franz.

Helge Schjøtt showed how Aker BP has operationalized the outcome of READi in its major NOAKA area development. Noaka is being developed via a Fixed Facilities Alliance, a joint venture between Aker BP, Aker Solutions and Siemens Energy. A new digital platform combines the Aize Workspace and Cognite’s Data Fusion platform. The intent for the group is to have a common information model and digital language that can be shared between contractors and operators. READI is, we are assured, ‘on the agenda’. The group is starting to scope use cases and expects to have a proof of concept before year-end 2022. Schjøtt wound-up presenting AkerBP CEO’s ‘digital transformation check list’ observing that without this, ‘the risk is that folks will still use PDFs and Excel in two years’ time’.

Yngve Nilsen Norway’s oil and gas association NOROG compared the Technical Information Requirements Catalogues (TIRCs) from READi, EPIM and the EqHub RDL. These need to be combined into a single portal for operators and suppliers. As READi deliverables are handed over to NORSK the TIRC functionality will merge into EqHub. At the same time, the CFIHOS equipment class structure will inform a ‘future TIRC’ plugging many currently missing equipment types.

Jann Slettebakk (Aker Solutions and PCA chair) focused on the future of READi and ISO 15926:Part14 and the development of a semantic RDL endpoint. ISO 15926 was ‘revolutionary’ when it started out in the 1990s Today the Parts are already ‘widely used’ and the new Part 14 is set to meet tomorrows expectation for digitalization and digital twins and to ‘support Equinor’s new way of working’. More from PCA.

In the Q&A the team was quizzed on a possible READi collaborating with OSDU to further the development of the OSDU Energy Platform which ‘includes engineering data’. The answer is that OSDU has until now been working with subsurface data and applications. Recently, a new OSDU working group related to engineering data has started. This group is looking at extending the OSDU concept to provide a ‘data warehouse of engineering data’ along the same lines as for subsurface data. The group is collaborating with CFIHOS and DEXPI to identify and propose demonstrations of interoperability in engineering work processes. The READi group ‘sees OSDU as complementary to READi’ with an IT platform on which READI’s deliverables can be implemented.

Oil IT Journal asked the following: ‘Norway has a rather long history of trying to apply the W3C semantic web technology in various initiatives from PCA and the IHON ‘semantic oil and gas platform’. What went wrong with the previous attempts and why will things be different this time around?’ The response was that ‘Since the IOHN project, two of the major EPCs in Norway have successfully implemented large scale solutions based on semantics and parts of ISO 15926. Learnings from these implementations have led to ISO/TR 15926 Part 14, a version of ISO 15926 which is fully compliant with W3C’s OWL2 and provides full semantic reasoning capabilities. Even though the technology and the standards have evolved and matured the last decade, a sustainable financial model for PCA needs to come in place.’

More from the READi home page. View the presentations here .

Comment: One of the most stubborn errors in IT projects is to put the IT cart before the business horse. This means deciding on a technology at the project outset. READi, like much earlier Norwegian work is predicated on the use of semantic web technology, notably OWL. We have been tracking this now for over 20 years and have seen how a) OWL has proved good at spinning-out academic projects in Norway and the EU, b) its use in oil and gas, notably with the introduction of ISO 15926 ‘Facades’ (and maybe READi’s ‘Aspects’) did not meet with the approval of the W3 RDF purists, and c) the technology is often perceived as too abstract for the engineering community, hence perhaps, the stickiness of PDF and Excel!


NVIDIA 2022 Global Technology Conference

CEO Jensen Huang on the ‘three Nvidias’. NETL’s Modulus-based digital twins for energy. BP applies CGAN to 3D seismic attenuation. Senai Cimatec, deep learning for predictive maintenance. Fotech and embedded real time DAS processing. Visco’s hi-fi oil and gas digital twin.

Founder and CEO Jensen Huang, wearing his signature leather jacket, gave the keynote to the 2022 NVIDIA Global Technology Conference stating that with the advances in AI, ‘data centers are becoming AI factories’. There are now ‘three Nvidias*’, AI, HPC and the ‘Omniverse’ a platform of algorithms, compute power and ‘science’ that is to power the next wave of development. The Omniverse poster child is ‘Earth 2’, ‘the world’s first digital twin’. Earth 2 is to perform ‘physics-informed deep learning’ for weather forecasting building on 40 plus years of EU weather data. Thousands of simulations on the huge training sets ‘obviated the need for supervision’ and the conditions are primed for other scientific breakthroughs. Huang proceeded to describe a bewildering technology offering with some staggering numbers. An new H100 chip sports some 80 billion transistors with a 5 terabyte/s IO bandwidth. ‘20 of these could sustain all the world’s internet traffic’. Chips are combined in DGX Pods to provide exaflops of AI compute bandwidth. Multiple pods connect with InfiniBand switches. EOS is being built with 18 DGX Pods and will provide 18 exaflops, ‘a 4x leap over the current world N°1 supercomputer’.

* Huang omitted to note Nvidia’s fourth revenue stream, crypto mining. The SEC recently fined the company $5.5 million for its failure to disclose that crypto mining ‘was a significant contributor to its 2018 revenue’.

Physics-informed neural networks were evidenced in a presentation, ‘Developing digital twins for energy applications using Modulus’ from Tarak Nandi and Oliver Hennigh (NETL). NNs are trained to ‘minimize the residual form of the same physical equations as in CFD*’ and are claimed to be orders of magnitude faster than CFD for uncertainty quantification/design optimization problems. Nvidia Modulus is a neural network framework that blends physics, in the form of partial differential equations, with data to build parameterized surrogate models. NETL applications address various industrial/chemical process models. NETL twins are being developed for CCUS, direct air CO2 capture and flue gas capture.

* Computational fluid dynamics

In the oil and gas track, Muhong Zhou showed how BP has extended a conditional generative adversarial network (CGAN) from 2D to 3D, targeting execution on a multi-GPU node. The CGAN was developed in-house for seismic attenuation compensation. Data augmentation was developed with the CuPy, open-source array library for GPU-accelerated computing with Python. Tests show the CGAN code can generate images ‘with similar quality as those generated by the existing in-house physics-based attenuation compensation tools’.

Erick Nascimento, Ilan Figueirêdo and Lílian Guarieiro from Brazil’s Senai Cimatec research establishment showed how deep learning can support predictive maintenance activities. Labeled data with annotated failure modes and time-to-failure train deep learning models. But labeled data may be scarce or even nonexistent. The challenge is to combine unsupervised and supervised ML to identify faults in time series data. The team used a public dataset* of an oil and gas offshore platform, including different failure and pre-failure behaviors captured by sensors. An Nvidia GPU Tesla V100-SXM3 was used for the deep learning semi-supervised model. A combination of C-AMDATS** and a deep neural networks was able to detect anomalous patterns The authors concluded that ‘predictive maintenance solutions can be further developed, even in scenarios with few, or even no, labeled data’.

*The 3W Dataset is said to be a realistic oil well dataset with rare undesirable real events that can be used as a benchmark dataset for development of machine learning techniques. The theory behind the approach is given in the Journal of Petroleum Science and Engineering paper, ‘A realistic and public dataset with rare undesirable real events in oil wells’.

** A Cluster-based Algorithm for Anomaly Detection in Time Series.

BP Launchpad company Fotech reported that the arrival of Nvidia’s Jetson AGX Xavier was a game changer for its proprietary Dataflow processing that allowed for ‘fully embedded’ real time processing. Fotech is a DAS fiber optic specialist whose technology is used to monitor intrusion and damage along pipelines, power lines or fiber cables themselves. DAS generates a ‘fire hose’ of data that needs to be processed in real time. Fotech’s architecture prepares and manages processing and disturbance detection in near real time using Nvidia GPUs. Edge AI can ‘turn a fiber optic cable into 60,000 vibration sensors’.

Tore Kvam and Serhiy Todchuk from Norwegian Visco described their high fidelity renderings as ‘a new digital twin concept’ of oil and gas installations spanning surface, sub-sea and subsurface. Animating these huge (700 million triangles) models requires many software smarts. Sharing them between multiple users is enabled with an Nvidia Tesla-based vGPU server that allocates virtual PCs to any device via RDP. Visco is now looking into the Nvidia DLSS (deep learning super sampling) for efficient resolution scaling, the use of mesh shaders for geometry pipeline and memory efficiency and to ‘move everything to the GPU!’


‘Oil Company’. Which word don’t you understand?

Oil IT Journal Editor Neil McNaughton, proud recipient of the 2022 SPE Regional Data Science and Engineering Analytics Award, discusses the dangers of ESG obfuscation and suggests that oils should be just that, oils. The big decisions on the ‘energy transition’ are for society to take. Not for oils trying to ‘sing one song to the tune of another!’

Imagine, you are the CEO Of a major oil company, sitting in the boardroom discussing your ESG strategy with your colleagues. You are congratulating the technologists who have just come up with a ‘green’ fuel for Formula 1 racing or some such improbable contribution to the energy transition. All of a sudden. Ba-doom! The cops, deploying what the DHS calls ‘tactical entry equipment’ (a battering ram to you and me), swarm in. ‘What’s the charge?’ asks the CLO. ‘Greenwashing!’

Sounds ludicrous? Maybe not so much. Earlier this year, as the Financial Times reported, fifty police officers raided the Frankfurt offices of Deutsche Bank and its fund manager DWS, as part of an investigation into greenwashing. Elsewhere, the SEC has spoken out against ‘exaggerated claims about ESG strategies’. The SEC has also warned against ‘misleading or deceptive fund names’, suggesting that names that suggest a certain investment focus (like ESG) should direct at least 80% of their assets thereinto. ENI was fined for greenwashing a couple of years ago. Similar attacks for misrepresentation in advertising have been directed against oil companies for over a decade by the greens, particularly in Europe as the EU majors re-position themselves as ‘energy’ companies and downplay their fossil activity.

In my opinion, the EU majors are boxing themselves into a corner by adopting an ‘energy transition’ stance. Once you claim to be ‘transitioning’, you open yourself up to criticism and an evaluation of the speed with which you are transitioning. This will of course never be fast enough for the greens. Claims for ‘net zero’ at such and such a date may well fall under the scrutiny of the law as above. The confusion induced by oil companies that are not works both ways, as the kerfuffle of the UK’s windfall tax showed recently: ‘tax me and there will be less cash for our windfarms’ now cry the EU majors!

The best thing to do would be to stop oil companies from playing the energy transition game altogether. This involves two related considerations, the environmental case and the financial case. On the environmental case, oil companies, their employees and your humble servant are in a singularly bad position to pontificate on the ‘reality’ or otherwise of human-driven global warming. Whatever our opinions may be, any deviation from the consensus is going to be considered by society at large as ‘unreliable evidence’ from us folks with skin in the game. Society, governments will decide the speed at which the energy transformation will take place. This will then determine the financial case for oils.

How fast is the energy transition taking place? Our second* of three book reviews on the energy transition suggests that a complete shift over to electric power for just about everything is doable and could be painless (at least for the USA). But this would cost about as much as FDR’s New Deal and would see 1% of the US land surface occupied by wind and solar. This contrasts with the current powergen situation with 50% of France’s nukes out of action, ERCOT’s spectacular failure to keep Texans warm last winter and California’s failing grid in the 2020 heat wave. It is easy to be skeptical about the speed of the transition.

So what of the economics of the oil industry for the next few years? Buying shares in an oil means betting that the transformation will be slow, that there will be a continuing need for fossil fuels and that governments will have a hard time taxing fossil out of existence while society continue with its collective ‘green-lite’ posturing while driving SUVs. Not a pretty picture perhaps. How long will the rump/twilight oil business continue? Total CEO Patrick Pouyanné observed recently that even a 60 million barrel per day industry (i.e. a 40% decline) will require continued upstream investment. So there are a few years (decades?) left in the old dog.

How might this imagined world of pure-play oils come about? Well, there is always the possibility that the regulators that have been working on the ESG funds will wake-up to the similar obfuscation that the major oils are currently enacting but I am not holding my breath on that. Black Rock’s Larry Fink proposed a mechanism for a return to the pure-play with his idea of a ‘bad bank’ for fossil assets although he was more interested in redirecting monies from such a spin-off into green investment. We, as oil folks, are perhaps more interested in the fate of the ‘bad’ stuff.

The other route to the pure play is that advocated recently by Harold Hamm who recently has offered to buy out the other shareholders in Continental Resources and take the company private. His motive appears to be to plough Continental’s profits back into E&P rather than to pay dividends to shareholders. Hamm may not view E&P as ‘bad bank’ material, but the result is the same.

How does this play out for the climate? It could actually be positive. While investment in ‘offsets’, wind power and solar will be off limits, that leaves plenty of activity that should meet with environmental approval. Reducing flaring, venting and methane and other emissions reduction should be part of doing business right. CCS possibly, although that is really another one for ‘society’ to decide on. Energy efficiencies throughout the value chain? Sure.

With oils back in a pureplay posture it would actually be easier for investors, governments and society at large to ‘do something’ to reduce fossil use. Whether these stakeholders will move to action is moot. In any case pure play would be better than the status quo of greenwashing and energy obfuscation.

* See our review of Saul Griffith’s ‘Electrify’ in this issue.


Book Review: Electrify

Saul Griffith’s ‘optimist’s playbook for our clean energy future’ argues that a transition to an all-electric energy economy is achievable. To achieve this, the free market needs a ‘swift kick in the ass!’

In ‘Electrify*’, Saul Griffith sets out to ‘show you a clear path to a better world in enough detail to bridge the imagination gap’, not to write a ‘doomsday book’. Having said that, he argues that the energy transition ‘has to be now, not 10 years from now, or even a month from now’. ‘We have arrived at the last moment when we can shift global energy infrastructure without passing a 1.5°– 2°C temperature rise’. How will this be fixed? The hint is in the title, ‘Electrify’. ‘We need EVs and other emissions-free vehicles to be 100% of vehicle sales as soon as is physically and industrially possible. And similarly, domestic heating needs to move massively over to electricity. The free market is unlikely to solve the challenge of 100% rapid adoption and needs an invisible foot to give it a swift kick in the ass now and then. Every player, individuals, governments, businesses and the market needs to work together. Such major projects and achievements have been realized in the past: John Muir’s advocacy that saved the US wilderness, FDR’s New Deal, the mobilization for WW II and the space race. Griffith is an energy geek. His company, Otherlab, was contracted by the US Department of Energy to consolidate energy use data, some of the results used in Electrify can be seen on the cheekily named departmentof.energy.

Griffith is dismissive of what he describes as ‘1970s thinking’ that involves ‘trying extremely hard, and making sacrifices so that the future will be a little less fucked than it might be otherwise’. The 2020s mindset says, ‘If we build the right infrastructure, right away, the future will be awesome!’ So out goes energy efficiency (politically problematic) and carbon capture (too expensive) along with lots of other stuff like ‘sustainable’ fish, public transportation, and stainless steel straws. ‘Let’s release ourselves from purchasing paralysis and constant guilt at every small decision we make so that we can make the big decisions well’. Massive electrification is the best way to address climate change. And by the same token, out go dumb-ass energy vectors including hydrogen and ammonia that use up more energy than they convey. ‘Hydrogen vehicles are the canonical case of this silliness’.

Massive electrification (from solar and wind) will reduce energy needs by more than half due to the efficiency of electric vehicles and heating systems and in eliminating the significant amounts of energy used in finding, mining, refining, and transporting fossil fuels. Griffith has no time for the naysayers, who cynically wish to keep profiting from fossil fuels, ‘burning your children’s future’. ‘Don’t let them divide us by confusing us. We don’t just need to change our fuels, we need to change our machines’.

Electrifying transportation is a big win, saving around 15%. For domestic heating ‘we have an astounding and well-developed technology called heat pumps that significantly outperform the old ways of doing things**’. Adding up all the savings means that ‘we only need around 42% of the primary energy we use today’. ‘America can reduce its energy use by more than half by introducing no efficiency measures other than electrification’. No thermostats need be turned down and no vehicles or homes downsized. Electrification is a ‘no-regrets’ strategy for decarbonization.

Griffiths shows how, with a lot of solar and wind power, but not an unrealizable amount, a 1,500-1,800 GW capacity can be achieved. Over three times the amount of electricity currently produced. The landscape will look different with pervasive solar panels and windmills. To power all of America on solar would require 15 million acres (about 1% of land area), roughly what is currently dedicate to roads or rooftops. The calculation includes summer and winter variations in solar input and reasonable assumptions as to panel efficiency. Griffith is dismissive of the nimbies. ‘We have learned to live with a lot of changes in our landscape, from electricity lines and highways to condos and strip malls. We will also have to live with a lot more solar panels and wind turbines’. The trade-off is that we’ll have cleaner air, cheaper energy, and, most importantly, we will be saving that land and landscape for future generations. We will have to balance land use with energy needs. The final piece of the puzzle is storage and ‘batteries’, both chemical, storage heaters and pumped hydro.

Griffiths calls for a jedi mind trick of finance to pay for all the new infrastructure. Investment in electrification, heat pumps and batteries should benefit from infrastructure financing as opposed to consumer credit. Lowest-cost infrastructure-grade financing is crucial. Policymakers could offer low cost climate loans rather like the long- term mortgages that enabled home ownership after the Great Depression. On the plus side, electricity is cheap and getting cheaper. Utility-scale solar currently costs ~3.7¢/kWh, wind power ~4.1¢ with natural gas at ~5.6¢/kWh. Behind-the-meter energy as generated on your rooftop is even cheaper.

Griffiths has put his money into a couple of ambitious electric ventures, Makani Power, a kite- powered wind-energy he stated in 2006 and Sunfolding that builds tracking devices that steer solar panels. Both technologies have been superseded by the advancing economics of conventional solar and wind power. As Griffiths says, ‘electricity will finally (well, almost) be too cheap to meter, as they used to say about nuclear power’. All of which represents a rare opportunity for industry, small and large.

But what to do about the ‘8,000- pound carbon gorilla in the room’, the proven reserves on oil company balance sheets? Portfolio divestment from fossil- fuel companies is no good, we need a mechanism for buying out the their stranded assets, preferably on the cheap. A final chapter reviews all the other stuff that is harder to decarbonize than transport and heating. Rather succinctly, Griffiths enumerates the problems of steel and concrete manufacture and the issue of sourcing all the raw materials that the electric world will require.

An appendix offers some briefing for use in dinner party conversations. CCUS? A waste of time. Natural gas? An unsafe, collapsing bridge to nowhere. Fracking? A huge distraction. A carbon tax? Not a solution. Griffiths also offers some suggestions as to where workers in different professions could fit into the new environment. His suggestion is not too helpful for oil and gas workers: ‘If you are an oil industry worker, thank you for your service. Now you’ll have a job helping America build the massive infrastructure that is required for a zero- carbon future’. That’s all folks!

*Electrify, An Optimist’s Playbook for Our Clean Energy Future Saul Griffith 2021 MIT Press ISBN: 9780262046237.

** We have submitted our heat pump debunkery to Mr. Griffiths and are waiting to hear back.


OSDU Update

Shell’s two step AWS data migration. Krebbers advocacy for expanded scope.

In a recent Shell release, digital subsurface VP Paul Zeppenfeldt reports on the migration of the company’s subsurface data to the Amazon cloud, ‘overcoming the limitations of traditional, discipline-specific software’. The migration of data to the Shell Subsurface Data Universe (SDU) in the cloud is ‘already well under way’. Shell handed over the technology and concepts of its SDU to seed The Open Group’s OSDU Forum for further development as ‘open source’ software. By moving its SDU data to the AWS cloud Shell plans to ‘accelerate its journey in artificial intelligence and increase cross-disciplinary collaboration’ and in the process is retiring some 25 legacy databases. Shell has moved wells data and applications to the AWS cloud including its WellsOffice application, originally developed at Norske Shell with help from Flinke Folk solution. Shell is now ‘exploring the migration of key seismic processes to the OSDU data platform’.

Blogging on the OSDU Forum website, OSDU luminary Johan Krebbers* acknowledged that OSDU’s work on an energy data platform is not yet complete and OSDU must ‘speed up developments in this area’. At the same time Krebbers argues for an OSDU expansion into new energy fields. OSDU has started working on geothermal and hydrogen which are to be supported by the data platform. These will leverage existing OSDU functionality as deployed in the upstream production services offering, notably OPC UA, Delta Lake and Kafka. Krebbers opines that it is the lack of data standards that is ‘slowing us down’ particularly for hydrogen** but ‘geothermal is only slightly better’. Krebbers dream is to support a mixture of energy sources across offshore wind, hydrogen, solar and batteries all rolled up into a single OSDU data platform.

* Krebbers is currently working part time for Cognite on the integration of the OSDU data platform with Cognite’s Data Fusion.

** Curiously Krebbers did not mention Shell’s 2020 hydrogen digital platform, reported as being based on a ‘repurposed’ OSDU platform.


2022 Energy Conference Network Digitalization in Oil & Gas Canada

CNRL’s digital journey. TC Energy on the decarbonization pathway. Veerum’s brownfield digital twin and the internet of behaviors. Canadian Centre for Cyber Security’s ransomware playbook. Suncor on failed digital transformations and the digital skills gap. Vendor presentations from Drishya AI Labs, IronSight, BrainToy and Matidor.

The 300 attendees at the 2022 Energy Conference Network Digitalization in Oil & Gas Canada heard from John-Paul Portelli who traced Canadian Natural’s (aka CNRL) digital journey. Portelli is head of technology scouting and works with on the company’s developing digitalization strategy, ‘finding good things that people are already doing and bringing them to other parts of the company’. CNRL is an unusual company in that there is no CEO. The company is operated by a management committee where working groups ‘come together to initiate and challenge new ideas, and to disseminate knowledge and know how throughout the business’. Employees are encouraged to ‘innovate at all levels’. The idea is that information sharing and collaboration will deliver better results. Portelli reassured suppliers saying ‘We don’t want your IP. We want a win-win’. On the collaboration front, Portelli cited the Clean Resource Innovation Network* (CRIN) a ‘a pan-Canadian network focused on ensuring that Canada’s oil and gas resources can be sustainably developed and integrated into the global energy supply’. Current CNRL work covers advanced analytics, robotics, RPA, data provisioning and 3D visualization. On the emissions front, the company is working with Bridger Photonics’ aerial Lidar monitor and is also working with University of Calgary to deploy its PoMELO methane detector technology. Other programs include AR/VR (notably with the HoloLens), and an Intergraph Smart Plant Review portal that provides as-built information on all plant engineering data. Corrosion studies are rolled-up into a Pipeline Criticality Ranking (PCR) rating for maintenance work. ML algorithms have been used for equipment surveillance, predictive maintenance and materials optimization (IronSight got a shout out - see below). CNRL is offering training to its employees, in the ML field, ‘We have stripped down the hype and put training in place to demonstrate what machine learning actually is’. Internal crowdsourcing allows people from different business units to weigh-in on each other’s projects. Portelli wound up his keynote asking for input from the vendor community, ‘we are always looking for new ideas, we want to innovate together’.

CNRL’s technology achievements are presented on the company’s technology portal.

Phil Demers described how TC Energy is establishing a decarbonization pathway via an air emissions digital strategy. TCE is aiming for a 30% reduction in GHG emissions intensity by 30%, and for net zero by 2050, with a combination of offsets and of carbon credits. A digital data fabric underpins these efforts with a GHG Digital Concept that will drive outcome-based and cross-functional digital solutions to address the emissions reduction goals. Enablon software from Wolters Kluwer leverages information from an ‘emissions data lake’ for reporting. Demers concluded that ‘the world of emissions has changed, an IS/IT strategic partnership is now critical’.A report on TCE’s GHG reductions plan is available here.

Trevor MacMaster and Rob Southon showed how Veerum has built an automated brownfield digital twin with for the Clean Resource Innovation Network. Veerum was the beneficiary of a $776k CRI grant to further its Digital Twinning of Legacy Facilities project. This sets out to provide unified data for environmental, safety and productivity improvements with a ‘cost-effective, easy-to-maintain method of leveraging AI to connect enterprise information management systems to VR asset models’. The solution promises ‘cognitive augmentation’ with an ‘internet of behaviors’!

Michael Russell hails from the Canadian Centre for Cyber Security, the federal government’s unified source of expert guidance, services and support on cyber security. The CCCS provides a service to critical infrastructure operators spanning incident handling and cyber defense. While the government currently assesses the serious risk (damage or loss of life) to critical infrastructure as low, ‘cyber threat actors may target critical Canadian organizations to collect information and pre-position for future activities, or as a form of intimidation’. The CCS has issued a ‘ransomware playbook’ with advice on defense against ransomware and how to recover should you become a victim. ‘It’s all about effective cyber hygiene’ and the management of cyber risk ‘at a corporate level’.

Sheldon Wall (Suncor) cited a World Economic Forum study on industrial digital transformation which came up with a potential $100 trillion ‘value opportunity’ by 2025 for industry and society. However, a subsequent McKinsey report* has it that ‘in industries, such as oil and gas, digital transformations are challenging and success rates fall between 4 and 11%’. Wall puts this down to a digital skills gap that needs to be addressed with an integrated, multi-year digital talent strategy. Also companies need to establish a ‘value pipeline’ to identify, categorize and execute on a portfolio of opportunities. Wall warned against the risk of ‘Pilotosis’, an affliction that results in digital products that never go beyond the experimental stage.

* Unlocking success in digital transformations (McKinsey).

Vendor presentations came from Drishya AI Labs (AI-based engineering data extraction and smart P&ID generation), IronSight (Field operations), BrainToy (mlOS for low code/no code MLOps and Matidor (Esri GIS-based project management).

The next Digitalization in Oil & Gas Canada is scheduled for April 2023 in Calgary.


Society of Exploration Geophysicists in transition

SEG/SPE talks quashed! Digital transformation task force to uberize SEG! SEG signs OSDU MoU. Energy in Data conference surplus to requirements? SEAM’s open source data platform.

The SEG has now ‘quashed’ talks of possible mergers (notably with the SPE) and is to remain an independent organization while accelerating the ‘transformation’ of the Society. A ‘strategic options’ task force and third party independent contractor held focus groups to get feedback from stakeholders. Themes for increased focus included broadening the community to all members and to meet the membership’s needs throughout their careers, to ‘modernize SEG’s approaches’ and to ‘improve diversity’. Encouraged by this rather nebulous feedback the SEG is to form a transformation task force and hire a project manager. For more (but not much more) on the decision not to merge and on the transitional task force, listen to SEG President-elect Ken Tubman’s podcast where he expounds inter-alia on what the gaming industry might bring to SEG members’ collaboration. TikTok for geophysicists?

The SEG Technical Standards Committee has been engaged in discussions with The Open Group for several months and signed an MOU in August 2021 with the Open Subsurface Data Universe (OSDU). The TSC has published a data-delivery best-practices document for post stack seismic data and is to recommend adoption of SEG-Y Revision 2.0 by year-end 2022. Although this does not appear to address the SEG Y vs Open VDS ‘storage wars’ we have reported on previously.

A ‘digital transformation task force’ has been set up to ‘identify business opportunities for SEG to expand in digital and virtual space’. The task force is inspired by companies ‘like Google, LinkedIn, Amazon, and Uber’ who apparently ‘provide lessons for building a business model based in digital space’, many of which are ‘relevant to a viable future for SEG’. The task force is to apply ‘best practices in virtual market analysis’ to identify the ‘personas’ that comprise the SEG’s ‘customer base’ and is using business intelligence software to test and refine personas from our database of customers. Energy in Data is presented as a task force ‘pilot’ although the conference has been running since 2020. What is new is the transition from a conference to an online community, with conferences as a ‘secondary or possibly unnecessary delivery channel’. The second trial is a community-based online business model for content derived from SEAM projects. Both trials are centered on new business lines from which SEG may be able to capitalize by providing a ‘virtual business model’.

SEAM has hitherto focused on building realistic subsurface models and simulating corresponding numerical benchmark data sets. The organization is now to undertake projects that focus on industry challenges (technology generation) and on the ‘further democratization’ of data and industry benchmarks, digital twins, and industry standards. The SEG is also to establish a ‘SEAM Data Platform’ which will be ‘the number one global platform for synthetic data sets and models’ leveraging open-source components, but not, as far as we know, OSDU! More from SEG/SEAM.


Software, hardware short takes

AgileScientific and the ML algo zoo. Altair Exchange. Asset Guardian certified. CGRisk BowTie Server 11. Chongqing Sevnce’s robot ATEX certified. Enverus MineraliQ portfolio manager. GE Digital’s Proficy 2022. Honeywell Forge connected warehouse. Interica OneView 2022.1. Petrosys PRO OSDU connector. OriginLab’s Origin 2022b. Kongsberg Digital SiteCom Go. mCloud’s AssetCare MobileNow. Moziware CIMO. Rock Flow Dynamics’ tNavigator 22.1. B&Kjær Vibro’s VCM-3/Setpoint. CGG builds ‘EU’ HPC hub in UK. Emerson AMS Machine Works embeds ‘PeakVue’ alerting.

Matt Hall (AgileScientific) recently blogged on the ‘The machine learning algo zoo’ illustrating the many ‘wonderful but baffling’ ways to do machine learning. The post includes a ‘Big Giant Spreadsheet’ that compares popular learning algorithms in terms of their most important characteristics and predictive abilities.

Engineering simulation specialist Altair has announced an ‘open source’ collaborative engineering space, Altair Exchange for one-on-one support, documentation and forums. The most prolific users are rewarded with points that will appear on global leaderboards, ‘demonstrating their prowess to other members’.

Asset Guardian Solutions reports that is has been awarded ISO/IEC 27001 certification for information security management of its own and clients’ assets. The ISO framework includes policies, procedures, processes, and systems that manage information risks such as cyber security attacks, data leaks, or theft of data.

Wolters-Kluwer has announced CGRisk BowTie Server 11 with new web-based bowtie editing and new APIs for integration with third party systems and data. The new release also allows individual users to be assigned as owners of specific bowties and barriers, ‘bridging the gap between bowties as abstract models and bowties as actual representations of physical risks’.

Chinese robotics specialist Chongqing Sevnce Technology has obtained explosion-proof certification for its patrolling robotic inspector. The unit sports sensing devices to collect data and a meter-reading functionality. The robot is said to reduce costs and avoid risk to personnel.

Enverus has announced MineraliQ, a free, easy-to-use portfolio management tool for individual or family-owned mineral rights owners. MineraliQ links owner statement details to wells from the Enverus database and includes a map display of well locations and nearby oilfield activity.

The 2022 edition of GE Digital’s Proficy Historian includes enhanced system management and connectivity, a new asset model associated with historian data, and improvement in collection throughput and encryption. An administrator function spans the Proficy portfolio providing management of multiple systems from a ‘single pane of glass’.

The new Honeywell Forge connected warehouse is an SaaS offering that aggregates siloed OT data to provide a 360° view of warehouse operations with operations dashboards, critical KPIs, real-time asset health status and more.

The 2022.1 edition of Interica OneView sees fixes to the graph database that enable scheduled deletion of old metadata from the graph without impacting performance. OneView has been upgraded to use Java 13, Wildfly 22 and the Log4J vulnerability has been addressed. Long-term support for Java (17) and Wildfly version 26 will be added later this year. Interica is also working with Amazon to integrate OneView with the AWS QuickSight business intelligence service. New geoscience-based QuickSight dashboards display data patterns such as well and seismic data coverage and application utilization. OneView now supports integration with the January 2021 R3 Mercury edition of the OSDU Data Platform.

Interica parent company Petrosys has announced an OSDU Wells Connector for its Petrosys PRO mapping and data package. The 2021.2 release supports reading well headers, directional surveys, formation tops and well logs from OSDU R3.

The 2022b release of OriginLab’s Origin data analysis and graphing software adds 75+ new features and improvements and a suite of all-new apps. New capabilities include SVG export, support for GeoTIFF file import and edit and more. New apps include matrix analysis, NetCDF data analysis, Zoom FFT and radiometric geochronology.

Kongsberg Digital’s SiteCom Go adds a mobile app for access to real time drilling and well information in the SiteCom ecosystem. SiteCom Go leverages SiteCom global mnemonics to display data from multiple stakeholders as an overview of the current situation. The solution is built with the cross platform HTML5 and uses Energistics’ ETP data transfer protocol.

mCloud TechnologiesAssetCare MobileNow is said to be an ‘all-in-one connected worker solution’ for the digital oilfield. MobileNow enhances the productivity and safety of field workers across the energy industry and is ready for deployment ‘out of the box’. The solution provides access to asset information in real-time and users can communicate and collaborate with experts and other co-workers remotely, sharing photos and videos of equipment requiring repair. MobileNow is delivered along with a Moziware CIMO hands-free headset, a pocket-sized, voice-activated, hands-free head-worn display. An AssetCare Mobile subscription starts at $99 per month.

The 22.1 release of Rock Flow Dynamics’ tNavigator reservoir simulator includes multiple additions to the CCUS simulator and interface to model CO2 migration in the subsurface. tNavigator now claims a ‘comprehensive solution for CCUS that includes geology, geomechanics and flow simulation’. The new release also offers better handling of large seismic volumes. tNavigator has also seen multiple improvements for subsurface/surface coupling studies, including the ability to run on a cluster.

Brüel & Kjær Vibro has enhanced its VCM-3/Setpoint condition monitoring system. VCM-3 data is now integrated with VC-8000 data and fed directly into the CMS to create a plant-wide solution, encompassing machine protection and condition monitoring.

CGG is to build a new ‘European’ HPC hub in Southeast England that will become operational in H1 2023, increasing its cloud HPC capacity by up to 100 petaflops. The expansion targets subsurface imaging technology and services and also specialized HPC offerings to new and existing clients in the energy, environmental and other industry sectors. As part of CGG’s commitment to green energy and its pledge to become carbon neutral by 2050, the hub will be powered with ‘100% renewable energy’, as are CGG’s other UK operating sites.

A new version of Emerson’s AMS Machine Works interfaces with edge analytics devices to monitor the health of production assets. AMS Asset Monitor provides embedded, automatic analytics at the edge using patented ‘PeakVue’ technology to alert personnel to the most common faults associated with a wide range of assets. AMS Machine Works also provides OPC UA connections to external systems such as historians, computerized maintenance management systems and more.


Energy Conference Network’s Machine Learning in Oil & Gas

NVIDIA’s partner ecosystem. FuelTrust’s AI ‘digital chemist’. Cybereum on ML for oil and gas megaproject planning. Drishya.AI’s Artisan brownfield digital twin.

Reynaldo Gomez, who manages Nvidia’s energy partner ecosystem, presented some poster child deployments of ‘GPU-accelerated computing in oil and gas’. This leverages the Nvidia GPU Cloud, a ‘free’ online code repository with containers and pre-trained models (BERT, ResNet 50, etc...) for deep learning and other HPC applications. Gomez cited Beyond Limits work on well placement optimization with reinforcement learning, Bluware’s InteractivAI (GPU powered deep learning-based seismic interpretation), Helin’s Red Zone computer vision-based safety systems for offshore rigs and Abyss Solutions’ semantic segmentation models for corrosion monitoring. Gomez also presented work performed by InstaDeep for Total on deep learning-based fossil identification in 3D micro CT scans which leveraged a ‘distributed 3D Mask R CNN’ developed by DKFZ, the German Cancer Research Center.

FuelTrust’s Darren Shelton opined that ‘almost all carbon/GHG reporting is using rough estimates, inaccurate formulas, and has significant duplication across the supply and distribution chains’. GHG traceability is ‘complex’. Enter FuelTrust’s ‘from source to smoke’ solution for end-to-end visibility, compliance and anti-fraud needs in the global fuel lifecycle. An AI-based ‘digital chemist’ tracks, predicts, and validates outcomes at every step in the emissions lifecycle. Results are ‘securely recorded on the blockchain’. ‘DNA tracing’ of marine bunker fuels is said to address regulatory compliance and mitigate frauds in quantity and quality.

Ananth Natarajan (Cybereum) showed how ML can be used to improve oil and gas megaproject planning. There is indeed room for improvement. ‘For decades, capital project leaders have relied on practices that attempt to optimize individual investments, such as a nuclear power plant, an oil refinery, or a pipeline. Cost overruns approach $1.2 billion on the average project (some 79% over budget) and delays run from six months to two years.’ Oil and gas megaprojects are not the worst, but still on average show a 23% cost overrun. The problem is down to three causes: unpredictable project complexity, bias and over-optimistic estimates of cost, schedules and ‘principal agent issues’ including overstatement of benefits and ‘data hiding’. Megaprojects may suffer from the ‘impossibility of deterministic planning’. A variety of cognitive biases may come into play from optimism, Parkinson’s Law, whereby ‘work expands to fill the allocated capacity’, and ‘Students’ Syndrome’ where work is ‘procrastinated to the last moment’. All of which and more are elegantly summarized in the DesignHacks cognitive bias codex. Natarajan traced the evolution of project management from the GANTT approach of the last century through Pert, Primavera and MS Project. The last few years have seen the emergence of the cloud, AI, and ‘reference class forecasting’ a Cybereum specialty. Another favored approach is AI-based network analysis, said to be a ‘step forward from traditional PERT and critical path methods’. These are illustrated on the dynamic graph-based web page from the Cybereum-backed Blockchain for Projects site. BfP is described as a ‘Web3’ development that encodes projects into a ‘responsive, committed, symbiotic organism comprised of all the stakeholders, by the combination of incentive engineering and cryptographic methods’!

Amardeep Sibia from Drishya.AI presented his company’s ‘Artisan’ engineering digitalization application. Artisan can be used to create a digital twin of a brownfield asset. The toolset reads scanned engineering drawings (notably P&ID diagrams), understanding engineering logic to create the digital model. Artisan detects pipe naming conventions, symbols and lines and prepares symbol-to-line associations for manual QC. The system recreates CAD plant diagrams and builds 3D models of legacy plant along with a master tag list and asset inventory. Poster child is Shell Canada’s Peace River SAGD brownfield. Drishya recently joined Scovan’s PadX partnership that sets out to ‘accelerate innovation of SAGD well pad design and execution through Western Canada’.

More from Energy Conference Networks.


Folks, facts, orgs…

AAPG, Alberta Energy Board, Applied Petroleum Technology, Allegheny Science & Technology, Aveva, CAM Integrated Solutions, CGG, CGI, Cognite, Drishya AI Labs, Energy Information Administration, EQT, Energy Systems Catapult, EnLink Midstream, Gate Energy, UK Geological Society, Halliburton, Hart Energy, Lloyd’s Register Foundation, Lease Analytics, Namur, NIST, Navigator CO2 Ventures, OneFuture, Opportune LLP, Petrosys, Project Canary, Resoptima, Ryder Scott, SEG Foundation, Schlumberger, SeekOps, US Well Services, Valor, Weatherford, Ovation Data, Technical Toolboxes.

Claudia Hackbarth (ex-Shell) is president-elect of AAPG for 2022-23. Matthew Pranter (U Oklahoma) is editor.

Laurie Pushor is the new president and CEO of the Alberta Energy Board. He was previously with the Saskatchewan Ministry of Energy and Resources.

Applied Petroleum Technology has appointed Barry Bennett to head-up its UK operation in Conwy, Wales. He succeeds Julian Moore who is now APT global CTO. Bennett was latterly with Schlumberger.

Leah Guzowski is now VP of DOE Programs at Allegheny Science & Technology. She hails from the US NIST National Laboratory.

Ravi Gopinath has stepped down from his executive role as chief strategy officer and chief cloud officer at Aveva. He remains a strategic advisor.

CAM Integrated Solutions has promoted Drew Stehling to VP facilities.

CGG has reorganized: A Data, Digital & Energy Transition (DDE) reporting segment includes Geoscience (headed by Peter Whiting) and Earth Data (Dechun Lin). The Sercel/Equipment unit becomes Sensing & Monitoring (Emmanuelle Dubu).

François Boulanger is now president and CEO of CGI. Steve Perron takes over his previous EVP and CFO role.

Cognite has appointed Girish Rishi as CEO replacing co-founder John Markus Lervik who remains as chief strategy officer. Rishi hails from Blue Yonder.

Donovan Nielsen has joined the board of Drishya AI Labs. He is a co-founder of Scovan Engineering.

Joseph DeCarolis is the new Administrator of the US Energy Information Administration (EIA).

EQT has hired Tinna Nielsen (Copenhagen) and Angela Jhanji (New York City) to its sustainability team.

Guy Newey is now CEO of the UK Energy Systems Catapult, succeeding Philip New.

Bob Purgason has joined EnLink Midstream to head its carbon solutions group

Lee Jordan is the new CEO of Gate Energy.

Jonathan Redfern (U Manchester) is the new editor of the UK Geological Society’s Petroleum Geoscience publication.

In what it describes as a ‘robust succession management process’, Lance Loeffler is named SVP of Halliburton’s MENA region. Eric Carre is CFO.

Deon Daugherty is now Editor-in-Chief of Hart Energy’s Oil and Gas Investor. She hails from Energy Intelligence Group.

Ruth Boumphrey is now CEO of Lloyd’s Register Foundation following Richard Clegg’s retirement.

Hayes Carter is now SVP at Lease Analytics. Travis Beavers is VP Land.

Christine Oro Saavedra has been appointed as general manager of the EU Namur standards body, taking over from Nils Weber. Both hail from Bayer.

Laurie Locascio is the new director of the US National Institute of Standards and Technology.

Chris Brown, Eric Leigh, Jim Mullin and Jenny Speck have joined Navigator CO2 Ventures.

Jim Kibler is now executive director of OneFuture, replacing retiree Richard Hyde

Daniel Kohl is now co-head and MD of Opportune LLP’s investment banking practice. Nichole Jaggers is chief people officer. She hails from BP.

Karolina Harvie has rejoined Petrosys as Interica OneView support.

Tanya Hendricks is chief commercial officer at Project Canary.

Philippe Mieussens is MD of Resoptima’s new center of excellence in Bucharest.

Andrew Thompson has rejoined Ryder Scott as SVP and manager of the Calgary office. He hails from Macquarie Group. William Turner joins as senior project engineer (midstream and upstream) from Rystad Energy.

Katie Burk has been promoted to MD of the Society of Exploration Geophysicists Foundation.

Schlumberger has opened a new ‘Innovation Factori’ in Oslo, Norway.

Jennifer Stewart has joined the advisory board of SeekOps.

Joel Broussard is now chairman of the US Well Services board of directors. Kyle O'Neill is president and CEO. Josh Shapiro has been promoted to CFO.

Cathy Ramirez has been promoted to director of oil and gas accounting at Valor.

Keith Jennings, EVP and CFO, is to leave Weatherford. The search for a replacement is on.

Ovation Data has named Jasmine Tran VP of Marketing. She hails from CGG GeoSoftware.

Technical Toolboxes has appointed Jim Schuchart as president and CEO.

Deaths

Ryder Scott announces the death of Charles Milner (91) who joined the company in 1967 as a petroleum engineer. He retired as president of Ryder Scott in 1990.


In memoriam Claude Royer

Elf geologist who debunked the Great Oil Sniffer Hoax dies. Oil IT Journal recalls the 1970s debacle and some other exploration technologies that failed to bear fruit.

A death notice in Le Monde of a certain Claude Royer, geologist, caught our attention and led to a literature search that turned up quite an interesting tale. Royer worked for what was the Elf, now TotalEnergies, and was a key player in the debunking of the ‘avions renifleurs’ (sniffer planes) that caused quite a kerfuffle in France in the late 1970s. The story is told at length on Wikipedia as the Great Oil Sniffer Hoax. Briefly, a couple of con men had convinced Elf’s management that their black box could detect hydrocarbons from the air. The invention had great strategic implications and secrecy was of the utmost importance. Only the top brass of Elf and an inner circle around the French Presidency were involved in evaluating the technology. That was until Royer was called in and spotted the trickery. The ‘images’ of oil and gas fields that appeared on the display were actually drawn on pieces of paper. The machine cleverly paged these in and out of view as the survey proceeded. Commenting on the event years later, Royer observed, speaking of the inventor, Aldo Bonassoli ,‘He was the magician who made the images appear to people who had such a strong will to see them. It gave him great pleasure to please so many people, especially such important ones!’

The Avions Renifleurs story is not really an outlier in the history of oil and gas exploration. Many oil and gas explorers are approached by folks with weird black boxes, that are supposed to detect hydrocarbons. In fact, the folks touting such machines may well believe that they work. Sometimes it is hard to tell whether a system is a scam, or whether it just does not work very well. Thus, subsequent to the Elf debacle we have seen technologies that were heralded as breakthroughs that did not quite pan-out.

In the 1990’s BP invented its own ‘sniffer plane’, a system that used an airborne-mounted laser to detect hydrocarbon seeps from oilfields which was believed to be ‘an excellent tool for frontier exploration’. As far as we know this has not quite turned up trumps.

Again, without wishing to spoil anybody’s business model, Schlumberger’s 2004 acquisition of ExxonMobil’s ‘R3M’ deep electromagnetic prospecting technology back was heralded with the claim that controlled source EM ‘could replace seismic’. Well that hasn’t happened yet!

The field of geoscience is generally well supplied with Royers who have their feet on scientific ground and are capable of detecting the more egregious examples of improbable technology. If only the same could be said for IT!


Done deals …

Emerson is now AspenTech SS&E. Dragoneer, Bessemer back ComboCurve. Cyient acquires Grit Consulting. Dawson takes charge on failed Wilks Bros merger. EZ Ops bags Payload and Drift. Green Park & Golf backs Earthview. Enbridge funds Smartpipe. Kongsberg Digital acquires Interconsult Bulgaria. Aker, Cognite and Telenor team on Omny cybersecurity boutique. Validere acquires Clairifi. Vela Software buys Geovariances. Sercel acquires Geocomp.

Emerson E&P Software (previously Paradigm) is now AspenTech subsurface science & engineering. The combined companies now offer ‘end-to-end geoscience and engineering software’ and a ‘comprehensive asset optimization portfolio’. Capabilities span the petroleum supply chain, from the reservoir to fuel distribution at the gas station. The team of 3,700 professionals are also to pursue ‘comprehensive solutions’ for carbon capture utilization and sequestration, geothermal and hydro energy and mineral resource extraction under president and CEO Antonio Pietri. The deal involved a $6 billion cash payment from Emerson to AspenTech shareholders in exchange for a 55% stake in the combined company.

Cloud-based energy analytics specialist ComboCurve has raised $50 million through a Series B funding round led by Dragoneer Investment Group and Bessemer Venture Partners. The monies will be used to accelerate core product enhancements and expand into greenhouse gas emission forecasting, scheduling, and modeling of renewable energy sources.

Indian Cyient is to acquire Singapore-based Grit Consulting, strengthening its global technology consulting practice.

Dawson Geophysical reported a $2,872,000 charge relating to the its proposed merger with a subsidiary of Wilks Brothers which failed to receive the requisite 80% shareholder approval. CEO Stephen Jumper also reported ‘difficult market conditions in the North American seismic data acquisition sector with ‘historically low levels of crew activity’ although a ‘high-priced oil environment’ was cause for encouragement.

Canadian EZ Ops has acquired two transportation and logistics companies, Payload Technologies (cloud-based field tickets) and Drift Technological Solutions (fluid logistics optimization). More in the release.

Earthview Corporation, developer of the BluBird continuous methane monitoring platform, today announced has secured funding in a round led by Dallas-based Green Park & Golf Ventures, a specialist technology and scientific investor.

Smartpipe Technologies has received a $6.6 million investment from Enbridge to develop its pipeline technology. The monies will be used to improve pipeline safety and allow for the transportation of hydrogen and carbon dioxide. Smartpipe’s patented technology involves a high-strength, composite internal pipeline liner that is pulled through an existing pipeline to increase structural integrity and allow for improved monitoring with an embedded optical fiber. The deal follows a long-term association between Smartpipe and Enbridge that has seen the development of a 16” high-pressure composite line pipe. The companies are now working on a 24” diameter Smartpipe.

Kongsberg Digital has acquired software developer Interconsult Bulgaria developer of process modeling, software architecture design, quality assurance and 3D modeling. ICB will serve as KDI’s European software development hub and support its vision to ‘accelerate the green shift by digitalizing the worlds industries’. The companies have been working together since 2004.

Aker, via its Aker Capital unit, Cognite and Telenor are establishing a software security company. The company, called Omny, will fill a gap in the cybersecurity market where there is an ‘unmet need for software that prevents cyber attacks and secures operations’. The aim is to build a Norwegian software company that will be a global player in operational industrial security.

Validere has acquired the emissions management and regulatory reporting platform Clairifi, strengthening its ESG offerings and allowing customers to ‘turn compliance into capital’. Clairifi offers end-to-end reporting capabilities on environmental and regulatory requirements, ‘minimizing the burden around emissions management while maximizing tax savings and improving operational efficiency’.

Vela Software has acquired French geostatistical boutique Geovariances. Vela stated that the company will be run as a ‘standalone and autonomous businesses’ while benefiting from proximity to other Vela-owned companies operating in the mining and oil and gas industries.

CGG unit Sercel has acquired Geocomp, a provider of geotechnical infrastructure monitoring, consulting and testing expertise and technology along with delivering effective solutions to address major safety issues on aging and risky infrastructures as well as more modern renewable energy construction projects.


Petroleum Industry Data Exchange (PIDX) Spring 2022 conference

Microsoft cloud for sustainability. PIDX ETDX embeds emissions data in PIDX XML. PwC on the carbon leger and the Open Footprint Forum.

Microsoft’s Kadri Umay cited the GHGProtocol as a reference for reporting Scope 3, emissions, those that ‘company is responsible for outside of its own walls’. Scope 3 makes up the overwhelming majority (98%) of Microsoft’s emissions and the company is committed to halving these by 2030. Likewise Schlumberger’s sope 3 emissions amount to 96% of its total and a 30% cut is announced for 2030. Umay observed that Scope 3 emissions are very hard to calculate, ‘your suppliers’ scope 1 - 2 are your scope 3, and they are already reporting it’. There are also challenges with the calculation and reporting of GHG emissions. Umay cited The Carbon Call organization which reported that ‘Today, carbon accounting suffers from data quality issues, measurement and reporting inconsistencies, siloed platforms, and infrastructure challenges [ which ] makes it difficult to compare, combine and share reliable data, particularly for companies’.Enter the Microsoft Cloud for Sustainability that promises ‘automated calculation of Scope 3’. MCS can connect to some 50 data sources from Word through JSON to SalesForce and offers dashboards, scorecards and reports to drill down, in incredible detail, to emissions factors for different fuels and vehicles.

The PIDX ETDX emissions transparency data exchange group has spent a couple of years looking into this problem with industry partners (Baker Hughes, BP, Chevron, ConocoPhillips Halliburton, Shell and Schlumberger) with the intent of developing standards for exchange of carbon emissions and other energy transition requirements that needs to be harmonized across industry participants. ETDX is working with industry and other bodies including the UN, the CDP (Carbon disclosure project) and The Open Group’s Open Footprint Forum which is operating in a similar space. ETDX is exploring how the existing PIDX schemas could be extended to support the transfer of emissions data from supplier to operator, and vice versa, leveraging PIDX downstream code tables and the definitions and UNSPSC mappings in the Petroleum Industry Data Dictionary.

Umay presented a use case involving the supply and use of drill bits to show the interaction between different contracts for supply, shipping and drilling and how the related emissions are divvied-up between the parties, passing emissions data by part serial number to the operator as an invoice line-item. Umay wound up returning to the MCS to show how the intricate calculations of emissions associated with the use of Microsoft’s Azure cloud computing resources take account of manufacture, packaging, transportation, use, and end of life hardware phases in Microsoft-owned and leased data centers. The methodology is presented in a Microsoft White Paper on Azure Scope 3 Emissions.

Comment : Alongside the difficulty of accurately defining what emissions fall into who’s scopes, the exercise appears to involve easily as much data collecting and reporting as is involved in financial accounting. While there is a certain logic to this, it is hard to see companies throwing the requisite resources into this magnificent box-ticking exercise, or whether a regulator will ever be in a position to provide a similar audit effort.

Chris Welch from OFS Portal reprised some of the Scope 3 definitions and showed how they can fit with PIDX supply chain message orchestration. The ETDX is extending the PIDD with greenhouse gas attributes for product cradle-to-grave emissions, operations and measurement uncertainty. Emissions numbers are represented in an extended PIDX XML format as <PIDX:EmissionsData>, embedding information on kilograms of CO2 equivalent in an electronic invoice. The idea is to move from sustainability reporting that uses industry averages to detailed line by line emissions reporting. As reporting shifts from calculation to measurement this will bring more granularity and presumably a correspondingly huge increase in data to report and manage. Welch proposed a division of labors between OFF and PIDX with the former responsible for defining reporting data standards and the latter for a data exchange standard.

John Service (PwC) stated that a ‘carbon ledger’ is required to ensure high quality reporting and proposed a data model for a single source of the truth. This would enable informed decisions with a transparent view of carbon related data such that companies can ‘navigate evolving carbon regulations with the same level of accuracy and transparency a seen in financial disclosures’. This will enable companies to achieve decarbonization targets with auditable data and to defend against claims of greenwashing. Accurately quantified carbon-associated costs can be attributed to the correct cost center, and carbon data can be shared with key stakeholders to support their net zero agendas

Today, carbon data is siloed in disparate systems that make it difficult to have an overview. Data is ‘divorced’ from the core infrastructure and architecture that companies use to manage their business*. Moreover they lack the ‘transparency and actionability needed to remove carbon emissions from products, operations and value chains’. A robust data model is key to a working carbon ledger but who is to build the carbon data model? For Service it has to be the Open Footprint Forum (OFP), an Open Group backed cross-industry collaboration to create a common data standard for carbon-related information and a set of standards for data collection and management across all industries.

* A questionable claim as a google for SAP ESG might suggest.

More from the PIDX Conference home page.


2021 IQPC Cyber Security for Energy and Utilities

Plains All-American on US NPC report of oil and gas cyber security and on OT/IT convergence risks. KPMG on supply chain related cyber risk and SOC Type 2 compliance. ONG-ISAC on recovering from a breach.

Al Lindseth (Plains All American Pipeline) gave a recap of the 2019 US National Petroleum Council’s Dynamic Delivery study of cybersecurity across America’s oil and natural gas transportation infrastructure. The report traced the convergence of information technology networks with operations technology and the consequent heightened cyber risk to operations. The push for more data from the operating environment for analysis led to the breakdown of traditional defenses and the need for a ‘defense in depth approach’ for IT networks. Twenty years ago this meant a ‘moat’ i.e. a completely isolated network*. But over time, a breach became an inevitability and the focus moved to detection and threat intelligence. This is now happening with operations technology which also relied on isolation and network segmentation. A year ago the feeling was that there was no need to overreact to the ‘hype factory’, there were relatively few cyber events in the US. The thinking was that OT had more natural defenses. The NPC report recommended that IT and OT groups needed to work together, although ‘this was not happening’. Another recommendation was for a cyber process hazard analysis to evaluate the risks of potential attacks and establish an appropriate level of protection. The NPC report came forward with a prescriptive approach to cyber security but a risk-based approach may be better. With finite resources, if one risk is brought down to zero, other risks will inevitably rise. How sustainable are more regulations atop other regulations? The risk-based approach is preferable although ‘when there is an incident, we need regulators with teeth’!

* The moat a.k.a. the perimeter of a plant has been a popular topic of cyber security since The Open Group’s Jericho Forum introduced the ‘deperimeterization’ concept back in 2002.

Jason Howard-Grau (KPMG) presented on the risk of supply chain threats, an area where the risk is great and increasing as the OT landscape gets more complex. ‘Attackers will wait patiently to exploit the weakest link in the supply chain’. Some have posed as visiting engineers and lured site personnel. Also it is time to modernize your old kit. KPMG recommends that operators mandate SOC Type 2 compliance which ‘was not happening’. Covid means that there is less reliance on on-site personnel and now folk dial in from home. There is a movement to keep this new way of working. Ransomware is on the rise and the extended supply chain means that ‘your risk is my risk’. Howard-Grau recommends that those oil and gas companies that the DHS considers critical infrastructure should adopt industry-specific cyber security standards and that these should be audited by DHS and other ‘government-sanctioned’ entities. The results of such audits should inform new regulations, in particular where these may demonstrate the limitations of today’s voluntary framework. Howard-Grau also recommended that existing cyber security standards, notably API 1164, better integrate cyber security that spans IT and OT. The skills sets are different and the two ‘will not meet’. Standards need to be updated as do the ‘rules of engagement’. Industry needs guidance. ‘All struggle with the organizational structure. Engineers think differently from cyber geeks and this will never change without better new governance. Every organization says that the asset inventory is crucial, but in reality even this can be hard achieve. You can’t secure what you can’t see!’. There is too much inside engineers’ heads, folks don’t write it down. Configuration files may be hard to locate. Monitoring OT can be hard. KPMG found that one supplier has installed its own its own router inside a client’s site, unbeknown to the client! These are process and people issues. The Colonial Pipeline attack was not a full-frontal system attack. Companies need to revisit their plans and understand how to act post event, building-out industry standards appropriately.

Angela Haun presented the work done at the Oil and Natural Gas information sharing and analysis center (ONG-ISAC), which ‘serves as a central point of coordination and communication to aid in the protection of exploration and production, transportation, refining, and delivery systems of the ONG industry, through the analysis and sharing of trusted and timely cyber threat information, including vulnerability and threat activity specific to ICS and SCADA systems’. The OT/IT merge is still work in progress. Learning how to recover from a breach is key. ISAC helps you filter and prioritize available information. Join one of the 27 ISACs that meet regularly.

More from IQPC.


Sales, partnerships, deployments

Repsol rolls-out Celonis. Digital Intelligence Systems partners with ComplianceQuest on AI EHS. DNV to support IOGP members’ decarbonization. DeepOcean partners with Aker BP. Equinor signs with Aibel and APT. Fugro Starfix for Jumbo Maritime. GE Digital AEMS for PDO. Halliburton/Aker BP digital twin for development. Petrobel deploys iEnergy Stack. BW Energy selects IFS Applications ERP. Kongsberg EPC for Yinson FPSO. Kuva Systems for Marathon. NetSpring teams with DCP Midstream. Williams moves ops to Oracle Fusion cloud. Trafigura teams with Palantir. Phillips 66 joins Ipsos program. Teledyne and MFE Inspection team. TotalEnergies launches Ausea emissions program. US Well Services integrates KCF MachineIQ. Upstream Development joins CO-LaN. Nasuni joins OSDU Forum. Petrobras extends CGG contract. Liberty partners with Seismos.

Repsol is to deploy Celonis’ execution management system to ‘reveal and fix’ process inefficiencies and ‘reduce the time spent turning knowledge into action’. Celonis EMS connects data across systems, apps and desktops. In 2021, Repsol, in collaboration with Accenture, began marketing ARiA, ‘advanced Repsol intelligence and analytics’, its own, cloud-based data and analytics platform. Celonis will now be deployed alongside the ARiA platform.

Digital Intelligence Systems is partnering with ComplianceQuest to provide ‘AI-powered’ environmental health and safety solutions for customers in the oil and gas industry. DiSys’ D2M managed services division will serve as the lead consultant and implementor of the combined training, reporting and auditing services.

IOGP, the international association of oil and gas producers has selected DNV to support its members’ decarbonization initiatives. DNV will develop metrics, recommended practices, guidelines, and methodologies, focusing on carbon capture and storage (CCS), electrification of oil and gas assets, reducing flaring and venting, increasing energy efficiency, and developing business cases for hydrogen. More from DNV.

DeepOcean has signed a long-term strategic partnership with Aker BP for the provision of subsea inspection, intervention, repair, survey and emergent operations together with associated onshore engineering and project management services. More from DeepOcean.

Equinor has entered a ten year collaboration agreement with Aibel for the provision of future maintenance and modification services onshore and offshore. More from Equinor.

Equinor has joined forces with Applied Petroleum Technology on an R&D project that aims to replace downhole sampling and logging with ‘more cost-efficient solutions’. The project is to use geochemical analysis to extract more information from reduced data acquisition programs to replacing activities such as downhole fluid sampling, production logging and wireline logging. More from APT.

Jumbo Maritime has awarded Fugro a positioning and metocean services contract for the transport and installation of a new, 24,000-ton floating production system at the US Gulf of Mexico Vito deepwater development. More from Fugro. A remotely-enabled Fugro Starfix solution will provide real-time knowledge of all eight vessel locations.

Petroleum Development Oman has purchased an advanced energy management System (AEMS) from GE Digital to plan, control, and optimize power generation for its oil production operations. PDO runs its own electric network, a microgrid, to power business, running autonomously from the main grid. The GE Digital software will enhance the management of the microgrid and prepare for the ‘challenges and opportunities of renewables in the future’.

Halliburton and Aker BP are working on a ‘first-of-its-kind’ digital twin for field development. This is to result in a new field development planning (FDP) package from Halliburton, expanding the company’s digital well program, a DecisionSpace 365 cloud application. FDP, it is claimed, is ‘built on the OSDU data platform’.

Petrobel, a joint venture between ENI and the Egyptian General Petroleum Corporation, is to deploy iEnergy Stack, Halliburton’s ‘cloud solution that runs on-premise’, to manage petrotechnical software applications.

Global E&P operator BW Energy has selected IFS Applications enterprise resource planning and asset management (EAM) software to support its global oilfield production and development strategy. More in the release.

Kongsberg has signed an engineering, procurement and construction (EPC) agreement with global offshore production contractor Yinson for the supply of an integrated suite of electrical and control equipment for an FPSO currently under conversion. Scope includes Kongsberg’s E-house, electrical, control, safeguarding and telecommunication equipment solutions, and will include service support on the Yinson-owned Maria Quitéria FPSO. More from Kongsberg.

Following a successful pilot with Marathon Oil of its methane monitoring technology, Kuva Systems has deployed additional infrared cameras for real time leak detection. Kuva’s technology is also used at the Oilfield Technology Center (OTC) at Texas Tech to establish detection limits in preparation for new EPA rules expected later this year. The OTC is a 10 acres facility at Lubbock, TX, in the Permian Basin.

NetSpring, a provider of ‘metrics-first’ operational intelligence solutions has announced a collaboration agreement with DCP Midstream to develop advanced streaming analytics solutions from IoT operational data. These solutions will help energy companies drive operational excellence across field assets by uncovering new and untapped insights. The first use case is an operational assurance solution that detects critical operating issues and delivers real-time contextual mobile notifications to stakeholders, while initiating restoration workflows.

Williams Companies, a major US natural gas company has moved its finance and operations to the cloud with a deployment of Oracle Fusion cloud enterprise resource planning. The new solution ‘eliminated thirteen different applications that had been bolted on, improving data governance and visibility into financial metrics’.

Oil trader Trafigura is collaborating with Palantir Technologies on a supply chain carbon emissions platform. The solution involves a ‘consortium approach’ whereby participants across global energy and commodities supply chains will model lifecycle carbon intensities, enabling enhanced visibility and reporting. Palantir’s Foundry operating system will be configured to provide consortium partners with an calculation of carbon intensity across supply chains, beginning with crude oil and refined products, and concentrates and refined metals. The deal leverages prior work by Palantir and Trafigura, where a pilot built scenarios ‘across ten million carbon pathways’ using commodity shipments integrating Trafigura data and other metrics.

A recent report in the Financial Times reported Palantir’s ‘philosophical problems’, ‘Asked to set a date for profits, not only did executives at [Palantir] fail to answer the question during a first-quarter earnings call, but one began to talk about nuclear war’.

Phillips 66 is transitioning its site assessment program to IPSOS Channel Performance for its Phillips and Conoco branded sites. ICP will enhance Phillips 66 ‘excellence in action’ site assessment program with quarterly site assessments at across the US, Puerto Rico and Mexico.

Teledyne FLIR Defense has teamed with MFE Inspection Solutions to integrate its FLIR MUVE C360 multi-gas detector on Boston Dynamics’ Spot robot and commercial unmanned aerial systems. The integrated solutions will enable remote monitoring of chemical threats in industrial and public safety applications. The MUVE C360 detects and classifies airborne gas or chemical hazards for industrial safety and inspection applications.

TotalEnergies has launched a worldwide drone-based emissions measurement campaign across all its upstream operated sites. The campaign uses AUSEA technology developed by TotalEnergies, the French National Research Center for Scientific Research (CNRS) and University of Reims Champagne Ardenne. AUSEA consists of a miniature dual sensor mounted on a drone, capable of detecting methane and carbon dioxide emissions and identifying their source. More from TotalEnergies.

US Well Services has integrated KCF’s MachineIQ fault detection automation engine and vibration sensors with its ‘next generation’ Clean Fleet electric frac systems. The solution autonomously balances the load across multiple frac pumps on site.

Upstream Development and Engineering has joined CO-LaN, the process engineering standards association, as a corporate associate member. UDE provides design and engineering services for oil and gas processing facilities in the US and South Korea. Clients include Chevron, DSME, Mustang and Technip.

File management specialist Nasuni has joined the OSDU Forum. Nasuni specializes in cloud-based file storage and cyber/ransomware protection.

Petrobras has awarded CGG a five year extension for the provision of its Geovation seismic imaging software. The agreement gives Petrobras geoscientists access to advanced technology including full-waveform inversion and user training from CGG’s GeoTraining center.

Liberty Energy has partnered with Seismos on the provision of real-time stimulation QC. Liberty will deploy Seismos’ acoustic sensing and non-invasive monitoring systems to perform quality control of stimulation performance. Seismos’ ‘measurements-while-fracturing’ uses active controlled acoustics to ‘probe the near-wellbore fracture network’ and compute a near field connectivity index for each stage.


Standards stuff

ISO/IEC standard for AI governance. OGC publishes results from ‘Testbed-17’. OGC seeking sponsors for disaster management pilot. PIDX releases units of measure guidelines. UK Treasury’s Transition Plan Taskforce to establish standard disclosure framework.

A new standard, ISO/IEC 38507:2022 covers the governance implications of the use of artificial intelligence by organizations. The document provides guidance on the governance and use of AI to ensure its ‘effective, efficient and acceptable use within the organization’.

The Open Geospatial Consortium (OGC) has published the outcomes of Testbed-17. TB17 research covers model-driven standards, geo data cubes, sensor integration and more.

The OGC is also seeking sponsors for a pilot to improve disaster management and response. The OGC Disaster Pilot 2022 will use spatial data standards with web and cloud technologies so that stakeholders can collaborate across any distance, using disparate data, to manage every phase and scale of disasters. The 2021 edition of the DP was backed by NASA, USGS and Natural Resources Canada with support from AWS.

PIDX International has released a set of units of measure (UOM) scheme guidelines to allow trading partners to share the units of measure used in price sheets, invoices, and transaction documents. The UOM scheme is backward compatible with PIDX price sheets. The UOMs fit into the price sheet header, and support ‘ANSI, ISO, or UNECE validation’. More from the PIDX 1.62 schema bundle.

The UK Treasury has announced a Transition Plan Taskforce to support the move to a low-carbon economy. The TPT is to make recommendations for a disclosure framework that enables ‘science-based, standardized and meaningful transition plans’ including disclosure guidance, templates and metrics. Beyond its UK regulatory mandate, the TPT hopes to be ‘an example for the development of other national and international standards and norms’.


Digital Twin digest

Rockwell explains the fundamentals of digital twins (sort of!). Digital Twin Consortium goes open source with French roastery poster child. Element Analytics K-Graph and the digital twin. Oil IT Journal probes the digital twin’s ancestry.

We were intrigued by the title of an article in the Rockwell Journal that promised to explain the ‘fundamentals of digital twins’. The short posting makes a case that what distinguishes a DT from a ‘simulator’ is that the former is a ‘living digital replica’ and the latter is … Well it’s really not very clear. Strip out the marketing baloney and you are left with the concept of a simulator that is tuned to mimic a machine, plant or reservoir as accurately as possible. The article points the reader to a combination of MapleSim and Rockwell’s own Studio 5000 design environment. The referenced white paper in turn points to a use case of ‘virtual commissioning’ that dates back some 15 years. No the digital twin is really nothing new!

Hedging its bets on the newness issue we have the ‘Digital Twin Consortium’ that describes the DT as ‘relatively new’. The DTC now offers an open-source repository to ‘help DT communities collaborate while building the market’. To date the most active repo appears to be Bema’s Ecolcafé an ERP/MES system that underpins ‘Torréfacteur 4.0’, an Industrie 4.0 coffee roastery.

IT/OT data management specialist Element Analytics has rolled out tools for digital twin builders in the form of a connector portal that supports knowledge graph-based modeling and ‘advanced joins’. Element’s Unify Graph is said to support the mapping of complex data environments. Data can be exported for consumption by other graph database products such as AWS Neptune or Neo4j. ‘Advanced joins’ let users combine data from various sources using approaches including fuzzy and ‘contains’ matching. Connectors are available for Amazon S3, IoT SiteWise, Azure Blob, Ignition, KepWare and others. More from Element.

In its glossary, the Digital Twin Consortium defines the DT as ‘using real-time and historical data to represent the past and present and simulate predicted futures’. If you are into reservoir simulation you will recognize the phases of model building, history match and production forecasting. How ‘new’ is that? Well our records of this activity point back to 1955 with the work done at Humble Oil Co (later Exxon), among the first to use computers to model oil reservoirs.


Railroad Commission rolls-out AI-based seismicity reviews

Regulator targets ‘no more mag 3.5 quakes’. TexNet injection volume reporting tool deployed. AI tool speeds monitoring.

In Texas, earthquakes of magnitude 3.0 or over have increased eightfold since 2017. The Railroad Commission of Texas, the regulator, has increased its monitoring and oversight of injection with the goal of ‘no more magnitude 3.5 plus earthquakes after 18 months from the date of an RRC response action’ (est. date 2024). For the affected areas, operator-led response plans have been developed with expanded data collection efforts, contingency responses for future seismicity, and scheduled checkpoint updates with RRC staff.

Because the lag time between reductions in injection volume and reductions in seismicity is uncertain, the RRC has been studying changes to injection operations and subsurface pressure changes within the affected areas. New RRC rules call for more frequent reporting of injection volume and pressure data in areas of seismicity. RRC has teamed with the Texas Comptroller on the TexNet Injection Volume Reporting Tool to facilitate injection report filing and data accessibility.

The RRC has also turned to ‘artificial intelligence’ to speed the process of conducting its reviews of seismicity prior to the award of a permit for an injection/disposal well. The reviews are conducted by the RRC’s underground injection control department which has deployed a machine learning algorithm to process the large amount of information that needs to be digested. The UIC has developed a Python/scikit learn model to assess multiple factors related to the number, severity and proximity of earthquakes and uses a decision tree to assign a grade to the review. The assessed grade determines the allowable amount of fluid that can be injected.

Python scripts automatically collect GIS mapping of historic seismic events within a permit application’s area of interest, data which is used by the machine learning algorithm. The approach has reduced permitting time from over 100 days in November 2018 to about 20 days presently. At the same time decisions are more consistent in how seismic and other factors are considered. More from the RRC.


Blockchain, ESG and the energy transition

API joins Blockchain for Energy. SPE podcast pumps blockchain. Shell, ‘blockchain supports the energy transition’. Chainlink Labs and the Lemonade Crypto Climate Coalition. GAM and Cathedra abandon flaring-and-mining cogeneration. eMission Software aligns Hashgraph with Canadian GHG reporting. Ogallala Life sells AcreNFTs to ‘save the aquifer’! Texas Blockchain Council switches off to save ERCOT. BfE, Shell, eMission respond to Oil IT blockchain challenge.

Oil IT Journal continues its dutiful if reluctant reporting on what seem like increasingly improbable ‘applications’ for blockchain in energy. For the blockchain skeptics that we are the announcement that the venerable American Petroleum Institute (API) has joined the Blockchain for Energy consortium came as a shock. Even more so when we learn from the release that the API plans to leverage its BfE membership to ‘focus on environmental, social and governance issues, as the consortium continues to build and expand its industry grade solutions’. The API/BfE association is spearheaded by BfE chairperson, Chevron’s Raquel Clement. For API, Aaron Padilla stated that ‘blockchain can be used for transparent and efficient tracking of GHG emissions’.

Blockchain for Energy got an endorsement from the Society of Petroleum Engineers which has obligingly hosted a podcast by BfE president Rebecca Hofmann where you can learn how ‘blockchain is fundamental for low carbon energy, including geothermal’ and how the technology ‘improves the supply chain, carbon credits, regulatory reporting, and compliance’.

We provided both the API and BfE with our Financial Times letter for comment. BfE spokesperson Martin Juniper replied thus…

“Your article makes some interesting points. I would counter that it’s not what one uses but how one uses it that has the most impact. Blockchain for Energy is focused on using Blockchain as part of a mix of secure decentralized technologies rather than the solution to every problem. Our team is very careful to engage across the industry and only apply those technologies that are suitable. I do think that healthy discourse is a big part of what we do at B4E and we welcome these types of challenges as a way to prove our worth.”

David Hone who is chief climate change advisor for Shell blogged recently on ‘Blockchain, carbon emissions and NFTs’ stating that ‘blockchain is becoming a tool to support the energy transition’, in particular, the non fungible token (NFT) that ‘can be associated with a digital or physical asset’ and sold or traded. Hone suggests that in the a voluntary carbon market, the NFT represents a carbon removal or carbon reduction such as carbon storage in a tree, or in a geological formation. NFTs would then provide a public certificate of authenticity or proof of ownership and a license to use the asset for a specified purpose. That use could be as an offset against emissions generated by the holder of the NFT. Having said that, Hone adds that blockchain and NFTs could do the job of tracking carbon units and ensuring integrity, but that ‘current systems also do this and have been doing it satisfactorily for some time’. It feels like the new kids on the block are a solution looking for a problem, but at least as far as unit tracking goes, there isn’t a particular problem. He does hold out some hope that blockchain and might solve the complex problem at the heart of the climate issue, the carbon budget. This might involve a global decentralized autonomous organization granting NFTs to worthy stakeholders.

We shared our FT letter with David Hone who kindly provided the following in response to our blockchain challenge.

“As I said in the article, blockchain is not my area of expertise. However, I did make a similar to point to yours in that simply using this technology to replace carbon unit accounting systems that already work seems like a rather pointless activity. So, without a deep knowledge of blockchain, I ventured the idea that the technology might be better suited to the much more difficult task of managing the global carbon budget, for which there is no current solution. I take your point that blockchain may have flaws within it, making it completely unsuitable for a number of different uses, but I am not in a position to comment on that.”

A new report Managing Climate Change in the Energy Industry With Blockchains and Oracles claims to ‘reveal the role blockchains can play in managing climate change’. Chainlink Labs, aided and abetted by Tecnalia show how smart contracts address ‘key interoperability and economic complexities in the transition to renewable energy’. The report recommends tokenizing carbon credits and consumer rewards that can then be used as collateral in decentralized finance applications. One (non-energy) poster child cited in the report is the ‘Lemonade Crypto Climate Coalition’ which provides crop insurance to smallholder farmers, ‘triggering automatic payouts when rainfall or temperature data from sources like AccuWeather meets a certain threshold’.

Great American Mining, a ‘bitcoin mining company that provides a solution for oil and gas companies to reduce flaring and increase oil production’ and Cathedra Bitcoin have parted their ways. The idea was to using otherwise flared gas as a power source for a bitcoin mining operation. Due to ‘severe winter weather conditions’ in North Dakota (isn’t that where Fargo is?), the operation saw its performance down to only 45% of the anticipated expected hash rate. The companies are now working to conclude the business relationship. Notwithstanding the difficulties, GAM reports that it has scaled the past year from approximately 1 megawatt to over 20 megawatts of deployed hash rate on the oil fields of North Dakota. More from GAM.

eMission Software is partnering with the HBAR Foundation to incorporate the latter’s Hedera Hashgraph technology and open source Guardian ESG marketplace software into emissions reporting that is ‘auditable to the metric tonne’.The partnership will align the Guardian to current Canadian reporting requirements including the GHG Reporting Program and the National Pollutant Release Inventory.

Our challenge to eMission Software brought the following response from CEO Richard Hepp

“I don’t think a ‘rebuttal’ is required. We view blockchain and hashgraph technology as a tool that can be utilized to achieve a specific goal with specific benefits and detriments, like any new piece of technology. Our unique usage of this tool sits nicely in our business plan, and allows us to offer a solution to less-carbon intensive industrialized nations/industries, mitigate traditional financial issues like inflation of foreign currency, and resource constrained organizations that would like to improve their emission calculations and reporting simultaneously providing benefits to global emission accounting, (transitivity of data over M&A transactions in the energy space over specific pieces of infrastructure, for one example). As for your examples, they are too narrow in scope and ambiguous for me to understand your question or point.”

Ogallala Life, with help from AcreNFT, has raised some $200,000 by selling Non-Fungible Tokens to ‘save the Ogallala aquifer*’. The Ogallala aquifer has lost 10 trillion gallons of water over the past 40 years putting US grain production in peril. Ogallala Life proposes a ‘marriage of geology, art and tech’ to save the aquifer. The plan is to build check dams along stream channels to capture and prevent mass evaporation of rainwater and install direct borehole groundwater recharge zones to replenish the aquifer. All of which will be funded by a perpetual endowment backed by the Angel Protocol and decentralized finance on the Terra blockchain. Finance is raised by selling artwork NFTs on the AcreNFT. More from Ogallala Life.

* As featured in Annie Proulx’s book That Old Ace in the Hole.

But not all blockchain news merits our skepticism. We learn that the grandiosely titled Texas Blockchain Council acted in a public spirited manner when ERCOT, the Texas grid operator asked folks to ‘turn off’ and conserve power during a recent heatwave. The TBC neatly spun this into a big plus for the bitcoin mining industry which can ‘turn off in a trice’, making it a ‘perfect resource for the grid regarding frequency balancing and demand response.’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.