Oil IT Journal: Volume 27 Number 2


Is the singularity nigh?

Rice University/Ken Kennedy Institute HPC Energy Conference hears from BP’s John Etgen on the state-of-the-art in high performance computing. Etgen fondly remembers the physics-oriented technology of the Connection Machine and the ‘crazy guys’ who revolutionized HPC in the last century. But where are they now?

Speaking at the 2022 Energy High Performance Computing Conference (previously the Oil and Gas High Performance Computing Conference), BP distinguished geophysics advisor John Etgen set out to ‘tickle your complacency a bit’ by asking ‘are we approaching the singularity* in scientific computing’. The answer to the question is ‘no’. ‘We are actually going down the drain’.

Etgen is a geophysicist who has earned a ‘non trivial salary’ doing subsurface imagery for exploration, using math and physics to ‘see things that we have not seen before’. Etgen traced the history of high performance computing from his own experience, starting with running Fortran algorithms at school on a DEC PDP 10 and graduating to Amoco’s Cray 2 and the super-fast TMC Connection Machine. The latter was easy to code and ran 20x faster than the Cray. Since the CM (mid 1980s), there have been no ‘kick ass’ moments in HPC! Etgen’s current 25 petaflop cluster fails to provide the same ‘wow factor’. Today’s machines are cheaper, there are many more processors but the hardware forces us into an awkward compute model.

An earlier ‘singularity’ for humanity was the industrial revolution. This was characterized by the replacement of multi-tasking individuals by specialists. For Etgen, the next singularity will require even more specialization. While BP and others state that ‘we need cross-skilled, multi-talent hires’, this is ‘the fastest way to kill an industrial revolution type event’. We hear, ‘we need generalists’. Wrong! ‘If you want a singularity, let people hyperspecialize’.

Etgen was recently the editor of the Journal of Geophysics and realized that U-Net has ‘taken over the world’. The AI guys have got it. But if you are a hard’core computational physics person you don’t get it. Unless, that is you are using U-Net to replace conventional physics (something that Etgen personally hates!) Etgen regrets that there are no more ‘cowboys, cray, crazy lunatics’ like CM founder Danny Hillis. His crazy idea? Building a computing to match a problem in physics. (Hillis is currently building the Clock of the Long Now).

Going back to the industrial revolution, Etgen cited the legendary competition between steel driver John Henry and a steam-powered steel driving engine. Henry won against the machine … but died in the act! Today HPC faces exactly the same danger in the race between hand-coded finite difference stencils and AI-powered automatically-generated code. ‘Code that writes code that writes code…’ . Today the humans have the edge ‘but ultimately machines will mow over humans’.

Etgen wound up with a dig at the GPU HPC brigade. Learning CUDA is not the kind of specialization he is advocating. Again he returned to the CM and its concise Fortran Programming Guide of 1991 with ‘one page for every scientific algorithm’. The machine itself had one computational element for every grid point and could be treated like a big differential equation solver. Thanks to Hillis who made the machine look like your problem. Look at what HPC is doing today. Is it adding value? Hardware needs to be more abstractable, but ‘that is not the current trend’. And software too needs to be abstract, high level and re-usable. On the plus side ‘the AI guys are doing this already’.

In the Q&A Etgen was asked, ‘If you don’t do the GPU programming who will?’ He explained that he does some C++ coding ‘but this is not close to the metal’. This may be calling three levels of device drivers, below which, even more layers. ‘I just want to stand on the shoulders, just use the libraries – don’t write your own’.

* ‘Singularity’ refers to Ray Kurzweil’s 2005 oeuvre on artificial intelligence that envisaged a point in time when machine intelligence will be infinitely more powerful than human intelligence.

Comment – we take Etgen’s last comment to reflect the degree to which HPC has over the last couple of decades turned into a very specialized activity, that of tuning scientific code to ever more complex hardware. This is clearly not the kind of specialization that Etgen wants. But when he invokes AI-driven code generation, it is hard not to think of the complex hardware geometries that drive AI today (more of which in our report from the Nvidia GTC in our next issue). On the other hand, the AI brigade may be doing a better job of abstracting their logic into convenient Python libraries. Also, after the event we came across something that may be just what Etgen is looking for. In a report in First Break, quantum computers are presented as ‘replacing math by an actual physical optimization experiment’. We are not sure what it means but maybe some hyper-specialized crazy guy out there can pick this up and run with it.

More from the Rice HPC in Energy home page and also in our report elsewhere in this issue.


Learnings from other industries

UK Energy Institute ‘Toolbox Series’ webcast hears from safety experts on how unpicking the organizational failings and cognitive biases that cause major incidents (Like the UK’s Grenfell Towers fire) can make the energy industry safer. EI Toolbox web app targets high incidence events and hazards.

The EI Learning from Incidents webcast asked ‘How can we do this better?’ Lorraine Braben (LBC) reported from the EI’s Hearts & Minds program which found that incidents are often not shared outside of an organization. Matthew Laurie (CRA Ltd.) cited some major UK catastrophes (Alton Towers rollercoaster and the Grenfell Tower fire) to ask, ‘Why are we not using these incidents? Is it our cognitive biases, the not invented here syndrome?’ He cited Daniel Kahneman on the psychology of judgement and decision making and the fallibility of the human brain. Humans have three biases viz, attribution error, false unique bias and confirmation bias. Attribution error is what leads us to consider ourselves ‘safe’ drivers while considering other road users as ‘bad actors’. Likewise other companies may be considered as ‘bad, not like us’. False uniqueness is the tendency to see oneself (or one’s company) as ‘special’. ‘We all believe we are above average’ which is clearly impossible. Likewise, ‘that would never happen in oil and gas, we are better skilled, and don’t have much to learn from others’. Confirmation bias makes us ignore stuff that contradicts our existing beliefs.

Braben came back to enumerate the learning opportunities that incidents outside of one’s own industry can bring. These include safety KPIs, emergency preparedness and more. Incident accountability is key here. At Alton Towers, multiple stakeholders were involved in managing the facility with poor coordination, change management and a confused regulatory environment. In fact, the Alton Towers system detected another train on same track but the trip was reset even though a train with passengers was moving forward. Multiple people were involved in resetting the trip with no defined leader, no accountability and no authorization was given for the train to move forward.

At Grenfell, a bewilderingly complex regulatory structure led to multiple failures both before and during the fire. Residents safety concerns were ignored, safety equipment left unrepaired and, during the fire, there was continuing advice to residents to ‘stay put’.

Lawrie then addressed the confirmation bias phenomenon, qualifying the Internet as a ‘confirmation bias machine!’ Safety professionals need to ‘slow down’ their reasoning process and think of multiple explanations, challenging their own story and bring in views from outside their own industry. ‘Challenge your team to think of other explanations’.

The webinar concluded with the EI’s Stuart King who explained how the EI’s Toolbox helps with learning from other industries. Toolbox is a free to use web app that focuses on high incidence events and hazards to frontline workers. Supervisors visit the Toolbox to locate an incident relevant to a particular situation. Content from the Toolbox can then be leveraged in planning meetings to raise awareness of hazards and to effect additional safety measures. The toolbox is designed to make it easy to find relevant content, ‘cross fertilized’ from different industries. EI is also currently funding research into ‘reflective learning’. This derives from the fact that ‘reading is not learning’ and that true learning involves asking questions of oneself and one’s colleagues.

Other safety related resources cited in the Q&A included Lloyds List and the Federal Aviation Authority’s lessons learned website. There are also some free incident learning resources for the chemical/process industries from the Hazards Forum, EPSC, AICHE, CSB, EMARS and of course, IOGP.


SPE AAPG merger bites the dust

Editor Neil McNaughton reflects on the failed merger of the Society of Petroleum Engineers and the American Association of Petroleum Geologists which raised some interesting questions of the SPE’s governance. On which topic he has a bone to pick with the SPE.

Last year we published a short and cheeky article titled ‘SPE’s me-too Open Subsurface investigation’ on the Society of Petroleum Engineers’ 2021 Open Subsurface Workshop that was to serve as a ‘platform for SPE to reflect on the role it should play to help and support open subsurface projects in the future’. As we pointed out at the time, the SPE’s pitch, that ‘open source, data, and ecosystems tend to enable a faster pace of innovation [as opposed to] proprietary solutions’ could have been lifted from the OSDU playbook. We offered to act as scribe to the event but were told, ‘SPE workshops do not allow press reports. In order to stimulate frank discussion, no proceedings are published and members of the ‎press are not invited to attend’. Hmm… that still smarts.

A lot of water has passed under the bridge since mid-2021, notably with the (now aborted) merger of SPE with AAPG. We have been a bit remiss in not reporting on what might have been an upheaval in the upstream organizations landscape. We sat things out and it is now all over. But just for a short after-action review, here goes.

The merger was very much a top-down affair initiated by management of SPE and AAPG. The rationale for the merger was partly financial (presumably costs could be saved by amalgamating the offices) but also, the perception that a combined organization would somehow help the membership ‘transition’ to a new low carbon world. Curiously, this did not prove very popular for many petroleum engineers and petroleum geologists. Along with a grass roots movement against environmental obfuscation some SPE members were concerned regarding the society’s governance, and incidentally, by the surprisingly large salary of the president.

Governance was an issue we touched on in our 2021 open source piece when we asked, ‘should an event run by a self-selected coterie with no published proceedings determine what role the SPE will play for its broader membership?’ Well, there are still no published proceedings but we were curious to read a JPT article, ‘Workshop cracks open subsurface data’ by a card-carrying member of the coterie, Patrick Bangert.

Bangert’s article is disappointing and reads more like the agenda than a report. We learn the workshop included ‘presentations about existing open projects as well as panel discussions on topics including legal matters and how to start, scale, and maintain an open project’. But we do not hear what the projects were about, or what the outcome of the discussions was. Bangert then turns to a free-form rap on open source this and that, name checking Linux, PyTorch, TensorFlow and GitHub. He makes some very good points, especially his recommendations that professional societies should ‘create a publication platform for open-access papers, open source, and open data’. But was this an outcome of the Workshop? Will it be taken on board by SPE? Remember, this was to be a ‘platform for SPE to reflect on the role it should play …. etc.’.

The paying Workshop formula with no report is no way to reflect on a society’s ‘role’. As we pointed out last year, the elephant in the upstream open source room was and remains OSDU. We saw the SPE event as a hastily organized riposte to OSDU. It is inconceivable that OSDU was not discussed at the Workshop but there is no official trace of this or any other matters arising*. So much for ‘open’!

* Naturally, we are open to any information (on or off the record) on this and other topics raised at the Workshop. As a small reward, Neil McNaughton will present a signed copy of his 1999 oeuvre, ‘A Survey of Upstream Data Management’, a $95 value (well it was in 1999).


Book review: The green utopia and social thermodynamics

Oil IT Journal reviews ‘The Utopia of green growth’ by French energy expert Philippe Charlez. The wide ranging, authoritative and opinionated work considers economic growth to be a product of ‘social thermodynamics’ which will make the energy transition a difficult goal. Charlez work concatenates a broad ranging and well-informed energy textbook with some rather outspoken politics.

Philippe Charlez*’s 500 pages ‘essay’, ‘L’Utopie de la Croissance Verte’ (‘Utopia’**) is subtitled The Laws of Social Thermodynamics. It covers a lot of ground and seeks to link a wide ranging physical and chemical analysis of the energy landscape and the climate situation with a social/thermodynamic theory of growth. His thesis is that growth is practically baked-in to modern society, and that finding an alternative is going to be extremely difficult. Charlez writes from the standpoint of a self-confessed right winger. The frontispiece to Utopia has a curt one liner that states, ‘protect the rich and you will enrich the poor’. Margret Thatcher would approve.

The first half of Utopia is devoted to a history of the world and the evolution of society and politics. Charlez displays much erudition, with references to the great philosophers and societal trends. It is a great read with food for thought on every page, which makes it quite hard to summarize.

Utopia is packed full of facts and figures relating to energy consumption and development. One telling graph shows the relationship between the Russian economy and the oil price. Energy is presented as the main feedstock of growth and of mankind’s development. Until the last decades of the 20th century, growth and energy consumption were in lock step, albeit with huge differences in developed and undeveloped countries. Environmental consciousness was raised by events such as the Exxon Valdez and Torrey Canyon disasters and by the rise of the green movement. At this juncture in Utopia, Charlez observes that the environment is a rich person’s ‘sport’ and that, although technology can minimize society’s impact on the environment, decoupling growth from its environmental impact is impossible, ‘in the long term, nature will show that Malthus was right’!

Charlez reviews of the laws of thermodynamics, extending them to ‘social’ thermodynamics. This leads him to dismiss both ‘no-growth’ Marxist societies and the contracting society advocated by some greens. ‘Green growth’, presented by some*** as the way forward, is considered a utopic notion. Again, Charlez displays his erudition, comparing Voltaire’s and Rousseau’s models of society, France’s 20th century dalliance with Marxism, and later attempts by French legislators to break with the ‘inevitability’ of the liberal model. Utopia is of an extreme granularity. There are facts, opinions and historical allusions on every page. Before we leave the thermodynamic chapter, one parting shot: society has two ‘enemies’, growth and morals! Morals were ‘invented my mankind to counter the natural laws of thermodynamics’. Strong stuff!

You may be interested to hear what Charlez, who is energy advisor to France’s erstwhile main right wing party Les Republicains has to say about the energy transition. The answer is, a lot! Again with great erudition and in a more straightforward manner that shown in his political analysis. There is still, in oil and gas circles, some debate as to the human origins of atmospheric CO2. Charlez gives this very short shrift, citing carbon isotope studies of ice cores that show ‘without ambiguity’ the post-industrial revolution human origin of atmospheric CO2. But is this the main cause of global warming? Charlez looks at other possibilities, Milankovic cycles, the sun, clouds to conclude … well that it’s complicated and that various feedback mechanisms contribute more to global warming than the radiation forcing of CO2.

Charlez takes a few swipes at the politics of the GIEC (the worst case 5° scenarios are unrealistic and should be removed). He divides the commentariat into three groups, the climate skeptics, the ‘collapsologists’ and the rest. The collapsologists (the Nostrodamuses of modern times) advocate a radical change in society which clearly does not meet with his favor. But what does he have to say about the climato-sceptics? He analyzes the skeptics’ arguments in detail to conclude that these do not stand up to scrutiny. What drives the skeptics is a polemical approach that is removed from scientific reasoning, with some even questioning the evidence of the ice cores. Today, such opinions are ‘marginal’ and disappearing.

The last chapters of Utopia discuss the four levers that can be actioned to limit global warming: limit growth, limit population, reduce energy intensity and ‘transition’ from fossil fuels to renewables. On growth, or the absence thereof, Charlez reprises his earlier historico-social reasoning adding in some comparative religion for good measure! He hits his stride with the introduction of ‘climato-gauchisme’, concatenating the green movement with left-wing ideology, with Greta Thunberg as its high priestess. Clearly this is not at all to Charlez’ taste. Instead he advocates, rather redundantly ‘sustainable and durable’ energy, citing with approval his old boss Patrick Pouyanné (TotalEnergies CEO) who said, ‘energy must be clean, available and affordable’. One cannot disagree with that!

Regarding energy intensity, things look more positive. Insulating homes and heat pumps (despite our own misgivings) are ways forward. He highlights the extraordinary worldwide energy consumption of the private motor vehicle. Energy use in a motor vehicle can be reduced by decreasing its weight, improved tire technology and reducing wind resistance. He cites anecdotally the efforts of the ‘Extinction Rebellion’ movement who protest by letting down SUV’s tires! This is ‘intolerable’ ethically but ‘sound science’ in so far as the vast increase in the number of SUVs, with their high air resistance is anathema to fuel efficiency. How fast should we be driving? Charlez recommends that the top speed on European highways should be 100kph (62mph).

The chapter on carbon neutrality, looking forward to 2050 begins with a quote from French bishop and philosopher Jacques-Bénigne Bossuet, who observed ‘God laughs at those who deplore the effects of that whose causes they adore’, which sums up rather nicely the predicament we are all in today. The last couple of hundred pages are devoted to an analysis of carbon reduction and the energy transition. These include a shift to nuclear, biomass, energy storage, hydrogen and other ‘fixes’. All of which are evaluated objectively and none of which would appear to be a magic bullet. Utopia is so packed full of information and ideas that it is not just hard to summarize, it is hard to see what conclusion Charlez wants us to draw. The lack of an index (quite common in French publishing) does not make it any easier to see the big picture.

In sum, to our minds, Charlez is a great thinker and has put a lot into Utopia, almost every page calls for further debate. On the plus side, the science and commentary in Utopia would make an excellent text book for the energy transition. On the minus side, Charlez’ politics are often shrill and intrusive. Does ‘climato’ really have to be coupled with ‘gauchiste****’?

* Charlez’s LinkedIn page has him as Energy Expert at the Paris-based Sapiens Institute. His 35 year career with Total is somewhat hidden from view, on both LinkedIn and in the blurb for ‘Utopia’.

** L’Utopie de la croissance verte ISBN 2492545032.

*** Green growth is one of the OECD’s tenets for sustainable development.

**** left wing.


Society of Petroleum Data Managers Online 2021

Equinor’s long data mesh journey. CGI on the mesh, data migration and OSDU’s promise vs. reality. Cognite’s Data Fusion, a unified semantic data model. North Sea Transition Authority’s cessation of production initiative. Troika proposes XML metadata for SEG-Y. Schlumberger on AI-enhanced OSDU data ingestion. Equinor’s SLIMM spatial locator. AgileDD mines mining data. UK Technology Leadership Board and the Robotarium.

SPDM* Online 2021 heard from Sun Maria Lehmann and Jørn Ølmheim on Equinor’s data mesh journey that started in 2016. The data mesh philosophy (see also our report on BP’s data mesh) revolves around the concepts of design thinking and providing the best possible data products for different users that focus on use cases. Lehmann compared this to making pancakes to order. A federated domain-driven model targets different user personae with brainstorming and scrum/agile rapid iterations to establish data product principles. Data products involve more than data, including also code for ingestion, transformation and update. The mesh consists of several layers. The approach allows for different technologies for different parts of the business. A minimal governance layer assures interoperability with connected data products constituting the mesh, ‘there is a lot of value in existing models’. In the Q&A the mesh approach was contrasted with OSDU’s ‘all data in one place’ approach. For Equinor, OSDU’s data footprint is currently limited and in any case, ‘you will never have everything in OSDU, Industry 4 (process), financials and so on’. We want to see how data in OSDU fits with data products not in OSDU. Asked on the state of deployment, Lehmann admitted that the mesh today is not widely deployed, Equinor is working on a data platform for production data and is ‘very much at the beginning’.

Michael van der Haven (CGI) sees the data mesh as way of easing pain points in data migration. Van der Haven compared the OSDU promise with reality. OSDU opponents cite vendor support and ‘expensive’ cloud deployment. Van der Haven envisages OSDU as a spider in the web of the data mesh. This differs from the usual perception of OSDU. Not all data in is necessarily in the data lake. It may be on prem or in the cloud, and not necessarily in an open format. The market is slowly but surely moving to provide the connectors that enable a data mesh. Pulling metadata into OSDU is a good idea for data browsing.

Gunnar Staff described how Cognite is ‘rethinking data and digital in oil and gas’, asking ‘why are we still struggling?’. Industry is drowning in data but ‘consumer tech is leading the way forward’. Staff cited Google map as a typical ‘disrupter’. In the field, oils are ‘still struggling with paper-based solutions and manual input!’ Enter the ‘unified semantic data model’ that combines IT and OT data with a focus on the digital twin. However, few digital twin proofs of concept have scaled successfully. The answer here is (commercial plug) a Cognite Data Fusion-enabled pipeline into data science. In the Q&A Staff was asked if CDF was a competitor to OSDU. He claimed not, ‘OSDU is not an implementation, you don’t buy OSDU. CDF has promised to be OSDU compliant – it is not a competitor.’

Robert Swiergon from the UK Oil & Gas Authority (now rebranded as the North Sea Transition Authority) presented the new Cessation of production (COP) initiative. COP provides operators with a standard reporting template user guide that allows them to check for data inconsistencies and reporting lacunae. Even today, data captured is inadequate. The OGA is now playing catch-up, requesting data from operators to fix multiple issues in reported production. The result is (will be?) a cleaned-up and accurate dashboard showing daily production of UK fields and wellbores. OGA is also working to embrace carbon capture data and on a PowerBI-based front end.

Jill Lewis (Troika) retraced the evolution of the SEG’s seismic data recording formats from the ‘too successful’ SEG-Y that is still with us. But SEG-Y does not do 3D, microseismic or OBC and is (strictly speaking) a spec for the now defunct 9 track tape. Rev 1 addressed many of the shortcomings but had, ‘zero take-up’. The current Rev 2 is more promising but would benefit from a more flexible metadata capability. Troika has proposed* adding an XML file to capture extra information and allow for auto-read and safe data transfer. XML allows for a more robust data description than legacy formats that require knowledge of bit/byte locations, these are ‘things of the past’. Lewis expressed hope that the OSDU folks would be listening in and opt for Rev 2 rather than the older versions. A JSON representation of Troika’s XML add-on is under discussion.

* Back in 1999 Oil IT Journal opined, ‘The venerable SEG-Y format for seismic data has suffered over the years from inconsistent use, and a desire to stuff more and more information into the format [ … ] SEG-Y is therefore a prima facie candidate for what the French would call a ‘re-looking’ à la XML.

Jamie Cruise (Schlumberger) asked ‘Can AI help OSDU enterprise adoption?’ For Cruise, we are now ‘a couple of years into the OSDU era’. This means that we are now starting to think about how to solve the data management challenges and achieve the nirvana of a single repository. We need to move toward AI supported by physics (or physics supported by AI). This implies managing regular data along with information produced by automation and AI to provide ‘democratized access through de-siloization’. Also, we are leaving the database era. The database did a good job but now we need big data and the cloud. Operators are working with help from Schlumberger on OSDU. It is not always simple, we have been going a few times around the block but have produced ‘glorious open source code’. Cruise sees OSDU in the context of a data mesh, contributing content to single, extended version of the truth. While there is a place for conventional, interactive QC, Cruise is most interested in AI-enhanced rapid ingestion. This involves updating master data records, mining reports with feature extraction and creating a virtual source for master/gold records. AI is cheaper than a person and can be applied to very large numbers of documents. NLP in also useful for data ingestion. The aim is for a corporate knowledge base in OSDU that ‘puts every piece of data in context’. In the Q&A Cruise was asked if anyone has already retired their legacy data stores (OpenWorks, Delfi…) in favor of OSDU. Cruise was not sure that Delfi is ‘legacy’. There is a lot of Delfi in OSDU! But this is early days for OSDU adoption. It is currently running as a supplement to other systems. ‘There will be customers turning-off their legacy systems real soon now for cost reasons, although there is still work to do’. Another question covered data clean-up strategies. Is it better to perform a mass clean-up before migration to OSDU or can this be more easily realized from inside OSDU once all data is has been liberated to it. Cruise opined that you don’t need to do all remediation up front. Just migrate raw source data using OSDU’s flexible data migration services and then build or use clean-up and dedupe services inside the platform. On the issue of data residency legislation, OSDU partner IBM provides in-country deployment. The solution can be transparent across different data regimes.

Harvard Gustad presented Slimm, Equinor’s spatial location information model and media files. Slimm is designed to spatially-enable various data types. Slimm is built on Omnia, Equinor’s cloud-based data platform and allows for interaction with remote devices such as handhelds and tablets. One use case is corrosion monitoring using a photograph taken by a plant operator. Slimm locates the photo on a 3D plant model for e.g. corrosion evaluation. One third of Equinor data has a spatial component. The 3D location problem is a meaningful problem to address as there are ‘no standard off-the-shelf solutions available’. Equinor is adopting an open source strategy to develop minimum viable data products for operations and maintenance. Equinor’s Technology, Digital & Innovation (TDI) is working on autonomous inspection robots. These are tested at the Kaarstoe K-Lab in a 3D virtual plant model. Slimm data feeds into Equinor’s Echo asset digital twin.

Henri Blondelle described how his company Agile DD has pivoted from oil and gas (Blondelle was previously a geophysicist with CGG) into the mining arena. Agile is now capturing assay table in mining documents or ‘(data) mining for mining’. He thanked clients, including BRGM, Orano and Barrick Gold, for their welcome into these new fields. Mining leverages real-time data from Lidar drones to monitor activity. One Nevada asset has some 40 million documents for capture. These are similar to oil and gas reports, tables, composite logs. Blondelle contrasted manual data entry from documents to a database with smarter approaches. Documents with a similar layout can be collated and rules defined to capture key elements. This is ‘quite a good approach but limited’. Agile’s approach iQC/IDP (intelligent document process) leverages AI to capture and populate a structured data set. Domain-specific training is the key, generic AI models are no good. The model must handle all aspects of a document, layout and content. Training by users is also important. OCR can be useful – even on handwritten documents. For automatic capture of tables in documents, Agile has developed TABIO*. Why not just use existing tools like PDFTables, Tabula or MIT’s Camelot? Because these are quite deterministic and inflexible. They are not trainable so don’t improve with use. Blurry text and other defects will affect the results but even a partial success on a large data set can be very useful in providing extra data points. In the Q&A Blondelle was asked if, after document capture, operators could throw their paper records away. He replied that, ‘No, you will never extract everything. Only a human eye/brain can do this’. If you do have to lose the paper, at least, keep the scans.

* Tabio development was co-funded by Saipem, Schlumberger, Subsea7 and Total. The open source GIT repository can be found here.

David MacKinnon (seconded from Total) presented the UK Technology Leadership Board which describes itself as ‘one of seven industry task forces [that] works with government and other stakeholders to [adapt] oil and gas technologies that support the energy transition’. For MacKinnon, this means robots and autonomous systems, his passion in particular, corrosion inspection robots roaming offshore platforms controlled remotely from Aberdeen. The Energy Systems Catapult, OLTER (the offshore low touch energy center where industry, supply chain, academia, developers and other sectors collaborate) and the ‘National Robotarium’ currently under construction at Herriot Watt university.

The Society for Professional Data Managers (SPDM) was established in 2017 to support the professional development of the worldwide community of data and information managers working in the energy sector. SPDM was founded by CDA* and Norway’s ECIM. The next ECIM Annual International E&P DM meeting, the 25th, will take place in Haugesund, Norway

* Common Data Access has now been rolled-into Oil & Gas UK, which recently rebranded as Offshore Energies UK.


2022 Rice University Ken Kennedy Institute HPC in Energy Conference

Birds-of-a-feather session hears from BP, Shell and TotalEnergies. ‘We’re all green now!’ Aramco’s ML R&D. FResNet++ speeds CCS simulation 1000x.

The ‘birds of a feather session’ was billed as ‘HPC and ML aspects of the energy transition’. It turned out to be much more of a commercial for the participants’ companies. All are going green and it could be that that the CO2 generated by the HPC devoted to modeling carbon capture exceeds that which is actually sequestered.

BP’s head of HPC, Elizabeth L’Heureux, described BP’s net zero ambition and mutation from oil to energy. BP is working to transition its subsurface expertise to new energy and to develop new skills to differentiate its new energy activities. Uses cases are (will be?) simulation for CCS, reservoir modeling, simulating wind farms, solar panels, batteries and AI for distribution. Today HPC in BP is 85% geophysics and this increasing, with an anticipated 35 petaflops in 2024. As new businesses come in new people are looking for help with HPC expertise. BP’s HPC unit has a tentative agreement to grow its on prem systems but is also exploring cloud and third parties to offload excess demand.

Weichang Li, presented machine learning work underway at Aramco’s Research Center in Houston. Here some 70 researchers work primarily in the upstream on flow measurement, the ‘Sensor Ball’, downhole robots and ML with distributed sensing for CCS. Other activity includes deep learning based completion monitoring with DAS/DTS and core image analysis.

Shell’s Detlef Hohl, came clean ‘I am hardly going to talk about HPC at all’, instead, ‘what Shell is doing in the energy transition’. This involves very ambitious climate goals to ‘avoid, reduce, and mitigate’ CO2 inter alia by ‘digital twin asset optimization’. CCS is important to Shell for example Quest and Gorgon*. Already Shell produces more energy in gas than oil. Shell has some 350 math/data scientists and another 800 ‘citizen data scientists’. Already, 10,000 equipment items are ‘monitored by AI’.

* Although the Chevron-operated Gorgon is struggling as reported in UpstreamOnline.

Following a masked Mauricio Araya (TotalEnergies) was tricky but we did glean that Total plans for less oil liquids down (to 30% by 2030) and more gas (up to 50%). A new OneTech organization has been in place since September 2021. Total does less lab work and more machine learning. Going green includes CCS at Northern Lights, Aramis, NEP, and afforestation of the Batéké plateau. Pymgrid, a Python library is used to generate and simulate a large number of electrical microgrids.

In the Q&A the panel was asked how they (as the environmental ‘bad guys’) were doing recruiting all the new talent needed for data science. BP has indeed struggled to find HPC/CFD specialists especially those with domain skills and is trying to develop people internally. Shell observed that AWS (the questioner’s affiliation) was not having an easy time hiring talent either! Shell is teaching data science. It is really hard to attract talent to oils in the EU although it may be easier in the US. ‘We bring people in, show them what we do. We also show them Amazon Science. It’s a good portal, we don’t have it!’

Shell’s Janaki Vamaraju presented FResNet++. Conventional modeling of complex phenomenon for hydrocarbons, carbon capture and other fluid flow applications is computationally expensive and can take days to run. AI is solving the hardest challenges in scientific simulation and speeds up modeling 1000x. Physics-informed neural networks and surrogate models are used to accelerate 50 or 100 years of CCS simulation. Neural nets working in the Fourier space is not new but the need for high resolution (10 million to a billion cell models) will stretch HPC resources. Enter the FResNet++ model*, a ‘deep residual learning framework’ for multiphase, multi component flow simulations in heterogeneous media.

*This reference comes from the ‘papers with code’ website although curiously, for FResNet++, there is no code!

Visit the conference home page and watch the conference videos.


EAGE Digital 2022

OSDU and Total’s digital transformation. IFPen’s open source ELK-based geo data framework. INT’s OSDU front-end. Naftagas’ EPP (Esri not OSDU!). Schlumberger proposes ‘Shapeley additive explanations’ for AI black boxes. OMV – APC cornerstone of digitalization. Svalbox digital geology. EU Geological Data Infrastructure.

Mathieu Terrisse presented TotalEnergies’s digital transformation program that leverages the OSDU data infrastructure to underpin its in-house developed Sismage-CIG geosciences and reservoir platform. Terrisse’s presentation is a great sales pitch for OSDU, presented as ‘liberating and standardizing data’. While digital transformation is a long road, early results show promise. The OSDU wellbore domain data management services, DDMS, enabled scenario sharing between geoscientists and drillers, with bidirectional transfer of well information between the Sismage geosteering module and ‘DrillX’, another of Total’s in-house developments and a component of Adept, its ‘advanced drilling engineering platform’. At present OSDU’s live data exchange capability is limited as is version management but overall, OSDU has shown that different applications can collaborate on the same data ‘as per the theoretical model’.

Interestingly, Total presented Sismage-CIG before OSDU as ‘a platform with access to every piece of data’.

Tatiana Akimova presented another of Total’s OSDU trials, an ‘ambitious’ use case where an OSDU data platform running in the Microsoft Azure cloud was used in an eight month study of Total’s Suriname Golden Block 58. Here OSDU improved collaboration across drilling, geosciences and development accelerated exploration and appraisal drilling. The Suriname project saw Total extending the OSDU API with a ‘robust’ entitlement model to align with its security policies. Microsoft, Wipro, INT, Emerson and Thales were involved in the project which is described as ‘a major breakthrough for geosciences data management in Total’.

But not all things Total are OSDU. Antoine Bouziat (IFPen) presented a joint IFPen/TotalEnergies investigation into the use of opensource frameworks to manage geoscience knowledge. The ‘ELK’ stack* of open-source data management tools includes Elasticsearch (search), Logstash (data ingestion) and Kibana (visualization). ELK is used to manage and explore large collections of structured and unstructured data. The software is used in financial services and webstores but not so far in geosciences. The proposed architecture includes an Elasticsearch cluster and an FSCrawler application that scans and indexes files in the network. Other functions include language detection, de-dupe and OCR. The results are exposed in data-driven interactive visualization dashboards developed with Kibana. The authors are very positive as to the potential of ELK in the subsurface and encourage geoscientists to ‘consider open-source projects and hands-on programming experiments’. The work was carried out as a R&D collaboration between TotalEnergies and IFPen under the auspices of the Tellus consortium for geoscience digitalization .

* According to Wikipedia, ELK is now to be termed the Elastic Stack. While the components are open source, enterprise users may be interested in commercial bundles from Elasticsearch BV, Amazon and others.

Frederic Desloges (INT) gave an enthusiastic plug for OSDU that might have been taken as a commercial presentation if it was not for the fact that OSDU is ‘open source‘. Desloges has it that ‘moving data to the cloud is key to the future of the digital workspace. Companies ‘must be able to access their data in the cloud to generate insights quickly and make critical business decisions in real-time’. Enter the OSDU platform that makes it ‘easy to set up a secure data lake in just a few days’. For visualizing data in the lake Desloges unsurprisingly advocates INT’s IVAAP, a ‘modular platform’ (another one!) that supports ‘multiple sources of data such as OSDU, custom data catalogues, and data lakes’.

Kseniya Kultysheva reported on similar ‘fundamental’ data management difficulties which led Naftagas to kick off an ‘Exploration Platform Project’. This identified challenges that went beyond the ‘simple implementation or creation of a single software’. A novel approach to managing data across domains was proposed that leveraged, not OSDU, but a ‘geo-information web system’ that was already deployed in the legal department for land management. The product was rebaptized the ‘Exploration Platform’ and is under active development by Naftagas’ geoscientists.

OK so if it is not OSDU what is it? A little googling suggests that the ‘product’ used to manage Naftagas’ land is Esri ArcGIS Enterprise.

Schlumberger’s Amir Shamsa observed that while ‘ML solutions are now used everywhere’, they are frequently rejected by domains experts due to the ‘black box syndrome’. Shamsa proposes the ‘Shapley additive explanations’ (SHAP) visualization tool that computes the contribution of each feature to the model. The approach was used on an 800 wells dataset of the Duvernay shale gas field in Canada. The approach was tested on an Xgboost ML model of the field and used to identify the contribution of each modeling parameter to predicted production. The results were claimed to assist in selecting completion parameters and picking infill drilling locations. For more on the Shapley plot in oil and gas see Keith Holdaway’s LinkedIn paper Explainable Artificial Intelligence: The Shapley Values.

Georgia Kotsiopoulou related OMV’s deployment of advanced process control (APC) (a.k.a. MPC - model predictive control) as a ‘cornerstone of digitalization and operational excellence’. APC has been deployed at two of MV’s gas production facilities in an ‘almost completely remote manner’. The controllers are based on dynamic process models that autonomously optimize the system using real-time data. OMV carried out ‘thorough market research to find the supplier of the software’. Unfortunately the authors did not share what software was selected but the references point to earlier APC work by AspenTech and KBC/Yokogawa.

Kim Senger (University of Svalbard) presented ‘Svalbox’ a research tool for the study of digital geology outcrop models, samples and drill cores. Svalbox includes a subsurface project database built in Petrel along with virtual field trips to illustrate aspects of Svalbard geology. Apart from Petrel and ArcGIS, Svalbox uses ‘mostly open-source’ software available on GitHub. Svalbox is said to ‘bridge the gap between outcrops and subsurface data’ and exposes the Svalbard archipelago as a geoscientific playground.

Karen Lyng Anthonsen (Denmark & Greenland Geological Survey) presented the EGDI, the EU Geological Data Infrastructure that is to make European geological data accessible. EGDI was launched in 2016 and has since been extended with data and results from the GeoERA research projects. A ‘web GIS system*’ provides access to pan-European and national geological datasets connecting to platforms such as Destination Earth, the European Open Science Cloud (EOSC) , the European Plate Observing System (EPOS) and the EU Raw Material Information System (RMIS). EGDI also provides access to other geodata resources such as OneGeology-Europe (geological mapping), EuroGeoSource (energy and minerals) and others.

* A combination of OpenLayers, MapServer and MapScript.

More from the EAGE Digital home page. Proceedings (for those with a subscription) are on EarthDoc.


New carbon capture and sequestration software

Kongsberg Digital teams with CMG on GELECO2 software. DNV compares and contrasts KFX CO2 simulator with prior art from TotalEnergies and NETL.

Kongsberg Digital, with partner Computer Modelling Group (CMG) have received funding from Gassnova Norway’s CLIMIT-Demo Program to develop simulation software for CO2 injection into depleted oil and gas reservoirs and saline aquifers. The Climit funding is in addition to that already secured from other operators.

The GELECO2 program is to leverage of Kongsberg Digital’s LedaFlow transient multiphase flow simulator with CMG’s GEM reservoir simulator and will develop a controller program that integrates and manages the interaction between the well and reservoir systems. Both LedaFlow and GEM will be further developed for CO2 simulation to provide an ‘accurate integrated simulation tool for CO2 injection simulations’.

Kongsberg Digital SVP Shane McArdle, said, ‘This project will be key for the success of future CCS projects as it is critical for companies to model CO2 injection in an integrated manner, not with isolated well and reservoir models. Net Zero is a focus area for Kongsberg Digital and developing this software will be a gamechanger within CCS.’

The Kongsberg/CMG announcement came hot on the heels of another Norwegian CCS development which saw DNV and Equinor partner on the development of ‘KFX CO2’, computational fluid dynamics simulation software for CCS as we described in our last issue.

As we have already reported on two projects for CCS storage software as in CCSI from the US NETL and Total’s GEOSX, we asked DNV what was new in KFX and if such ‘prior art’ was leveraged in its development.

DNV’s Kjell Erik Rian came back with the following. ‘As far as I understand, the NETL/CCSI and Total/GEOSX CCS software projects were looking at quite different technical aspects of CCS. The current KFX-CO2 project (with partners Equinor and TotalEnergies) builds on previous KFX CO2 safety R&D supported by Equinor and the Research Council of Norway and on our CO2 dispersion modelling experience obtained through full-scale CCS industry projects. The main goal of the KFX-CO2 project is to improve the KFX simulation technology with respect to industrial CO2 dispersion analyses for realistic conditions. KFX is DNV’s industrial computational fluid dynamics (CFD) tool for 3D consequence analyses of gas dispersion, fires and explosions. KFX is used daily for safety studies in the energy and process industry worldwide.’


Software, hardware short takes

Lloyds Register AllAssets, DGB’s OpendTect, Emerson DeltaV Simulation Cloud, EnQuest Thor, Safe Software FME, FutureOn FieldTwin Collaborate, Halliburton and Honeywell, Harvest Technology, Modelica Association FMI 3.0, P2 Field Operator, Pegasus Vertex CWPRO, Eliis PaleoScan, Peloton SOC compliance, Petromehras ResX, Schlumberger GeoSphere 360, Terradepth Absolute Ocean.

Lloyds Register’s AllAssets 3.0 cloud-based asset performance and risk management solution is deployed on over 90,000 assets across 125 sites. The latest release includes updates to inspection data capability and tracking, improved data transfer to the risk-based inspection modules and finer control of security.

The recent V6.6.6 release of DGB’s OpendTect’s seismic machine learning platform accelerates the path from R&D to operational deployment with import of models developed in Keras (TensorFlow), Scikit Learn and PyTorch.

Emerson’s DeltaV Simulation Cloud enables users to connect to a simulated version of their control system from anywhere in the world. DeltaV Simulate and Mimic simulators allow operators to train and react in real time to abnormal situations on the same interface as deployed in the plant. The cloud edition allows small and medium sized companies to leverage simulation technology without the overhead of on-prem equipment and maintenance.

EnQuest Energy Solution’s new Thor is a 5,000-horsepower electric hydraulic fracturing pump solution that improves frac efficiency, reduces operational costs, and achieves ESG targets.

Safe Software’s FME 2022.0 offers new visual workspace comparison for data comparison and merging, ‘at-scale’ automation and job analytics and better user management. The Esri reprojector now supports vertical coordinate systems and time-dependent (epoch) coordinate systems. The new release also includes new data formats and data management functionality. More from Safe Software.

FutureOn has announced FieldTwin Collaborate, a digital project sharing forum for collaboration across the energy sector and supply chain. FTC brings multiple contractors together in one space, benefiting operators and project teams by ensuring accessibility, integrity and real-time updates on integrated projects.

Halliburton and Honeywell are teaming on the provision of Industry 4.0 technologies to ‘transform hydrocarbon recovery and asset performance’ with a surface/subsurface digital twin that integrates the entire value chain.

Harvest Technology Group has launched RiS, a remote inspection system. RiS delivers high-fidelity video, synchronized data, and high-quality audio transmission from any site to another via a secure network using ‘military-grade’ encryption. RiS provides a telepresence to remote technical experts to advise on-site operators.

The latest version of the Modelica Association’s Functional Mock-up Interface, FMI 3.0 defines a ZIP archive and an application programming interface to exchange dynamic models using a combination of XML files, binaries and C code: the Functional Mock-up Unit (FMU). The API is used by a simulation environment, the importer, to create one or more instances of an FMU and to simulate them, typically together with other models.

P2 has launched an extension of its business process outsourcing offerings to include production accounting based on the P2 Merrick platform. Users can new deploy P2 Field Operator to their field users and integrate it with their scada systems, allowing them to ‘focus on their operations and health of their wells while P2 handles the daily activities associated with managing and reporting production volumes’. The new services are delivered via P2 Carte, the web-based self-serve production portal. More from P2 BPO.

Pegasus Vertex has upgraded its casing wear prediction model CWPRO with multi-well sections offering not only multi-operations, but also cumulated casing wear results and tool joint wear models to ensure building a safe and successful well. A newly added feature, wear factor sensitivity analysis, allows users to input three groups of wear factors to quickly analyze their effect on casing integrity.

EliisPaleoScan 2021.2 includes workflow enrichment with new seismic attributes (AVO and 2D smoothing 2D), an enhanced OpenWorks data link and an enhanced user experience.

Peloton has achieved Achieves SOC 1 and SOC 2 Type 2 compliance for its SaaS-based platform. The SOC auditing standard was developed by the American Institute of Certified Public Accountants. Compliance demonstrates how Peloton safeguards customer data and maintains effective controls. The examination was performed by Armanino LLP.

Petromehras has released a new version of ResX its ensemble-based history matching software. The 2022.2.0 edition offers improved memory use and performance, embedded structural uncertainty and easier study setup. ResX now runs on Petrel 2021.2 and/or in Schlumberger Delfi.

Schlumberger’s new GeoSphere 360 service delivers real-time 3D reservoir mapping while drilling to enhance reservoir understanding and optimize production potential. GeoSphere combines 3D electromagnetic data with cloud computing to map resistive reservoir bodies.

Terradepth has launched Absolute Ocean, said to be the world’s first ocean data as a service platform. The cloud-based, browser-accessible ocean data management platform provides an ‘intuitive, immersive interface’ for the visualization of large high-resolution geospatial datasets. Users can blend their own data with the subscription-based Terradepth data. AO supports multiple geospatial data types, including side-scan sonar, synthetic-aperture sonar, multi-beam bathymetry, satellite-derived bathymetry, lidar, magnetometer, 2D- and 3D-point cloud and satellite imagery. A machine-learning pipeline optimizes data over areas of interest.


2022 ABC Onshore Wellsite Automation

Chevron’s IIoT hub in the cloud. Autosol/PureWest ‘scada a barrier to digital transformation’. Koda Resources’ MQTT retrofit. Red Bluff Resources wireless to cloud edge devices. Mallorn Energy repurposes Quickbase as ‘well relationship manager’. Schlumberger Agora AI for Oasis Petroleum. New ISA 112 standard for scada. Other solutions of note…

The 2022 edition of the American Business Conferences Onshore Wellsite Automation was held earlier this year live in Houston and online. As in previous years the event was categorized by the predominance of MQTT-based solutions displacing traditional scada. Todd Anslinger presented Chevron’s ‘industrial internet of things’ (IIoT) approach that envisages an IIoT hub in the cloud. Anslinger is on the steering committee of the Eclipse Foundation’s MQTT/SparkPlug. Chevron’s definition of IIoT embraces an IoT hub in the cloud. For a large company like Chevron, this may mean an in-house cloud, possibly using the Microsoft Azure Stack either on premise chez Chevron or possibly at Rockwell where Anslinger is a customer advisor. Smaller operators may be more happy with a public cloud. Chevron uses the Azure IoT hub for ingestion and the Azure Data Explorer for analysis. Use cases include flaring reduction and methane management using IIoT connected infrared cameras. Carbon capture at Gorgon and Quest is monitored with IIoT sensors. Anslinger proselytizes for the MQTT/ Sparkplug pub-sub data exchange standard that ‘drives efficiency at-scale’.

David Blanco (Autosol) and Bryan Hendrix (PureWest) reflected on the operation challenges of traditional scada. Slow polling, inconsistent data and no remote device management meant that scada had become a barrier to the digital transformation. The solution was ACM (the Autosol Communications Manager) and Inductive Automation’s Ignition scada. Again, MQTT is the preferred protocol for data transfer (although ACM uses a proprietary system). MQTT is said to deliver a ‘30-70%’ network speedup. This is just as well since ‘managing MQTT and Edge Devices over low bandwidth networks is not feasible’. Returning to the digital transformation theme the authors concluded that while ‘an entire institution may benefit from digital transformation, but that doesn’t mean it should go all-in right away. It’s often best to start in one area, identify progress and best practices, and then expand the reach of digital transformation across your campus.’

Evan Rynearson described Koda Resources’ MQTT retrofit project that set out to keep costs low when updating its legacy scada systems. There are ‘thousands of devices out in the field that don’t know the MQTT protocol’. Switching from a poll/response system to MQTT without replacing existing devices is not without risk. The rationale for such a move is the goal of big loop optimization from business down to operations. Again, Inductive’s Ignition portfolio has enabled the transition. The Sparkplug B protocol was used to create a meaningful tag namespace to identify equipment. An OPC/MQTT converter also ran. Different devices required different approaches but, as Rynearson warned, ‘MQTT is a new communication protocol, and very different to what we’ve grown used to and standardized our whole scada lives on’. But the effort has proved worthwhile with improved alarm response, flowback monitoring and providing new high resolution data. Greenfield sites are now scada-equipped ‘in minutes’ along with a template-driven namespace. The MQTT engine sends out a new ‘tags on birth’ message. The ‘send on change’ paradigm has produced huge bandwidth savings. Initial targets for deployment could include alarms, MQTT-capable devices and fixing that ‘low frequency data your data scientist is complaining about!’

Brandon Davis showed how Red Bluff Resources (RBR) has implemented IIoT edge devices to leverage real-time data. RBR uses a Red Lion Graphite/Crimson HMI front end and wireless comms from SignalFire whose Ranger node got a shout-out as providing configurable IO*. AFTI’s Watchdog pump monitor is also deployed as is Sensorfield’s monitoring technology and Andium’s video cameras. All comes together in RBR’s Azure instance (thanks to MQTT naturellement) for processing with Stream Analytics and analysis with PowerBI. Esri ArcGIS is used to display scada data in both map and schematics. ArcGIS tools monitor tank batteries and line pressures.

* In a separate presentation, Sandro Esposito drilled down into SignalFire’s sensor-to-cloud MQTT capabilities and LTE telemetry. MQTT/SparkplugB and LTE-M Cat 1 are said to be ‘the IoT industry standard’.

Marshal Hall (Mallorn Energy), reflecting on the success of customer relationship management systems, began playing with a CRM from Quickbase and soon realized that this relational database could be repurposed for use as a … well relationship management system, the WRM. By using this low code software, Hall eliminated the need for over 20 spreadsheets, replacing a reporting tool and ‘saving thousands of dollars’. After a quick RDMS tutorial, Hall showed some of the (rather compelling) results. Quickbase seems particularly suited to generating data-driven diagrams and reports for well cards, status updates, dashboards and even wellbore schematics. Hall concluded that ‘there is a massive amount of inexpensive software out there that can be used to streamline wellsite and back-office operations’. And also, ‘stop using Excel to store operations data’.

Will Whitley described trials of AI/ML at Oasis Petroleum. The project evaluated thousands of sucker rod pump cards to obtain an ML algorithm that could distinguish between fluid pound and gas interference and to spot conditions such as tagging. Agora (a Schlumberger unit) helped build the AI model which was successfully deployed on a 6 well trial before scaling up to a 74 well pilot.

Alan Bryant (Oxy and ISA 112 committee) provided a progress report from the new ISA 112 standard that is being developed for scada systems. There is currently no good guidance out there for building and operating a scada system for a diverse equipment set. A common terminology and reference architecture is needed. Work began in 2016. In 2018 a ISA112 lifecycle and architecture diagram was released. Currently a first draft of the standard is receiving comments and publication is planned, well, sometime this decade! The process appears somewhat laborious with input from 100 members across multiple verticals including oil and gas. An initial 5 layer architecture was proposed. After consultation and debate this now stands at 11 layers. It could be that the slow pace of development has meant that technology is overtaking the initiative although ‘a subcommittee is working on an IIoT version of the diagram’. More from ISA 112.

We have not covered all the presentations made at Wellsite Automation. Here are some other vendor solutions that merit a mention as follows…

More from the ABC conference website.


Folks, facts, orgs …

ABBY, Applied Petroleum Technology, Add Energy Academy, Chemical Safety Board, CO-LaN, Colonial Pipeline, Coresystems, Cummins, DeepOcean, Dragos, Energistics, Engage Mobilize, Equinor, Foster Marketing, Graves & Co., H2scan, Halliburton Labs, Halliburton, Hexagon AB, Lummus Technology, National Geospatial Advisory Committee, NSF TIP, Namur, Offshore Energies UK, North Sea Transition Authority, Pickering Energy Partners, ProFrac, Siemens Energy, US National Academy of Engineering, Schlumberger, Energy Transition Institute, BGN Technologies, Velo3D, Veriten.

Scott Opitz has been appointed CTO at ABBY where a 40% hike in R&D spend targets its intelligent document processing platform Vantage and process intelligence solution Timeline. Opitz was co-founder and CEO of TimelinePI, acquired by ABBYY in 2019.

Oslo, Norway headquartered Applied Petroleum Technology (APT) has appointed Eric Michael as chief geochemical advisor at its Houston location. Michael was previously with ConocoPhillips.

The Add Energy Academy has launched with a range of drilling and well engineering, operations and maintenance, safety and risk, and leadership training courses.

The US Chemical Safety Board has hired Stephen Owens and Sylvia Johnson to the board.

CO-LaN has contracted Florence Kuijl as technical administrator. Kuijl is graduate of the IFCIM school at Mazamet, France.

Colonial Pipeline has named Adam Tice as chief information security officer. Tice hails from Silicon Valley Bank. He developed his expertise in navigating post-breach environments when he joined Equifax following its 2017 incident.

Jean-Thomas Célette has been named Coresystems’ CEO. He most recent position was with Google’s shopping division. Coresystems is a field service management software specialist.

Cummins has elected president and COO Jennifer Rumsey to its board.

DeepOcean has hired Bill Smart to lead business development efforts in emerging markets. He joins from Delmar Systems.

Industrial cybersecurity specialist Dragos has hired Christophe Culine as president of global sales and chief revenue officer. Prior to joining Dragos, he served as President and CRO of RiskIQ, now a Microsoft unit.

Energistics (now an affiliate of The Open Group) has named Pablo Pérez Bardasz to its board. Bardasz, a PDVSA veteran and Witsml luminary also runs Bardasz, his own D&C boutique. The Energistics board now comprises Laurent Deny (Emerson), David Smith (Baker Hughes), Wilfred Berlang (Shell) and The Open Group CEO Steve Nunn.

Steve Foster is to join Engage Mobilize as CEO replacing founder Rob Ratchinsky who joins the board of advisors. Foster founded Convercent, recently acquired by One Trust. Other Engage joiners are COO Alison Kane and CRO, Jason Jose.

Carri Lockhart, EVP Technology, Digital & Innovation is resigning from Equinor to return to the US. Senior VP Elisabeth Birkeland Kvalheim has been appointed acting EVP for TDI. Other Equinor appointments include Geir Tungesvik (EVP Projects, Drilling and Procurement) and Aksel Stenerud (EVP People and Organization). Ana Fonseca Nordang takes on a new role in Equinor’s renewable business area.

Foster Marketing has appointed Elaine Benoit as design and digital associate with responsibility for ‘creative ideation’ and design campaigns on behalf of clients.

Following Allen Barron’s retirement, Bill Vail is to serve as EVP at Graves & Co. Consulting, a reservoir engineering advisory.

David Meyers is now COO at hydrogen sensing solutions provider H2scan. Meyers joined the H2scan board in 2017 after leading an investment from Altran. H2scan customers include ABB, Siemens, GE Energy, DOD, ExxonMobil, Shell and Chevron.

Jennifer Holmgren, CEO, LanzaTech and Maynard Holt, CEO, Veriten have joined Halliburton Labs, a ‘collaborative environment where entrepreneurs, academics, and investors join to advance cleaner, affordable energy’.

Tobi Young and Earl Cummings have been appointed to the board of Halliburton. Young is an attorney with Cognizant. Cummings is on the board of CenterPoint Energy.

Paolo Guglielmini has been appointed COO for Hexagon AB. Before joining Hexagon in 2010, Guglielmini was with CERN. Current COO Norbert Hanke is now EVP. Hexagon veteran Juergen Dold is now EVP with responsibility for the HxDR digital reality platform. Thomas Harring and Steven Cost join the executive management team.

Ujjal Mukherjee is now CTO at Lummus Technology, succeeding retiree Jo Portela. He was recently VP and MD of the Chevron Lummus Global clean fuels joint venture.

The US Department of the Interior has appointed Nadine Alameh of the Open Geospatial Consortium to serve on the National Geospatial Advisory Committee, representing the interests of the non-profit sector

US National Science Foundation director Sethuraman Panchanathan has announced a new ‘TIP’ directorate focused on technology, innovation and partnerships. TIP is to ‘accelerate the development of new technologies and products that improve Americans’ way of life, grow the economy and create new jobs, and strengthen and sustain U.S. competitiveness for decades to come’. Erwin Gianchandani is the inaugural TIP director.

The EU Namur process control standards body has elected Frank van den Boomen (Covestro) and Rene Neijts (Dow) to its board.

Industry body Oil & Gas UK (OGUK) has changed name to Offshore Energies UK, reflecting its embrace of low-carbon offshore energy technologies including offshore wind, hydrogen production, carbon capture and storage systems, and other emerging low-carbon technologies. The name change follows a ‘year-long strategic review’.

In a mirror move, the UK government’s Oil and Gas Authority is transmuting to become the North Sea Transition Authority, reflecting its evolving role in the energy transition. OGA CEO Andy Samuel is to step down and Russell Reynolds has been engaged to search for a replacement.

Jason Martinez heads-up Pickering Energy Partners’ new energy transition advisory practice. The ETA builds on PEP’s ‘Insights’ business intelligence, consulting and investing practices.

ProFrac has appointed Lance Turner as CFO. He hails from FTSI.

Siemens Energy has appointed Karim Amin to its board.

Geophysicists, Leon Thomsen and Oz Yilmaz have been appointed to the US National Academy of Engineering.

Schlumberger has opened a Houston branch of its ‘Innovation Factori’ AI innovation network.

A $10 million gift from Shell USA and Shell Global Solutions has enabled the University of Houston to establish the Energy Transition Institute focused on the production and use of reliable, affordable and cleaner energy for all through a just and equity-driven pathway. Total funding for the institute, including other recent gifts and matching funds, will ‘likely exceed $52 million’.

BGN Technologies, the technology transfer arm of company of Ben-Gurion University of the Negev (BGU), announced the launch of the joint Israel-US cyber security consortium, ICRDE. The consortium targets the protection of energy facilities with some $12 million in R&D funding. The consortium will be run under the auspices of BIRD, the Israel-US binational industrial R&D foundation.

James Shih has been named VP supply chain management for 3D printer manufacturer Velo3D. He hails from Bloom Energy.

Veriten, led by Founder and CEO Maynard Holt, has launched a new media platform, ‘a new voice in the global debate on the future of the world’s energy mix’. Holt was previously CEO of Tudor, Pickering, Holt.


Done deals …

Aker BP sells Cognite stake. Datagration fundraising. EQT/TA sell stakes in IFS/WorkWave. Expro bags SolaSense. GEP to acquire Costdrivers. H2scan gets LetterOne funding. Iron-IQ oversubscribed. News Corp buys OPIS and related assets. ProFrac completes FTS acquisition. Insight Partners lead Prisma Photonics Series B.

Aker BP has sold its 7.4% stake in Cognite to Aramco Overseas Co., a unit of Saudi Aramco. Cognite also reports raising some $225 million from global technology investors TCV and Accel. Aramco uses Cognite’s Data Fusion technology to ‘seamlessly deliver complex real-time insights and optimize energy supply.’

Datagration Solutions has closed an equity financing round to support the growth of its business and advance its PetroVisor platform. Existing investors, including Quantum Energy Partners’ Innovation Fund, participated in the fundraising alongside new investors, led by Houston-based private equity firm EIV Capital.

EQT Private Equity and TA have sold a large part of their stake in cloud software vendors IFS and WorkWave to Hg Capital giving IFS and WorkWave a notional $10 billion valuation. WorkWave was separated from IFS in 2021 and has since executed three ‘transformational’ add-ons to its Field Service offering.

Expro has acquired distributed fiber optic sensing specialist SolaSense, adding well surveillance technology to its well intervention and integrity services.

GEP is to acquire Costdrivers, a provider of costing and pricing trends forecasting services, and procurement intelligence and data science firm Datamark. GEP is a provider of supply chain strategy and software.

Hydrogen sensor manufacturer H2scan has received some $70 million in funding from LetterOne, a UK-based investor and Korean GS Energy. H2scan’s Gen 5 hydrogen sensor is used in leak detection and process gas monitoring markets as well as hydrogen distribution pipelines. Customers include ABB, Siemens, GE Energy, ExxonMobil, Shell and Chevron. More from H2scan.

Cloud-native scada innovator Iron-IQ reports an oversubscribed $3.5 million series-A fundraising led by Ascent Energy Ventures with contributions from Greater Colorado Venture Fund and Scape. Iron-IQ also recently announced Patch-IQ adding alarming, control, IP-camera integration, custom logic layers, and advanced data integration to its base offering.

News Corp has completed its acquisition of OPIS and related assets including Axxis Software, PetroChem Wire, OpisNavx and other businesses. OPIS has joins News Corps’ Dow Jones’ professional information business.

ProFrac Holdings has completed its acquisition of FTS International, ‘reuniting FTSI with the ProFrac management team’.

Prisma Photonics has raised $20 million in a Series B funding round led by Insight Partners alongside SE Ventures (Schneider Electric’s VC arm) and Future Energy Ventures (the VC and collaboration platform of E.ON). Prisma’s markets include oil and gas transmission pipelines, highways, subsea cables and more. Prisma’s ‘AI-driven’ Hyper-Scan fiber sensing technology monitors infrastructure over thousands of kilometers, providing real time events and alerts.


ESRI 2021 EU PUG Pipeline Session

Digital twin ‘not just a buzzword’. BP’s SPiRiT subsea, pipeline, riser twin. CHA-IS/Exprodat’s pipeline integrity management system. Rosen’s inline inspection methodology.

Jeff Allen kicked-off the 2021 ESRI EU PUG Pipeline Session stating that the digital twin is ‘not just a buzzword’. The three key components of the twin are, providing a historical baseline for the asset, support for real time operations data and a forecasting and testing capability to predict outcomes. Esri’s ArcGIS for Pipeline offers a GIS, augmented with a central enterprise data repository along with tools and technology for asst management and the digital twin.

Graham Savage presented BP’s Subsea, Pipeline, Riser Twin (a.k.a. SPiRiT). BP has ‘terabytes of pipeline data’. SPiRiT was developed in BP using out of the box Esri components and a ‘low code’ approach. Data is stored in BP’s global PODS SDE database. The global GIS allows for click through from a pipeline map to bring up a schematic display of the line. Various tabs allow for drill down to other data tables to view anomalies associated with pipe segment types. The solution combines ArcGIS dashboards, and PowerBI tables. In the subsea view, a selector is used to collate information on a particular flexible type. Video seabed survey data can be viewed to see where a pipe is not touching the seabed. Another function is the dynamic alignment sheet, generated on the fly from the PODS database. A documents tag brings up documents tables stored in PODS along with tags that link across to BP’s document management system.

Neal O’Driscoll presented a pipeline integrity management system (PIMS) that his company CHA Integrated Solutions* is developing (with help from Exprodat) for Gas Networks Ireland. GNI was looking for a standards-based pipeline data model/database and settled on PODS. Data is now being migrated from GNI’s Smallworld/Maximo systems along with inline inspection data. The solution combines PODS and Esri technology along with CHA’s Intrepid asset management software, also built on Esri technology. A click on a pipeline brings up digital field book data along with CIPS corrosion data stored in PODS. The PODS data loader is used to import more stuff. Data can be viewed in both Intrepid and in ArcGIS Pro.

* Previously Novara GeoSolutions.

Simon Daniel presented Rosen UK’s work on inline inspections. A survey by the UKOPA found that 21% of all production loss is cause by external interference. This in turn is influenced by pipeline burial depth. Rosen’s methodology for estimating coverage combines a highly accurate digital terrain model from LIDAR surveys with inline inspection measurements from an inertial navigation pig. Convolving the pig navigation data with the terrain model provides depth of pipe, color coded along a strip map display. Regarding accuracy, Lidar provides around 5cm vertical accuracy and 1m resolution. The ILI tool gives around +/- 0.2 m after 2000 m of pig travel. Rosen has surveyed some 1500 km of line to date with 800k measurements. The average depth of cover is around 1.5m. More from Rosen.

Finally a couple of corrections regarding our report from the Esri EU PUG in our last issue that Shell’s Berik Davies has provided. Shell’s name for its corporate AGOL deployment is ‘My Maps’ (not as we had it myMap). Also the focus of the Discovery system for Shell Maritime (Shipping & Trading) is more on vessel assurance rather than insurance, involving the spatialization of Shell’s GMAS, the Group Maritime Assurance System. Our apologies for the misrepresentations.


OPAF scales-up

ExxonMobil’s cloud-based process control architecture ‘running independently of any classic control system’.

ExxonMobil recently provided an update on its Open Process Automation ecosystem developments where Ryan Smeltzer traced the background and objectives of the OPA, originally conceived as the OPA Forum, under the auspices of The Open Group. The aim is to deliver an open, interoperable process architecture to ‘promote innovation’ and provide ‘optionality’ i.e. opening up system to ‘best in class’ product deployment, as opposed to single vendor solutions and proprietary lock-in. Work began on a prototype in 2016 with Lockheed Martin. A testbed/demonstrator was delivered in 2019 to test third party products with Yokogawa as systems integrator. Exxon is now funding field trials of the system with a go-live in 2023. The system will include 2,000 I/O points and 100 control loops running on OPA 2.1. The aim is to demonstrate the OPA ROI and encourage others to invest in OPA.

A memorandum of understanding has been signed with OPA partners Intel, Schneider Electric and Yokogawa to embed the OPA developments into Exxon’s ‘advanced compute platform’. The ACP will leverage software from Intel, Schneider’s IEC 61499 standards-based IoT ‘EcoStruxure’, Dell’s hyperconverged infrastructure and VM Ware’s Cloud Foundation virtualization. Yokogawa is to provide system integration and support. For Exxon, this will be a ‘best of breed architecture, running independently of a classic control system’. All will run in a low latency cloud environment and will enable more sophisticated control apps ‘at the edge’, without legacy ties to particular hardware. Non confidential outcomes of the trial will be shared with industry through OPA.

In the Q&A, Oil IT Journal asked, ‘What sort of market share does oil and gas have in the process control business. Will OPA have application in other process verticals? How far and wide do you expect this to reach? Will it touch discrete manufacturing?’

Exxon’s Mohan Kalyanaraman replied, ‘We do expect this to go beyond oil and gas into the power segment, food and beverage. OPA is applicable in a hybrid batch environment but maybe not in discrete manufacturing. Smeltzer added, ‘Our approach with our partners is not to develop an oil and gas solution. We need to appeal to more than oil and gas to be successful. We are also thinking of the scale of implementation, from a skid-based deployment to the scale of a plant’.

The Dell rep concurred, ‘Yes, this is why we are on board. OPA has standardized a fragmented marketplace. Intel agreed too, ‘We are active in the discrete space with automotive. Many of these problems are shared with discrete manufacturing like containerization and so forth. It is a shame that discrete does not have the OPAF umbrella. And Yokogawa, ‘We are developing a software defined infrastructure. We want to bring this as a packaged solution for any kind of client’.

More from the OPAF home page.


‘Process4Planet’

A.Spire organization unveils ambitious Process4Planet program to ‘transform Europe’s process industries and assure climate neutrality’.

A.SPIRE is the EU association that manages the Processes4Planet program that is to ‘transform the EU process industries to achieve circularity and overall climate neutrality by 2050’ while ‘enhancing their global competitiveness’. P4Planet is a public-private partnership between A.SPIRE and the EU Commission’s Horizon Europe R&D program. P4Planet sets out to promote a ‘holistic systemic socio-economic approach!’ A.Spire membership includes EPRA, the EU petroleum refiners Association and CEFIC, the EU chemical industry council. A.Spire is firmly anchored in the climate neutrality and foresees a ‘massive disruption’ in the energy sector itself and in process industries in general as they seek to make optimal use of alternative energy resources and feedstocks and contribute to the transition’.

While the fossil fuel business is excluded from A.Spire’s deliberations, the ‘green deal’ movement is billed as a game changer for Europe’s process industries and its conclusions may ricochet around oil and gas processes at some juncture in the future.

Such deliberations are set out in A.Spire’s 260 page ‘Strategic Research and Innovation Agenda’. This ranges widely across multiple industries and objectives. In the field of digitalization, A.Spire is setting out to ‘solve the industrial big data problem as a basis to develop the cognitive digital plant by 2050’. Leveraging the ‘huge amount of largely incomprehensible data’ amassed to date will require the development of ‘hard and soft solutions based on new computing techniques’. ‘Process control, production scheduling, material allocation and maintenance actions can be improved by new computing techniques’. Along with the entreaties to leverage ‘modelling technologies and artificial intelligence methods like machine learning techniques … especially reinforcement learning’. Cognitive tools will be driven by reliable process analytical technologies and will make use of the process data, providing high level supervisory control while supporting the process operators and plant engineers.

The Agenda name checks another EU favorite, blockchain technologies, used to ‘track and trace component parts’, and to ‘secure the chain-of-custody’.

A.Spire ‘successes’, a.k.a. first-of-a-kind large scale application of one or more new technologies, are curiously dubbed ‘marbles’. An initial set of marbles already covers the overall set of innovation areas and programs defined in the 2050 Agenda, ‘showing the coherence of the P4Planet approach with industry priorities’. Whether or not the P4P will succeed is moot. In the meanwhile however the vast scope of the Agenda will opens up the doors for another round/boondoggle of taxpayer largesse (more than €35 billion) directed at the startups, universities and consulting folks (who, by the way, write the Agenda and will later on mark their own homework) that make up the EU Horizon R&D ecosystem. Apply here.


Sales, partnerships, deployments

Accenture/AWS to Ecopetrol. Aker BP/Accenture/Cognite ‘data factory in the cloud’. AspenTech for Numaligarh Refinery. Baker Hughes, C3AI, Accenture and Microsoft team on energy IAM. Shell reports C3AI deployment. PBF Energy deploys ClearDox. Enveil teams with Terradepth. HUVR and Crochet Midstream deliver tank asset integrity solution. Petrobel deploys Landmark’s iEnergy Stack. Honeywell Forge for Petroperú. ICE Connect’ for UK BEIS. Schlumberger Delfi for ConocoPhillips. Inatech and cQuant.io partner on risk management. Implico digitizes TanQuid terminal.

Ecopetrol, Accenture and AWS have teamed on a new water intelligence and management solution (WIMS) for the energy industry. The ‘open platform’ covers the water lifecycle from access to treatment, recycle/reuse and disposal and is said to enable water neutrality and ‘net-zero carbon emissions’. WIMS embeds Accenture’s industry insights and cloud capabilities from AWS, including high-performance computing, storage, machine learning and AI. The solution targets Ecopetrol’s water usage and disposal across E&P and refining.

Aker BP and Accenture are teaming on a ‘data factory in the cloud’ to improve oil and gas operations. The project is a component of Aker BP’s digital transformation based on Cognite Data Fusion. Accenture was selected by Aker BP to develop the solution. Other project goals include ‘exploring’ OSDU data standards and formats for wells and seismic data.

India’s Numaligarh Refinery is to deploy AspenTech’s software portfolio in pursuit of ‘operational excellence’. The deal includes Aspen HYSYS, InfoPlus.21, Tank Operations and more. The announcement follows ‘more than a decade of collaboration between both companies’.

Baker Hughes, C3AI, Accenture and Microsoft have teamed on industrial asset management solutions for the energy and industrial sectors. The collaboration focuses on Baker Hughes IAM solutions to optimize plant equipment, processes and operations. Net zero/decarbonizing is also an objective. The teaming follows an earlier announcement of the BakerHughesC3.ai alliance for oil and gas and industrial applications.

In a separate announcement, C3 AI reports that Shell now leverages its AI-based predictive maintenance toolkit across some 10,000 equipment items in global upstream, manufacturing and integrated gas assets. The system ingests some 20 billion rows of data weekly from more than 3 million sensors to train and run 11,000 machine learning models making ‘over 15 million predictions per day. Shell is also commercializing its applications which are now available through the Open Energy AI initiative.

PBF Holding Company, a subsidiary of refiner PBF Energy is to deploy the ClearDox Spectrum intelligent document processing to automate data reconciliation for broker confirmation statements and counterparty contracts.

Data protection specialist Enveil has teamed with ocean data-as-a-service specialist Terradepth to ensure secure and private data usage and access. The capability is said to transform ocean data usage for sensitive business and mission applications including secure maritime domain awareness and mission planning. The deal adds Enveil’s ZeroReveal technology to Terradepth’s Absolute Ocean platform, targeting users in numerous industries, including oil and gas.

Reacting to new Texas state legislation for tank farm storage safety SB900, HUVR and Crochet Midstream Consulting have joined forces to deliver an industry focused, fit-for-purpose solution to meet aboveground storage tank asset integrity inspection challenges. CMC president Earl Crochet described the current situation as a ‘perfect storm’ as ‘industry veterans are exiting and taking their hard-earned tank inspection knowledge with them [while] new inspection technologies and regulations enter the scene’. HUVR’s cloud-based IDMP (inspection data management platform) aggregates inspection data from sensors, robots and field technicians. More from HURV and Earl Crochet a.k.a. ‘The Tank Whisperer’.

Halliburton Landmark is to deliver its iEnergy Stack of ‘on-premise cloud-based’ E&P interpretation solutions to Petrobel, an ENI/Egyptian General Petroleum Corp. The iEnergy Stack includes DecisionSpace software along with third party applications. The ‘private cloud infrastructure’ is said to be a first step in Petrobel’s digital transformation and data residency requirements.

Peruvian integrated oil company Petroperú has implemented Honeywell Forge workforce competency solutions to train its industrial workforce. The deployment is a component of Petroperú’s digital transformation, part of the refinery modernization megaproject at its century old Talara Refinery. The solution includes dynamic training simulators for four new units at Talara along with a training and certification program for operators and supervisors.

BEIS, the UK government’s Department of Business, Energy and Industrial Strategy is to deploy Intercontinental Exchange’s ‘ICE Connect’ desktop platform for analysis of the UK’s Emissions Trading Scheme (ETS) markets. ICE Connect aggregates cross-asset real-time data, news and analytics from global markets, helping users manage price and currency risks and streamline workflows. BEIS will also have access to data for UK and European utility markets, including ICE’s benchmark natural gas, power and environmental products. Last year ICE was appointed by BEIS to host emissions auctions for the UK ETS. Some 18 billion tons of carbon allowances were traded on ICE in 2021, equivalent to ‘over half the world’s estimated total annual energy-related emissions footprint’. More from ICE.

ConocoPhillips is to deploy Schlumberger’s Delfi cloud-based environment for reservoir engineering modeling, data and workflows. ConocoPhillips reservoir engineers will have access to cloud-based, high-performance computing resources along with Petrel, Intersect and Eclipse.

Inatech has teamed with cQuant.io to add risk analytics to its Techoil energy trading and distribution system,. The integration adds risk analysis functionality such as risk reduction value and stress testing. Both companies offer solutions are cloud-native, multi-tenant, SaaS solutions. More from Inatech.

Downstream software specialist Implico has completed a ‘visionary’ digitalization project at TanQuid’s Duisburg, Germany tank terminal. The facility is now running Implico’s process-oriented terminal management system OpenTAS 6.0 with additional cloud services from the ‘Supply Chain United’. The Duisburg tank farm comprises 118 tanks holding a wide variety of chemical, petrochemical, and other products. Some 3,500 shipments per month take place in Duisburg under control of OpenTAS. More from TanQuid and Implico.


Standards stuff …

IOGP publishes Environmental data collection guide. New EU standardization strategy for twin ‘green and digital’ transitions. The US NIEM transitions to OASIS. NIST publishes AI Risk Management Framework. OGC updates ‘Role of standards in geospatial information management’. Digital twin capabilities periodic table. XBRL rolls-out units for ‘consistent climate and energy reporting’. PPDM’s strategy review.

IOGP Report 2021eu, ‘Environmental data collection user guide (2021 data) – Definitions and exclusions’ covers the collection, collation and reporting of upstream environmental information, a ‘central part of the IOGP work program since 1998’. Member companies submit data relating to their exploration and production activities on an annual basis which is rolled-up into the annual environmental performance indicators report. IOGP performance data can be accessed via the dedicated data site.

A new EU standardization strategy has been announced addressing the ambitions of the twin ‘green and digital’ transitions. The document also describes the implementation of policies including the digital single market, internal markets for renewables, natural gases and hydrogen energy efficiency and climate.

The US NIEM (national information exchange model) is transitioning to become an Open Project Standard under the OASIS standards body. NIEM is a common vocabulary for information exchange between diverse public and private organizations. More from NIEM.

NIST, the US National Institute of Standards and Technology has published a first draft of its AI Risk Management Framework. The AI RMF addresses risks in the design, development, use, and evaluation of AI products, services, and systems. AI RMF 1.0 is planned for release January 2023.

OGC, the Open Geospatial Consortium has just published the 3rd edition of its Guide to the role of standards in geospatial information management. The Guide emanated from a multi-geo standards bodies meet (ISO/TC 211, IHO and OGC) and has been endorsed by the UN Committee of Experts on Global Geospatial Information Management (UN-GGIM). The Guide provides recommendations on the open international standards and good practices necessary to ensure that geospatial data and technologies can be shared and used. The latest edition aligns with the IGIF, the UN Integrated Geospatial Information Framework (IGIF). Download the Guide.

The Digital Twin Consortium, the self-styled ‘authority in digital twin’ has rolled out the digital twin capabilities periodic table (CPT), a framework that can be used to ‘design, develop, deploy, and operate composable digital twins’. The CPT ‘clusters capabilities around common characteristics using a periodic-table approach’. Organizations can use the CPT to determine assess digital twin capabilities and to analyze vendor solutions. Pieter van Schalkwyk, who co-chairs the DTC’s Natural Resources Work Group said, ‘Organizations can use the Digital Twin CPT framework in the boardroom to explain the business case for a digital twin project’. The DTC is a unit of the Object Management Group. More on the CPT.

The XBRL standards board has announced availability of new units for consistent climate and energy reporting, an update to the XBRL Unit Registry. These include new units for measuring greenhouse gas emissions, and a range of new physical units of use for reporting in the energy sector. XBRL reporting now extends beyond the financial field and the new units ‘reflect the increasing variety ways in which XBRL is being used today’. More from XBRL.

PPDM is undertaking a strategy review in the light of the evolving energy landscape. At issue is a possible ‘pivot’ from the current oil and gas focus to a broader ‘energy’ standards body. A name change is also under consideration. A Strategy Position 2022-2027 document is under preparation for discussion at the upcoming PPDM Houston Data Expo.


Back to school …

AESP to reskill energy professionals. Index AR supports Center for Energy Workforce Development. Bentley Education for future infrastructure professionals. CGE Risk e-Learning for IncidentXP. Future Skills Centre EDGE UP trains displaced oil and gas professionals. The IOGP Guidance for subsea source control competency. ISN/RelyOn Nutec train oil and gas workers. LOGIIC selects ISA for cyber certification. NPD DG ‘the petroleum industry represents jobs for the future’. NSF announces Data Science Corps program. Fugro and U Houston collaborate on ‘Data Science for the Energy Transition’.

The US DOE Office of Energy Efficiency and Renewable Energy has selected AESP, the Association of energy services professionals to provide a three-year eLearning program to reskill some 50,000 energy professionals. The free and open program involves ‘virtual reality and AI-driven instruction’ in new grid-interactive energy technologies, a.k.a. demand flexible loads. More from AESP.

Index AR Solutions is to supply the Center for Energy Workforce Development with its interactive eBooks and mobile apps. These will help deliver interactive digital training programs to build a ‘skilled and diverse workforce pipeline’. CEWD is a non-profit consortium of electric, natural gas, nuclear, and renewable energy companies committed to the development of a skilled, diverse energy workforce. Index’ Apprentice Program solutions are used to create multimodal digital curriculums that streamline and replace legacy training materials. More from Index AR.

Bentley Systems has announced the Bentley Education program to encourages the development of future infrastructure professionals for careers in engineering, design, and architecture. The program includes no-cost learning licenses for Bentley infrastructure engineering applications and teaching from the Bentley Education portal. The program is open to students and educators at community colleges, technical institutes, polytechnics, universities, secondary schools, and homeschooled students.

CGE Risk is now providing an e-Learning course for users of its IncidentXP software. This includes two modules, an introduction to learning from incidents and introduction to IncidentXP. More from CGE.

Future Skills Centre is to invest over $5.4 million in an upskilling program with Calgary Economic Development. The EDGE UP* 2.0 program provides training for 320 displaced oil and gas professionals for careers in tech. The program targets professionals displaced from the structural change in the oil and gas sector. Students are trained for in-demand information technology jobs including data analysts, full-stack software developers, information technology project managers, cybersecurity analysts, UI/UX designers, digital marketing and more.

*Energy to Digital Growth Education and Upskilling Project

The IOGP has published a report providing Guidance for Subsea Source Control Competency and Skills a.k.a IOGP-IPIECA Report 591. The report targets response organization leaders and builds on an earlier IOGP Report N° 594, ‘Source control emergency response planning guide for subsea wells’.

ISN, a provider of contractor and supplier information management, and RelyOn Nutec have partnered to provide digital training solutions to the oil and gas industry. ISN can now provide its members with RelyOn Nutec’s safety, training and competency eLearning courses worldwide. RelyOn Nutec provides safety, drilling, fire and technical training for the oil and gas sector globally. ISNetworld, ISN’s online platform of data-driven products and services, helps its customers manage risk and strengthen relationships. More from RelyOn and ISN.

In 2019 the LOGIIC* organization conducted a study to identify training and certification programs that could be delivered to staff located in remote locations throughout the world. Eight ICS cybersecurity training providers were involved in the exercise but ISA was ‘the only company that had training available in an online format’. More from ISA.

* Linking the Oil and Gas Industry to Improve Cybersecurity

In an interview with Norway’s Stavanger Aftenblad newspaper Ingrid Sølvberg, director general of NPD, the Norwegian Petroleum Directorate bemoaned the low number of applicants for petroleum-oriented studies. Sølvberg opined that the petroleum industry ‘represents jobs for the future’.’ ‘Oil and gas will be needed for decades, although their share of the energy mix will decline. And these revenues, expertise and technology will ease our transition into the low-emission society. Many of the new industries are based on expertise and technology developed in the petroleum industry’. Examples of the new industries include offshore wind, CCS, offshore mineral extraction and hydrogen production from natural gas. ‘We will continue to develop the Norwegian shelf and create value from petroleum and new, emerging profitable industries. We need motivated and skilled professionals to do this. And we need diverse expertise.’ More from NPD.

A new program from the US National Science Foundation is to expand data science education pathways. The Data Science Corps program, or DSC (a component of NSF’s Harnessing the Data Revolution Big Idea) works with the academic community to bring students and local organizations together to use available data to solve problems. This will address challenges in the community and ‘transform data science education’. One example of the new approach is Purdue University’s ‘Data Mine’, a university-wide community that teaches data science to participating undergraduates from all majors. The HDR DSC National Data Mine Network (awarded to the American Statistical Association) expands this effort to train a cohort of 300 students at dozens of partner institutions across the nation. More from NSF.

Fugro has joined the University of Houston on a collaborative project to advance geo-data science skills in the energy sector. The ‘Data Science for the Energy Transition’ is funded through a 3-year grant from the National Science Foundation and will offer undergraduate and master’s students training in statistical and machine learning techniques for subsurface geo-data. Fugro’s role as an industry partner on the project is to provide UH with real-world data and guidance on its use in hands-on training. More from Fugro.


Pipeline tech? Communications, communications, communications

ABB/TANAP present on southern gas corridor, the silk road of energy. Ovarro RTUs monitor CNPC’s China-Russia crude pipeline.

If you had to say what technology was key to pipeline operations you could choose from scada/process control, AI, GIS, digital twins, and more. But recent announcements and presentations have led us to believe that the key to pipeline operations is … communications.

Speaking at an ABB webinar last year, Ömer Korkmaz and Yakup Yilmaz (TANAP) presented the building of the Trans-Anatolian Natural Gas Pipeline, a 56” line that takes Azeri gas to Europe along the Southern Gas Corridor a.k.a the ‘silk road of energy’. The 1,850km line provides some 5% of Europe’s energy needs and said to be ‘one of the longest gas pipelines in the world’ and includes ‘one of the largest ever integrated telecoms, security and scada systems illustrating the benefits that modern communications bring to pipeline security and control’. TANAP is considered by the EU as a ‘Projects of Common Interest’, benefitting from public funds because of its ‘contribution to the EU climate goals’. Natural gas is considered a bridging fuel.

Building the pipeline involved traversing ‘exciting but challenging’ terrain with elevation from -65m subsea to 2,760m asl. There were 115,000 landowners involved and some 230,000 documents held at the control center. Three scada systems cover pipeline operations, leak detection and intrusion monitoring, all linked with dual fiber optic cables along with VSAT backup.

Ian Holden (ABB) explained how the fiber comms cable doubles as an intrusion sensor that monitors the whole pipeline leveraging digital acoustic sensing. This, along with CCTV and a ‘huge amount’ of pipeline data feeds into the control room. High bandwidth communications have avoided the need for a control system at each station. In TANAP, control system and scada are one and the same. This has ‘saved countless hours of integration and testing’. Holden recommends, ‘If fiber is available, consider a single control system’.

~

In a separate announcement, Ovarro reported the supply of 22 of its TBox-MS modular remote telemetry units to PetroChina Pipeline Co.’s China-Russia crude oil pipeline. A second, 800km pipeline has brought crude oil flows from Russia to China to 30 million tonnes per year. The units, which functioned in temperatures of -43°C provide monitoring and control of data and events on the pipeline while also reducing maintenance and repair costs. RTUs address the issue of operating a pipeline over a large network of remote fixed assets. In the event of a communications failure, the RTU’s data loggers ‘ensuring that critical data from the field is not missed’.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.