The ECN Carbon Tracking and Reporting conference* earlier this year included an informative panel session on decarbonization and net zero challenges in the American oil and gas industry. Panelists Hatem Nasr (SoftServe), Samantha Holroyd (Golden Advisory Services), Tony Beebee (TS Cyanergy) and Jeremy Grzwa (NOV) were shepherded through the discussion by Chairperson Abhi Kohli (Shell). Kohli began with a straw poll of the audience that found that while many attendees were involved in scope 1 and 3 decarbonization, scope 3 issues were, how should we say, out of scope for most. The big issue for the world is supplying an increasing population with energy while staying under the 1.5° C limit of the Paris agreement.
Holroyd observes that the ‘E’ component of ESG, environment, was the main way that its opponents can ‘really attack the oil and gas industry’ hence the current focus on the emissions side of business. In reality this is nothing new. Industry has been ‘tracking, measuring, assessing and managing emissions for years’ but now is the time to be talking publicly. ‘We can no longer keep this information in-house we can no longer self-manage and self-mitigate it’s now time for us to go public with our disclosures’. ‘Very soon the SEC will require this’. ‘Predicting and managing emissions will become part of your core license to operate.’
Nasr drilled down into the technologies that this involves. ‘The biggest culprit here is methane obviously, this is what we’re obsessed with and it is a huge source of concern for both the world and the oil and gas industry’. Measuring and reporting methane is challenging but there has been phenomenal technology development in accurate measurement with satellites that now give metric accuracy of super emitters. The consolidation and management of methane data is work in progress and is not helped by unclear reporting requirements which are ‘very confusing right now’.
Beebee observed that another issue is the feedback of measurement data to operators. ‘There may be a report at the corporate office but it doesn’t get to the person making a decision in the field’. The feedback loop is ‘super important’ in allowing operations to react to events and ‘achieve the efficiency number that the companies have promised’.
Grzwa added that while many were pinning their hopes on future breakthrough technologies, there was a lot being done now in terms of retrofitting existing equipment to run more efficiently, kinetic energy, peak shaving and more automation tools for drillers. Industry needs to invest more in this space and more clarity and from the regulator is needed especially in the area of consistent reporting protocols. Automation vendors are also up against companies that still operate with a ‘guy with a tape measure and a clipboard strapping tanks and sending off an email’.
Nasr agreed that the regulatory situation is very confused. ‘It’s a mess. I don't understand why the government is subcontracting this to everybody’. Frameworks such as MiQ, Veritas and UNCG are creating a lot of confusion. The government really need to step up and invest in the EPA, put a lot more people here to get reporting under control. The second aspect of this is that the IRA should include tax incentives to replace, for instance, the ‘hundreds of thousands if not millions’ of pneumatic valves and also to encourage tank automation upgrades. The government needs to put more incentives on the table. Support and hand-holding from the industry on allocating the $350 billion IRA money is needed.
The discussion ended up on the need for data transparency, not a strong point for oil and gas which does a better job of ‘closely guarding its data even when sharing it could be beneficial’. Holroyd sees data transparency as a core fundamental of stakeholder management. Data needs to be shared with all employees who ‘should be able to quote them just like they quote your daily production or they quote your stock price’.‘Maybe emissions numbers should be up there next to the ticker symbol!’
* This session from the ECN Carbon Tracking and Reporting US Conference 2023 is available on the SoftServe YouTube channel.
The sentiments expressed at the ECN event were largely reprised at the subsequent 2023 AWS Energy Symposium where TS Cyanergy and SoftServe unveiled a data-driven emission tracking solution leveraging Amazon web services. A proof of concept drillers dashboard performs emission measurement and monitoring using real-time data to manage and reduce fuel consumption and greenhouse gas emissions. In parallel, a machine learning model monitors motor performance and predicts failure. The system is still under development and second generation monitoring hardware is to roll out real soon now. The partners envisage a much larger market (15x over oil and gas) for the solution in the shipping industry.
On the AWS side, solution is already productized as the AWS CarbonLake which collects methane and carbon dioxide data, and provides recommendations for optimizing operations, along with auditable data for ESG reporting. The CarbonLake is said to address the issues raised at the ECN conference above, notably the ‘difficulties of ingesting, standardizing, transforming, calculating, tracing, and analyzing carbon and greenhouse gas emissions data’. It employs a ‘nine-step pipeline’ spanning data quality checks, a transformation ledger, greenhouse gas calculator, prebuilt analytics and machine learning tools and a ‘serverless’ web application. The pipeline deploys the whole gamut of AWS technology, Amazon S3, SQS, a DynamoDB GHG emissions factor lookup table, a GraphQL API, SageMaker notebooks, QuickSight dashboards with added AI, Athena queries, and a web app stack that uses AWS AppSync. AWS Amplify provides a ‘serverless, pre-configured management application that includes basic data browsing, data visualization, data uploader, and application configuration’.
Comment: While TS Cyanergy and SoftServe
joined AWS on the CarbonLake presentation at the AWS Energy conference,
the main sponsor of the ECN Carbon Tracking event was … Google Cloud.
Both carbon clouds address reporting that goes well beyond oil and gas
emissions to consider variously the emissions associated with computer
calculations in the cloud and also scopes 1, 2 and 3 resulting from
goods and services transiting the supply chain. For a comparison of
these and Microsoft’s too, read this report on the now defunct Protocol.
Visit the 2023 AWS Energy Symposium here and the ECN Carbon Tracking Conference here. Next year’s ECN event will he held in Houston on March 26-27, 2024.
Intel reports that Edinburgh, UK-based Codeplay has released an extension to its oneAPI ecosystem, the oneAPI Construction Kit. The open source project allows code written in Sycl to run on custom architectures for HPC and AI. The kit includes a framework for bringing oneAPI support to new hardware including AI accelerators.
Intel has opened up a ‘training portal’ for those wishing to migrate CUDA code to C++/Sycl and ‘unlock their code from the constraints of vendor-specific (read Nvidia) tools and accelerators’.
In a separate announcement Intel demonstrates how its Open HPC+AI portfolio powers ‘generative AI for science’. In a presentation made at International Super Computing 2023 Intel demoed ‘AI-accelerated HPC performance driven by oneAPI’. With help from ISV Ansys, the Intel Data Center GPU Max Series was claimed to outperform NVIDIA H100 by 50% on AI-accelerated HPC applications, in addition to an average improvement across diverse workloads of 30%. ‘The Habana Gaudi 2 deep learning accelerator delivers up to 2x faster AI performance over an Nvidia A100 for deep learning training and inference’.
Chinese oil country software developed GeoEast has reported use of the oneAPI dev kit to provide a ‘solid high-performance base’ to its geophysical exploration software, speeding pre-stack migration by a claimed 1,000x. Researchers at the Federal University of São Carlos, Brazil have also leveraged oneAPI to run seismic workloads on FPGAs. Here applications were accelerated using SYCL* with oneAPI for FPGA frameworks. However, Intel warns, your performance and mileage may vary ‘by use, configuration and other factors’. More from Intel.
ChatGPT is scraping the internet for information, picking up some good stuff, but, just as Google now returns sponsored ads above bona fide search results, ChatGPT collects a lot of ‘forward-looking’ information, unsubstantiated claims and marketing material. This is converted into extremely plausible and authoritative text.
I thought that I would take OpenAI’s ChatGPT (CGPT) for a test drive and at the same time settle a dispute that I have had with, well, the rest of the world really, on the efficacity of the heat pump as a ‘solution’ to the energy transition. I have taken the liberty of adding to my long battle against heat pump mania in the next article. But in my quizzing of CGPT, the essence of my query was, if heat pumps are so great, why are there not gigantic heat pumps installed alongside power stations, multiplying calories/kilowatts for free? But first, some softball questions to lure the beast in* …
NMcN - How does a heat pump work?
ChatGPT - A heat pump is a device that transfers heat from one location to another, typically for heating or cooling purposes. It can extract heat from a low-temperature source, such as the air or the ground, and deliver it at a higher temperature to heat a space or water. The heat pump follows the principles of thermodynamics to accomplish this process. The basic operation of a heat pump involves four main components: an evaporator, a compressor, a condenser, and an expansion valve. Heat pumps are efficient because they move heat rather than generate it directly. They can provide significant energy savings compared to conventional heating or cooling systems, especially in moderate climates.
NMcN - But what about the second law of thermodynamics?
ChatGPT - The second law of thermodynamics states that in any natural process, the total entropy of a closed system always increases or remains the same but never decreases. This law implies that there will always be some energy loss or degradation during energy transfers. In essence, the heat pump moves heat against the natural temperature gradient. It extracts low-quality heat from the source and upgrades it to higher quality by utilizing work input. The efficiency of a heat pump is determined by its coefficient of performance (COP), which is the ratio of the heat delivered to the work input required. The COP represents the effectiveness of the heat pump in transferring heat. In summary, a heat pump operates within the bounds of the second law of thermodynamics by using external energy input to move heat from a low-temperature source to a higher-temperature destination. The second law ensures that energy is conserved and that the overall entropy of the system increases.
NMcN - But where do the calories come from?
ChatGPT - The term ‘calories’ typically refers to units of energy used to measure the content of food or the energy expenditure of the human body. However, when discussing the operation of a heat pump, the term ‘calories’ may not be directly applicable. Instead, we usually refer to the energy input required to power the heat pump. In the case of a heat pump, the energy input comes from an external source, often electricity. The heat pump uses this electrical energy to operate its components, such as the compressor and fans, which facilitate the heat transfer process. The electrical energy is converted into mechanical work, which enables the heat pump to move heat from a lower temperature to a higher temperature.
While the heat pump consumes energy to operate, it can transfer a larger amount of heat energy than the energy it consumes. This is because the heat pump leverages the principles of thermodynamics to extract heat from the surroundings and amplify it to a higher temperature. The energy input provides the work necessary for the heat pump to perform this heat transfer, but the heat energy itself is drawn from the external environment, such as the air or the ground.
NMcN - What do you mean by low quality and high quality heat?
ChatGPT - In thermodynamics, the terms ‘low-quality heat’ and ‘high-quality heat’ refer to the relative usefulness or usefulness of heat energy in performing work. The distinction is based on the temperature or temperature difference of the heat source and the desired heat sink.
Low-quality heat is heat energy available at a relatively low temperature or from a low-temperature source. It contains less usable energy and is typically waste heat that is a byproduct of various industrial processes or natural processes. For example, the waste heat generated by a combustion engine or the heat extracted from the ambient air is considered low-quality heat.
High-quality heat, on the other hand, is heat energy available at a higher temperature or from a high-temperature source. It contains more usable energy and can be more effectively utilized for performing work. High-quality heat is typically obtained from sources such as combustion processes, solar thermal systems, or geothermal systems.
Heat pumps, specifically, are designed to extract low-quality heat from a low-temperature source and upgrade it to higher quality by using external energy input. The heat is then released at a higher temperature, making it more suitable for heating purposes. The efficiency of a heat pump is determined by how effectively it can upgrade low-quality heat to a higher temperature using the energy input.
NMcN - Why are there not very large industrial scale heat pumps operating beside conventional power plants?
ChatGPT - While heat pumps can be highly efficient in certain applications, there are several reasons why very large industrial-scale heat pumps are not commonly used alongside conventional power plants:
Temperature limitations: Heat pumps work best when there is a significant temperature difference between the heat source and heat sink. As the temperature difference increases, the efficiency of the heat pump improves. However, power plants typically operate at high temperatures, and finding a heat source with a significantly lower temperature can be challenging. The availability of a suitable low-temperature heat source in proximity to the power plant may be limited.
Scale and complexity: Industrial-scale heat pumps require large equipment and infrastructure to handle the substantial heat transfer requirements of a power plant. Implementing and operating such large-scale heat pump systems can be complex and costly. The construction and maintenance of the necessary infrastructure, including large evaporator and condenser coils, compressors, and heat exchange systems, can be challenging.
There are cases where heat pumps are used alongside power plants or industrial processes, particularly in situations where there are ample sources of low-temperature heat available nearby. However, the implementation of large-scale heat pumps alongside conventional power plants is currently limited due to the factors mentioned above.
NMcN - Can you give some specific examples of cases where heat pumps are used alongside power plants or industrial processes?
ChatGPT - Certainly! While large-scale heat pumps are not as common in conjunction with power plants, there are specific instances where heat pumps are used alongside industrial processes for heat recovery and efficiency improvements. Here are a few examples:
District Heating/Cooling Systems: District heating systems utilize heat pumps to recover waste heat from power plants or industrial processes and distribute it for heating purposes in nearby residential or commercial areas. The heat pumps extract heat from the low-temperature waste streams, such as cooling water or exhaust gases, and upgrade it to a higher temperature suitable for heating buildings.
Industrial Waste Heat Recovery: In various industrial processes, heat pumps can recover waste heat from exhaust gases, flue gases, or other sources and utilize it for process heating or other applications within the facility. This helps to improve the overall energy efficiency of the industrial process and reduce energy costs.
Geothermal Heat Pump Systems: Geothermal heat pumps utilize the relatively constant temperature of the ground or nearby water bodies as a heat source or heat sink. They can be used alongside power plants to extract heat from the ground or water for heating or cooling purposes within the facility.
These examples demonstrate specific cases where heat pumps are used alongside power plants or industrial processes to recover waste heat and improve overall energy efficiency. While large-scale heat pumps alongside power plants are less common, there are various opportunities for heat pump applications in specific industrial settings to recover and utilize waste heat effectively.
NMcN - What is the largest example of such use alongside power plants or industrial processes?
ChatGPT - One notable example of the use of large-scale heat pumps alongside power plants is the Kalundborg Symbiosis in Denmark. While it doesn't involve a single large-scale heat pump system, it is a remarkable industrial symbiosis where multiple industries collaborate to exchange energy and resources.
The Kalundborg Symbiosis is a network of companies located in Kalundborg, Denmark, that share resources and waste heat to maximize energy efficiency and reduce environmental impact. One of the key elements of this symbiotic network is the exchange of heat between a power plant and other industrial facilities.
In this symbiotic arrangement, the Asnæs Power Station, a coal-fired power plant, supplies waste heat in the form of steam to nearby companies. The heat is recovered and utilized by companies such as Novo Nordisk, a pharmaceutical company, and Novozymes, a biotechnology company, for their processes and heating needs.
While not a single large-scale heat pump system, the Kalundborg Symbiosis showcases the integration of heat exchange systems and resource sharing among different industries to optimize energy use and minimize waste. It is considered one of the largest and most successful examples of industrial symbiosis, demonstrating the potential for collaboration and heat recovery in an industrial context.
NMcN - But this is an example of the use of steam in a heat exchanger. No heat pump is used here.
ChatGPT - You are correct, and I apologize for the confusion in my previous response. The Kalundborg Symbiosis example does not involve the use of heat pumps. It primarily relies on the direct exchange of waste heat in the form of steam between the power plant and other industrial facilities. I apologize for the inaccurate information. Thank you for pointing that out. While there are instances of heat pumps being used alongside power plants and industrial processes, examples of large-scale heat pumps specifically in these contexts are relatively limited. I apologize again for any confusion caused by my previous response.
NMcN - Can you tell me where the largest real heat pump is?
ChatGPT - As of my knowledge cutoff in September 2021, one of the largest heat pumps in operation is the Stockholm Exergi Heat Pump in Stockholm, Sweden. This heat pump, known as the KVV6, has a capacity of 260 MW (megawatts). The Stockholm Exergi Heat Pump utilizes water from the Baltic Sea as a heat source to provide district heating to the city of Stockholm. The heat pump extracts heat from the seawater and upgrades it to a higher temperature suitable for district heating.
NMcN - How successful is the Stockholm Exergi Heat Pump? Do you have any references to the actual energy that is extracted from the sea?
ChatGPT - I apologize for the incorrect information provided earlier. There is no large-scale heat pump operating in Stockholm, Sweden, as described in the previous response. I apologize for any confusion caused. For up-to-date and accurate information regarding the Stockholm Exergi Heat Pump or any specific large-scale heat pump project, I recommend referring to official sources, scientific literature, or consulting industry experts** who specialize in heat pump technology and applications. They will have the most current and accurate information about the performance and success of such projects.
Conversation ends ...
So you see that CGPT has had to apologize twice for bullshitting me. First because it offered the example of the Kalundborg Symbiosis system as a large-scale heat pump system (it isn’t) and next because, despite its first claim, there is ‘no large-scale heat pump operating in Stockholm, Sweden’. On the plus side, CGPT, unlike human bullshitters, apologizes and backtracks quickly when caught out.
I think that this exchange illustrates the weakness of the system.
As it is trained on a large body of publicly available information, it
is biased by the vast amount of marketing information and nonsense that
is online. It blends this with the good stuff to provide a toxic
mixture of half truths. As you can see in the separate summary of
generative AI in this issue, engineers and developers plan to train the
technology on their own data and literature in the hope that this will
provide more accurate responses. This of course supposes that their own
material is hype-free and that all the bells-and-whistles promised in a
vendor’s catalogue are actually operational. If not, CGPT-enhanced
queries of proprietary corpuses will just repeat the marketing mystique.
* I have truncated GGPT’s replies in this editorial, you can read the full exchange here.
** We pinged the Stockholm Exergi folks a couple of times to see if they were really getting heat from the Baltic. They never got back to us.
In my musings and putative debunking of the heat pump, I have been blindsided up to now by focusing on the heat pump itself. I don’t feel too bad about this as everyone else who writes about heat pumps has been similarly led astray. Thinking of the heat pump itself leads to agonizing about its internal mechanics, the Carnot cycle and a mysterious factor, the COP, the coefficient of performance which is relates the amount of energy coming out of the system to the electrical energy required to drive the pump.
Forget all that stuff for the moment and consider the heat pump as a
black box with three connectors. There is electricity going in to power
the system. There is heat coming into your home. And there is heat
going into the system from some external source. If we imagine a
perfect heat pump which runs on zero electricity, then the energy from
the outside going in is the same as the energy produced in your home.
So what is this perfect heat pump actually doing? It is not multiplying
energy, it is transforming it. A heat pump transforms a high heat flow
at a lower temperature into a lower heat flow at a higher temperature
(that’s why the fan on the outside is a lot bigger than the one on the
inside). But the energy on both sides is the same.
So where are the calories or kilowatts actually coming from? This is the big question that nobody seems to want to address. The rationale for the heat pump is that it chills some fluid outside the house, sucking out the calories and restituting them on the inside. let’s consider some different configurations. An air source heat pump has a lot of source material to play with, but air is not a great carrier of calories, with a very low specific heat. The fan has to be big and work hard (wasting energy), especially when the outside temperature is low (just when you need most heat). This prompts many to seek more other sources of lukewarm fluids, by drilling shallow boreholes or placing serpentines under the lawn, circulating around the subsurface. For closed loop systems, the problem is different to the air-air system. The subsurface may be capable of heating the fluid some, but as it is chilled and recycled, the issue is how quickly can the ground resupply calories into the chilled exhaust fluid. I recently visited a ground-source system that had both the outgoing chilled fluid and the incoming ‘heated’ fluid traveling in a closed loop down and up the same shallow borehole. I found this to be extremely optimistic, a sentiment that was borne out by the paltry 1°C temperature rise that I noted in the piping.
Many other configurations are in operation. All are predicated on a mystical COP that ‘produces 4 kw for every 1kw of electricity’. let’s say your house needs 16 kw to stay warm. If you believe in the COP, that means that you are looking to extract 12 kw from the cold air outside or from the lukewarm subsurface. That’s a lot of watts! Is it conceivable that there is anything like this energy available from these systems.
The earth scientists and reservoir engineers who are seeking rebirth in green energy are rather well placed to answer this question. Fluid flow in the subsurface? Check! Heat flow? Check! Heat exchange across pipe boundary? Check. All the engineering know-how is there in the reservoir/digital twin toolkit. Why is it not applied routinely? I suspect that this is because such calculations would show the extreme unlikelihood of achieving anything like the required amount of energy from air or the subsurface. Much easier to just blindside folk with the Carnot cycle and the ‘COP’.
Post script. France is currently heat pump crazy. The President of the République is installing a ground-source heat pump in the gardens of the Elysée Palace hoping to heat what is generally recognized to be a ‘passoire thermique’ (i.e. impossible to insulate). Good luck with that Monsieur le Président!
Post post script. While thinking
about this editorial I was surprised to see a report on France’s main
TV channel on a heat pump that is heating Marseilles with energy
derived from the Mediterranean sea! A map of Dalkia’s ‘thalassothermie’
systems popped up showing future installations located all around the
French coastline. I was shocked. It was like the TV was talking to me!
Perhaps I am crazy, although there may be an explanation for this too.
Dalkia is a subsidiary of France’s EDF, the main supplier of grid
electricity. Now ask yourself, who is likely to benefit most if heat
pumps are actually getting most of their energy from the grid? Who will
benefit when all those heat pumps switch over to cooling in the summer
making electricity consumption rocket?
According to Wikipedia, SAP is the largest non-American software company by revenue, the world’s third-largest publicly traded software company by revenue, and the largest German company by market capitalization. This means that pronouncements from SAP need to be taken seriously. SAP’s musings on the future of business might offer a window into where the software behemoth is heading and an opportunity to judge SAP’s chances of maintaining its enviable pole position.
Oil IT Journal has form in reviewing SAP’s writings. Back in 2011 we reviewed SAP’s book on ‘In Memory Data Management: An inflection point for the enterprise’ and wondered if the book might include an element of FUD*. In a letter, co-author Hasso Plattner denied any FUD, kindly explaining the link between the in memory database (IMDB) and various seemingly unrelated topics. We were duly chastised.
Business as Unusual (BaU) by SAP’s Thomas Saueressig and Peter Meier enumerates eight ‘megatrends’ that influence businesses, shape strategy and transform the structure of value chains. First megatrend is for ‘everything as a service’ – i.e. new business models that move from selling ‘products’ to ‘delivering outcomes’. For the oil and gas industry with its privileged operators who have been outsourcing their activities (and risk) to service companies for a century the novelty of the service model may surprise. Other trends, sustainable energy, mobility, supply chain resilience, are less contentious.
As an illustration of the service trend, BaU offers a thought experiment revolving around ‘coffee as a service**’ where a CDO ‘fantasizes about coffee consumption data’ that could be used to feed personalized ads to drinkers or develop a new line of lifestyle products aligned with a consumers habits’. Such musings may lead to the development of ‘thousands of digital twins of coffee machines’, networked with AI to perform predictive maintenance.
One poster child for the services economy is GE, ‘probably the first manufacturer to internalize digital disruption’. While this is undoubtedly true, GE’s early foray into digitization (with Predix) did not prove to be quite the success that was hoped for.
BaU is of course a soft sell for SAP’s vast software portfolio as is illustrated by a diagram of Capgemini’s ‘servitization’ architecture with a constellation of products running under SAP C/4HAHA, the ‘customer experience’ portfolio. BaU is some 300 pages long and ranges widely. Topics include lifelong health, the covid-19 ‘catalyst’, the work done at the Heidelberg University Hospital, and SAP’s vision of a ‘patient centric network’. Financial services, an SAP forte, get extensive treatment that includes a goodly does of FUD viz the Gartner quote that ‘by 2030, 80% of heritage financial service firms will go out of business’. It’s not clear whether this will be a good thing for SAP which currently boasts ‘14,000 customer banks in 150 countries’. Bank poster child for SAP is the Bank of London, a financial services disrupter with a new business model and ‘TBOL’, a patented, SAP-based information technology stack.
BaU’s grasshopper approach means that we jump from financial services to Lego’s transformation in the face of competition from Netflix. And then to the use of Rise with SAP (the cloud platform) to ‘transform capex into opex’ by ‘moving to a hyperscaler’ which probably sounds smarter than it really is.
Oil IT Journal readers will probably want to skip to the chapters on Energy. These focus on the sustainability issue and the energy transition. Citing a DNV study, BaU has it that in 2050, half of the worlds primary energy will come from renewables (up from under 5% in 2021). Coal, oil and gas will still be producing the other 50%. SAP’s role in the energy transition includes a commitment to run its data centers on clean energy and extensive support to customers in terms of emissions reporting, notably via the SAP Climate Strategy Framework (possibly now the Sustainability Control Tower) and an ‘Emissions Intelligence’ program developed with Accenture. This section is one of the most complete parts of BaU with a good drill down into the complexity of decarbonization. However, as Benjamin Beberness, SAP’s oil and gas lead, warns, ‘our world runs on oil and gas, this is not something we can change overnight!’
A section titled the Shell story recaps Shell’s ‘Powering Progress’ strategy. Here BaU makes a bold forecast, that ‘a few years from now the archives will document the massive transition that Shell has undertaken to accelerate the transition of its business to net-zero emissions’. This reviewer, benefitting from 100% hindsight, can observe that Shell, like others, appears to be backtracking already on its 2021 goals. In the sustainability stakes, both Shell and SAP, under the auspices of The Open Group’s Open Footprint Forum are reported to be collaborating on a ‘common model for footprint-related data covering all emissions types .. to normalize and aggregate industry data’. They are not alone in this rather overcrowded field.
BaU concludes with a chapter ‘The Road Ahead’ looking to a future which already seems rather dated. The Metaverse? Maybe. Artificial Intelligence? Perhaps. As baseball superstar Yogi Berra is reported to have said, ‘it tough to make predictions, especially about the future’. All in all, BaU is a bit of a hodge-podge of anecdotes and factoids garnered from SAP’s extensive online and conference presentations. It merits a read if you have the time and want to bone up on some ideas for ‘unusual’ business. But as a testimony to the achievements, philosophy or future of the EU’s software champ, it falls some way short.
* Fear uncertainty and doubt
** Starbucks is an SAP client as in this ‘SAP storytelling’ video.
Cognite gets a prize for being fast out of the starting blocks with (inter alia) an ‘Open Letter to Customers’
from CEO Girish Rishi who gushes, ‘This is the moment that Cognite was
built for’. AI and large language models are the ‘next horizon for
democratizing data’. However, ‘almost accurate’, ‘probabilistic’ data
(read ChatGPT) produces ‘hallucinations’. Moreover, ‘putting
proprietary enterprise data or into the public domain is nothing short
of corporate blasphemy’. The answer, is Cognite AI,
a suite of generative AI capabilities atop Cognite’s Data Fusion
industrial data ops platform. Elsewhere, Cognite CTO Geir Engdahl
expects ChatGPT to ‘finally unleash the iPhone moment in digital transformation’.
Siemens has likewise reported on a Teamcenter app for Microsoft Teams that leverages the Azure OpenAI Service to help write code* for factory automation. Teamcenter for product lifecycle management now leverages the OpenAI language models and other Azure AI capabilities. Interestingly, Siemens’ system is bi-directional. Engineers can report product design or quality concerns using natural speech which is parsed by OpenAI, summarized and routed to the appropriate expert. The system also has a multi-lingual capability. Siemens and Microsoft are working to accelerate programmable logic controller code development, generating PLC code from natural language inputs.
* Using ChatGPT to write code was also featured in our report from the Rice HPC in Energy conference elsewhere in this issue.
There is also some interesting pushback from the NLP* crowd on ChatGPT. Notably from Arria whose business involves creating text from data and queries, rather like CGPT in fact. Arria recommends going with the flow, adding its language generating capability to ‘emerging’ generative AI technologies, like ChatGPT. However, Arria warns there are clear differences between the two technologies. CGPT is not a solution for addressing the complex, mission-critical challenges that businesses are currently facing. Arria’s models are ‘predictable, controllable, auditable, and 100% accurate’. CGPT’s accuracy is limited and results are ‘unpredictable’.
* Natural language processing.
‘Everyday AI’ boutique Dataiku opines that large language models (LLM) plus its own platform make up the perfect pairing. LLMs can be used in the enterprise either by making an API call to a service or by downloading and running an open-source model in locally. The company argues that LLMs can be computationally intensive and that a smaller language model addressing a particular task may be better. Dataiku has trialed OpenAI’s GPT-3 model to query the contents of its own documentation, knowledge base and community posts. The results were ‘impressive’, providing easy-to-understand and helpful context. Users reported that it was more effective than simple links to the ‘highly technical’ reference documentation. The company has also proposed a generative AI cookbook using an open-source LLM (Hugging Face) in parallel with its own API.
If you really want to get under the hood of the generative AI engines you should read Stephen Wolfram’s writing on ‘What is ChatGPT doing and why does it work’. A word of warning though. Just like looking under the hood of your car, you may not understand all that you see!
For the DIY enthusiasts, and admittedly more vanilla than generative AI you might like to see what is possible today using entry-level hardware and a few lines of code. Nurgaliyev Shakhizat, writing on Hackster presented TinyML, an always-on audio classifier using synthetic data’. The system uses machine learning to recognize and classify audible events. Hardware consists of an Ardunio Nicla Voice board. Shakhizat’s setup is designed to recognize a name in ambient speech, but one can imagine such a system being used to capture acoustic anomalies from machinery. Training was performed on the Edge Impulse platform before model deployment on the Nicla. The latter comes with a package of sensors. Along with the microphone, it features a smart 6-axis motion sensor and a magnetometer, ‘making it the ideal solution for predictive maintenance, gesture/voice recognition and contactless applications’.
Comment. Despite the generally held view, the ‘generative’ AI phenomenon did not arrive in virgin IT territory. Many companies have leveraged what was up to know known as natural language processing to attempt to extract useful information from a text corpus. ChatGPT and the like have indeed shifted gear and are capable of answering questions with plausible, even authoritative answers. But these are not always correct. See our report from the Rice HPC in Oil & Gas event elsewhere in this issue and also Neil McNaughton’s editorial where he catches ChatGPT out, twice. And, as reported by Bloomberg, Microsoft’s Bard ‘readily churns out conspiracy theories’. CGPT’s errors, lies and fabrications are described, as ‘hallucinations’ which is rather charitable in our opinion. All of this is understandable since these tools are trained on the massive amount of information of doubtful provenance that is available online. Vendors are now re-training these engines on their own reduced information resources to make things more ‘accurate’. But as the training corpus diminishes, you may be getting close to the point where you would rather see the original source documentation or evidence. Which would you rather, ask ChatGPT to explain why the pressure gauge is reading off the scale? Or look quickly at what’s bubbling up in the mud pit*.
* A true North Sea story by the way when a geologist came into the mud logging cabin to find the mud logger attacking the pressure gauge with a screwdriver!
A TotalEnergies job ad for the position of data governance and culture lead in Aberdeen, UK illustrates how data management has evolved from its early geoscience focus to become much more IT-centric. The ‘mid-senior level’ position in TE’s Technology Data and Innovation (TDI) unit asks for a strong technical background in data governance and excellent communication skills to ‘drive a change in data culture across TEPUK’.
The new TDI function is ‘defining and implementing a data foundation based on the principals of the data mesh’. Data governance is described as growing discipline in TE that is ‘fundamental to the facilitation and utilization of data as an asset’. Leading a small data governance and culture team, the role involves the definition and ‘animation’ of a data culture including a data governance roadmap for geoscience and surface data management leveraging a ‘data domain platform’.
Requirements for the new position include an understanding of data governance concepts and practices, ‘especially the Data management association’s Data management body of knowledge (Dama-Dmbok*), a BSc degree in data management or computer science with a ‘significant professional experience in exploration and production’. Also sought is a knowledge of data modelling techniques and cloud platforms (Microsoft Azure), ‘Agile’ methodology, Power BI and Office 365 tools.
Applications for the post are now closed but at the time of writing there is a similar role open at Wood.
* Reviewed In Oil IT Journal back in 2010.
The 17th annual Energy High Performance Computing Conference,
hosted at Rice University by the Ken Kennedy Institute was a
rather subdued affair with attendance well below the announced ‘500
leaders and experts’. While the title and intent of what was previously
the Oil and Gas HPC Conference has shifted to ‘Energy’, its geophysical
ancestry and footprint is still in very much in evidence. Stalwarts
John Etgen (BP) and Henri Calandra (TotalEnergies)
reminisced on past seismic glories and attempted to figure out ‘what
really matters to our industry in HPC and exascale computing’. For the
last 30 years, HPC has been the perfect tool to integrate more physics,
implement more complex algorithms and solve our (seismic) problems. The
large capital intensive bets on HPC infrastructure can ‘make a real
difference’ if they are done right, but can also ‘crater the whole
company if you get them wrong*’. HPC systems evolved in parallel with
acquisition systems, larger field data sets and the evolving hardware
landscape, particularly with the advent of the GPU. The high point of
seismic HPC was successful full waveform inversion, circa 2017. Etgen
mused that today, the full physics approach is being eclipsed by
machine learning at least in the eyes of the younger generation.
Seismic exploration is still the dominant workload although other HPC
use cases like wind, solar and new material research in chemistry mean
that seismic will become a smaller slice of the pie. Analytics and AI
are the direction things are ‘mildly trending’ pending the arrival of
‘something disruptive to that none of us are anticipating’! Managers
(like Etgen and Calandra) of large in-house compute centers are
naturally interested in how cloud computing will impact their
bailiwick. For Etgen, ‘We are at peak cloud hype right now’. Interest
in the cloud has risen greatly but it is likely to stay flat in the
future. For those working in HPC, new energies will offer new ways of
making money. Maybe not as much money as the hydrocarbon business has
done in the past. But for those who can pose fundamental problems and
challenges in a physical language, solve them with numerical
algorithms, make predictions and inform investments judgments and
decisions, ‘you have a bright future ahead of you and you will always
be in demand’.
* Some chapter and verse on this would haven been nice although almost any defunct seismic contractor might fit the bill!
Samir Khanna (BP) addressed the role of HPC in the energy transition announcing with some embarrassment that much engineering today doesn’t actually need HPC! HPC applications can be found in complex situations involving multiphase fluid transport, pigging operations, digital rock analysis and wind turbine models. Khanna showed an impressive animated PowerPoint slide with spinning turbines, turbulent fluid flow, electrolyzers and more. Offshore wind is the fastest growing business in BP. Turbines and windfarms are getting bigger and ‘there are lots of things we don’t understand’. How turbines interact, how windfarms interact. In particular, blade rotation is not usually taken into account in turbine models. BP has developed a fully-coupled dynamic model of a floating offshore wind turbine* for better risk/resource assessment. A somewhat more esoteric application was developed for BP’s Castrol unit where the formulation of Castrol ON has been adapted for use in data center cooling applications including crypto mining! In the Q&A, Khanna revealed that BP uses a commercial base for its modeling, adding its own user defined functions. He did not say which package this was (incredibly, he was not asked during the Q&A although we put in a query via LinkedIn that is so far unanswered). He also addressed the issue of sharing the BP seismic-designed HPC installation with computational fluid dynamics work. ‘Hardware that is good for seismics is not necessarily good for us’.
* Based on the U. Maine’s VolturnUS design.
Dan Stanzione offered the view from the Texas Advanced Computing Center. The day before his talk, NAIRRTF, the National AI research resource task force recommended a multi-billion research program to Congress. Rather tongue-in-cheek, Stanzione decided to ask ChatGPT to defend the proposal. Its response was the ‘same as most congressmen would give, just a little more polished’. ChatGPT cited a McKinsey study that promised a ‘$13 trillion value’ from future AI applications and ‘lots more apple pie’. So what does this mean for HPC? It is foolish to separate the two. AI and conventional HPC both are multiplying matrices. How long before ChatGPT realizes that ‘they are sparse’? How will AI change your job? Standzione has tested ChatGPT on writing matrix multiplication code. It does a good job and can port code to many different languages, some quite specialized CUDA, AMD HIP and even Argonne’s PETSc scientific libraries . It won’t work for thousands of lines of code, but ‘go function by function, fix minor bugs and it will be a lot faster than porting the code by scratch*’. Things are getting to the point where ‘if you are programming and you are not using it, this is kind of malpractice’! ChatGPT was also tested as a provider of technical support? It did very well, most advice was correct and pointed to the right documentation. However it can fail spectacularly. Viz the query, ‘Can you use OpenOnDemand at TACC?’ The correct answer is No. But CGPT said yes, and came back with detailed instructions, made-up documentation references and a plausible but fictional URL. ‘We have reached a milestone in AI. Machines now lie with confidence**. They are also bad at math.’
* This does raise the question as to whether writing code from scratch is really slower than ‘fixing minor bugs’.
** See too the editorial in this issue on ChatGPT as a world-class bullshitter!
The sessions from the 2023 Rice Energy HPC Conference are online on the Ken Kennedy Institute YouTube channel. Next year’s Rice HPC in Energy Conference will be held in Houston on March 5-7, 2024. Sign up for the mailing list here.
For those of you who may have been misled by our 2021 report on the ‘soon-to-be-shuttered’ Energistics organization, Ross Philo (ex-Energistics CEO, now with Bardasz) has kindly provided the following update that we have edited.
First, to correct a commonly-held misconception, Energistics is not now ‘part of OSDU’ or ‘under the control of OSDU’. Energistics is a stand-alone legal entity, with its own board of directors, etc. and exists to serve its members in the pursuance of data standards. Energistics is now also an independent affiliate of The Open Group. OSDU, on the other hand, is a Forum underneath The Open Group with no legal standing on its own. The two communities clearly tap into the same pool of members, but they are distinct and different. OSDU is implementing a standards-based data platform, while Energistics is focused on defining industry standards that can be used in OSDU and elsewhere, i.e. it’s all about definition vs. implementation which are complementary but separate activities.
The recently announced ‘OSDU-compliance’ of WITSML 2.1 focused on a number of key areas. Fields have been added to Witsml 2.1 objects that mirror the OSDU schemas for object types, such as Well, Wellbore etc. Energistics pore pressure fracture gradient objects are now available in OSDU. The OSDU ‘integration type’ is now available in the 2023 Energistics Common Technical Architecture that is shared by the latest Witsml, Resqml and Prodml standards. Energistics standards can now carry OSDU object metadata in any object.
The operators participating in the Oil & Gas Data Quality group have stated that they want to move to WITSML 2.1 as soon as possible, and have asked service companies to confirm when they will be able to offer that capability.
Personally I would say that the Energistics standards are alive and well, and that TOG is proving to be an excellent custodian. However, the community as a whole is still coming to grips with not having the Energistics admin team to take care of many of the tasks required – such as promulgating the standards, developing JSON schemas, updating websites, handling certifications, developing and publishing documentation, developing use-cases, handling donated code. These are expected to be done as volunteer activities.
There is a strong push to increase participation from the OSDU community in the Energistics effort, since that benefits both sides. The standards are a key part of what OSDU is implementing, so it makes sense that the OSDU members take a more active role in helping to define how those standards may need to evolve. The Open Group has made it easy for companies to participate in Energistics – essentially, those joining OSDU can take part in the Energistics community at no additional cost, thus giving them a 2-for-the-price-of-1 membership benefit. The idea was that the people most interested in contributing to the development of the Energistics standards would come from the OSDU membership, so we wanted to make it easy, and cheap, to participate.
~
On the topic of Witsml 2.1, Philo also reported that his company, Bardasz is ‘first-to-market’ with a Witsml 2.1 solution that can convert on-the-fly from between different versions of Witsml. The Bardasz Witsml 2.1 SuperStore (also said to be ‘OSDU compliant’) brings the increased speed of 2.1 to legacy applications running on earlier Witsml versions. Philo observes, ‘Despite such clear advantages, there has been something of a chicken-and-egg situation. Operators and service companies want the benefits of the new standard, but they have lots of existing applications and data stores that were created around earlier versions. Companies were concerned that adoptingWitsml 2.1 would mean having to update and rework all those legacy applications and data stores. This is where the Bardasz Witsml SuperStore can help move the industry forward. The SuperStore is the first commercially-available solution that offers the benefits ofWitsml 2.1 and ETP 1.2’. More from Bardasz.
While the Industry section of the latest TOP500 list of worldwide supercomputers is dominated by oil and gas companies (6 out of the top 10), there are no new O&G machines in the list. Top performer is ENI’s 2020 HPC5 Dell Power Edge based system at a reported 35 petaflops closely followed by ExxonMobil’s 2022 Discovery from Cray at 26 pf. Saudi Aramco has two machines in the top 10 (Dammam-7 and Ghawar-1), both Cray-based systems. Petrobras’ Pégaso from SuperMicro and TotalEnergies IBM-based Pangea III complete the Industry top 10. Systems are mostly based on AMD CPUs with Nvidia GPU co-processors. All run Linux of course. A word of caution: the Top500 is largely self-reported, many HPC operators may not wish to join in the competition. Moreover, evaluation is based on a somewhat artificial Linpack benchmark so your mileage may vary. More on the latest developments in HPC at Top500.org where you can download the list as an Excel file.
Altair has announced SimSolid Cloud, a cloud-native edition of its structural simulation software. SSC ‘eliminates’ geometry simplification and meshing, said to be the two most time-consuming and expertise-intensive aspects of finite element analysis. More from Altair. In separate announcements, Altair has also unveiled the 2022.3 edition of its Simulation flagship with new ‘digital thread’ workflows and a new DesignAI tool for improved design-of-experiment outcomes.
Esri’s new ArcGIS AllSource desktop software simplifies access to disparate sources of data. Built-in links provide access to 2D and 3D maps, timelines, imagery, graphs and video to ‘uncover patterns, trends and relationships in data to inform decision-making’.
The EasyCopy Company has released EasyCore 3.1 with EasyDB, an embedded SQL database for integration, data search and access.
Flare Solutions has announced FLARE MiNDR, a search, tagging, and compliance tool designed to streamline the way operators find and tag information prior to submitting it to the UK National Data Repository. Search spans multiple file shares, SharePoint sites and physical data listings. Found information can be classified with ‘C-Tags’, derived from metadata such as file names and paths, along with scraped file content. Users have the final say in what is submitted to the NDR. More from Flare.
Inductive Automation has rolled out a Cloud Edition of its industrial data connectivity toolset to ‘complement and extend’ a standard on-premise deployment. ICS is currently available through the AWS Marketplace, an Azure edition will be available ‘real soon now’.
Leostream has announced secure remote computing for AWS with ‘zero-trust network access’, an addition to its eponymous remote desktop platform. The solution targets, inter alia, remote workers who ‘increasingly need privileged access to manage sensitive company resources’. More from Leostream.
Nvidia has announced PyTriton, a ‘Flask-like’ interface that lets Python developers use its Triton inference server to deliver ‘anything, be it an AI model, a simple processing function or an entire inference pipeline’. Native support for Triton in Python enables rapid prototyping and testing of machine learning models with efficient hardware utilization.
Petrolink International has released PetroVault, a ‘next-level’ data management solution with Energistics Witsml 2.1 and ETP 1.2 support. PetroVault aggregates real-time drilling and subsurface logging data from remote locations. The platform is compatible with third-party applications and ‘encourages the development of new solutions tailored to industry needs’. More from Petrolink.
Stark Reality has rolled out ‘Surface Segments’, a commercial plug-in for users of dGB’s Open dTect seismic software. The new tool generates attribute volumes of specific event types such as peaks, troughs, and zero crossings. Such volumes can reveal thin-bed spatial continuity and help distinguish horizons that are amenable to auto-tracking from those which resemble ‘moth-eaten sweaters’.
Teledyne FLIR’s new G-Series optical gas imaging Cameras offer ‘superior’ gas quantification and wireless data transfer. The devices empower everyday users as well as leak detection and repair specialists in oil, gas and other verticals. Images and videos can be uploaded to the ‘FLIR Ignite*’ cloud from the field.
* Not perhaps the best choice of words.
Trimble’s new X9 Laser Scanner. The ‘reality capture’ solution for survey, construction and engineering professionals includes X-Drive technology for automatic instrument calibration, survey-grade self-leveling and laser georeferencing. Bandwidth is rated at a million points per second with a 150 meter range and 2mm resolution.
i.safe Mobile and Senseven have teamed on Valve Sense, a mobile inspection system for maintenance in hazardous areas. The ‘software and AI-based’ system uses the acoustic emission (sound?) method and can be used by maintenance personnel without special training.
Infoscience Technologies has released GeoClassifier V4, a natural language processing algorithm written in Python which automatically classifies subsurface and well-related documents. The tool automates metadata capture for file share cleanup activities, scanning projects, records retention and compliance in SharePoint and other document management systems. V 4.0 includes a geotagging function and personal information detection for compliance with data privacy legislation. Infoscience recently receive a King’s Award for Enterprise Innovation.
Presenters from Nvidia, Stanford University and Caltech have presented a model (which we recommend reading for its comprehensive illustrations) for the use of digital twins in carbon capture and storage. Currently, some 30 large-scale CCS operations sequester some 40Mt of CO2* per year. Development of this technology is set to ‘grow rapidly’ in the coming decade, ‘but this promising solution has yet to prove that it can be industrialized at an acceptable cost’. Seemingly, a key challenge for keeping CCS solutions economical is the cost of proving the duration and reliability of storage using numerical modeling**. Traditional simulators for carbon sequestration are time-consuming and computationally expensive. Enter Machine learning models that promise ‘similar accuracy and significantly reduced time and costs’.
The post, ‘Accelerating climate change mitigation with machine learning: the case of carbon storage’ describes the use of Nvidia Modulus and Omniverse tools to investigate various geological scenarios and injection patterns. The post describes a ‘physics-informed machine learning’ approach that leverages a Fourier neural operator (FNO) to predict the 3D reservoir behavior. The nested FNO approach is claimed to offer an ‘inference speed that is 700,000x faster than a state-of-the-art numerical solver’. One assessment of uncertainties in capacity estimation and injection designs that would have taken nearly two years with numerical simulators ran in ‘only 2.8 seconds’.
The trained nested FNO is hosted on a public GPU-based web application such that users can construct any random combination of reservoir condition, injection scheme, and permeability field characteristics and obtain instantaneous predictions of gas saturation, pressure buildup and sweep efficiency estimates. The web app promotes equity in CO2 storage project development and knowledge adoption. This especially benefits small to mid-sized developers as well as communities that want an independent evaluation of projects being proposed. High-quality forecasts were previously unattainable for these important players. The ML framework leverages Nvidia’s Modulus and Omniverse platforms. Modulus is a physics-ML framework for developing physics-based, machine-learning models. Modulus takes both data and the governing physics into account to train a neural network that creates an AI surrogate model for digital twins. Omniverse adds a virtual reality based GUI for interactive exploration of digital twins using the surrogate model output from Modulus.
The nested FNO model on is available on Git. Read the original paper here on ‘Accelerating carbon capture and storage modeling using Fourier neural operators’.
* World carbon emissions in 2022 were estimated at some 37 gigatons.
** Computer modeling may not be a major cost compared with the astronomical sums involved in engineering CCS!
Deborah Sacrey has been voted president-elect of AAPG for the 2023-24 term. Only 20% of AAPG’s 10,021 eligible voters cast their ballots.
Asset Guardian Solutions has appointed Claudine Beaver as senior account manager. She was previously with Codeweaver.
Merrill ‘Pete’ Miller has been named CEO and chairman of Levare, a rebrand of Borets International. He was previously with NOV.
Brüel & Kjær Vibro has named Ingo Anders as CEO. He hails from Atlas Copco.
Sophie Brochu has been appointed to the CGI board of directors. She was previously with Énergir.
CybeReady has appointed Dirk Rausse to regional sales director for the DACH region (Germany, Austria, Switzerland).
Johan Knijp is the manager of DNV’s new Technology Centre located on its Groningen, NL Zernike campus. The ‘sustainable’ facility is equipped with the latest testing equipment to support DNV’s research into the decarbonization of energy systems.
Following Sebastien Lacaze’s precipitous departure from Eliis, co-founder Fabien Pauget has been named President. Benoît Matha is now CEO and François Lafferriere has been promoted to chief revenue officer.
The Enbridge board has appointed Patrick Murray, currently SVP and chief accounting officer, to succeed Vern Yu as EVP and CFO.
The IOGP standards committee has named Cecilie Haarseth of ExxonMobil as its new chair.
Fleet intelligence software house IntelliShift has appointed Jeff Cohen as chief revenue officer. He hails from Trackunit.
Stig-Morten Knutsen, a geologist with the Norwegian Petroleum Directorate has been re-elected to the board of the UN expert group for resources (UNECE/EGRM).
In a separate announcement, the NPD has confirmed Torgeir Stordal’s appointment as Director General. He has been ‘acting’ DG since June 2022.
Following a four month global search, Simon Seaton has been taken on as CEO of the Society of Petroleum Engineers. He was previously with Sodexo’s Energy & Resources unit.
PIC Group, an international energy services provider, has appointed Felix Alejandro to VP global human resources. He joins from InVeris Training Solutions.
Tareq Kawash has been appointed group chief executive and executive director of Petrofac. He hails from McDermott.
Victor Ponsford tells us that he is leaving Rystad Energy to relocate to London where he will be fundraising for fleeing Afghans who previously worked for the British government and working on his own communications consultancy in the energy/climate space.
SCS Technologies has appointed Cody Johnson as president and CEO.
Sarah Hewitt is joining the Society of Exploration Geophysicists as MD of the SEG Foundation. She hails from the IPAA.
Valaris has announced that ARO Drilling, its 50/50 JV with Saudi Aramco, has appointed Mohamed Hegazi as CEO. He was previously CEO of TGT Diagnostics.
Speaking of whom, TGT Diagnostics has just signed an agreement to build a new technology hub in Abu Dhabi, ‘providing essential diagnostics to keep oil and gas wells safe, clean and productive’.
Tellurian has hired investment banker Simon Oxley as its new CFO.
Meredith Talley has joined Valor as director of oil and gas services. She was previously with Quinn Resources.
Alexander Varlahanov has been promoted to TCO at Velo3D.
Workrise (previously RigUp) has hired Ed Neumann as CFO and promoted Joshua Trott to chief revenue officer.
Deaths
The Rock imaging special interest group
announced that RISIG co-founder, Dr Nick Coles died unexpectedly on
Saturday 13 May at his home in Abu Dhabi. Read RISIG co-founder Ross
Davidson’s memorial statement on the Rock imaging website.
Williams former chairman and CEO Joseph Williams died April 27, 2023 aged 89. He was the son of David Williams, one of the original co-founders of Williams. Read his obituary on the Williams corporate website.
In its 2022 Year in Review, cyber security specialist Dragos highlighted the risk to the oil and gas industry from the Bentonite malware, a highly opportunistic malware that is used for data exfiltration, espionage and IT compromise. Bentonite can deploy wiper malware and be used in ransomware attacks. To combat this and other control system risks, Dragos recommends ‘consequence-based vulnerability management’ addressing the 2% of vulnerabilities which represent immediate risks.
Anomali and ONG-ISAC* have announced a joint initiative to combat cybercrime in the oil and natural gas sector. The Anomali cyber intelligence platform will be used by ONG-ISAC security analysts to identify attackers that target critical infrastructure. The platform provides high-fidelity signals of attack, detects threats and prioritizes response. More from Anomali.
* The Oil and Natural Gas Information Sharing and Analysis Center.
SentinelOne’s new ‘Singularity’ security data lake (SDL) has received a strong endorsement from National Oilwell Varco CISO John McLeod who opined, ‘We spent years looking for a replacement for our SIEM solution and we found one in the SentinelOne SDL, which provides more storage and faster searches in a highly cost-effective manner.’ The cloud-native solution provides a comprehensive view across security ecosystems, enabling organizations to quickly uncover threats and respond to them in a real-time, intelligent and cost-efficient manner. More from SentinelOne.
In its year-end 2022 review, ISAGCA*
reported ‘continued efforts’ by member Petronas that have resulted in
Standards Malaysia’s adoption of ISA/IEC 62443 as a national Malaysian
Standard. Also in 2022, ISAGCA ‘stood up’ ICS4ICS, the incident command
system for industrial control systems. More from ISAGCA.
* The ISA Global Cybersecurity Alliance.
Inductive Automation reports that its company-wide secure software development lifecycle (SDLC) was recently assessed by Exida to meet the requirements of the ISASecure security development lifecycle assurance (SDLA) 3.0.0 and IEC/ANSI/ISA-62443. The latter is a set of standards maintained by the ISA99 committee on security for industrial automation and control systems that make up a ‘comprehensive’ cybersecurity framework which shares responsibility between asset owners and systems integrators.
DNV has just published a whitepaper covering the new NIS2 European cyber security laws, their implications for industrial companies and how to achieve compliance. NIS2 results from a January 2023 EU directive that member states must transpose into national laws by late 2024. NIS2 is described as ‘NIS on steroids’, suitable for an era in which organizations operating essential services need more than ever to manage the cyber risk of both their IT and the control systems that manage, monitor, automate and control industrial operations. Increased risk arises from greater connections between OT/IT and the internet. Oil and natural gas falls under the NIS Directive Annex II covering ‘essential services and digital service providers’.
The UK Petras center of excellence for IoT system cybersecurity has just published its Deployment Guidelines for machine learning-based intrusion detection systems for industrial control systems. The guidelines have been developed under the Petras ‘Elliot’ project and are claimed to guide tool selection from the ‘plethora of commercial and open-source options’. The report covers the selection and deploying of ML-based anomaly detection tools and their limitations. The Guideline can be downloaded here and a short summary of the key recommendations is available here.
SEI, the Carnegie-Mellon Software Engineering Institute has released a new version of its energy sector cybersecurity capability maturity model (C2M2). The new release has updated two-thirds of the 2012 model and merged the previously separate models for the electricity and oil and natural gas subsectors. A Cybersecurity Architecture domain was added, and the third-party risk management domain has been refreshed in the light of increasing supply chain cybersecurity risks. The model is now accessible via the SEI’s HTML-based C2M2 tools.
The German Namur standards body has just published WG Position 1.4, an analysis of the use of ‘asset administration shells’ (AAS) as used in the process industry. The AAS underpins the EU Industrie 4.0 concept which rolls-up IoT and digital twins. WG 1.4 defines roles, assets, device types and instances along with the information exchange and interactions between assets. File-based engineering document exchange according to VDI 2770 from the ‘digital data chain consortium’ also ran.
A demonstration at the 2023 Digital Industries Nuremberg tradeshow had Siemens and other industrial partners leveraging the AAS to exchange data between digital twins in manufacturing. More from Siemens.
IOGP JIP36 a.k.a. Cfihos , the capital facilities information handover spec has released Cfihos version 1.5.1. This is described as a minor release of content upgrades. More in the release notes here. All Cfihos documents are available to download and use free of charge.
The IEA Greenhouse Gas unit has published Technical Report 2022-11 on ‘Applying ISO Standards to Geologic Storage and EOR Projects’. The study, performed by DNV on behalf of IEAGHG, aimed to synthesize two ISO Standards for geological storage of CO2: – ISO 27914:2017 (Carbon dioxide capture, transportation and geological storage) and ISO 27916:2019 (Carbon dioxide capture, transportation and geological storage with EOR) to provide a ‘high-level understanding of the content in an easily digestible format’. The report concluded that ‘the standards are complementary with minimal overlap, as was intended by stakeholders’. Moreover, similarity between regulatory regimes for oil and gas projects and CO2 storage projects ‘may mean that existing petroleum regimes, complemented by the ISO standards, could form a specific regulatory regime for the geological storage of CO2’.
A cryptic post by TotalEnergies’ Jean-Charles Leclerc reported on the rebranding of ISO15926-part14 which is now to be known as IDO, the Industrial Data Ontology. There is as yet nothing about the change on the ISO TC 184, (parent of ISO 15926) web page. Work on IDO is to be hosted in a new information exchange framework for Ontology Based Interoperability a.k.a. OBI.
The new ISO 19157-1 2023 standard for geographic information quality is said to establish the principles for describing the quality of geographic data. This document is applicable to data producers providing quality information to describe and assess how well a dataset conforms to its product specification and to data users attempting to determine whether or not specific geographic data are of sufficient quality for a particular application.
Report 796 from the IOGP JIP33 is an implementation guide to its standardized procurement specifications. The document describes best practices for implementing the IOGP specs across operators, EPC contractors and suppliers in the petroleum and natural gas industries. IOGP has also release the third edition of its Report 456, Process safety - recommended practice on key performance indicators. Report 456 enables companies to establish effective leading and lagging indicators that assess the health of barriers that manage the risk of process safety events, particularly those that could result in a major incident. The third edition incorporates changes from the latest version of API RP 754 to facilitate a unified approach to process safety across different sectors of the industry. The update includes the IOGP’s Well Control Incidents classification system to help members report such events more consistently. Both reports are a free download from the IOGP library.
A new framework from NIST addresses risk management in artificial intelligence applications. The Framework, AI RMF 1.0 provides guidance for voluntary use by organizations designing, developing, deploying or using AI systems to help manage the many risks of AI technologies. Development followed a direction from Congress for NIST to develop the framework in collaboration with the private and public sectors. In parallel NIST has launched the AIRC, a ‘Trustworthy and Responsible AI Resource Center’, a ‘one-stop-shop’ for foundational content, technical documents, and toolkits to enable responsible use of Artificial Intelligence.
Addressing the advent of ChatGPT and generative AI, The Open Group has spoken on the topic of natural language generation (NLP). The pronouncement comes in the form of a blog post by TOG member Chris Harding from Lacibus. The NLP that underlies ChatGPT has huge potential for disruptive change in many areas, including standards development. Harding argues that such work should be added to the TOG’s Data Integration Work Group’s charter to research use cases and current trends in data integration and also to review the corpus of TOG standards to identify relevant clauses working with a prototype ‘Ideas Browser’ that scans web pages and generates summaries with the language model used by ChatGPT.
At The Open Group’s recent London Summit, Steve Nunn, President and CEO, announced the award of the TOG ‘most valuable contribution as an organization’ to Shell. Shell has been a TOG participant since 1990 and since has been ‘one of the most contributory energy companies’. Shell has participated in the TOG’s Customer Council, the Security, Architecture and ArchiMate Fora.
The OPC Foundation, has added a RESTful interface to its OPC UA flagship standard. The addition was requested by IT companies to allow OPC UA operational data to be used in ‘concepts’ like Data Spaces, Digital Twins and the Metaverse. Such solutions often share information using proprietary HTTP REST-Interfaces. OPC UA with REST is ‘less likely to be adopted at the field level, but in gateways and cloud services’. Cloud companies can now access standardized information and use it efficiently to optimize the business processes of their customers and ‘unlock new use cases’.
The US Government has just released its National Standards Strategy for 2023, covering critical and emerging technology (CET). The 14 page document has it that ‘Standards development underpins economic prosperity across the country and fortifies US leadership in the industries of the future’. The US has engaged in standards for many years but now, it faces challenges to its standards leadership and to the core principles of international standard. Competitors are ‘actively seeking’ to influence international standards development, particularly for CET, to ‘advance their military-industrial policies and autocratic objectives’ these include ‘blocking the free flow of information and slowing innovation in other countries, by tilting what should be a neutral playing field to their own advantage’. The US Government has China in its sights. The EU gets a more friendly treatment, through the US-EU Trade and Technology Council Strategic Standardization Information mechanism.
Chris Barton (Wood) announced that global upstream capex is estimated at $572 billion for 2023, a 13% hike year-on-year. Offshore capex is up 17% to $178 billion, heralding ‘the return of exploration’. Investment levels are now roughly the same as pre-covid, although well down on 2014. South America and the Middle East currently receive the lions share of offshore investment. Barton concluded that deepwater activity is ramping up again and ‘some good years are to be expected’. As ever, sharing the spoils between operators and contractors is a source of ‘natural conflict’ and the severe economic cycles have made the operator/contractor equilibrium ‘fragile’. Operators should think of their contractors as a ‘source of strategic advantage’. Rather than playing the field, operators need to deepen their relationships with contractors to ‘create a mindset where they jointly plan their futures together*’.
* We have heard similar sentiments expressed many times by geophysical contractors. They were never very persuasive.
Robert Cousins presented Baker Hughes’ Engage Subsea platform, a digital single source of truth for subsea operations. ES connects to multiple primary data sources (SAP, Oracle, OPC UA and more) feeding apps through a service layer and digital twin. The solution covers asset health, integrity monitoring, remote workflows and robotics. The system benefits day-to-day operations and supports decarbonization and emissions reduction efforts.
Thierry Boscals de Real presented TotalEnergies target for net zero by 2050. Oil production is to peak in 2025, gas (the transition fuel) is rising steeply through 2030. Green energy is set for a significant, if more modest hike by 2030. One contribution to TotalEnergies’ greening offshore is its ZerEm (zero carbon emissions assets) project. This addresses the scope 1 and 2 greenhouse gas emissions from exploration and production. ZerEm in the offshore includes exhaust gas sequestration from compressors, wave and well heat energy recovery and the integration of GHG calculations in production forecasts
Burkhard Sommer from SLB’s One Subsea unit presented on all-electric connectivity for super long offset tie-backs. In the North Sea, ‘super long’ can mean as much as 250km as offshore fields are effectively operated from the shore. In the Eastern Mediterranean this may mean 400km offsets in the (up to) 2500m water depths of Europe’s future gas hub. Long offset electricals mean less infrastructure, less CO2 and a lower environmental impact. Optimizing the ‘power cord’ is key – leveraging high voltage DC systems. Comms can be achieved with fiber or satellite, possibly using low power wave or solar generation. The latter are described as a ‘perfect match’ for decentralized systems such as CCS.
On the topic of electricity, Jerome Anfray (TotalEnergies) and Nicolas Congar (Sofresid) presented an Organic Rankine Cycle (ORC) power system for subsea electricity generation. These are typically used in industry to generate electricity from waste heat in flue gasses and other sources. In a deepwater context, the temperature difference between hot well fluids and cold seawater could be used to power an ORC. Systems are limited to higher temperature (>100°C) wells and are capable of powering applications (well head command, chemical injection, subsea processing) using power in the 100kw to 1mw range. The systems are currently at ‘low maturity’ and the speakers called for contractors and operators to work together on this and other ‘net zero’ solutions.
Kevin Kappes and Kaitlin Haymaker (both with OneSubsea) presented on the merits of the IOGP’s JIP33 standardized approach to subsea production. The idea is to strike a balance between idiosyncratic operator specs and ‘over-restrictive’ standards. The JIP33’s configurable standards approach is said to square the circle and deliver ‘subsea performance agility’. Curiously, the IOGP’s companion spec Cfihos from the JIP 36 did not get a mention.
David Saul (ExxonMobil) and Ryan Gola (BP)
introduced the embryonic IOGP CCS committee. This has set up a CCS
Expert Group and established terms of reference. These include work on
CCS regulations and guidance for operators. More from IOGP.
Eirik Walle presented ClampOn’s eponymous ‘non-intrusive’ flow temperature meter and showed how the ingenious device has been calibrated using Dassault Systems’ SolidWorks simulator. ClampOn uses sensitive temperature variations on the pipe wall to figure fluid temperature inside the pipe. The system has been installed as a mod to ClampOn’s pig detector.
InteliWell, a joint venture between Nekkar’s Intellilift AS subsidiary, Transocean and Viasat’s RigNet
unit, has secured its first contract for its ‘game-changing’ eponymous
drilling automation solution. Transocean, on behalf of clients
Wintershall Dea and OMV, has contracted the JV to deploy InteliWell on
the Transocean Norge rig. InteliWell is a suite of applications that
provide an end-to-end, closed-loop workflow for well-construction
planning, execution and monitoring. More from IntelliWell.
L&T Technology Services has used Altair’s
simulation platform in its work on the ‘phygital’ world, blending
physics and digital technology in engineering and design. HPC and
digital twins accelerate innovation and improve customer experiences
while cloud computing is ‘transforming the capabilities of data-driven
systems’. Listen in to LTTS CTO Ashish Khushu’s talk at the Altair 2023 Future Industry conference and visit LTTS’ oil country landing page.
The Norwegian Petroleum Directorate has contributed its library of oil and gas reports to underpin an AI demonstrator developed by FabriqAI and Kadme. Users can ask questions about production licenses, areas, prospects, wells and discoveries from some 600 reports documenting work performed by companies on the NCS. More from NPD.
UK-based 3D technology specialist, Eserv, is to digitize the Neptune-operated D15-A and K12-C offshore platforms, ‘boosting’ Neptune’s global portfolio of digital twins. Neptune’s 14 digital twins are claimed to ‘enable engineers to carry out traditional offshore work from an onshore location, accelerating work schedules and reducing costs’. Watch the video here. Stop-press: Neptune Energy has just been acquired by ENI in a $3.9 deal.
Aker BP has deployed Ivalua’s source-to-contract and supplier management toolset. KPMG caried out the systems integration.
Alteryx has achieved ‘cloud-ready’ status for its integration with Google’s AlloyDB, a managed PostgreSQL-compatible database. More from Alteryx.
Bently Nevada has teamed with Augury on a machine health as a service offering. The solution covers alarm maintenance and optimization along with alarm response and root cause determination. Augury’s AI ‘detects 95% of anomalies, correctly diagnoses 99% of malfunctions and offers an ROI of 3X-10X within months.’ More from Baker Hughes.
Phillips 66 has contracted with Bridger Photonics to deploy its laser-based methane detection technology across Colorado, Oklahoma, Kansas and Texas.
OMV has renewed its multi-year contract with CGG for the operation of a dedicated processing center at its head office in Vienna for an extra three-years.
DNV has awarded metal 3D printer manufacturer Aidro an ‘industry-first’ certification for its additive manufacturing technique. Aidro’s Desktop Metal Shop System achieved the highest level of DNV’s AM manufacturer certification for laser powder bed fusion, which included ‘part qualification for critical level AMC 3’ of a 316L valve body. The solution targets additive manufacturing ‘2.0’ applications in energy, oil & gas, maritime, and other industries.
Eliis and the BRGM, the French geological survey, are working on a 111 km2 3D onshore seismic data set acquired in spring 2022, located 60 km SE of Paris. The study is to investigate CO2 storage sites under the EU Horizon PilotStrategy project. More from BRGM.
In a separate announcement Eliis revealed that Bluware’s VDS data format and FAST data streaming technology are now embedded into PaleoScan and, moreover, ‘integrated with the OSDU Data Platform running on AWS’.
Equinor’s One Subsurface community (that resulted from the 2022 merger of the exploration and petech organizations) has selected Halliburton Landmark DecisionSpace as its standard subsurface data interpretation toolbox and ‘OpenWorks with OSDU’ as its corporate database for interpretations.
Repsol has implemented the Halliburton Landmark DecisionSpace 365 Well Construction Suite as its preferred digital solution to automate and streamline the well design process. The deployment leverages Landmark’s ‘SmartDigital’ co-innovation service to deliver tailored workflows and software components for well workflows.
One Stop Systems has received certification from Nvidia for its OSS Rigel Edge Supercomputer. The Rigel houses four Nvidia boards from the HGX A100 platform.
Velo3D has teamed with UK-based PhysicsX, adding ‘AI-enhanced’ simulation to its 3D printing offering. PhysicsX’ simulation for additive manufacturing workflows are said to accelerate simulation loops, improve fidelity and ‘algorithmically explore’ complex design spaces.
SAP is migrating part of its internal IT landscape and the SAP Enterprise Cloud Services portfolio onto Red Hat Enterprise Linux. SAP is also is ‘boosting support’ for RISE with SAP for which RHEL is now ‘the preferred operating system’.
IBM’s Watson technology is to be embedded into SAP solutions to provide ‘new AI-driven insights and automation’. The Watson capabilities will power a digital assistant in SAP Start, a unified entry point for cloud solutions from SAP.
Petrobras has awarded SLB a contract to develop and implement a control system and dynamic process simulation. The solution is being developed in a partnership between Inprocess Technology and SLB’s Sensia unit.
ALNAFT, the Algerian regulator has executed a multiyear agreement with SLB for the launch of EXALT (Explore Algeria today), a gateway to Algerian subsurface data and ‘evergreening’ products.
Scepter and ExxonMobil are working with Amazon Web Services (AWS) to develop a data analytics platform to characterize and quantify methane emissions from monitoring platforms that operate from the ground, in the air and from space. A constellation of low-earth orbit satellites is planned for 2026 to enable real-time, continuous monitoring of methane emissions from oil and gas operations on a global scale. The system is currently being tested using stratospheric balloons.
Sulzer and Siemens LDA
are to provide operators of large centrifugal pumps with an ‘enhanced
digital value proposition’, combining their respective Blue Box and
Sidrive IQ IoT-platforms and services. Sulzer’s pump-specific, AI-based
analytics platform will now be complemented by predictive maintenance
data from Siemens IoT-platform. More from Sulzer.
Swiss robotics pioneer ANYbotics has secured $50M in Series B funding to further develop its four-legged robotic workforce. Walden Catalyst and NGP Capital led the round. ‘Pre-launch’ clients to date include Petronas, Shell, SLB, Siemens and Vale.
Arria’s board has ‘relieved’ its CFO and retained TechCXO, an independent consulting group to reorganize its financial accounting, reporting and compliance group. The company ‘intends to be prepared’ for an exchange listing ‘as early as is feasible’.
Energy Aspects has acquired INAS, a provider of price information drivers for energy markets.
Halliburton has acquired Resoptima, adding reservoir modeling and optimization to its DecisionSpace suite.
In status update, mCloud announced that it continues to work ‘diligently and expeditiously’ to complete its annual filings as soon as possible. In the interim the company has been granted a management cease trade order by the British Columbia securities commission.
Attensi has developed a game-based ‘safety simulator’ for Equinor. The solution underpins a training program with realistic 3D scenarios. The program is a component of Equinor’s ‘Always Safe’ professional development initiative for its workforce of over 21,000 employees in 30 countries. More from Attensi.
Blackline Safety has announced the ‘largest ever’ order from the
non-US/EU region from one of the world’s largest energy and
petrochemical companies located in the Middle East. The $1 million
lifetime value order is for Blackline’s cloud-connected,
location-enabled G7 safety wearables and G7 EXO portable area gas
monitors. More from Blackline.
The US Chemical Safety Board (CSD) has released its final report and video of the 2018 Cenovus/Husky Superior refinery fire. The report found deficiencies in Husky’s turnaround procedures and lack of process knowledge that lead to the explosion and fire. CSB Chairperson Steve Owens said, ‘Refineries with FCC units, including especially those with hydrofluoric acid alkylation units, should review our report thoroughly and ensure that they have the necessary safeguards in place to prevent a similar disaster from occurring at their facilities during shutdowns and startups. This accident could have been avoided.’
Another recent CSB report contains a safety alert on the potential hazards of emergency discharges from pressure release valves. The alert advises facilities that while a discharge from emergency pressure-relief systems can protect equipment from unexpected high-pressure events, it can also seriously harm or fatally injure workers and cause extensive damage to a facility if the discharge is not made to a safe location. The report enumerates four incidents which variously resulted in injuries, explosion and/or the release of toxic substances.
IOGP Report 2022su, ‘Safety data reporting user guide – Scope and definitions’ covers global safety incident data from members since 1985. Data is recorded in the IOGP safety database, the ‘largest database of safety incident statistics in the industry’. Scope covers worldwide E&P activities, onshore and offshore, for both operators and contractors. Data is consolidated and analyzed in terms of the frequency and severity of events by region, country, function and company. Other recent IOGP safety-related publications of note include a revised Recommended Practice on key (safety) performance indicators (IOGP Report 456 V3) and Process Safety Fundamentals. The latter are designed to support front-line staff in preventing situations that have historically resulted in process safety events.
An Insight Report from Intelex covers seven leading indicators that can drive safety improvement. The report explains how to establish leading (predictive) metrics for process safety process and how they can be used to drive improvement. A benchmark study of universal metrics conducted by Predictive Solutions is also available.
A new white paper from Moore Industries provides a brief introduction to intrinsic (IS) safety, the different components in an intrinsically-safe system and the different types of barriers. Additionally the paper outlines why selecting an ‘associated apparatus’ as the safety barrier provides the most economic and effective use of IS technology. The techniques outlined in this paper are applicable to the industrial process control sector in oil and gas production, refining and others.
The US Department of Transportation’s Pipeline and Hazardous Materials Safety Administration (PHMSA) has just announced availability of some $25 million of grant funding through its pipeline and hazardous materials safety programs. These grants are for projects that will train first responders, strengthen safety programs, improve safety, reduce environmental impacts and educate the public on local safety initiatives. Notice of the funding opportunities can be found here. More too from Phmsa.
Teledyne Gas and Flame Detection has unveiled the Spyglass SG50-F series of flame detectors with updated optics, algorithms and HD video in either color or near-infrared. Color video is said to be effective in detecting fuel fires like gasoline and jet fuel, while the near-infrared video option can detect fires caused by other fuels such as hydrogen and methanol. More from TeledyneGFD.
OEUK and Oil Spill Response Ltd. (OSRL) have initiated a framework
to help companies respond quickly to an offshore oil spill. The ‘UK
mutual aid framework agreement’ (UK MAFA) will facilitate mutual aid of
personnel between UK operators in the event of a sustained incident
response. More from OEUK.
A blog from Ken Scott and Bushnell of the US National institute for occupational safety and health (Niosh) reports from a recent meeting of the oil and gas extraction sector council of the National occupational research agenda where health and safety professionals shared how industry-wide labor force trends were impacting their work. The insightful report shows inter alia how downturns in oil and gas affect contracted employees far more that operators. In 2020, when oil and gas employment plummeted, the operator headcount was down by 8% while employment among contractors dropped by 33%. Contractors have higher rates of fatal motor vehicle crashes, fatal falls at work and total workplace fatalities. Contractor employees have reported longer shifts, longer average commutes and higher levels of exposure to various chemicals than operator employees. Since the ‘depths’ of 2020, hiring has begun again in the oil and gas industry in a competitive job market. New hires are likely to be young and relatively inexperience with again, a significant difference between operators and contractors. Read the full blog post here.
In a recent webinar, Morgan Bowling (Seeq) and Anthony Teodorczuk (SMB Offshore) presented the latter’s approach emissions monitoring. Netherlands-headquartered based SBM builds FPSOs, essentially offshore refineries combined with an oil tanker. The company is on a journey to sustainable operations as ‘regulations are evolving, sometimes faster than can be implemented’. ‘Companies need to imagine what compliance will look like in the future, and build a flexible emissions measurement framework and develop and understanding of their data’.
Following a three month trial deployment of Seeq’s hosted emissions monitoring technology, SBM signed a strategic agreement with the aim of ‘driving down’ the CO2 emissions from its fleet of leased FPSOs with a target of a 50% reduction by 2030. Case in point for the webinar is an FPSO operating offshore Brazil in 2000m of water.
Seeq’s Workbench provides ‘consistent’ Carbon KPI reporting of emissions tuned to daily operations as well as specific activities such as flaring events and cargo venting. The Workbench feeds into Power BI and PI Vision for analysis. Time series analytics leveraged technology from Norwegian Amitec. In the Q&A it emerged that this reporting is separate from reporting to the operator which is still done the ‘old fashioned way’ on paper. More from Seeq.
A blog post by Maria Simsky from Leuven, Belgium headquartered Septentrio demystifies GPS/GNSS corrections and explains why they are important for user of high accuracy positioning. The demand for high accuracy positioning is growing and spreading from automation and robotization to lawn mowers and other mass market devices. This is driving new developments in the GNSS ecosystem, as new correction services and models emerge to satisfy the demands of various industries. The GNSS corrections ecosystem is currently like the wild west, with little standardization, making it challenging for users to select and connect to the most suitable service. Enter Septentrio’s Agnostic Corrections Partner Program, a bridge between the customer, the receiver and the correction services, that allows integrators to take advantage of reliable high-accuracy receivers in combination with the best suitable correction service.
GNSS corrections are necessary to compensate for variations in satellite signals and atmospheric delays to the radio signal. To achieve decimeter-level accuracy a receiver needs to get external information from a base station or a corrections service. Today there are a range of affordable PPP-RTK (aka SSR) correction services that provide continental coverage, sub-decimeter accuracy and a fast convergence rate. These are delivered over the internet or by satellite. Inspired by the blockchain mining economy, some companies allow users to setup corrections base stations at their home or office. This has resulted in a community-powered, shared network, which has the potential for high density coverage.
The Agnostic Corrections Partner Program was set up to help users navigate the growing maze of correction offerings and facilitate the integration of the right service into the system. GNSS users can select the service which best suits their system requirement and connect it to a receiver for a high degree of positioning reliability and availability. More from Septentrio.
Researchers at the University of Houston have developed three software tools ‘to make energy industry processes more efficient’. The online calculators, developed by Mohamed Soliman, Birol Dindoruk and Utkarsh Sinha, are available to industry professionals free of charge.
The UH Viscosity Calculator calculates oil viscosity at different temperatures using a ‘full-range’ machine learning-augmented method that covers viscosities from a fraction of a centipoise (cp), to a million cp.
The UH Carbon Dioxide MMP Calculator computes the minimum miscibility pressure of carbon dioxide gas streams as used in EOR/gas injection processes. Again, an ML-augmented method was used to ‘overcome the limitations of traditional approaches’. CO2 MMP is said to be particularly important for carbon dioxide sequestration projects.
The UH Hydrocarbon MMP Calculator is likewise used in natural gas re-injection, using a ‘light gradient boost’ ML model to ‘achieve the desired pressure without needing expensive compressors or risking reservoir damage’. The UH research team is now working on a tool for measuring the carbon dioxide solubility in realistic brines with mixed salts, ‘an important topic for the evaluation of carbon dioxide sequestration in deep saline aquifers’.
The apps were built with variously Shinyapps or the Streamlit open source framework. The online calculators themselves are free to use but the code is not open source. More from U Houston.