Oil IT Journal: Volume 26 Number 1


Putting Canada’s methane solutions on the world stage

2020 LBCG Canadian Onshore Well Site Facilities Optimization & Methane Emissions Reduction Technology Conference hears from regulators, operators and service companies on the need for better monitoring, reporting and mitigation of methane and other greenhouse gasses.

A long time ago, when Oil IT Journal was Petroleum Data Manager, we categorized Calgary as the ‘E&P Data Capital of the World’. This was due to the serendipitous bequest, from a defunct big data provider of a high-performance fiber data network that circled downtown. Today, Calgary is teed-up to become a world center for methane monitoring as regulators, technology providers and academics collaborate. The Canadian federal government plans to ‘put Canada’s methane solutions on the world stage’.

Regulations

Ian Kuwahara (Alberta Energy Regulator) and co-author John Jurgenliemk (Methane Emissions Leadership Alliance*) set the scene with an exposé on current Canadian regulations. A 2018 survey by Clearstone determined that Alberta’s three main sources of methane emissions are legacy pneumatic devices**, fugitive emissions and venting from tanks and well casing. AER has addressed the issue with Directive 017 and how-to Manuals 015 and 016 for reporting and management. Directive 060 (2020) added fuel and flare gas rules, extending regulations to smaller plants. Today, operators must have a methane reduction retrofit compliance plan (MRRCP) in place. Rules specify limits for each methane source according to asset age. Nnew plants must comply with stricter limits. Fugitive (more-or-less accidental) emissions are monitored with risk-based surveys and optical gas imaging cameras (see below). Operators have a time limit before these must be fixed. A fugitive emissions management program is mandatory. AER has so far approved aerial and truck-mounted sensors. Drones are under evaluation. More from AER.

* MELA, The Methane Emission Leadership Alliance, Canada’s source of data, technologies and solution providers that monitor, measure, and reduce methane emissions.

** Pneumatic devices use flowline gas pressure to power equipment or inject chemicals.

Mohamed Abdul (Natural Resources Canada) introduced the Canadian Government’s Technology Innovation and Emissions Reduction (TIER) program and its Emissions Reduction Fund that helps operators fund their compliance initiatives. The $CAD750 million carbon offset credit fund provides up to 75% of project costs and is said to ‘help onshore and offshore oil and gas companies invest in green solutions to reduce GHGs* and retain jobs in the sector’. There are many opportunities for further emissions reductions in operations, minimizing and optimizing the equipment count and using standard modular facilities. Better leak detection and repair (LDAR) practices and a move to zero venting are also recommended. Opportunities exist in the development and deployment of new technologies, especially from intermittent and unpredictable sources. Abdul issued a challenge to the many engineers who understand their facility-specific requirements and who could offer lower-cost abatement solutions. Regulations have halved methane emissions since 2005. Future efforts are needed to address incremental improvements with the aim of a further halving of emissions by 2030.

* Greenhouse gasses.

Monitoring

Rob Milner presented FLIR’s emissions detecting cameras with some impressive imagery of various egregious emission locations. Optical gas imagery is said to be the best technology to pinpoint emissions locations. The latest technology includes emissions quantification from optical gas imaging and the GIS-320 drone-mounted instrument. The handheld GF77 has been updated with specific lenses for various greenhouse and other gases. For large installations, permanent site-wide measurements Milner gave a shout-out to IntelliView’s DCAM sensors and software. Flir’s cameras have also been deployed by Colorado-based CleanConnect.ai to ‘automate compliance and achieve autonomous operations by harnessing the power of AI’.

Miner was joined by Jean-Francois Gauthier who presented GHGSat, the ‘first and only high-resolution satellite-based system to monitor methane operations directly from oil and gas sites’. GHGSat’s system now also operates from aircraft and the two systems can be combined to provide an ‘efficient, cost-effective tiered system for methane monitoring’. GHGSat’s first demonstrator satellite has been in operation since 2016. A second was launched in 2020 and the third, ‘Hugo’, separated from a SpaceX rocket in January 2021. A constellation of 10 satellites is scheduled by year-end 2022. The satellites are calibrated in blind tests of controlled emissions from OGCI partner companies. Gauthier presented the results of a tiered satellite/airborne survey over the Montney shale basin, British Colombia. Data collected from such tiered surveys is amenable to AI-style analysis for detecting flaring and predicting future emissions.

Cindy Verhoeven presented The Sniffers’ (an Intero unit) ALT-FEMP* compliance process. ALT-FEMP technology is said to reduce survey costs by 40% and has resulted in a 60% plus reduction in emissions with only one survey per year. The approach combines aerial and vehicular surveying. Verhoeven advocates mixing LDAR and OGI measurements to maximize the likelihood of detecting emissions while minimizing survey expense. SFEMP, the web-based Sniffers Full Emission Management Platform software is used to manage VOC and methane emission data. SFEMP runs on phones and tablets and integrates with enterprise software from SAP and IBM Maximo.

The Sniffers also participates in the UN’s OGMP 2.0 reporting framework an initiative of CCAC, UNEP, and EDF. Some 62 companies representing 30% of the world’s oil and gas production have joined the partnership and commit to reporting against OGMP 2.0. After the event, Verhoeven told Oil IT Journal ‘We help to establish emission monitoring protocols and multi-year programs for robust emission reduction results and year to year reporting according to the OGMP 2.0 standard. At minimum, our programs are at OGMP 2.0 Level 3 or 4. We can also offer a Level 5 program with both bottom-up and top-down measurements. The software we have developed, is particularly strong in inventorizing sources, taking in quantification data (either from direct measurements, quantification factors or activity factors) and reporting emissions in an auditable way, in accordance with OGMP 2.0 reporting templates.’

* The alternative fugitive emission management program from the AER Directive 060: Upstream Petroleum Industry Flaring, Incinerating, and Venting.

Reduction

Dallas Rosevear presented on Clear Rush Co’s (CRC) enclosed vapor combustors, an environmentally friendly alternative to the flare stack. CRC’s clean combustion technology can be applied to most all of the above sources of fugitive methane. The smoke-free engineered combustors are said to provide 99.99% total hydrocarbon destruction and can be safely deployed within 10m of the wellhead.

Reporting

Sharif Nawyaz from the University of Alberta presented on Canada’s national greenhouse gas reporting program. This mandatory reporting program covers all sites that emit more than 10 ktonnes of CO2-equivalent per year. A national GHG inventory, monitoring and reporting program is in preparation for six GHGs. However, as Nawyaz observed, ‘GHG emissions and climate change are global problems. The ideal solution [would be] a global policy that puts a price on all pollution across all countries and pollution-generating activities’.

James Diamond (Government of Canada) is working to ‘put Canada’s methane solutions on the world stage’ by seeking multinational engagement on methane and slowing climate change. Environment and Climate Change Canada (ECCC) manages Canada’s participation in various international bodies and holds the purse strings to the climate finance envelope. The two main international bodies are the UN’s Climate and Clean Air Coalition (CCAC) and the Global Methane Initiative (GMI). The GMI is particularly active in the oil and gas sector. A survey amongst GMI members determined that priority number one was the ‘establishment of methane emissions reductions and use’.

More from the LBCG conference home page.


OSDU update

Open subsurface data universe signs MoUs with OGP, Energistics. OSDU to collaborate with PPDM on semantic standards. OSDU production ‘Mercury’ release (R3) imminent. OSDU scope to extend to process and facilities, ‘based on Delfi’. Streaming ‘DataOps’ pipelines. Support for LNG. Cegal jumps the gun with OSDUaaS.

The Open Group, OSDU’s parent, has signed a three-year memorandum of understanding with the International Association of Oil & Gas Producers (IOGP) to collaborate on common information standards. The Open Group is bringing OSDU, the Open Process Automation Forum and the recently announced Open Footprint Forum to the table. IOGP currently manages CFIHOS, the capital facilities information handover standard (see elsewhere in this issue) and is working to develop IDR, an Industry Digitalization Roadmap*, which ‘has touchpoints with the strategic direction of the OSDU Forum’. The partnership is destined to ‘enrich OSDU’ with IOGP-curated standards and technologies’ and also ‘to facilitate IOGP in delivering the IDR in an Agile manner’.

* No sign of this on the IOGP digitization page but probably a reference to its work with CFIHOS, see elsewhere in this issue.

Energistics has signed a memorandum of understanding with The Open Group covering collaboration between their respective memberships and ensure that the Energistics standards evolve to meet the requirements of The Open Group’s OSDU data platform. Energistics CEO Ross Philo explained, ‘While a number of companies are members of both The Open Group and Energistics, there is not a complete overlap. To enable seamless collaboration between stakeholders in both groups, we will establish processes to allow the harmonization of priorities and cross-member participation. We are committed to supporting the success of The Open Group OSDU Data Platform’. OSDU Forum Chair Phillip Jong added ‘We are pleased to join forces with Energistics and leverage the existing Energistics standards. The OSDU data platform will provide implementations of these standards and expose the data to applications via OSDU APIs’.

OSDU has also re-affirmed its collaboration with PPDM, leveraging ‘semantic standards’ such as ‘What Is A Well’ and ‘Well Status’ and ‘Classification’. Design teams are working to convert the PPDM data model as JSON schema and also to leverage the PPDM Data Rules library and Well Identification best practices.

The community is waiting with bated breath for the OSDU Data Platform Mercury Release (R3), due out on March 24, 2021. Mercury*, described as the first production release and a milestone for the community, adds end user functionality and enterprise-level trust and security. Data access is enhanced with ‘domain data management systems’ and data ingestion with automated enrichment capabilities to support customer-defined policies.

Other OSDU plans for 2021 include support of production and facilities data based on Schlumberger’s Delfi. Also planned is support for high-volume real-time data workflows across drilling, seismic and streaming production data with end-to-end data pipelines (a.k.a. ‘DataOps’) with reference to the ‘integration and facilitation of D-WIS products’. OSDU support for ‘Edge computing’ is also planned for low latency requirements. OSDU is to extend to become a ‘data platform for LNG’, supporting LNG business workflows and services. Finally, OSDU is planning a hosted ‘OSDU as-a-service’ platform, enabling adoption by mid-size and small operators.

* Mercury is an interesting moniker for OSDU’s latest offering, harking back to the Exxon/IBM Mercury data model of the 1990s, the grandfather of upstream data models that begat, inter alia, POSC’s Epicentre.

Cegal has jumped the gun on OSDUaaS with the announcement of a packaged version of the Open Subsurface Data Universe running in the Microsoft Azure cloud. OSDUaaS is claimed to be a ‘genuinely vendor-neutral solution for OSDU-related services’ and can integrate with other data platforms, cloud solutions and applications. Users can pick and choose the applications they need as opposed to being locked-in to a single vendor. Cegal is looking for early adopters of its solution to test the new platform and move, over the next few years, from current software landscapes, where data is tightly coupled with applications, to a ‘full cloud-native platform’ for subsurface workflows. More from Cegal.


On our raison d’être

Oil IT Journal editor Neil McNaughton walks through the current issue explaining the whys and wherefores of our reporting. Our scope has expanded since we started out in 1996. This issue provides a snapshot of evolving oil and gas information technology in the face environment pressures and the onslaught of data science.

In this editorial, I’d like to walk through the current issue and explain to subscribers and non-subscribers something of our philosophy, where we are now, how we got here and where we should be heading. Oil IT Journal started life as Petroleum Data Manager, back in 1996 when data management in the oil industry became an ‘issue’. Around year 2000 we rebranded as Oil IT Journal, expanding coverage to include all things digital that impact oil and gas exploration and production. That includes geoscience, GIS, reservoir and petroleum engineering, plant and process, finance and more.

I am a rather literal sort of person. When I first heard talk of interoperability and ‘breaking down the silo boundaries’, I thought, ‘way to go!’. So, our broadened scope set out to offer specialists in one domain, a glimpse at what is happening in the other silos. This is still our guiding principle, for yes, the silos are still there and there are probably more of them today than when we started out.

So to start at the top. In this issue’s lead we report from the LBC* Canadian Onshore Well Site/Emissions Reduction conference which provides a situation report from the intersection of regulations, emissions mitigation technology and software. Emissions control is fast becoming a key facet of oil’s license to operate: a silo worth well watching and participating in.

We report on recent developments in OSDU, the open subsurface data universe where you can read about the jostling for standards supremacy (call it ‘collaboration’) that is ongoing between The Open Group, Energistics IOGP, PPDM and D-WIS. OSDU is expanding to silos well beyond its initial subsurface scope. Expanding scope is a high risk, high reward activity! One wonders if the ‘fail fast’ approach will be followed at the macro-level (unlikely).

Our review of Enders A. Robinson’s Basic Wave Analysis is quite an eye-opener for those stuck in the latest silo of ‘data science’. Want to write code for full waveform inversion? ‘Forget about AI and machine learning’. What you need is ‘geophysical theory as conveyed by books and journals’. ‘Since the 1950s, geophysics has seen major advances from the use of computers. None of these major advances have been the result of machine learning’.

A bold statement! But Robinson’s sentiments are echoed in our report on the SEG’s ‘Energy in Data’ webinar on physics-based models (à la Robinson) vs. data-driven models. This included some interesting discussions on the applicability of machine learning in different situations. Where all you have is data and not much idea of the physics, well, you don’t have much choice. Elsewhere, hybrid data and physics models are desirable, although they may be hard to realize. In circumstances where the data is ‘small’, a ‘big’ neural net may not be such a great idea.

Our report from the ECN** ML in Oil and Gas event offers plenty more on the data-driven side of the equation. Oil patch data is not just ‘big’, it is also ‘convoluted’. Presentations look into where best to apply ML and how to explain the results to a skeptical scientist. We hear from Walmart on an issue that has been central to corporate IT since our early days; how much work do you do in-house and how much do you outsource? Walmart established its NexTech unit to ‘minimize dependence on vendors for thought leadership and innovation’. But not all companies have such resources. Some may be more interested in Riverford Exploration’s exposé on big data for small companies. The ECN event also covered one of the few uses of graph databases that we have come across (although we have reported on the technology before). A Lawrence Livermore presentation returned to the physics/data conundrum to advocate ‘physics-informed’ neural nets and ‘fat’ neurons that include physical processes within the model.

Conferences and publications organized by the learned societies have various rules regarding ‘commercial’ papers and presentations. The result is that the software used in a particular study is downplayed or even unmentioned. The orgs (and oils) also have difficulty with data release and publication. What do we do? Well, we always try to tell you what software is being used, whether commercial or open source. We add links to the vendor’s website (at no charge) and/or to a Git repository for the code. Our aim is not to promote any particular software but to provide a useful service to the reader. Scientific publishing is not immune to commerciality and hyperbola, as we show in the article on The Turing Institute’s finite element modeling ‘breakthrough’, an illustration of how we try to separate the facts from the marketing/science/computing confusion. Speaking of confusion, our report on the engineering construction standards space shows more jostling for position between USPI NL, IOGP/CFIHOS and even the venerable ISO 15926!

So there you have it. Since we started out as Petroleum Data Manager in 1996, we have undergone plenty of our own ‘good’ scope creep, into fields well beyond data management. We tell things as we see them, hopefully as they are, and we always appreciate feedback and correctives if we get things wrong.

* London Business Conferences

* Energy Conference network


Book Review: Basic Wave Analysis

Illustrious geophysicist Enders A Robinson*’s new book Basic Wave Analysis** (BWA) sets out to explain the fundamentals of computer processing in exploration geophysics. After reading the introduction we imagined an exchange between the authors and their editor who asks why there is no mention of big data, artificial intelligence and machine learning. BWA’s introduction is an impassioned answer to our imagined query, and its content an erudite exposé of centuries of‘prior art’.

In the introduction, BWA imagines the task of writing a full waveform inversion (FWI) code. Forget about AI and machine learning. The person must depend upon their own intellect, not upon AI. As a prerequisite, the person would have to be well versed in geophysical theory conveyed by books and journals. Since the 1950s, geophysics has seen major advances from the use of computers. None of these major advances have been the result of machine learning. All of the existing codes have been written by geophysicists. BWA sees a decidedly bleak future for a world dominated by AI where ‘inventive science would disappear’. There may be a time while ‘vast computational algorithms do whatever you like’, but sooner or later ‘something would break down and no one will be able to fix it’. Geophysicists need to learn the basics of computer programs so that geophysical tools, talents and geological models continue to be refined, leveraging increasingly powerful computing.

BWA distinguishes itself from other texts by ‘reminding the reader of our pioneering ancestors of scientific research’. The text is peppered with historical backgrounders to theory with reference to Gauss, Huygens, Leibniz, Newton and many others. Vignettes of the scientific greats introduce elements of wave theory. While not exactly a history of science textbook, BWA provides plenty of pointers to where an inquisitive reader might look for more. BWA closes the circle on the history of science and computing with the observation that today, ‘the bounty of data resources and computing facilities is beyond anything that could have been imagined a few years ago’. As Gauss observed, ‘it is not knowledge but the act of learning, and not possession but the act of getting there, which grants the greatest enjoyment’.

So is BWA enjoyable? Its three parts (of increasing difficulty) address velocity, raypath and wavefront analysis. We have only dipped our toes into the 400-page work but found this rewarding from the broad historical narrative. For instance, a discussion of velocity analysis starts with an account of the impact of the Lisbon earthquake of 1755 that ‘heightened the debate between intellectuals of the age on their views of reason and religion’, and goes on to compare Leibniz’ and Voltaire’s views of the world and of religion that introduced the age of enlightenment. All in the context of an intriguing story of 18th Century science and experimental investigations of isostasy. There is some serious erudition here combined with an entertaining writing style and some fascinating commentary. For instance, Alexander Pope in hisEssay on Criticism (1711) ‘effectively describes the objective function’ in a short poem.

So yes, BWA is immensely enjoyable, but how does it perform as a textbook. Here, our opinion was colored by a recent blog posing from Mat Hall on ‘illuminated equations’ which makes BWA (and perhaps all textbooks) look a bit dowdy. There are some inconsistent levels of explanation. The ‘phasor’ is dumped in as a formula without explanation. This contrasts with Wikipedia’s more complete coverage and elegant graphics. Sometimes the commentary veers into the far side as when a discussion of dimensionality extends to Einstein’s four-dimensional spacetime and the possibility of ‘more than four dimensions of spacetime, string, bosonic and superstring theories’. The discussion comes back down to earth with the observation that geophysics does not generally extend beyond classical physics and itself ‘has enough complications to satisfy the enquiring mind’. But for the key topics, such as full waveform inversion, the explanation of both the science and the math is accessible. BWA takes its time with its expositions.

BWA is replete with historical allusions and sometimes reads like a life of the scientific and geophysical saints. Fermat, Huygens, Morse, Fessenden, von Mintrop … all names to conjure with. BWA provides historical chapter and verse of the early days of geophysics including some background of geophysical contracting companies (deGolyer, Karcher ..). Dirac gets a section but curiously, poor old Joseph Fourier escapes treatment, an oversight that is compensated by the copious (67) references to Fourier math. BWA, is a great addition to the geophysical literature, complementing Geophysics in the Affairs of Mankind that we reviewed back in 2001.

* More on Robinson on the excellent Engineering and Technology History Wiki.

** With co-author Tijmen Jan Moser. SEG Geophysical Monograph Series N° 24, an SEG/EAGE Co-Production. ISBN 978-1-56080-372-0. Available from the Society of Exploration Geophysicists: $106.00 (list); $59.00 (members).


FORCE AI Challenge results

And the winner is … Olwale Ibrahim

The results from the artificial intelligence challenge issued last year by Norway’s FORCE industry body are now in. The challenge involved predicting lithology from a labelled set of North Sea logs from Norway’s Diskos repository. Labeling was performed by Stavanger-based Explocrowd and I2G.cloud. The open data set now comprises ‘the first truly open lithology interpretation dataset in the world’. Some 400 teams took part in the contest. The winner was Olwale Ibrahim, an applied geophysics student from the Federal University of Technology Akure, Nigeria. Runners-up were the GIR research team from Brazil’s Universidade Estadual do Norte Fluminense and the Lab ICA Team at the Pontifical Catholic University of Rio de Janeiro.

Commenting the results, the challenge organizer, Equinor’s John Borman observed, ‘Comparing the machine predictions to inhouse data and vendor purchased data I came to conclude that the machine predictions offer an extremely valid second opinion and often highlight the shortcoming in these large databases that have been curated, somewhat inconsistently, over the past 50 years’. All data, submitted machine learning codes and final scores are here and you can read the full report of the challenge here.


Oil and Gas Construction and Process Industry standards developments

IOGP provides update on CFIHOS, the capital facilities information handover standard. USPI-NL releases draft FL3DMS (Facilities Lifecycle 3D Model Specification), working on practical implementation of ISO 15926. Mapping the CFIHOS reference data library to ISO 15926. UK’s National Digital Twin (!).

IOGP/CFIHOS

Since control of CFIHOS, the Capital facilities information handover standard, passed from the Netherlands-based USPI-NL to the IOGP in 2019, IOGP reports 2020 as a ‘year of renewal’ for the standard. Version 1.4 of the standard (first announced for 2017) was published in 2019 and isnow available from the IOGP. CFIHOS is now an IOGP Joint Industry Program (JIP36) with its own website and LinkedIn page. IOGP has also produced a video outlining the CFIHOS story and purported benefits. CFIHOS’ ‘ultimate goal’ is now described as to become ‘the go-to standard for the information supply chain’. A ‘point release’ 1.4.1 was issued early in 2021 with some improvements to the data model and revised documentation, notably an ‘Implementation Guide for Principals’. The Reference Data Library (RDL) is unchanged. More from the CFIHOS standards home page.

The CFIHOS RDL has its roots in the venerable ISO 15926 suite of process industry standards. The CFIHOS intent, as we understand it, was to simplify the ISO standard with a shift from the earlier ‘semantic web’ technology advocated in ISO 15926 to a simpler Excel/CSV format. A quick look at the 1.4 published standard shows that this is not just a ‘simple spreadsheet’. The key ‘Using the data model document’ is a 57-page explainer replete with entity-relationship models covering the vast scope of the engineering domain.

USPI-NL

USPI-NL reports progress on its Facilities Lifecycle 3D Model Specification (FL3DMS) a standard for 3D model design integration for creating Digital Twins and for replication and reuse of 3D designs. The draft standard has been created using existing 3D Model company standards and best practices of all the participants. It focuses on the technical requirements for the creation and handover of 3D models such that the 3D model can be used for different use cases in projects and operations. Current membership includes Equinor, Shell, BP, ExxonMobil, Total, IOGP and service providers. The first draft of the new spec was issued to membership in January 2021 with help from Equinor.

USPI, through its activities in ISO TC 184/SC 4, is working to make the ISO deliverables easier to use, more accessible and aligned with the latest technologies. The ISO 15926 reference data class library (Part 4) is to be extended to support industry projects such as CFIHOS. The practical implementation of ISO 15926 will be the subjects of a workshop USPI will organize in the first half of 2021. More from USPI and on LinkedIn.

ISO 15926

Hans Teijgeler, retired senior manager of information management at Fluor and ISO 15926 evangelist, has been working with the CFIHOS published standard to show how the CFIHOS RDL relates to the ISO 15926-4 RDL (Teijgeler has also mapped and harmonized the DEXPI* RDL with ISO 15926). Teijgeler has kindly contributed a short note ‘ISO 15926 - a status report’ which we publish as an Oil IT Journal contributed paper.

* Data exchange in the process industry.

UK National Digital Twin

Another ISO 15926 luminary, Matthew West (formerly with Shell) reports his involvement in the UK’s National Digital Twin program a partnership between the UK Government and Cambridge University that is to develop ‘an ecosystem of connected digital twins to foster better outcomes from the built environment’. That should be easy!


DISKOS 2.0 awarded

Norway’s subsurface shared data platform returns to Halliburton/Landmark. Kadme gets trade module. New Guide to the Resource Regulations issued.

NPD, the Norwegian Petroleum Directorate has announced the award of the two-part Diskos 2.0 data contract to Halliburton/Landmark (subsurface) and Kadme (trade). The subsurface component covers seismic, well and production data and the trade solution covers data exchange and trade managed by Norwegian Oil and Gas, where user rights to the data sets are awarded to Kadme. The contracts have a total duration of eight years; five years plus three option years. The estimated total value is NOK 157 million.

Kadme was a partner in the current Diskos manifestation. The return of Landmark displaces CGG’s ‘Akon’ data management solution. DISKOS was originally run by an IBM-led consortium, which developed the PetroBank data management system. This was later taken over by Halliburton unit Landmark Graphics. At its 2004 renewal, PetroBank was retained, but operations passed to Schlumberger. In 2009, Landmark was back in the driving seat, to be replaced again by CGG in 2015 operating its own Trango-derived ‘Akon’ data management platform.

NPD has also released an updated ‘Veileder til Ressursforskriften’ (Guide to the Resource Regulations), only available in Norwegian at the time of writing. The guide covers, inter alia, the unambiguous identification of production metrology, and the mandated use of the NPD’s unique identification codes (NPID), discovery and well designations. Visit the DISKOS home page.


Physics versus machine learning models

AAPG/SEG/SPE Energy in Data webinar hears from Hess on data-driven models in shale exploration. Corva on ROP drilling prediction. Schlumberger – use both ML and physics! Xecta don’t use ML on small data! Data-driven reserves reporting for anyone?

Energy in Data, a joint venture between the SEG, AAPG and SPE, aims to ‘lead the digital journey of the energy sector’ and holds free monthly webinars and other events. We attended a recent webinar on ‘Physics-based vs data-driven models’, subtitled ‘Engineering in a virtual sub-surface’.

Invited by compere Siddharth Misra (Texas A&M) to ‘give us your best shot’ at outlining the data-driven/physical model conundrum, Sebastian Matringe (Hess) cited the ‘huge reliance’ on data-driven methods in shale exploration. Classical reservoir models are unadapted or too slow. So, ‘just take well data and use machine learning to decipher the relationships’. Most shale operators use data-driven today. Kriti Singh (Corva) cited successes in optimizing and predicting drilling rate of penetration. Modeling ROP has been studied since the 1930s using physics-based models of weight on bit and other parameters. These later involved statistical models and most recently, ML. Now there are tools to use real-time data. But it should be noted that these are not data-driven alone. Most are hybrid models, you need to understand the physics. There is a balance between how hard it is to acquire all the data needed for a physical model and how expensive a mistake would be. For space exploration, mistakes are expensive so physics is used*. Ravinath Kausik (Schlumberger) observed that physics should be the ‘default that we trust’, data science comes in after. ‘They can aid one another’.

* Wells are expensive too!

The discussion turned to the reliability of data-driven models. While conventional models are studied and validated by experts, this is not usually done in oil and gas for data-driven models. These need to be made more robust and make them explainable. ‘You don’t want black boxes’. In unconventional exploration, companies are finding it hard to take the results from core acreage to outlying areas. ‘There was some hype earlier on…’.‘Neural nets are not for all circumstances. It might be better to use multi-variate regression that people can understand’. ‘Data-driven is influencing oil and gas hiring’. ‘Data science is the sexiest job of the 21st Century’.

On the subject of convergence of DD and physical models, Kausik suggested using both on the same problem, data science to segment data and then and then apply physics. Data science has a long way to go for widespread application. We still don’t appreciate the uncertainty in our models, when will they fail? If you don’t have big data don’t address the problem with a big neural net, use a simple model! ‘We don’t have as much data as Google’. Satish Sankaran (Xecta Digital Labs) observed that the industry has changed a lot over the years. Oils more or less abandoned R&D some 20 years ago. Today it is coming back. There are also many startups working in this space.

Are data-driven results appropriate/accepted for reserves booking? Matringe sees data-driven as an extension of the accepted SPEE practice of using type curves. These are ‘derived from data in order to justify proved undeveloped results’. However, ‘it’s not a simple answer, it depends on the application and there are a lot of details that have to be thought about when using these practices for something as important as reserves’.

There was pushback on the risk of ‘turning engineers into data scientists’, ‘we still need to honor physics and our engineering degrees’. Also on the real novelty of ‘data-driven’, which are ‘the same tools we've used in the past, just with new names and better hardware’. Not according to Matringe, ‘neural nets and regression look similar but there are some significant differences, today’s NNs are a different class of methods’.

The Q&A threw up some interesting ideas, for more on constraining ML with physics, read Raissi et al. and their Github.

The Energy in Data event is well run with a slick interface that allows for considerable audience interaction. There was only one regret, the posted question ‘we talk about success stories of data-driven methods, where are the outstanding failure stories that we can learn from?’ was left unanswered. Maybe a good topic for a future webinar?
Watch the webinar recording.


Software, hardware short takes

New releases and announcements from Beicip-Franlab, Blue Marble Geographics, Bluware, dGB, Emerson/Paradigm, IHS Markit, Katalyst Data Management, Petrosys, Rock Flow Dynamics, Enverus, Assai, Brüel & Kjær Vibro, Esri, Flowserve, Stratus, RigER, Validere Technologies, GLJ, Solid Project.

Upstream

Beicip-Franlab has released OpenFlow Suite 2020/ R 2020.1, with bug fixes for OpenFlow, DionisosFlow and TemisFlow, now all available in the cloud.

The 2020 SP1 release of Blue Marble GeographicsGeographic Calculator includes the ISO 19111:2019 EPSG/IOGP upgrade to the IOGP/EPSG data model and an online point-to-point geodetic calculator.

Bluware has rolled-out OpenVDS+, adding wavelet-based data compression to its OpenVDS seismic data format for the cloud.

dGB is offering a combo of seismic machine learning and pre and post-processing plugins at a discount. The new package, which includes the ML plugin and OpendTect Pro, Dip-Steering and Faults & Fractures, is available for purchase or rental from dGB’s Prostore.

Geolog 20 from Emerson/Paradigm comes with usability enhancements, improved nuclear magnetic resonance log visualization and a redesigned core analysis workflow. The petrophysical toolkit has been expanded to cover unconventional assets and the specific requirements of the Russian/CIS market. Geolog now also supports data import/export in OSDU formats.

Emerson/Paradigm has also released ‘SpeedWise Reservoir Opportunity’ (SRO), an automated, cloud-native analytics solution developed in collaboration with Quantum Reservoir Impact (QRI). SpeedWise applies algorithms and data mining to multidisciplinary data to increase production, reserves and capital efficiency. Workflow automation cuts workloads from months to weeks, resulting in shorter decision-making cycles and better risk management. SRO features automated geo-engineering workflows for identifying and ranking recompletion, vertical sweet spots, and horizontal wells.

The 2020 edition of IHS Markit’s Petrel Gateway enables Kingdom geoscience interpreters to leverage Schlumberger’s Petrel for geomodeling. 2D/3D seismics, horizons, faults and more data types can be shared between the two platforms. More from the release notes.

Katalyst Data Management has added a subsurface data search engine, powered by Elasticsearch, to the latest release of its iGlass data management solution. iGlass Portal ES supports freestyle text searches across seismic and well data.

Another Elasticsearch deployment comes from Petrosys which has added the open source technology to its E&P data crawler. Petrosys Intelligent Search was showcased in a presentation, ‘Can Elasticsearch help us access large Oil & Gas datasets more efficiently?*’ to the recent virtual conference of the Society of Petroleum Data Managementhttps://www.iogp.org/blog/epsg/upgrade-of-epsg-dataset-data-model/upgraded.

Petrosys has also announced Petrosys PRO 2020.2, with a new well perforation display from Petrel function, raster tracking of contours, faults, and other liner features from scanned images. Also new is Interica OneView archiving integration, a link between OneView and Petrosys PRO for direct archival of PRO projects.

Rock Flow Dynamics has released tNavigator 20.4 with updates and improvements to the simulator kernel, AHM and Uncertainty module, Geology and Model Designer, PVT Designer, Well and Network Designer.

Environment

Enverus has released ESG Analytics, a new evaluation standard for environmental, social and governance benchmarking across the energy industry. ESG Analytics tracks emissions intensity, flaring rates, land and water use via satellite, production and economic data. The "S" and "G" elements track pay disparity and diversity, allowing operators to benchmark themselves against peers and provide investors with verifiable data. More from Enverus.

Plant & process

Engineering document specialist Assai reports that 60% of its clients now use the AssaiCloud, the SaaS version of AssaiDCMS. The cloud edition offers managed software upgrades, patches and back-ups along with management of third-party software, including Oracle and Brava. Watch the AssaiCloud/REST API webinar.

Brüel & Kjær Vibro has announced Vibrostore 100, a palm-sized device that provides vibration level and bearing wear monitoring at the push of a button. The device houses ISO 10816 machine data and alarm limits. A traffic-light display indicates vibration severity at different frequencies that indicate common machine faults, such as imbalance, misalignment or looseness.

Esri has released ArcGIS Velocity (previously ArcGIS Analytics for IoT), with new cloud-native capability for ingestion, processing, visualization, and analysis of real-time and high-volume geospatial data on the fly. AGV spatially enables Internet of Things data and simplifies real-time analysis.

Flowserve has released the RedRaven internet of things services platform. RedRaven lets operators monitor assets remotely, anticipate equipment failures and take preventive measures. RedRaven supports any flow control equipment regardless of manufacturer. RedRaven sensors and gateways are available with wireless or wired options. Data is collected from assets, encrypted and transmitted to the cloud. Technicians at Flowserve’s dedicated remote facility monitor client operations and inform clients of issues and suggested fixes.

A new white paper from Stratus vaunts the merits of its ‘zero-touch’ edge computing platforms for midstream operators. Modern, distributed computing architectures bring asset performance, monitoring and control and logistics visibility outside the fence.

e-Business

RigER 8.0 Odessa, an oil country equipment tracking, billing and rental reporting package now includes improved data management and information security along with mobile ERP and CRM. A built-in email client tracks communications and eliminates cut and paste, double data entry or lost emails. A barcode reader adds inventory control. ISO27001 information security comes with all Odessa packages.

Validere Technologies has launched Edge Connect to provide oil and gas buyers and sellers with ‘greater optionality, discoverability and increased profit opportunities’. Edge Connect uses predictive machine learning and a ‘vast’ data set to provide critical intelligence and ‘guide win-win connections that benefit all participants’.

Miscellaneous

Calgary-based GLJ has announced IntelliCasts a subscription service that offers rapid oil and gas asset valuations. IntelliCast combines machine learning, decline analytics and five decades of play knowledge to provide insights for producers, debt and equity stakeholders and mid-streamers.

Speaking at the January 2021 Solid World conference, world wide web inventor Tim Berners-Lee presented the Solid Project, a suite of open specifications, built on existing open standards, that describes how to build applications so that users can ‘conveniently switch between data storage providers and application providers and take the data generated along’. The objective is for ‘all applications to be truly interoperable with multiple backend software’.


EAGE DIGITAL 2020

Equinor’s Seismic-ZFP compression. Aker BP’s ‘SWAP’, an OSDU-like ‘vendor-independent workflow architecture’. Cognite demos SWAP on Diskos dataset in the cloud. ENI ‘Geo-Apps’ keep tabs on proliferating RESQML datasets. Schlumberger/Equinor ExplorePlan, ‘get better at estimating’. OMV’s machine learning replaces well tests. TNO on data-driven well event detection. Schlumberger reports on ISAPP Olympus well placement challenge. Panel debates graph technology in E&P, mistrust in OSDU. Equinor showboats digitalization.

Equinor has open sourced Seismic-ZFP, its seismic compression algorithm and Python library. David Wade was enthusiastic as to the benefits of machine learning and big data but warned of the cost of ‘seeing all the benefits ending up in the pockets of cloud vendors’. While cloud data upload may be free, storage is costly, as is data egress. The trick is to compress data and cut storage costs. Enter LLNL’s ZFP algorithm for lossy compression of arrays of up to 4 dimensions. Equinor has adapted seismic cube volumes to suit disk access and enable extraction of arbitrary lines. The SeismicZFP header holds SEGY header plus SGZ metadata. Compressed data quality shows ‘no meaningful difference’ at 8:1 compression. ‘Even 16:1 is OK’. Compressed data quality is ‘fine’ for machine learning applications. It may even be a pre-requisite for the efficient implementation of some. Download with ‘pip install seismic-zfp’ or from GIT.

Vidar Furuholt (Aker BP) advocates data liberation and the free flow of information between upstream applications. Aker BP’s SWAP is a proof of concept of such ‘industry 4.0’ principles, with data organized in a data layer, and decoupled access via an API. ‘Real data liberation in E&P can only be achieved with standards’ (like OSDU – see below). SWAP promises a vendor-agnostic workflow framework linking apps to data with a GUI/Python scripting endpoint. The system behaves as a workflow pipeline, but with added vendor neutrality. SWAP does not touch either services or data, issuing requests on a message queue, triggering one service after another and presenting a composite output to the user. The system can be built with open source libraries or by assembling cloud services such as Kubernetes/Terraform/Docker. SWAP runs under a Python/Jupyter Notebook and leverages a Google cloud pub/sub mechanism with support for Google cloud storage buckets. Use cases include seismic volume transforms and filtering with AkerBP’s own frequency match algorithm. Currently, seismic volumes reside in Cognite Data Fusion as a CDF Data set. Furuholt acknowledged Baringa’s help developing the solution.

Carlo Caso (Cognite As) described a test of the SWAP concepts performed with AkerBP. Using a test data set of 2,000 SEGY datasets and 3D surveys, Cognite reproduced the DISKOS dataset in the cloud. Performant data access is provided with protocol buffers and gRPC, a high-performance remote procedure call framework. End users can quickly discover and run multiple vendor services. Cloud cost models are different and impact data flow. Data egress can exceed storage cost. The cloud is optimized for web services, not HPC. Not all legacy apps compatible with remote storage. Legacy apps often have their own data store and their developers may have a vested interest in the status quo. But industry is asking for open APIs, open standards for data formats and models and data adaptors for legacy apps. AkerBP and Cognite are making their open API public. Cognite is also an ‘active member’ of OSDU and has contributed its API to the initiative, where it is presumably seen as an alternative/competing data component to Schlumberger’s OpenVDS.

ENI’s Marco Piantanida warned that use of Energistics’ RESQML reservoir model data format can lead to an unmanageable mess of RESQML files. A better solution is to use the Energistics transfer protocol (ETP) to allow apps to listen to each other without file data exchange. ENI has productized the approach as ‘Geo-Apps’ offering data exchange and tracking for RESQML. Binary and other log and fault data go into the official ENI data repository via a RESQML disaggregation layer. An ‘e-RESQML’ GUI controls the workflow with a neat exploded graph representation of RESQML contents. The system adds a model tracking database to RESQML. TechEdge, Kwantis and Oracle were involved in the project.

Caroline Le Turdu gave an unashamedly commercial presentation of Schlumberger’s ExplorePlan. ExplorePlan was developed to combat over-optimistic pre-drill reserves estimates as a joint Equinor/Schlumberger AI program. The solution combines Schlumberger’s Delfi ‘cognitive‘ E&P environment along with apps including Petrel and GeoX. The solution shares prospect data with team members for peer review along the exploration ‘funnel’. Results are aggregate as dashboards in Spotfire. The system helps operators build on existing knowledge, remove bias and get better at estimating.

Gael Joffre (OMV) presented a straightforward application of machine learning which sets out to use analytics to replace well tests*. Gas fields in New Zealand are automated and flow is comingled. Well head gas meters experience time delays and water encroachment and the situation is complicated by the pipeline retention time between platform and shore. OMV uses PI historian data and a ‘hybrid’ interpretation platform built with scripts and solvers from Cognite CDF, Power BI, Grafana Azure and Google to distinguish between flow disturbances and stable periods.

* A similar approach is currently proselytized by current SPE Distinguished Lecturer Roland Horne in his talk ‘Big Data and Machine Learning in Reservoir Analysis’.

Jonah Poort (TNO), speaking for a consortium of Wintershall, Shell, Total and others also presented on data-driven detection of well events in mature gas fields. The work is part of a larger design and maintenance program for ‘geo-energy assets’. Gas production is plagued by undesirable ‘off-normal’ events - salt, asphalt, gas/water coning. These are usually identified by human inspection. Salt precipitation is observable at the surface as decreasing flow and wellhead pressure. Historical data was scanned to locate target patterns. The algorithm picks likely matches and an operator gives a yea/nay for the match. A sliding window compares snapshots of the data with the target pattern, leveraging ‘dynamic time warping’. After tuning the system found all known salt events in the data, even with 10% added noise. The test also identified 10 hitherto undetected events, 8 of which were confirmed by the operator.

Ralf Schulze-Riegert (Schlumberger) presented on work performed for Norwegian Petoro* on expert-guided machine-learning for well location under uncertainty. The work was performed under the ISAPP** Consortium’s Olympus Challenge. Production and injection scenarios for the North Sea Olympus development were modeled as 50 equiprobable subsurface realizations giving production forecasts from 2016 to 2036. An expert interpreter identified one promising location for a drill site, and ML was used to find lookalikes. The system includes information on economic demand, well costs, costs of shared facilities and more. An interactive workflow was used to answer questions such as ‘find 90% of wells that meet economic criteria at over 80% probability’ in what was described as an ‘expert-guided ML approach’.

* Petoro manages the State’s financial interest in Norwegian North Sea joint ventures.

** Integrated systems approach to petroleum production.

A panel discussion titled ‘Digital Ecosystems – Quo Vadis?’ homed-in on a discussion around graph technology. Andreas Blumauer from the Semantic Web Company was asked to provide ‘concrete examples of automated metadata extraction from unstructured subsurface data’. Blumauer intimated that this was ‘a little bit under NDA’ but provided a pointer to SWC’s large lithology knowledge graph developed for the Austrian Geological Survey. There was also a lot of interest in OSDU. Schlumberger’s Jamie Cruise opined that these new platforms are very flexible as there are no schemas ‘baked into code’. OSDU started with seismics and wells and is now extending to production and ‘thinking about new energy, methane emissions’. ‘There is some mistrust in OSDU’ with regard to Schlumberger’s dominant position. But ‘there is room for competition’. Target’s Ali Al Mujaini stated that ‘OSDU needs to be much more open and not just for the supermajors. People still struggling to deploy on the cloud and benefit from AI’.

Finally James Elgenes (Equinor) gave a showman’s view of digitalization in exploration. Digital technology at the Johan Sverdrup field has ‘boosted earnings’ by over $200 million since startup. Equinors’ formula for such success is quality data, data science and analytics, discipline experts, new ways of working (fail fast) and external collaboration, notably with Microsoft, provider of Equinor’s Omnia cloud database. Equinor’s enthusiasm for AI and digital transformation has led to new investments in companies such as KoBold Metal and Earth Science Analytics. OSDU is a ‘key component’. Equinor believes in data sharing* viz. Northern Lights, Sleipner CCS and Volve. Reworking old data and reports has led to new discoveries. Data needs to be churned systematically. Agile teams involve 7 personae (computer geoscientists, analytical geoscientists, data scientists and last (and hopefully) not least, the subject matter expert). The latter is ‘an engaged specialist open to new ways of working’. Examples of Equinor’s work are ‘WELP**’, well extraction log processing i.e. the automatic creation of composite logs, well analytics with (and without) ML for poroperm determination and seismic data analytics with deep learning on post-stack data. Another theme is the value of information, to ‘spend money wisely’ and to keep tabs on oil product quality in different geographies. ‘You need to get people competencies right such that everyone has a place’. Data literacy is now a key competence.

* For an analysis of Equinor’s data openness read Matt Hall’s blog.

** See also this work by Erlend Viggen.


Folks, facts, orgs ...

Appointments and nominations at Borr Drilling, CAM Integrated Solutions, Chevron, ExxonMobil, Enverus, Fugro, Genasys, Halliburton, Independence Contract Drilling, ISA, Lufkin, NarrativeWave, NPD, Oceaneering, OPC Foundation, ProPetro, Quality Companies, Raymond James, Texas Railroad Commission, Seadrill, SEG, SPEE, Tellurian, TETRA, The Open Group, Williams, ConocoPhillips, IFP School, Inpex, Ryder Scott.

Magnus Vaaler is now Borr Drilling’s CFO, replacing Christoph Bausch.

Horacio Tinoco has been promoted to Director of Automation at CAM Integrated Solutions.

Marillyn Hewson (Lockheed Martin) is now a Chevron board member.

Len Fox is VP and controller at ExxonMobil replacing David Rosenthal who is to retire. Tan Sri Wan Zulkiflee Wan Ariffin (former CEO, Petronas) is now a board member.

Manuj Nikhanj is now President of Enverus. He was previously president and co-CEO at RS Energy Group, acquired by Enverus last year.

Fugro has opened a new state-of-the-art remote operations center in Abu Dhabi. Paul Verhagen is stepping down as Fugro’s CFO and member of the board to become CFO and member of the Management Board at ASM International NV.

Genasys has opened new offices in Singapore and Dubai. Sean McKane is VP of Business Development for Southeast Asia. Peter Ayre is VP of Business Development for MEA and Central Asia. And Mauricio Tellez is VP of Business Development for Latin America.

Van Beckwith is now Halliburton’s EVP, Secretary and Chief Legal Officer. He succeeds Robb Voyles who is stepping down after seven years with the company.

Independence Contract Drilling has promoted Philip Dalrymple to SVP Operations, Scott Keller to SVP Business Development, Katherine Kokenes to VP & Chief Accounting Officer and Marc Noel to VP Sales & Marketing. Stacy Durbin Nieuwoudt is now a member of the ICD Board.

Megan Samford (Schneider Electric) and Sharul Rashid (Petronas) are now members of the ISA Global Cybersecurity Alliance advisory board.

Saeid Rahimian is CEO at Lufkin. He hails from Aereon.

Ed Feo (Coronal Energy), Dennis Meany (Oatfield), Michael Grenier (BluOx Ventures), Mel Badheka (Greensphere-ESG) are now advisory members at NarrativeWave.

The Norwegian Petroleum Directorate appointed Hilde Nordbø as head of data management, Tommy Rafos as head of IT development and operations and Louis Vos as head of finance and joint services.

Karen Beachy and Kavitha Velusamy are now members of Oceaneering’s board of directors. Beachy has been appointed to the compensation committee and Velusamy to the audit committee.

Peter Zornio (Emerson) has joined the OPC Foundation board of directors.

Larry Lawrence is now a member of ProPetro’s board of directors.

Following its recent acquisition by Quality Companies, ex-EPS personnel Randy Lepretre, Jonathan Mayer Scotty Lepretre, Becky McManus and Jonny Broussard are joining the company.

Stephen Woima is director, head of acquisitions and divestitures, at Raymond James.

Jim Wright is the new Texas Railroad Commissioner. He has appointed Kate Zaykowski, Director of Public Affairs, Christopher Hotchkiss, General Counsel, and Megan Moore as Executive Assistant.

Seadrill has appointed Reid Warriner as COO and Leif Neilson as CTO.

Jim White is the new SEG executive director.

Steve Gardner (Ryder Scott) is now a member of the SPEE board of directors.

Octávio Simões is President and CEO at Tellurian. Jonathan Gross (Jexco) and Jean Abiteboul (GIIGNL) are now independent board members.

Paul Coombs is to retire from TETRA’s board of directors.

The Open Group has appointed Andras Szakal as VP and CTO.

Rose Robeson is now a Williams board member.

Tim Leach has been named ConocoPhillips’ EVP, Lower 48 and a member of the board.

Pascal Longuemare is now director of the Centre des Motorisations et Mobilité Durable at IFP School, succeeding retiree Pierre Duret.

Inpex has appointed Isao Takahashi VP Abu Dhabi Projects, Takeshi Yoshida GM Technical Planning & Coordination Unit, Technical division, Yoshihiko Kurokawa GM Operated Projects Support Unit, Technical Division and Makoto Suda Project GM, Technical Division.

Deaths

Ryder Scott has reported the deaths of Douglas McBride (PE) and Ronald Arthur Lenser, an early leader of the company.


Done deals

C3.ai IPO popular. Datagration acquires Mosaic Petroleum Analytics. DNV GL acquires ERS. geoLOGIC systems bags SubsurfaceIO. Hexagon buys PAS Global. ION Geophysical goes ‘asset light’. Gordon Technologies bags Lodestar. Quality Companies acquires EPS unit. Aucerna and Quorum Software merge, bagging Energy Components en route. Schlumberger New Energy, CEA form Genvia clean hydrogen venture. Spectris sells Brüel & Kjær Vibro. TechnipFMC resumes split. Xait acquires BlueprintCPQ.

C3.ai’s IPO was popular with investors. Priced at $42, the stock opened at $100 the following day. C3.ai provides ‘enterprise AI’ to companies such as Caterpillar, Baker Hughes and Shell.

Datagration has acquired Mosaic Petroleum Analytics, a data analytics, reservoir simulation, and economics platform for unconventional reservoirs. MPA will integrate Datagration’s PetroVisor platform, a ‘proven unconventional asset workflow’. More from Datagration.

DNV GL has acquired US-based engineering consultancy Energy and Resource Solutions, adding domain expertise and a ‘digital first’ approach to its energy management capabilities. DNV GL is also to combine its current Oil & Gas and Power & Renewables businesses into a new business area called Energy Systems.

geoLOGIC systems has acquired Houston-based SubsurfaceIO (SSIO), a provider of oil and gas cloud-based mapping and analytics solutions that connect and integrate public and proprietary data. The SSIO platform speeds data handling, cleaning and loading with a ‘unique data-agnostic micro-services architecture’. SSIO is said to be an early adopter of the OSDU standards.

Hexagon (Intergraph) has acquired PAS Global, a provider of operational technology cyber security solutions. PAS’ OT Integrity solution is deployed at ‘13 of the top 15 refining and 13 of the top 15 chemical companies’.

Following a restructuring agreement, ION Geophysical is to implement ‘certain restructuring transactions’ that will result in a refined ‘asset–light’ strategy and extend its December 2021 bond maturity by 4 years to 2025. More from Ion.

Scott, LA-based Gordon Technologies has acquired Lodestar International and its MaxWell Downhole Technology subsidiary. Lodestar provides measurement while drilling tools.

Quality Companies is acquiring the Production Management division of Expeditors & Production Services (EPS) a supplier of contract production management personnel and expertise.

Aucerna and Quorum Software are to merge and operate under the Quorum Software name. Quorum has also announced the pending acquisition of TietoEVRY’s oil and gas software business, including Energy Components (production accounting) and DaWinci (personnel and materials management). The TietoEvry deal is said to have an enterprise value of €155 million. More from Quorum.

Schlumberger New Energy, CEA* and other partners have received European Commission approval for the formation of Genvia, a clean hydrogen production technology venture. Genvia is to develop a ‘game-changing’ electrolyzer for clean hydrogen production to be deployed in a hydrogen ‘gigafactory’ at Béziers, France.

* CEA (originally the ‘commision pour l’energie atomique’) is now the French alternative energies and atomic energy commission, an R&D body with 20,000 employees.

Spectris has sold its Brüel & Kjær Vibro business to Tokyo-based NSK €180 million.

TechnipFMC is resuming its action to re-sperate into two publicly traded companies. At the same time, Bpifrance* is to invest $200 million in Technip Energies, acquiring shares from TechnipFMC and becoming a ‘long-term reference shareholder’ of Technip Energies.

* BPI France, a public-private partnership, is a ‘one stop shop for entrepreneurs’.

Stavanger-headquartered tender software specialist Xait is acquiring BlueprintCPQ, a provider of configuration, pricing and quotation software. Xait reports ‘strong growth’ from its position in the global oil and gas supply industry, with major corporations such as Honeywell, Siemens and ABB on the client list. More from Xait.


2020 Energy Conference Networks Machine Learning in Oil and Gas

Quantum Reservoir, ‘oilfield data is convoluted’. Shell Tech Ventures’ cash for innovators. WalMart’s NexTech unit minimizes vendor dependence. Riverford on Bureau of Economic Geology’s TORA, ‘big data for small companies’. Warwick, Neo4J Graph Technology for leasehold analysis. LANL’s ‘fat neurons’, physics-informed neural nets. Texas A&M, drones, AI and oil spills.

David Castiñeira (Quantum Reservoir Impact*) gave the keynote on ‘practical and value-additive ML and AI for oil and gas’. AI/ML terminology is confusing and definitions are elusive. Following a long (1980-2010) ‘winter’ AI has come back to life thanks to more data and better computers, to the extent that there is talk of an ‘AI bubble’. Castiñeira walked through the different approaches to machine learning and potential applications in oil and gas, suggesting that a virtual assistant for reservoir management might be possible. One issue is the fact that oil patch data is not just ‘big’, it is also ‘convoluted’ i.e. complex. It can be hard to settle on exactly what oilfield KPIs to model and optimize. QRI’s ‘augmented AI’ embeds intelligent workflows, data-driven models and automation. Where components of the workflow are amenable to first principle analysis, conventional reservoir engineering is recommended. Elsewhere, in more poorly conditioned areas, statistics and full-blown ML can usefully be added. The latter raises the issue of model ‘explainablity’, addressed by chaining smaller component models that are easier to comprehend. Data processing involves a lot of moving parts. Extracting data from a wellbore diagram involved PyWin, Google Cloud Vision API, Poppler for PDF to HTML conversion and JSON. Castiñeira presented the results of a well spacing optimization study for a Permian basin operator. Here a sparse, convoluted data set meant that classical reservoir modeling was impossible. Unsupervised ML was used to perform decline curve analysis on public and operator data. Ensemble model aggregation produced an optimized well spacing. The QRI method is under a US patent application. Other examples of QRI’s work included an OCR/NLP analysis of PDF drilling reports to drive drilling efficiency.

* Last year, QRI teamed with Emerson to provide ‘AI-based analytics and decision-making tools for exploration and production’.

Andrea Course explained why Shell is Investing in machine learning and how interested parties can apply for some of Shell’s largesse. Shell sees start-ups as where the action is. They can move fast, take more risks and (maybe) disrupt whole industries. Investment from Shell’s Technology Ventures unit can provide venture capital for innovative companies across the energy sector. The Tech Works program applies technologies from outside oil and gas to ‘solve today’s energy challenges’. STV currently holds participations in some fifty companies around the world working in oil and gas, new energies and other sectors. In the AI/ML space, Course cited Bluware (geoscience), Askelos (digital twin), Innowatts (electric load forecasting), Cumulus (leak mitigation) and Veros (rotating equipment monitoring).

Anna Jarman (Walmart Global Business Services) provided an outsider’s view of innovations in emerging energy technology. Walmart established its NexTech unit to ‘ensure that Walmart associates were actively engaged at the front end of the technology wave to minimize our dependence on vendors for thought leadership and innovation’. WalmartTechATX is a satellite of NexTech that is responsible for the enterprise technology functions that keep Walmart running. The focus is a mix of data science and emerging technology, with solutions that leverage natural language processing, machine learning, cloud computing and AR/VR. Solutions rolled-out to date include conversational chatbots for user interaction. These are built with Microsoft LUIS to determine user intent and offer a friendly user experience. XR Tech is to provide ‘immersive augmented analytics’ that will allow users to view huge datasets and ‘explore big data across many dimensions at once’. Anticipated usage includes spotting zero-day cyber threats using graph datasets. Jarman warned that ML ‘is not a magic bullet and cannot solve every business problem’. ML excels in areas where rules are difficult to apply or where data sets are large, mixed (convoluted as Castiñeira might say) and where outcomes cannot be obtained ‘by applying a set of explicit rules’. SVM*s are the workhorse of Walmart’s AI, used to classify and explore intelligently its vast inventory. SVMs can be trained via linked data relationships, reducing cost and increasing the accuracy of predictions.

A talk from Walmart is always a coup for a conference organizer. Last time we reported on Walmart, at the 2006 PNEC, Nancy Stewart reported on Walmart’s extensive use of a humongous Teradata warehouse to analyze what was not yet known as ‘big data’.

* Support vector machine

Bill Fairhurst (Riverford Exploration) observed that some big data approaches may be beyond the capabilities of smaller oil and gas companies. There are however some that are well suited to their needs and that can be applied to enhance technical interpretations and economic outcomes. Most domain experts are cognizant with statistics and other analytical tools, and have been using them for decades. Today, ‘analytics’ is heralded as a needed ‘transformational event’, even though early adopters have seen a 70-90% failure rate! Fairhurst presented the Texas Bureau of Economic Geology’s TORA (Tight oil and gas resource assessment) consortium, a major oil-sponsored initiative to investigate past, present and future recovery from unconventionals. This extensive study uses the previous year’s drilling outcomes, forecasts of prices and costs, to derive a resource portfolio. A profitability map suggests optimum drilling locations. Unsurprisingly, high probability locations are found in ‘proximity to recently drilled areas with past experience [ and at ] locations most attractive from economies of scale point of view’. A similar approach is applicable to small company portfolios using relatively straightforward multilinear regression models. A study of the Rodessa sandstone investigated poroperm variability from the date of first production, depth and other variables and found geologic and reservoir engineering relationships that were not expected from the geological interpretationy, and that explained the differences in production. Fairhurst concluded that while domain expertise is the key, statistics, machine learning and analytical models can assist in understanding and communicating variable relationships. ‘Smaller, independent oil and gas firms can perform similar analyses [ as the majors ] for successful long-term outcomes’.

Conrad Hess teamed with Chris Buie (both from the private equity Warwick Group) to present an analysis of leasing behavior using graph theory and network analysis. Warwick’s activity involves the consolidation of operated and non-operated working interests in premier low-cost oil basins in the continental US. Data Science is central to Warwick’s ‘micro-aggregation machine’. Key to the analysis is an understanding of who the competition is, and where are they leasing. This involves large (convoluted?) datasets, that are easily processed by conventional landman software. Enter the graph database, in particular, Neo4J* graph analysis. Neo4J’s networked nodes and relationships are amenable to modeling lease holdings and corporate connections. The techniques available to data in a graph structure ‘can reveal insights not visible to tabular data’. Jaccard Similarity is used to search and disambiguate thousands of entities and return similar names. In practical terms, when Warwick is offered a deal it can quickly find out other potential offers and similar available leases. Natural language processing and Spotfire also ran.

* For a backgrounder on Neo4J read our report from the 2016 Graph Connect conference. Also of interest in the context of oil and gas lease management is lease management software developer Michael Porter’s blog on Grandstack.

Monty Vesselinov presented the results of Los Alamos National Laboratory’s work comparing unsupervised and physics-informed machine learning analysis of unconventional oil and gas production. LANL has developed ‘patented, open-source’ unsupervised ML methods for tensor factorization, coupled with custom k-means clustering. The approach is said to be computationally efficient and adapted for terabyte datasets utilizing GPU’s, TPU’s and FPGA’s. Vesselinov’s preference is for ‘physics-informed’ neural nets that include prior knowledge of a problem. Physics-informed layers (aka ‘fat’ neurons) capture important processes such as flow, stress and displacement. The technique mandates ‘differentiable programming’ in JULIA. Since unconventional production forecasting is challenging and physical processes such as fracking are poorly understood, the approach uses large public datasets and data science to predict system behavior from observed oil and gas production. More from the project’s Git repository.

Zahra Ghorbani and Amir Behzadan (Texas A&M University) showed how AI can be used on drone-collected RGB imagery to automate oil spill detection. VGG16 convolutional neural networks were trained with a dataset of some 1,300 images. The spill classification model gave an accuracy of 92%. The approach was presented at the 5th World Congress on Civil, Structural, and Environmental Engineering (CSEE'20).

More from Energy Conferences Network.


Going, going, green

CodeCarbon tracks AI carbon footprint. Total, LLNL and Stanford release GEOSX CCS simulator. Environmental Partnership launches flare management program. Carbon Disclosure Project’s ‘record’ disclosures. New sustainability standards board to consolidate environmental reporting frameworks. Halliburton’s e-frack. IEA oil and gas methane emissions need urgent action. Linux Foundation ‘software-defined infrastructure for decarbonization’. University of Cambridge, ‘integrate net zero into innovation’. EY/OGCI reports on 2019 emissions. Wolters Kluwer analyzes US carbon tax credits, clarifies CO2 sequestration.

Mila, BCG GAMMA, Haverford College, and Comet.ml have developed a ‘groundbreaking’ open source tool, CodeCarbon to help organizations track their artificial intelligence carbon footprint. CodeCarbon estimates the amount of CO2 produced by executing code to incentivize developers to write more efficient code. The tool also suggests deploying cloud infrastructure in regions that use lower-carbon energy sources.

Total, Lawrence Livermore National Laboratory (LLNL) and Stanford University have released GEOSX, an open source simulator for large-scale geological carbon dioxide (CO2) storage. The open-source nature of GEOSX will enable transparency, sharing and community support, ‘paving the way’ for large-scale development of carbon capture, utilization and storage (CCUS). GEOSX is the first major outcome of the five-year FC-MAELSTROM research project launched in 2018. More from LLNL.

Note: earlier work on CCS simulation was carried out at the DoE’s Lawrence Berkeley Lab and by the NETL which released ‘CCSI’, an open source CCS simulation toolset, available on Github.

The American Petroleum Institute-backed The Environmental Partnership has launched a flare management program. Parties to the program will leverage best practices to reduce flare volumes, promote the beneficial use of associated gas, and improve flare reliability and efficiency when flaring does occur. Data on flare intensity, a measurement of flare volumes relative to production, will be analyzed and aggregated into TEP’s annual report. More from TEP.

In its annual report for 2020, the Carbon Disclosure Project notes ‘record’ disclosures via its platform, with over 10,000 entities now disclosing data on climate change, water security and deforestation issues. Some 9,600 companies, 50% of global market capitalization, now disclose to CDP, up 70% in the five years since the Paris Agreement.

In response to the IFRS Foundation’s consultation on establishing a new sustainability standards board (SSB), the EU Securities and Markets Authority (ESMA) acknowledged that an SSB could succeed in consolidating existing environmental reporting frameworks and would be well placed to maintain sustainability standards. These should include digital representations of sustainability standards using a common XBRL Taxonomy. Read ESMA’s response to the consultation.

Halliburton has delivered a successful electric frack powered from the electric grid. Working for Cimarex Energy in the Permian basin, Halliburton has completed some 340 stages across multiple wells using utility-powered electric frac pumps. Grid-powered fracking is said to be a path to achieving the lowest emissions profile possible compared to both turbines and Tier 4 dual fuel engines. The system is also said to be reliable and requires a lower capital outlay compared to turbines.

A new analysis from the International Energy Agency (IEA) calls on companies, governments and regulators to take urgent action to cut methane emissions from the oil and gas sector. According to the 2021 update of the IEA’s Methane Tracker, oil and gas operations worldwide emitted more than 70 million tonnes of methane into the atmosphere last year, roughly equivalent to the total energy-related CO2 emissions from the entire European Union. Unlike CO2, reducing methane emissions is ‘very cost-effective for oil and gas companies’. The report, Driving Down Methane Leaks from the Oil and Gas Industry: A Regulatory Roadmap and Toolkit can be downloaded from the IEA.

The 2020 Annual Report from the Linux Foundation inclused a chapter on ‘Addressing carbon emissions and climate challenges’ that sees software-defined infrastructure as a ‘great leverage point’ for decarbonization. LF sees a convergence between energy, 5G, cloud and automotive and puts itself at ‘ground-zero’ of ‘commodity’, non-competing software for the future grid. Among LF’s projects is ‘GXF’, a joint venture with Alliander to provide a ‘scalable, technology-agnostic IIoT platform to collect data and monitor, control, and manage smart devices on the grid’.

Amongst the key processes highlighted in the recently-published University of Cambridge Institute for Sustainability Leadership’s Targeting Net Zero business briefing is the application of technology and digital transformation to ‘integrate the net-zero ambition into existing innovation processes’. This is to be achieved by ‘creating innovation forums and systems to test and scale zero carbon alternatives to current ways of working.

The Oil and Gas Climate Initiative recently published its 2020 performance report, covering 2019 emissions. Reporting data has been audited by EY & Associés which collects and checks data consistency, and guarantees the confidentiality of member companies’ data. An innovative process, applicable to both listed and state-owned national oil companies, aggregates information from reporting companies, most of whom already ensure their data is independently verified. OGCI has worked with EY to develop a verification process for some of its aggregate data. EY’s statement this year covers eight of OGCI’s 12 members.

Wolters Kluwer has just announced its analysis of US tax credits implemented to reduce operators’ carbon footprint. This follows the final IRS guidance on CO2 sequestration, which has only just now been clarified. The late arrival of the regulations has forced the IRS to push back the application date by two years. There are now over 100 carbon capture projects either being planned, built, or in operation in the United States. For more on adjustments that favor the taxpayer, and on hurdles they may encounter visit Wolters Kluwer.


2020 LBCG Oil & Gas Pipeline Integrity & Data Utilization Solutions

Virtual event hears from API on latest PHMSA pipeline safety rules. Analysis of the new ‘mega rule’ from Paramount Energy and Crestwood. Enterprise Product Partners presents PipelineML. Williams’ plans for inspections out to 2035, big data and risk analysis.

Speaking at the online 2020 LBCG Oil & Gas Pipeline Integrity & Data Utilization Solutions congress, The American Petroleum Institute’s David Murk explained the implications of new PHMSA hazardous liquid pipeline rules. The rules were updated to reflect current integrity management practices in the light of high profile incidents such as San Bruno CA, Marshall MI and Yellowstone River. The update removes older ‘one-size fits all’ proposals and allows for the use of advanced technology and engineering assessment of pipeline integrity. The Final Rule, ‘Pipeline Safety: Safety of Hazardous Liquid Pipelines’ was published on October 1, 2019. The rule has support from the API, which has however argued for an ‘appropriate timeline’ for implementation and the exemption of offshore and rural gathering lines. The rule mandates data analysis and review of pipeline safety status, integrated with geospatial information systems. Much of the PHMSA’s requirements are already covered in the API’s different recommended practices (API RP 1173: Pipeline Safety Management Systems, API 1160: Managing System Integrity for Hazardous Liquid Pipelines and others). The API is now revising its safety management and integrity standards and developing an RP for Pipeline Public Engagement. More from the API and its pipeline standards home page.

Brandi Wolfe (Paramount Energy Consulting) offered an analysis of the PHMSA ‘Mega Rule’ for gas transmission pipelines. First published as a Notice of proposed rulemaking (NPRM) in 2016, the Mega Rule has since been updated and now covers MAOP*, repair, corrosion control, integrity management and management of change. The final version of the Mega Rule in 2019. 2020 saw feedback in the form of draft FAQs along with a COVID-19 ‘stay of enforcement’. The ‘stay’ ended on 31st December 2020. Wolfe enumerated the considerable number of requirements that now need to be addressed including records management, inline inspection and the extension of coverage to ‘moderate consequence areas’ (MCAs). Interested parties should contact Paramount for more.

* Maximum allowable operating pressure.

Clem Chuck (Crestwood) warned that the Mega Rule has brought a 20% increase in the number of regulated pipelines in the United States. Operators will face significant operating challenges and increased costs. ILI preparation and inspections easily run into the hundreds of thousands of dollars per line. High consequence areas (HCAs) are 200-meter buffer zones where a pipeline passes through developed areas and places where people frequently gather, like a school. Urban development has created many new HCAs, meaning that many previously non-regulated pipelines now require frequent inline inspections. Even areas previously classified as ‘Non-HCA’ or medium-consequence areas have experienced recent incidents, and PHSMA is extending its requirements to some of these. Inspections can be robotic or tethered. Although expensive, robotic inspections are suited to inline (ILI) inspections of newly-identified HCA areas. The devices assess pipe wall integrity with ultrasonic testing. Electromagnetic acoustic transducers measure pipeline wall thickness and laser profilometers and high definition cameras can detect internal surface irregularities such as pitting. For ‘unpiggable’ pipelines, i.e. without launching or receiving facilities, or with complex geometries, ice or gel pigging can be used prior to decommissioning.

John Tisdale (Enterprise Product Partners) presented the Open Geospatial Consortium’s PipelineML data standard. PipelineML builds on the OGC’s GML standard. Any software that can read GML can visualize the spatial information in PipelineML. The OGC PipelineML working group was established in 2014 and the standard was approved by the OGC in 2019. PipelineML enables pipeline data to be recorded in an industry-standard format as it is acquired. Data can be captured and verified ‘while the ditch is still open’ and added to, throughout the life of the asset. PipelineML can ingest design-time data from CAD software like AutoCAD, Bentley, or Intergraph. Data can be added as material tests and other records are first attached to components. Construction Management Systems can output PipelineML files to show up-to-the-minute progress. ‘PipelineML makes it fast and easy to capture information whenever data is discovered, such as when the ditch is uncovered during a rehab project or non-destructive examinations’.

PipelineML files support validation. When a file passes a validation test, it receives a unique validation certificate which proves the quality of the data. Asset data can be exchanged between different parties without custom translation or reformatting. Data must still be reviewed by a subject matter expert prior to ingestion, but the costly bottleneck of data manipulation is avoided. ‘PipelineML does the heavy lifting so staff can focus on work requiring subject matter expertise’. PipelineML solves the most difficult aspect of information sharing: resolving differences in vocabularies. PipelineML 1.0 embeds some 178 code lists, each containing a standardized set of codes and values. Code lists from the API, ASTM and others have been consolidated and standardized. PipelineML natively archives information. Every time a PipelineML file is generated, it captures a snapshot of asset information flows, either within the company or outside its firewall with service providers. Snapshots can be archived to the project or as part of an asset data management system that supports PHMSA-style TVC* compliance. PipelineML can be used to create situational awareness across operator departments with TVC-complete record management across the enterprise. PipelineML is a ‘free open standard that is available today’.

* Traceable, verifiable, and complete.

See also our 2019 coverage for more on PipelineML and its relationship with other pipeline data standards.

Amy Shank (Williams) discussed the impact of new regulations on integrity management with reference to the expanding scope beyond HCAs. For Williams, this entails the assessment of around 2,200 miles of previously unassessed MCAs in a 14-year time frame, out to 2034. The new hazardous liquid rules mandate integrity assessments at least every 10 years. Within 20 years, all liquid pipelines in HCAs must accommodate ILI tools. The rules also call for more leak detection surveys and expanded reporting requirements. Williams is currently re-evaluating its impact analysis and will be developing an implementation plan.

Kelly Thompson presented on Williams’ strategy for analyzing large data volumes to meet the new regulations. Williams leverages a set of data standards and uniform practices to support operations. Labor-intensive, repetitive tasks have been automated using tools such as Safe Software’s FME, SQL, R and Python. The idea is to ‘get risk data into the customer’s hands’ and help protect against losses. This is achieved by ‘making a multitude of databases work together’. SharePoint lists are leveraged as repositories for baseline assessment, prevention and mitigation activity. FME extracts data from native sources and pipes it into a SQL risk engine for calculation. Data is analyzed in PowerBI and visualized in ArcMap. The risk system informs Williams’ preventative maintenance system. Data is the foundation of the compliance, integrity, risk pyramid.

More from the LBCG Conference home page.


Standards stuff

PIDX converts to JSON, collaborates with Open Footprint Forum. Energistics adds catalogue of compliant products. Energistics University opens its doors. Industrial Internet Consortium’s RFP Toolkit. Modelica Association’s functional mockup checker. OPC UA Cloud Library. OSGeo liaises with ISO/TC 211. PWC converts sustainability standards to XBRL. SEC publishes ESG disclosure recommendations. XBRL publishes Open Information Model.

PIDX has launched a project to convert its existing XML-based e-commerce messaging to a new JSON-based architecture and RESTful APIs. More from PIDX. The PIDX Emissions Transparency Data Exchange (ETDX) project team is to collaborate with The Open Group’s Open Footprint initiative for emissions data collection, storage and exchange. The ETDX project already has a data model and schema design concept for emissions data capture. The project team is currently exploring the viability of leveraging Open Footprint’s API for transmitting emission data.

Energistics has added a catalogue of commercial software products that leverage one or more of its standards, a resource for companies looking for standards-based solutions. The Product Catalog includes search capabilities and filters to locate products. Energistics is also working on the Energistics University, and online training environment for users of the upstream standards.

The Industrial Internet Consortium has rolled out an IIC RFP Toolkit, a collection of best practices and online tools to help IIoT users with procurement of components and resources needed for a ‘complete end-to-end IIoT solution’.

The Modelica Association has released a prototype of a new online functional mockup checker that performs a static analysis of an FMU to check the validity of the model against the FMU XML schema. The validator also checks for non-unique or invalid variable names, completeness and integrity of the model structure and more. FMU Check is based on FMPy, a Python library and GUI from Dassault Systèmes.

OPC, in collaboration with CESMII, has launched the ‘OPC UA Cloud Library’ joint working group to specify how OPC UA information models of machines, SCADA and manufacturing execution systems will be stored in and accessed from a cloud-based database. Equinor is involved. More from the OPC.

OSGeo has established a ‘Category A’ liaison with ISO/TC 211 which will allow OSGeo to contribute to ISO/TC 211 and gain early access to emerging ISO/TC 211 standards. The liaison is said to ‘strengthen the voice of open source geospatial’. More from OSGeo.

The Sustainability Accounting Standards Board’s is working with PwC to convert all of its 77 sustainability standards into an XBRL taxonomy, to facilitate a comprehensive analysis of sustainability information. More from the SASB.

The SEC has just published its recommendations on environmental, social and governance (ESG) disclosure.

The XBRL organization has now published its ‘long-awaited’ Open Information Model (OIM), to simplify and modernize aspects of the XBRL Standard. The OIM removes XBRL’s dependency on the XML syntax with the possibility of leveraging the protocol in formats such as JSON or CSV. More from XBRL. XBRL also gave a shout-out to EasyX’ XBRL Wizard for EU-compliant financial reporting. Financial reports and graphics are authored in the Xpdf format. Once complete, the Wizard ESMA adds mandatory XBRL tags.


Sales, partnerships

IDS deal with Stena Drilling. Accenture SynOps for Halliburton. Saudi Aramco teams with Google Cloud. AVEVA now a Microsoft Energy partner. Kebotix helps BP explore AI. BP powers AWS data centers. Aramco and Cognite formed joint venture. DNV GL name change. Endress+Hauser invests in US. AWS and Emerson team. Equinor awards Altus, Archer contracts. OGA selects Flare Solutions. Yokogawa/KBC designs for Gazpromneft. Henderson implements GumboNet. Implico supports DCC. Lukoil extends WEX deal. McDermott gets pre-FEED for BHP for Trion FPU. EDF MethaneSAT selects SpaceX. CNPC Richfit trains with HoloLens. GeoGraphix and MJ Logs team. Neptune Energy awards Cygnus contracts. OFS Portal ‘21 years of data stewardship’. P97 Networks and Visa Cybersource team. Peloton powered by Azure. Petrosoft now a Verifone EPS Loyalty Partner. CGG, PGS and TGS sign a strategic partnership. Implico teams with Apollocom. PTT E&P selects GEP SMART. ProPetro extends RigNet contract. SAP and DNV GL team. Aramco signs with SAP. Schneider Electric and Aramco sign MoU. Seeq support AWS. Shell Inventory Optimiser. Schlumberger to digitize OMV. Solaris/AWS team on ML. Taulia, OFS Portal implement PIDX. ScanWell adopts XaitPorter. Xilinx and Quantico. INPEX, Terra Drone launch intelligent drone. Sword Venture to support OGA. AWS, Interica integrate OSDU.

IDS has secured a six-figure fleet-wide deal with Stena Drilling to carry out rig reporting and performance analytics leveraging its TourNet Pro automated reporting solution.

Accenture has deployed its SynOps platform to accelerate Halliburton’s digital transformation across its supply chain and manufacturing functions.

Saudi Aramco and Google Cloud are teaming to offer Google Cloud’s technology and solutions to global customers and enterprises in Saudi Arabia.

AVEVA is now a Microsoft Energy Core partner, a global initiative and center dedicated to digital transformation in the energy sector.

Kebotix is helping BP explore how AI and other advanced digital technologies can save time and resources.

BP is to supply more renewable energy for AWS data centers. The partnership aims to accelerate BP’s digital transformation and support both companies’ net-zero carbon ambitions.

Aramco and Cognite have formed a joint venture focusing on digitalization in Saudi Arabia and the MENA region.

DNV GL is changing its name to DNV effective 1st March 2021.

Endress+Hauser and its US representative partner Eastern Controls will invest $4.5M to grow its sales and service support around the US.

Emerson has demonstrated ‘Big Loop’ reservoir modeling on the AWS Cloud.

Equinor has awarded Altus Intervention and Archer Integrated Services framework contracts for integrated wireline services, with an estimated total value of one billion NOK per year.

OGA has selected Flare Solutions to provide project management and data improvement initiatives and to help achieve its goals of maximizing economic recovery and net-zero greenhouse gas emissions by 2050.

Yokogawa and KBC have been selected to study the conceptual design of a new Production Management Centre covering Gazpromneft’s Omsk and Moscow refineries.

Henderson has implemented Data Gumbo’s GumboNet in its global contract drilling, equipment sales and MRO services.

Following the successful integration of 34 new service stations into DCC’s existing software landscape, Implico will provide IT support for DCC’s Irish service stations.

WEX has secured a multi-year agreement extension of services with Lukoil North America for its branded fuel cards.

McDermott has secured a pre-FEED contract from BHP for the Trion Project’s semi-submersible FPU.

MethaneSAT, a subsidiary of Environmental Defense Fund, has selected SpaceX to deliver its new satellite into orbit aboard a Falcon 9 rocket. Bezos Earth Fund ‘granted’ EDF $100 million for the completion and launch of MethaneSAT.

CNPC Richfit is using Microsoft HoloLens to improve its training programs for technicians and frontline oil and gas employees.

GVERSE GeoGraphix and MJ Logs are teaming to offer a ‘superior product with world-class customer service at a competitive price’.

Neptune Energy has awarded Oceaneering and Stork integrity management and fabric maintenance contracts, valued at approximately $6.5 million, for its operated gas production platform, Cygnus.

OFS Portal approaches 21 years of data stewardship and has surpassed 500 e-commerce agreements with trading partners.

P97 Networks and Visa’s Cybersource have signed a new multi-year global partnership to deliver best-in-class mobile payment acceptance tools with integrated risk management for convenience and fuel retailers.

Peloton’s well-focused data management solutions, powered by Microsoft Azure, deliver a ‘robust’ end-user experience to oil and gas companies around the world.

Petrosoft is now a Verifone EPS Loyalty Partner.

CGG, PGS and TGS have signed a pioneering strategic partnership to offer a shared ecosystem providing direct access to their subsurface multi-client data libraries.

Implico and Apollocom are teaming up to bring the OpenTAS TMS terminal management system to Central America.

PTT E&P has selected GEP SMART, GEP’s unified procurement software, to support its digital transformation.

RigNet and ProPetro have extended their contract throughout 2021 to leverage Intelie Live, RigNet’s secure, intelligent networking solutions for upstream oil and gas operations.

SAP and DNV GL have teamed up to deliver Corrosion Under Insulation Manager, a new oil and gas industry cloud solution designed to combat corrosion.

Aramco has signed a strategic alliance with SAP Saudi Arabia to expand the digitalization of its Enterprise Resource Planning (ERP) systems. SAP’s Data Center in Saudi Arabia will offer new cloud solutions to Aramco and other companies.

Schneider Electric and Aramco have signed a MoU to collaborate on assessing emerging technologies based on the Open-Process Automation Standard (O-PAS). Testing will take place at a new built-for-purpose test bed in the Saudi Schneider Electric Innovation and Research Center in Dhahran Techno Valley, Saudi Arabia.

Seeq has expanded support to include Amazon Timestream, a managed time-series database service for IoT and operational applications.

Shell, Equinor and Microsoft are to develop a smart inventory management system, Shell Inventory Optimiser, which will run on Microsoft Azure.

Schlumberger and OMV have partnered to deploy AI and digital solutions enabled by the cloud based DELFI cognitive E&P environment, across OMV’s global operations.

Solaris Oilfield Infrastructure and AWS are to collaborate on the development of a data analytics and machine learning platform.

In collaboration with Taulia, OFS Portal has implemented PIDX Oil & Gas standards using AS2 between an OFS Portal supplier member and an operator customer of Taulia.

ScanWell is a new XaitPorter client.

Xilinx and Quantico Energy Solutions undertook cooperative research and development efforts to accelerate the inferencing capabilities of the QEarth product using Xilinx Alveo U250 cards and Vitis libraries.

INPEX and Terra Drone have launched INPEX-Terra Drone Intelligent Drone Plan to promote digital transformation and support the sustainable growth of the oil and natural gas industry in Japan and around the world.

Sword Venture has been awarded a collaborative partnership to deliver responsive Data Services to the Oil & Gas Authority (OGA).

AWS and Interica have partnered to deliver the first phase of integration with the OSDU. Supporting the AWS MAP program, Interica has deployed its technology to analyze geoscience environments enabling ‘intelligent migration’ to AWS cloud technologies.


Safety first

IOGP Report on safety recommended practices. LR Safetytech Accelerator II opens. NIOSH on selecting fatigue monitoring technology. TekSolv, hazard communication a ‘most cited’ violation. Neptune Energy teams with HALO Trust on UXO removal.

The International Association of Oil and Gas Producers (IOGP) has published Report 456 (V2), covering recommended practices and key performance indicators for process safety in the upstream. The report was produced in response to major incidents and sets out to improve safety by learning from past events with less serious outcomes. The report is a free download from the IOGP bookstore.

The Safetytech Accelerator* has announced round two of its Safetytech PoC Fund. The Fund is to disburse £1 million in support of safety proofs of concept, in particular, digital products and services ‘that can significantly enhance safety and reduce risk in safety-critical industries and infrastructure’. Projects funded in round one included M Squared (early detection of gas leaks on an LNG tanker), Numberboost (AI-driven real-time error detection), Multisensor Scientific (leak detection) and Smartvid.io (rigsite falling object detection). More from the earlier round.

* The Safetytech Accelerator is a not-for-profit initiative created by Lloyd’s Register Group and Lloyd’s Register Foundation. More from LR.

A recent blog post on the US NIOSH website offers advice on choosing fatigue monitoring and detection technology. FMDT comes in two forms, predictive (based on the sleep patterns and work hours) and monitoring (of eyelid movement and blink rates). Detection technologies that are common in high-end motor vehicles are now also available as wearable devices or mobile apps that can be used in almost any situation. NIOSH recommends considering key factors in FMDT selection including primary purpose, validity and reliability and user acceptance. There is no one size fits all. NIOSH warns that, ‘such devices can obscure the underlying causes of fatigue and should not be used as a primary safety measure’. More from NIOSH.

TekSolv warns that hazard Communication is one of the top ten OSHA ‘most cited’ violations. In 2019 alone, there were 3,671 violations. Organizations need a written Hazard Communication Program to ensure that the information concerning classified hazards is transmitted to the employer, contractors, sub-contractors and employees. TekSolv’s Hazard Communication Training program combines health hazard training and hazard communication training and if required, OSHA 10/30 certification. More from TekSolv.

Producer Neptune Energy has teamed with the worldwide humanitarian landmine clearance organization, The HALO Trust. Neptune teams will demonstrate their systematic approach to incident management, root cause databases and learning review techniques to their counterparts at HALO. HALO will share its experience of clearing landmines and other unexploded ordnance* from areas affected by war. More from Neptune and the HALO Trust.

* Unexploded ordinance is a subset of geophysical expertise with, for instance, this presentation on Machine learning for the classification of unexploded ordnance (UXO) from electromagnetic data.


Reinforcement learning at the Distillation Gym

AI proof of concept from Cambridge University showcases Cape Open simulation of hydrocarbon processing.

Speaking at the 2020 CAPE-OPEN 2020 Annual Meeting, Laurence Midgley (University of Cambridge) presented a paper on his ‘Distillation Gym’, an application of reinforcement learning (a branch of artificial intelligence) in chemical engineering. The Distillation Gym is an AI agent that designs processes using the COCO simulator*. A Python wrapper controls the computing.

Reinforcement learning is a novel approach to chemical engineering process synthesis with the potential to be applied to more open-ended design problems than conventional computer-aided techniques. In RL for process synthesis, the environment is the simulator (e.g. COFE, Aspen Plus), the RL agent is the process designer and the reward is the objective function (e.g. profit).

Midgley’s simple proof of concept covered a reinforcement learning agent that optimizes the design of a hydrocarbon distillation column train simulated with the CAPE-OPEN Flowsheet Environment, COFE and ChemSep. Also presented is the application of RL to simulation in general and ‘how CAPE-OPEN may facilitate such applications’.

* Cape Open to Cape Open simulation environment.


Stochastic forcing for better digital twins

A novel approach to finite element modeling is said to ‘lay the foundation of the digital twin revolution’. But there may be some ‘prior art’.

FEM, the Finite Element Method is a widely-used method for solving differential equations in engineering and mathematical modeling. Applications span geology, reservoir modeling, engineering construction and fracking. LR (formerly Lloyds Register), through its Alan Turing Institute, reports a ‘radically redesigned’ approach to FEM, developed by researchers at the Universities of Cambridge and Western Australia.

The researchers observe that FEM results often do not match empirical evidence, revealing mismatches in a model that are amenable to a statistical approach. Such model ‘mis-specification’ is addressed by introducing ‘stochastic forcing’ to the partial differential equations before updating the FEM. Stochastic forcing* is introduced through a random function within the governing equations.

You may wonder why an apparently abstruse mathematical technique is being brought to our attention. The answer is that, for LR, the approach ‘lays the theoretical foundations and methodologies by which digital twins can be realized’. Researcher Mark Girolami said, ‘Digital Twins, the pairing of the physical and virtual world, are of significant current interest to the broader engineering community. By integrating data with FEMs, this new work provides the mathematical foundations of the Digital Twin revolution’.

The study is reported in the Proceedings of the National Academy of Sciences with a use case of modeling ‘solitons’ (ocean waves) which are said to be a threat to critical offshore infrastructure such as wind turbines.

* It would appear that (at least) some facets of stochastic forcing are available in software from Cossan. We quizzed Cossan project lead Edoardo Patelli as to whether there was a degree of hyperbola in the LR announcement. He told us ‘Yes, you see a rebranding [of FME ]with cool terms like digital twins instead of stochastic finite elements and perhaps with a pinch of machine learning instead of data analysis. Maybe this is a better way to generate commercial interest! But are people going to use your code without understanding what is going on underneath?’ We challenged the paper’s lead author Colin Duffin, suggesting there may be some ‘prior art’ here. He responded, ‘Stochastic finite elements are similar but to my knowledge they are made for uncertainty quantification of parameters/quantities of interest. Whereas in our work we are more interested in updating our FEM solution with data and seeing the resultant posterior measure’.


Quantum computing in oil and gas. Yes really!

BP joins IBM Quantum Network. Quantum computing to propel BP’s Net-Zero initiative. Total appoints ‘Head of Quantum Computing’. GENCI and QCWare attack the ‘generalized pooling problem’. La Maison du Quantique opens its doors.

BP has joined the IBM Quantum Network as an Industry Partner, gaining access to IBM’s quantum computers via the cloud, including what is claimed to the ‘largest universal quantum system available to industry today’, a 65-qubit machine. A 1,000-plus qubit system is targeted for the end of ‎‎2023.‎ BP is to work with IBM to explore the use of quantum computing to solve business and engineering ‎challenges and explore the potential applications for driving efficiencies and reducing carbon ‎emissions.

Morag Watson, BP’s senior VP digital ‎science and engineering said, ‘Our ambition is to become a net-zero company by 2050. Next-generation computing capabilities such as quantum computing will assist in ‎solving the science and engineering challenges we will face, enabling us to reimagine energy ‎and design new lower carbon products’. Potential applications include modelling the chemistry ‎and build-up of various types of clay in hydrocarbon wells, managing the fluid dynamics of wind farms and optimizing ‎autonomous robotic facility inspection. More from IBM.‎‎

Total has gone so far as to appoint Marko Rancic as its first ‘Head of Quantum Computing’. Rancic was previously a postdoc at the University of Basel Switzerland. A Total-sponsored quantum computing project to research the ‘generalized pooling problem*’ (GPP) recently received funding from the Paris Region. Project ‘AQMuSE’ is to address GPP’s application to logistics along with partner Franco-Californian quantum computing boutique QCWare.

The Paris Region funds are disbursed from GENCI, a public company that works to increase the use of HPC to ‘boost competitiveness’ in the French economy. Genci’s ambitious project include a downtown Paris location, ‘La Maison du Quantique’, ‘to create synergies, benefit from shared space and infrastructures and accelerate the emergence of a quantum industry’.

Last year Total signed with Cambridge Quantum Computing (CQC) to develop quantum algorithms for carbon capture, utilization and storage (CCUS).

* A generic way of describing logistics flow across networks of sources, storage and sinks.


Chatham House debate ‘What’s Next for the Oil Industry’

OPEC - oil to dominate out to 2045. The Economist - back to pre-covid by 2023. Chatham House - in all scenarios, boom times over for oil and gas. Wilson Center - industry ‘permanently damaged’ by covid. Hydrogen - the new CCS?

OPEC’s Ayed Al Qahtani sees Global GDP bouncing back in 2021, led by China and India. Oil demand, that contracted by 9% in 2020, is to grow by 6% in 2021, again led by China. All energy sources are needed to match demand out to 2045. Renewables are the fastest growing, but crude will maintain its dominance. Oil and gas will supply over half of the world’s energy needs out to 2045. To keep up, upstream spend needs to average $380 billion/year for a cumulative capex of $3 trillion by 2045. More on this in the OPEC World Oil Outlook 2045.

Cailin Birch of The Economist Intelligence Unit sees a return to a pre-covid situation by 2023. Despite the recent price recovery, supply remains ample. OPEC+ unity needs to stick if prices are to maintain. While the Saudis need cash to fund their megaprojects, Russia can live with $45-ish oil.

For Valerie Marcel (Chatham House) in all scenarios, the boom times are over for oil. Investors are looking for stability and less risk. After the pandemic, many oil sectors will not take-off. Oils are under pressure to find low carbon, easier projects. Emerging producers do not have the infrastructure to handle this. They need to diversify and should avoid taking on debt in the expectation of future revenues that may not materialize. Marcel also stressed the importance of international management of covid in the energy transition. If this is unsuccessful and covid comes back again and again, investment will be difficult, and countries will retrench for survival. With better covid management there will be more opportunity for investment in renewables.

Montserrat Ramiro from the Wilson Center looked at the oil business through the different lenses of the short-term social imperatives of covid and the longer term issue of climate change. Oil is not going to go away any time soon but has the industry has been already been permanently changed by covid. Not all lost demand will return and the climate imperative is driving change and diversification – witness Saudi Arabia’s actions in renewables. Long-distance mobility (i.e. air travel) is an unsolved problem.

Chatham House’s Paul Stevens lightheartedly compared today’s enthusiasm for hydrogen as the new energy savior with an earlier dalliance with carbon capture and sequestration. There followed a ripple of amusement from the panelists. Al Qahtani appeared to agree, comparing the IEA estimate of the cost of CCS ($ zillions) with what we have today (not much!). Ramiro was more optimistic. Incentives drive change. Solar costs are down 90% and wind by 50%. Hydrogen will go the same way. Storage, renewables and other costs are falling. Hydrogen will be significant in helping the oil industry to ‘not die down completely’.

More from Chatham House.


Blockchain news

Dassault Systèmes on the challenges of blockchain for AI in manufacturing. Data Gumbo figures real-time OPEX. World Economic Forum’s Kryha blockchain traces carbon emissions. Woz’ new blockchain wheeze: ‘Efforce’ energy-saving with WOZX crypto. Ziyen Energy trades coinage for stripper well production.

Our 2018 ‘blockchain is BS’ editorial does not appear to have been universally taken on-board. The technology is still touted as a ‘solution’ for many problems that may or may not exist. What follows is a short compendium of what has come under our radar of late. But remember two things; 1) tying a digital ‘token’ to anything in the real world requires something outside the blockchain and 2) vendors’ ‘blockchain-based’ services are under no obligation to use blockchain in the way it was originally deployed, as in the peer-to-peer exchange mechanism that underlies bitcoin*.

A curious blog posting on the Dassault Systèmes North American website ‘uncovers the challenges of using blockchain for AI in manufacturing’. Citing the competition between industry and ‘the giants of IT’ for AI talent, the authors, in a mind-bending leap of faith, suggest that ‘blockchain technology is a way to help industry gain AI expertise’. A further leap is required to see the relationship with Dassault Systèmes’ offer of ‘more AI technology for its clients’ thanks to the acquisition of Proxem, a specialist in ‘AI-powered semantic processing software and services’. Proxem ‘harvests information, and then a blockchain-driven tool can verify its origin and use’ [allowing for] ‘innovative approaches to collaboration between departments and among development partners’. Dassault expects its blockchain-based software to roll-out in 2021. More on these improbable developments from Dassault.

Data Gumbo now promises real-time operational expenses (OPEX) visibility for energy operators and service providers across Europe. DG’s blockchain technology allows operating data from the field to be verified against commercial contracts, triggering automated payments and delivering real-time visibility into contract spend.

Blockchain booster the World Economic Forum proposes the use of blockchain to trace carbon emissions across extractive industries and has released a proof of concept application. WEF’s Mining and Metals Blockchain Initiative is a collaboration with seven global companies that launched in 2019. Today, COT, the Carbon Tracing Platform provides end-to-end traceability of CO2 emissions using distributed ledger technology. The COT PoC was developed by Dutch blockchain ‘champion’ Kryha and advisor Susan Joseph.

You may be wondering what Steve Wozniak has been up to since he stepped down from an active role with Apple Computer. Quite a lot apparently, including his latest ‘billion dollar’ venture to democratize and finance energy efficiency investments worldwide. Woz’ Efforce startup has launched WOZX, a cryptocurrency that launched in December with a share price of $1.2. It popped to a 3.14 high in a couple of days before stabilizing at around the issue price. Efforce’s business model involves striking a contract with a service company that proposes an energy-saving action. The contracts are then tendered to Efforce shareholders whose WOSX coinage finances the projects. At least that is how we understand the rather convoluted structure.

Another cryptocurrency deal financier is ‘Scottish-American’ Ziyen Energy which recently announced its latest oil and gas property acquisition in exchange for ZiyenCoin, its very own blockchain-based token. In return for its coinage, ZE acquired an unspecified amount of non-operated working interests in 42 oil and gas wells in Panola and Cooke Counties, Texas. In another tough one for our simple minds, the newly acquired interests ‘will provide the company with US dollar distributions from the ownership of oil production purchased in ZiyenCoins’. For more on Ziyen’s groundbreaking financial innovation read the article co-authored by Ziyen CEO Alastair Caithness in the Frontiers of Engineering Management Financial Journal. Why are we thinking of those (Scottish?) colonialists handing out colored beads to the natives?

* For a similar skeptic’s take on the blockchain phenomenon, read ‘Why blockchain is a belief system’ by the FT’s Izabella Kaminska.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.