Recent announcements on artificial intelligence in oil and gas and elsewhere

BP’s ‘Sandy’ AI platform leverages Belmont’s k-graph. eLynx and TUALP. Safe leverages OpenCV. IBM teams with Penguin Computer on Bayesian HPC. Neo4J blends graphs with AI. Quantzig’s ‘top four' benefits of AI in oil and gas. PNAS on the contrarians and on a ‘back to the graph’ future!

An article in Technology Record reports that BP is using Microsoft Azure-based machine learning to predict oil and gas recovery factors. The previously tedious work that used to take weeks can now be done in days or hours. BP’s ML-based recovery factor model is in daily use by hundreds of subject matter experts at BP.

BP’s enthusiasm for artificial intelligence has led to its BP Ventures unit chipping in some $5 million in Belmont Technology’s Series A financing round. The deal follows BP’s $20 million investment in Beyond Limits, another AI boutique. Belmont’s knowledge graph technology will ingest geoscience, reservoir and historic production data in a ‘robust’ knowledge-graph of subsurface assets. These can be interrogated with natural language queries while ‘AI neural networks interpret the results and perform rapid simulations’. The combined technologies underpin BP’s ‘Sandy’ AI platform with Belmont’s ‘scalable knowledge-graphs’ feeding into Beyond Limits’ platform.

ELynx Technologies is collaborating with the University of Tulsa on the development and validation of ‘digital twins’ to predict the behavior of wells produced using artificial lift. The research is said to accelerate the shift from predictive maintenance to predictive optimization. The work is performed under the auspices of Tualp, the Tulsa University Artificial Lift Project. Elynx will contribute training data amassed during its monitoring of some 40,000 across major US drilling basins. Elynx data scientists and subject matter experts are to contribute a data-driven perspective to physics-based modeling of artificial lift processes. In return, Elynx expects accelerated development of new products including models for the latest breed of plunger-lift and ESP products.

Safe Software has added custom tools for embedding computer vision in applications via its FME 2019 development environment. A blog from Safe explains how FME uses the OpenCV computer vision and machine learning library in its family of RasterObjectDetector transformers.

IBM has teamed with Penguin Computing (both members of the OpenPower Foundation) to develop a hardware appliance dedicated to ‘intelligent simulation’. The appliance adds a ‘Bayesian' optimization capability to an existing HPC cluster ‘of any architecture’ to improve processing capability. Researchers tell the systems to exchange data and the Bayesian appliance automatically designs smarter simulation instructions for the primary cluster. Cray is also working with IBM on the new approach. IBM also reports work on new knowledge graph technology capable of reading 500,000 documents per hour, ‘bringing order to chaotic data’ and establishing a corporate memory of HPC work. This web-based tool is currently available at no cost from IBM Zurich. The technology is being added to IBM’s AI/deep learning platform PowerAI.

Neo4J sees its eponymous graph technology engine as having a ’symbiotic relationship’ with artificial intelligence citing work by futurologist James Fowler as chronicled in his book Connected. For Neo4J, ‘we are on the cusp of a new Cambrian explosion of graph-powered artificial intelligence’.

Quantzig has blogged on the ‘top four’ benefits of advanced analytics in the oil and gas industry. These are ‘predicting the success rate of downhole operations’, ‘information delivery in real time’, ‘analysis of log runs’ and ‘predictive maintenance’.

At the 2019 ARC Advisory Group Industry Forum, Pioneer Natural Resources and Devon Energy reported use of Seeq’s analytics toolset for machine learning and industrial control system connectivity. Devon presented on the use of ‘advanced analytics’ to optimize shale completions while Pioneer’s work targets condition-based compressor maintenance.

The contrarians in us were excited when we came across the Proceedings of the National Academy of Science (PNAS) release on the ‘limits of deep learning’. PNAS’ Mitchell Waldrop describes the ‘much-ballyhooed’ artificial intelligence approach as ‘boasting impressive feats but still falling short of human brainpower’. Minor changes (aka adversarial attacks) to imagery can easily fool AI systems. A fact that has suggested to some researchers that ‘we’re doing something wrong’. This is a ‘widely-shared sentiment among AI practitioners, any of whom can easily rattle off a long list of deep learning’s drawbacks’. More puzzlement stems from the gross inefficiency of the training data paradigm which comes-off poorly when compared with human learning. ‘For a child to learn to recognize a cow, it’s not like their mother needs to say "cow" 10,000 times’. The opacity of the learnings from AI systems is also problematic and may be unacceptable in any circumstance, even the answer is right. AI’s salvation may come from the adjunction of graph technology to deep neural networks. The graph network has been getting a lot of excitement over the last couple of years. Such deep-learning systems have an innate bias toward representing things as objects and relations*. A graph network, then, is a neural network that takes a graph as input and ‘learns to reason about and predict how objects and their relationships evolve over time’.

* Is that back to the semantic web future or what?

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.