Rocky Roden (Rocky Ridge Resources) believes that deep learning models based on labeled synthetic datasets are best, citing successes in fault delineation with convolutional neural nets. Roden suggested extending the digital twin paradigm to build an ‘earth system digital twin*’. An ESDT could allow users to add information to a volume and run numerical simulations to understand the impact. For instance to compare 4D seismics with real time reservoir data. Working around an ESDT would allow workflows to be more flexible. Different procedures involved in the seismic analysis could be run simultaneously as opposed to today’s step-wise approach. Work still needs to be done on acceptance of the ML paradigm in geosciences. ML’s data-driven approach contrasts with the scientific method and the complex multi-scale and varying resolution of geo data. ML applies relatively simple math across thousands or millions of computations in what is perceived as a black box approach. It is also affected by poor or absent labels and, just like other methods, poor data. This has led to a new field of ‘explainable AI’ (XAI), involving novel methods that explain and interpret ML results. In any event, statistics on paper downloads from AGU and SEG show an exponential growth in AI-related subjects. Roden advocates combining ML-based tools into pre-built interpretation workflows that require little human input. Combining ML algorithms can greatly improve the overall result as models tune each other and ‘adapt to unknown tasks’. Roden concluded that ML is disruptive but it will not replace the interpreter. However, geoscientists who do not use ML will be replaced by those that do!
* see elsewhere in this issue for Amazon and Nvidia’s steps in this direction.
YPF’s Teresa Santana addressed the impact of innovation and diversity in applied geosciences. Machine learning has proven its worth over several decades within the energy industry, providing a better understanding of the reservoir. More recently, innovative learning algorithms allow computers to re-learn from their own predictions. AI leverages a diverse set of data - logs, seismic, cores – in support of different subsurface activities from frontier exploration to development. For Santana, diversity includes learning models that provide alternative results, perhaps testing multiple different models, evaluated by a diverse user community. ML is today applied as self-organizing maps (Geophysical Insights’ specialty), Bayesian geobody classification and semi-automated geobody extraction using frequency decomposition. But more can be done with AI in geophysics, helping the interpreter with more automation and again, leveraging diverse data types to train neural nets for future innovative use cases. Making AI easier to use, with ‘no code’ applications will allow subject matter experts to inform models, minimize bias and exploit machine learning to the full.
Battelle’s Srikanta Mishra has been using ML models to evaluate carbon capture and sequestration projects. CCS is a mostly proven technology, there have been ‘no major surprises’ from projects to date but there are uncertainties regarding leakage and induced seismicity. With regard to data-driven modeling, there are two camps. Some geoscientists and engineers may already be applying these methods in an ‘ad hoc’ manner. Others may be holding-off because of a lack of training. In general, ML is applicable when conventional physics-based models are too computationally expensive or when the physics is poorly understood. Enter the ML ‘black box’ model. These are recommended when the cost of a wrong answer is low compared to getting it right (e.g. proxy models in history matching), when they give better results (such as pattern recognition on well tests) or as tools to ‘inspire and guide’ (preventative maintenance). The big issue here is combining ML with physics. Enter the Science-Informed Machine Learning to accelerate real time decisions in subsurface applications, a.k.a. the US Department of Energy’s SMART initiative. Here Battelle and other US R&D institutions are building a 3D proxy model for a full field CCS project. A three orders of magnitude speedup is reported over conventional modeling with CMG GEMS. There remain acceptance challenges for ML, geoscience models are not as successful as those used in consumer marketing or social sciences. The field is immature, perhaps at the stage of geostatistics in the 1990s. Mishra recommends a mindset that sees AI as less a ‘curve fitting’ approach but ‘the extraction of insights consistent with mechanistic understanding’.
Federico Giannangeli enumerated some of the AI projects in the Repsol TechLab product catalog. Optimized seismic acquisition is set to reduce today’s ‘conservative’ seismic acquisition design. Sparse acquisition and simultaneous shooting could reduce costs by up to 30%. Sparse data is then interpolated with an ML-based ‘blind trace network’ which simultaneously de-blends the sim shot data.
MC Michael Dunn (Geophysical Insights) challenged the panel to provide some examples of their most used ML applications and especially if any had produced results from data that were ‘invisible’ before ML. For Repsol, ML is key to many areas such as rock characterization without cores, using cuttings and image processing. Field developments and production can be optimized with deep learning. In other words, it is applicable everywhere! Randall Gentry (Petrolern) stated that while ML has been around for a while it remains emerging technology in terms of adoption. There has not been a ‘eureka moment’ but Petrolern (a geomechanics company) has recently started using AI to separate out features from drilling records. By monitoring the drill bit it may be possible to run fewer downhole logs and reconstruct core info. This was based on early data from DoE grant funded work.
Lennart Johnsson (U. Houston) observed that while ML has changed the game in terms of research activity. Aspects of ML have integrated high performance computing everywhere. But no. ML has not so far brought new insights into computer science itself. ML is currently trying to do as well as human or classical methods. ML is sensitive to data sets. Small changes can lead to wrong classification. And there is one big negative, the compute resources and energy required for model training. Johnsson cited an MIT Review article that reported that 2/3 of research money was spent on HPC heating and cooling, leaving only 1/3 for research! Dunn continued with his quest for the ML killer app. How do you quantify the impact of ML on the business? Has anyone observed changes in the probability of success of finding oil? Repsol cited the case of saving a dry hole with an ML-based analysis. But in general, it would not appear that the real killer AI app has arrived. Unless that is, companies are, as Dunn suggested, ‘keeping schtum!’
Geophysical Insights is hosting sponsor of the Upstream ML event.
© Oil IT Journal - all rights reserved.