2019 Houston Oil and Gas (seismic) Machine Learning Symposium

Advertas*-hosted event hears from: IGI on AI in geophysics and the SEG/SEAM AI project. Geophysical Insights – Will ML replace the interpreter? Chevron – ML in reservoir engineering. ConocoPhillips – ML in 4D seismic and ‘direct quantitative prediction’. AASPI Consortium on ML in attribute classification. Geophysical Insights - making deep learning accessible to the interpreter. Dell on digital natives vs. the ‘see the beach’ generation.

IGI AI in geophysics and upcoming SEG/SEAM AI project

Former SEG president Nancy House (now with IGI-LLC) provided a history of seismic interpretation leading to the revolution in exploration geophysics that unconventional exploration has produced. Today the watchword is 3D quantitative interpretation, allowing fine detailed rock properties to be mapped from seismics including key indicators such as total organic content. Currently the industry is at a crossroads with conventional physics-based methods being supplemented by AI techniques. In one example, a 3D seismic attribute volume was generated in a pre-stack inversion tied to geochemical well data. A hydrocarbon indicator was obtained using an artificial neural network. The interpreter may be confronted with a very large number of seismic attributes (90 +). Manual interpretation with multivariate regression analysis can usefully be replaced with a neural net/machine learning approach. House announced an upcoming AI co-operative project, ‘testing AI applications in petroleum geophysics’ that will run under the auspices of the SEG SEAM. Objectives are the evaluation of AI-derived results for accuracy and efficiency. The SEAM website does not yet reference this work but interested parties should email: seam@seg.org. there will also be an SEG-sponsored workshop, ‘Artificially intelligent earth exploration: teaching the machine how to characterize the subsurface’ in Muscat, Oman 19-21 Apr 2020.

Geophysical Insights – Will ML replace the interpreter?

Rocky Roden (Geophysical Insights) asked rhetorically, ‘Will machine learning profoundly change geoscience interpretation?’ AI/ML is disruptive technology, but it will not replace geoscience interpreters. However, geoscience interpreters who do not use machine learning will be replaced by those who do. ML can improve interpretation workflows by providing the desired answers faster and more accurately. Already, ML applications provide higher resolution than conventional methods for inversion, fault delineation and facies. Semi supervised and reinforcement learning methods ‘hold great promise to reveal previously unseen phenomena’.

Chevron on ML in reservoir engineering

Sarath Ketineni (Chevron) presented on machine learning applications in reservoir engineering with reference to AI based reservoir characterization (deriving synthetic well logs from seismics using NN) and pay zone identification. This used a big data approach involving feeding well, seismics, completions data and more into a neural net to forecast future production. Both techniques can be combined to steer a well path through major sweet spots. For more on this see Ketineni’s paper SPE-174871-MS ‘Structuring an Integrative Approach for Field Development Planning Using Artificial Intelligence and its Application to an Offshore Oilfield’. Another use has been in unconventional reserves forecasting in the Eagle Ford shale. Conventional methods of predicting oil and gas recovery fail on unconventional reservoirs. Random Forest regression helps identify the most important parameters (25 variables from 4000 wells were used in analysis). Again, the results are available in SPE 196158-MS,’A Machine Learning Analysis Based on Big Data for Eagle Ford Shale Formation’. A third example is SPE 196089-MS: The importance of Integrating Subsurface Disciplines with Machine Learning, a case study from the Spirit River formation (John Hirschmilleret al., GLJ Petroleum Consultants). Ketineni concluded that ML applications in oil and gas are growing rapidly and that petroleum engineers need a better understanding of data science fundamentals, their applicability and limitations. Useful resources include the SPE’s Data Science and Digital Engineering Journal, Coursera’s ML courses and the ubiquitous Tensorflow.

ConocoPhillips – ML in 4D seismic and ‘direct quantitative prediction’

Mike Brhlik (ConocoPhillips) reported on trials of ML in 4D time lapse seismic interpretation for reservoir monitoring. Here the aim is to follow reservoir property changes directly using multiple attributes from successive seismic surveys. This is currently done with rock physics modeling and simulation in ‘semi quantitative’ linear workflows. Brhlik proposes a new ‘direct quantitative prediction’ approach using a data-driven workflow that also embraces the physics. The model was trained on forward-modeled synthetic seismic data to assess various predictors (elastic/seismic attributes) and targets (pressures, saturations) at wells. The 4D synthetic project established that the time lapse seismic inverse problem is solvable by ML regression algorithms. The Random Forest approach gives the best results along with gradient boosted trees. Neural Nets were not as successful. The approach has reduced interpretation cycle time and provides a ‘common ground’ for revisiting 4D model updating workflows that facilitates inter disciplinary integration.

AASPI/University of Oklahoma – ML and seismic facies classification

Kurt Marfurt (AASPI Consortium/University of Oklahoma) gave a wide-ranging presentation on finding the best attribute combinations for seismic facies classification. Marfurt has worked with a large Gulf of Mexico seismic dataset, using multi-attribute/ML classification to distinguish salt, mass transport flow and sediment accumulation. He concludes that human interpreters working with seismic attributes are good at identifying 2D spatial patterns, but that human interactive analysis is limited to about 3 volumes using different color models, transparency, and/or animation. In contrast, machine learning can analyze dozens of attributes at the same time. Moreover, ‘for normal amounts of training data using modern computers, an exhaustive search for the optimum number and combination of attributes is both desirable and feasible’. Having said that, Marfurt cautioned ‘We’ve presented three workflows for attribute selection; we do not yet know which is best for a given mapping task’.

Geophysical Insights - making deep learning accessible to the interpreter

Dustin Dewett (Geophysical Insights) wants to make deep learning accessible to the seismic interpreter and to identify facies from complex seismic waveforms. However, the most commonly used deep learning technique is the convolutional neural network which requires large amount of training data that may not be available. Dewett’s approach is to tune the model so that it can be trained with less data and to automate the generation of training data. Using data from the Taranaki Basin, Offshore New Zealand, 31 training lines were manually labeled as a training set. This represented a 1GB stacked volume of some 500 lines with 100 CDPs/line. Dewett showed that a CNN models with very few training lines still provides a useful result. CNN fault detection can be further accelerated by training on large scale synthetic data. Directly applying a pretrained fault detection network is extremely efficient.

Dell and the digital natives

David Holmes (Dell) gave a reprise of the Agile hackathons along with another example of a hackathon event. Here an Agile/Enthought/Fugro team working on 200GB of high res multi polarization thin section imagery trained a model to segregate and classify grain mineralogy. Tools of trade included SciKit Learn, Jupyter, NumPY and AWS hosting. Holmes presented the brave new world of AI/ML with what could be considered an ageist contrast between Gen Z ‘digital natives’ and Gen ‘See The Beach’, the latter characterized by a rather tired-looking group of EAGE luminaries! Agile’s Open subsurface stack got a mention as did the more recent OSDU, a ‘unique and unprecedented’ collaboration. Holmes concluded with his favorite theme of the ‘citizen data scientist’, part hacker, part statistician and part geoscientist. Whichever camp you are in, you need provenance tools and governance; for the data that is consumed, for the models that are produced and the algorithms that are generated. Watch the Dell EMC oil and gas video here.

Registration for the 2020 Oil and Gas Machine Learning Symposium is available at UpstreamML.com.

* Houston-based Advertas is a marketing and PR firm serving clients in energy and technology. In 2009 the company was retained by Geophysical Insights as its outsourced marketing and business development partner.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.