AAPG 2005 Calgary—post boom record falls

With 7748 in attendance, this was the largest AAPG since the 1980s boom (the all-time record is 13,000). In a caricature of industry demographics, the average age at the Awards Ceremony might have registered on the Phanerozoic time scale! A more youthful (and feminine) audience was visible at the environmental/Kyoto session. On the exhibition floor, we noted a plethora of applications for modeling anything—from plate tectonics to sand juxtaposition across fault planes (see this month’s editorial). The same models cropped up in several of the talks we attended where increasingly, the ‘demonstration’ paradigm of the learned journal is usurped by a presentation of the results of a computer model. Some modeling techniques, such as deriving a stereo net of fault distributions, really turn-on the geologists. As do seismically-derived palinspastic models—especially when spiced up with some migration pathway modeling. A significant driver in the current software push comes from BP’s geological toolkit run-off and ‘co-visualization’ research effort. We were impressed by Exxon-Mobil’s ‘RETR’, the interpretational equivalent of ‘extreme programming’. Schlumberger’s new display was a model of roominess and frequentation. Roxar was conspicuous by its absence. And Landmark was somewhat obscured by the smoke coming from a ‘re-ignited’ Geographix.

Sidney Powers awardee, Ken Glennie (ex-Shell), author of the definitive Petroleum Geology of the North Sea gave an entertaining account of early work with Shell, investigating an apparent association between oil in Oman and nearby outcrops of oceanic basalts – which turned out to be tectonic serendipity. Declining a posting back to the UK in 1972, Glennie was banned from promotion for life, reflecting the ‘autocratic times’ of the day [Today, you’d just get fired!]. Glennie went on to become an educator, within Shell and outside—with publications on the petroleum geology of NW Europe, the desert of SW Arabia and most recently of the southern North Sea’s Permian basin.

Reserves Session

Mary Van Der Loop (Ammonite Resources) quoted Shell’s Walter van de Vijver as being ‘sick and tired of lying about the extent of our reserves issues’ during the write-down debacle. Because reserve growth is one of the best indicators of market returns, it is an important KPI for the ‘mom and pop’ investor. Van der Loop therefore tracks companies’ proven undeveloped (PUD) reserves as obtained from SEC 10K postings. These ‘cut through the glowing discourse of the Annual Report’. Shell’s PUD to Proven Developed ratio climbed to 56% in 2002 before dropping to 48% on revision, bringing it into line with Exxon. Elsewhere Van Der Loop notes ‘gyrations’ in PUD around a general upward trend. So companies are either finding great stuff ‘or stretching the truth.’ Oil price rises move the ‘dogs’ into reserves. A large fraction of PUD increase is due to pressure from Wall Street. But the real big question is, ‘Is there someone in Saudi Arabia saying “I’m sick and tired of lying…”.’ Dan Tearpock (SCA) heads up the joint AAPG/SPEE investigation into the ‘reserves question’ and a possible certification program. A decision is to be made later this year. Whatever that is, it will be the fruit of a stupendous number of committees. Five top level committees will investigate reserve definitions, ethics, qualifications etc. Below them nested sub committees will study recommended practices, geoscience and modeling issues etc. With worldwide reserves at around $600 trillion, the key question for Tearpock is, ‘How much of this is real?’

Schlumberger

Schlumberger’s talk on the future of E&P IT made some interesting claims. Like the imminent ‘irrelevance’ of Moore’s Law. Seemingly we will soon have all the CPU bandwidth we want. In the 1990s seismic processors were all waiting on the CPU. Today only 20% of the wait is CPU-dependent, with more time spend on QC, visualization, testing new processing paradigms and on interpretation. A ‘GUI rebellion’ is underway—against the way the computer makes us work. Soon we will use pens and vocal control of ‘learnable’ software. SIS is working with the MIT media lab on ‘tangible computing,’ using the whole body as input—a touted remedy for carpal tunnel syndrome. Other futuristic pronouncements seemed little more than optimistic extrapolations from the situation today – thus integration will be a given, as will ‘pervasive’ optimization and ‘ubiquitous’ geomechanics etc. All enabled by SIS’ data and consulting unit – its fastest-growing business sector. Other innovations include digital structural analogs (from the University of Alberta), artificial intelligence on real time data, decision support, ‘social network analysis’ and ‘beyond databases’ and into ‘workflow repositories,’ part of Shell’s brave new smartfields world.

Anadarko

Henry Posamentier (Anadarko) presented a fireworks display of seismic geomorphology leveraging VoxelGeo, StratiMagic and and image processing with ER Mapper. Seismic geomorphology has undergone a revolution – moving from the 2D internal reflection mapping of the past to 3D seismic volume interpretation. By changing cube transparency, fluviatile patterns or carbonate patch reefs are revealed. A video of the Alaskan shelf showed stupendous imagery of sedimentary structures like crevasse splays, revealed by changing vertical relief and illumination angle. Such techniques can also identify ‘FLT’s (funny looking things—a ‘technical term’). Quantitative seismic geomorphology leverages geoprocessing with ESRI tools, to study channel sinuosity and center line mapping for automated channel counts.

Geographix

As a part of its ‘re-ignition’ Geographix demonstrated its new native OpenWorks connectivity with data from BP’s Wytch Farm field. Geographix accessed interpretation data from SeisWorks and PetroWorks leveraging its map-centric approach. A right click on the map highlights wells with tops/production data. Production data and decline curves are easily obtained as is rather more tabular data than most folks would want to see in a demo! Thematic mapping can be contextualized to users preferences and the cross section display (recently acquired from A2D) is indeed pretty nifty. Data management wise, the association of Geographic and OpenWorks could be a compelling solution for some clients.

Apache

Alan Clare (Apache) speaking on the Schlumberger booth described a 140 million cell model of the N. Sea Forties field, created in Petrel. For simulation, this was ‘dumbed down’ to a 3 million cell model for use in the 3D StreamLine simulator. Layer-conformable scaling and grid refinement preserved as much heterogeneity as possible in the unswept periphery of the field, where seismic data quality is poor. Production data was used to refine the correlation. Modeling required the introduction of a permeability barrier at the base of the channel system which had not been detected in the logs. Results were encouraging with STOIIP up from 4.2 to 5 billion bbl. in place but all the remaining oil in a very poor quality reservoir. A history match run takes 28 hours and total CPU time for the study was around 1,700 hours—producing a well by well match to production within 1%. A streamline time of flight video showed unswept areas behind shale barriers. The big question for Apache now is whether to shoot another 4D survey.

ExxonMobil

Lester Landis presented the E&P equivalent of ‘Extreme Programming,’ Reservoir Evaluation Time Reduction (RETR). This has produced a ‘step change’ in cycle time reduction for reservoir evaluation ‘without compromising interpretation quality’. Key enablers are Schlumberger’s Petrel and Exxon’s EPSIM and Power simulators. The idea is simple—build a simple model quickly and then refine it. EPSIM builds a single common scale property model, Power does the fluid flow simulation. The initial ‘low frequency’ model, built in the ‘discovery’ pass is expected to be modified—so the model is designed to accommodate subsequent change. The later ‘flow unit’ pass rolls in sub-seismic features like shale barriers. Landis advocates using computer power to move faster through the workflow, rather than to build bigger models. He also advocates using a common scale model, rather than upscaling from a fine scale model to a flow simulator.

Renewable Energy Seminar

Talking about Kyoto in the oil and gas business is a risky proposition. It was therefore with some circumspection that Steve Ball (Conestoga-Rover) presented a ‘Kyoto 101’. Since Kyoto, and despite US reticence, many oil companies have adopted renewable carbon credits and are investing in wind farms and other diversifications. In Alberta, energy deregulation is making wind farms an economic proposition. For Ball, the consensus is that that ‘global warming is happening.’ Greenhouse gases are rated in CO2 equivalents—so that methane is 21 CO2e and SF6 (used in transformers) a whopping 23,900 CO2e! The main body administering carbon trading is the UN Framework Climate Change (FCC). Kyoto goals are to reduce emissions to below 1990 levels with the onus on developed countries. Results are mixed, Canada is up 20% on 1990 so far and the UK is 15% down. Mechanisms are being introduced to trade carbon credits with a global carbon credit stock market. Other projects aim at better carbon management—landfill gas collection and use, CO2 injection and sequester, reforestation and, of less interest to geologists, manure management.


This article has been taken from a longer, illustrated report produced as part of The Data Room’s Technology Watch service. Subscription info from tw@oilit.com or visit www.oilit.com and follow the Technology Watch link.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.