The 2011 Society of Exploration Geophysicsists’ annual convention received just over 8,000 attendees. It kicked-off with a rather lackluster Forum on ‘Exploration Frontiers, geography, technology and business models.’ Shell’s David Lawrence reported that shale/tight gas reserves would provide 100 years of supply in the US and 250 years internationally. Shell and Hewlett Packard will field a million sensor wireless seismic system by 2015. Tim Dodson (Statoil) shared some spectacular seismics on the Skrugard discovery showing a massive double flat spot (gas on oil and oil on water), visualized thanks to ‘partially proprietary’ algorithms bringing ‘competitive advantage’ to Statoil. WesternGeco president Carl Trowell observed that seismic imaging is evolving faster than drilling. Depth imaging now constitutes over 50% of Western Geco’s business. Every imaging project is bespoke—contractor and operator work ‘hand in hand.’ This is forcing organizational and technological change as software platforms need to share velocity and earth model data. A modern 1,000 km sq. survey with 120,000 full azimuth point receivers can generate 700 terabytes. The petaflop computing needed for this has been ‘piggy backing’ on gaming technology but today, WesternGeco is designing its own ‘chips of the future.’
Leila Gonzales reported on a study by the American Geological Institute that broadly confirmed the imminent ‘big crew change’ in geoscience professionals where a large cohort falls in the 50-60 age group. Around half of today’s 260,000 geoscientists will retire in the next decade. Even if all new graduates get hired, this will mean a 150,000 shortfall by 2021. Geoscience degree production is moving east, to the EU, Russia, China and Indonesia making for a global, mobile workforce—www.oilit.com/links/1110_10.
Parveneh Karimi of the University of Texas at Austin presented a novel approach to the ‘UVT’ system of stratigraphic coordinates as deployed in Paradigm’s Skua. The method allows interpreters to work in a pre-deformation frame of reference and is said to improve inversion. The new method leverages Sergey Fomel’s ‘predictive painting’ autopicking algorithm—www.oilit.com/links/1110_11.
Alex Martinez provided an insight into ExxonMobil’s approach to shale gas prospection. Shale gas success involves matching geoscience challenges with drilling decisions. Rock physics is the ‘Rosetta stone’ at the interface of these disciplines and the key technique is forward modeling from hypothesis to seismic response. Rock physics intervenes across the acquisition, processing and interpretation cycle. Regulatory and environmental challenges impact survey design. It can be hard to untangle the influence of fractures from other effects. Decision making speed is of the essence as shale gas drilling may move on regardless! GPS and GIS are heavily used in acquisition. In a Piceance basin survey Exxon issued GPS transponders ‘to show the regulator where everyone was.’ The survey mobilized seven helicopters simultaneously. High performance computing is a big help in quantitative data analysis. While seismic costs are high relative to drilling, Martinez hopes that, ‘we can move from pattern drilling to seismically guided drilling.’
Landmark’s latest release of its DecisionSpace Desktop includes Permedia’s Mpath for petroleum system modeling (acquired by Halliburton last year). A new subfusc GUI is said to ‘draw the eye to the data.’ A ‘VelGen’ velocity tool works across the processing and interpretation domains—from ProMax/JavaSeis to complex multi-z structural frameworks in Geoprobe. The Lithotect add-in allows for detailed structural geological interpretation. All results are stored in the Open Works database and an API is available for third party plug-in development—underlining DSD’s role as challenger to Schlumberger’s Petrel. Intriguingly, DSD was also running on a Mac—showing gestural interaction with the data via the touch pad. This is currently a research platform but is considered as a ‘vindication of the flexibility provided by Landmark’s Java/Qt codebase.’
Rich Hermann and Jitesh Chanchani elaborated on IHS’ acquisition of SeismicMicro Technology (SMT). The deal was struck on the back of SMT’s ‘market leading’ position in unconventionals. Short term plans include integration between SMT’s Kingdom and IHS’ Petra, GeoSyn and LogArc packages. Later this will evolve into a new ‘unified G&G workstation’ extending Kingdom SDK with more specialized apps and close coupling with IHS’ 5.5 million global wells dataset.
OpenSpirit, now part of Tibco, was showing ‘Tibbr,’ social networking for the enterprise. Tibbr lets users ‘follow’ people, subjects and events by subscribing to, for instance, a ‘US Land’ or a ‘New Ventures’ stream. Streams can be expense reports or invoices and each rolls up different apps and data sources. Tibbr ‘removes the chatter’ by moving email ‘conversations’ to a database and reducing notification frequency. Apache is trialing the system on a drilling community of practice—www.oilit.com/links/1110_12.
OpenSpirit has been busy integrating Tibco’s BusinessWorks information hub to extend its coverage to include SAP and any Oracle database. Business processes automation can now span geotechnical and financial data sources. A visual programming paradigm provides connectors to PeopleSoft, Siebel and Tibco’s ActiveMatrix orchestration engine. A use case might connect well information in SAP to OpenWorks, managing units of measure and coordinates en route.
Kris Pister (UC Berkeley) reviewed the state of the art in wireless networks. Modern spread spectrum channel hopping nodes run for years on batteries or on energy ‘harvesters.’ The ‘nasty’ standards battle has been won by the Wireless Hart protocol. Chevron has deployed a 4000 sensor mesh at its Richmond refinery—www.oilit.com/links/1110_13.
Rebecca Saltzer described how ExxonMobil has been using broadband seismometers to provide low frequency velocity information to stabilize migration. The trial was performed on the LaBarge field, Wyoming, with control from thousands of wells. 55 broadband Guralp seismometers (loaned by the National Science Foundation) were deployed at an unusually close 250 meter spacing and recorded background noise for six months. Data was recorded to 2 GB memory cards which were retrieved every 2 months. Nearly 600 teleseismic events were recorded including the magnitude 5.7 Costa Rica event. Data was processed with the same code as used in global seismology. The results were very similar to the well derived velocities—www.oilit.com/links/1110_14.
Pruneet Saraswat (Indian School of Mines) advocated the use of immune theory and self organizing maps to classify seismic facies. Seismic patterns are ‘learned’ and, like vaccines, ‘stay in the system.’ The demo showed some compelling if rather glib signal to noise improvements—www.oilit.com/links/1110_15.
Sharp Reflections’s PreStack Pro (www.oilit.com/links/1110_53) is a lightweight system for interactive pre stack seismic interpretation and processing. PreStack Pro embeds Fraunhofer’s global programming interface (GPI) and optionally the Fraunhofer parallel file system—www.oilit.com/links/1110_16.
Enthought’s software service support business model centers on scientific programming in Python and real time data visualization. CTO Travis Oliphant actually wrote the NumPy component of SciPy. Three levels of support are available from freeware to premium. Researchers at Shell and Conoco are users. Clients have developed Python code for pore pressure, AVO and micro seismics. Enthought is also trying to repurpose its ‘big data/NoSQL’ approach, originally developed for tick level trade data analysis in financial services, to production data streams—www.oilit.com/links/1110_17.
Reservoir Labs’s ‘RStream’ CUDA code generator for seismic processing produces optimized code for the GPU that ‘compares well’ with optimization performed by an expert hand coder—www.oilit.com/links/1110_18.
© Oil IT Journal - all rights reserved.