Geophysics has an apparently unlimited appetite for science and scientific computing. Digital recording began in the mid 1960s and you might have thought that everything that could be done in seismic processing has been tried already. Quite the opposite in fact. Super deepwater, sub salt targets like ‘Jack’ in the Gulf of Mexico have brought seismic processing and high performance computing into the limelight.
At the New Orleans SEG (page 6) such issues were debated in Ebb Pye’s Visualization Theatre (page 7). The processing community has already figured out what it would like to do in the way of pre-stack depth migration, inversion and so on, but has lacked the compute horsepower to run many of these algorithms. Today with pressure from discoveries like Jack, and the power of the supercomputer, the high-end techniques are back.
One procedure, something of a holy grail of seismics, is the idea that you could go straight from the recorded data to the 3D volume. Without passing Go, sorry, I mean without all the laborious ‘velocity analysis,’ model building etc. Art Weglein of the University of Houston’s Mission-Oriented Seismic Research Program (M-OSRP) described a technique that soon may invert seismic field data into a multi-dimensional cube with all the data about the rocks that you could wish for.
This is a seductive notion which I came across in Paradigm’s marketing literature which asks, one hopes not rhetorically, ‘Why not routinely convert seismic cubes into meaningful reservoir property volumes?’ This is such a good question that it makes you wonder why we have been futzing around so long with stacking ‘velocities,’ seismic ‘attributes’ and a sometimes bewilderingly large number of seismic data cubes.
At the SEG I heard tell of one company which had accumulated 1,200 versions of the same seismic data cube. No doubt each one had some geo-logic behind it. The particular combination of ‘instantaneous phase,’ ‘spectrally decomposed,’ ‘bump mapped’ data meant something to the interpreter when it was originally derived. But the next day, month year? How do you categorize and manage 1,200 sets of the same data?
Bye bye AVO?
This got me thinking, if we can go from recorded data without producing velocities, all the popular ‘pre-stack’ techniques go out of the window. This is actually a good thing. Instead of all these pseudo-physical attributes, seismic imaging would just be giving us its best shot of what it knows about the rocks and the fluids therein. Which brings me to something of a poser, ‘What is the minimum number of seismic data cubes that need to be stored to capture all the information that is actually there.’ Your answers are welcome. My own intuition is that the correct number is much closer to 12 than to 1,200.
Looking back over 2006, I think one of the key events of the year was the SPE Gulf Coast Section’s Digital Energy conference in Houston (OITJ Vol. 11 N° 4). For me this meet was a kind of problem-setting exercise for the ‘digital oilfield of the future’ (DOFF). Two opposing views were expressed. One the one hand, the ‘upstream upstream’ engineering community tends to be rather dismissive of the technology deployed on the oilfield today. SCADA is referred to as either ‘legacy’ or ‘commodity.’ The other view, expressed by some vendors and consultants, is that the grass is in fact greener on the process control side of the fence. These folks opine that all we have to do is operate the oilfield ‘like a factory’ to achieve huge benefits.
We got the opportunity this month to check this out with an invitation to the Invensys Process Systems User Group meeting in Dallas. The IPS-UG will be the subject of one of The Data Room’s Technology Watch reports and also feature in the January 2007 issue of Oil IT Journal. But I have to say already that this conference marks what I think will be a turning point for Oil IT Journal because the picture that is emerging from the process is a lot more complicated than those expressed above.
To explain how things are more complicated than they seem I will first hazard an analogy. If you came from Mars and learned that humankind was into communications, you might imagine that it would be easy to tap into them and perhaps ‘optimize’ either dialog or destruction (depending on the Martian psyche). But then our Martian is confronted both by wired and cordless phones, Ethernet, internet, WiFi, and soon WiMax. It’s an ugly picture but one we are all familiar with, and it is unlikely to change.
The picture is similar in the ‘factory.’ Process control has to contend with SCADA, but also DCS and PID and optimizations at various stages in the plant’s lifecycle. The process control community is also getting exercised about wireless—with digital ‘canopies’ or ‘umbrellas’ over refineries and plants. So our Martian’s communications problems will get layered on top of all this. Oh and as you probably guessed, there are those in the process control industry who argue that the grass is greener on the E&P side of the fence—particularly with ProdML!
Finally a word of advice about an anti-spam service called ‘Sorbs’ which is causing us some grief. A small amount of our mail is bounced back to us from the recipient with a message that Sorbs considers it to be spam. Sorbs’ data base contains many bona-fide companies whose ISPs have been used at some time in the past by spammers. If you are using Sorbs, or if your anti spam service does, you might like to Google ‘sorbs sucks’ for more. Which by the way is a rather good formulation for getting a quick contrarian viewpoint on just about anything from hardware to software and possible Christmas presents.
Repsol YPF has just announced a geophysical supercomputing project to leverage technology from Houston-based seismic boutique 3DGeo. The Kaleidoscope project is a long-term joint development program of novel seismic imaging technologies including compute intensive reverse time migration (RTM). The high-end processing techniques will be used, inter-alia, to image complex tectonics in Repsol’s Gulf of Mexico exploration portfolio. The new models and algorithms will run on what is described as one of the world’s most powerful supercomputers, the MareNostrum, operated by the Barcelona Supercomputing Center (BSC).
Francisco Ortigosa, Repsol’s head of geophysical operations, said ‘This project will accelerating the roll-out of next generation imaging technologies and will leverage our position as sponsors of both the Stanford Exploration Project (SEP) and the Barcelona Supercomputing Center (BSC). We selected 3DGeo because of its track record, being the first processing house to demonstrate commercially feasible 3-D wave-equation migration.’
Biondo Biondi, 3DGeo co-founder and CTO who is also an associate professor at Stanford, added, ‘The program will build on our existing high-end imaging applications to realize so far unimplemented cutting-edge, full wavefield imaging techniques. Testing the new algorithms will benefit from BSC’s experience of computer architecture and parallelization.’
MareNostrum resulted from a partnership between IBM and the Spanish Government. The original design targeted a 40 TFlop bandwidth and the No. 4 spot on the TOP500 list in 2005. MareNostrum was originally designed as a Linux cluster of 2,282 IBM BladeCenter with dual 64-bit IBM Power processors and 140 TB of storage.
The architecture is to be extended to test the new IBM Cell BE, as used in the Sony Playstation III. The new solution will speed processing time by ‘several orders of magnitude.’
One of the big issues in HPC is the difficulty of scientific programming across the complex memory models of multi-core, clustered machines (see our HPC report from the SEG on page 7 of this issue).
3DGeo president Dimitri Bevc told Oil IT Journal, ‘We are working with our partners to make the programming model more straightforward. CBE is still a distributed model, but it has potential to perhaps better handle some of the issues with which competing technologies (GPUs & FPGAs) have had trouble. Eventually, we may see commodity priced large memory machines.’ 3DGeo is to open a new office in Barcelona.
First announced in 2002 (Oil ITJ Vol. 7 N° 10), the drillstring Ethernet that originated as a US Department of Energy-funded project with partners Novatek and Grant Prideco, has now been interfaced with Halliburton’s Sperry unit’s measurement while drilling services.
The IntelliServ Network drill string telemetry is capable of streaming data from downhole drilling and formation evaluation tools to the surface in real time, at rates up to 10,000 times those offered by today’s acoustic mud pulse techniques. Halliburton and Grant Prideco unit IntelliServ are to jointly market and deploy the technologies.
Sperry VP Brady Murphy said, ‘Today, drilling rates can be limited by the transmission speed of high volume data collected during the drilling and evaluation process. The Sperry IntelliServ solution eliminates the bottleneck, and enables a new range of logging-while-drilling services.’
IntelliPipe is a custom pipe string with embedded electrical connections offering around a megabit of bandwidth. The signal passes from one string to another via an induction loop embedded in the joint.
The first ‘GeoMashUp’ event, held at the UK’s Ordinance Survey’s offices in Southampton, showed how lightweight ‘Web 2’ browser programming can be used to merge data from a variety of sources to create new applications. It can be argued that this technology is the first widespread demonstration of the success of the open, web services paradigm.
Google Maps API
Invest and hope ...
In answer to a question on Google’s business model Ricket stated that ‘Google’s philosophy is to invest and hope that a technology will become profitable.’ For corporations, Google offers an Enterprise edition of the GM API which offers support will ‘always be ad free.’
Sean Phelan (www.multimap.com) traced the history of blending GIS with enterprise data. In 1980-1999, you had to buy and deploy a GIS from ESRI. From 2000-2005 it became possible to have browser-based access to intranet/extranet GIS service providers. From 2003 to the present companies use web services and server to server XML. Looking forward from 2006, the picture is one of AJAX, ‘Web 2,’ mashups, client side integration and ‘B2B2C’ (business services for consumers.) Mashups promise rich and extensible functionality that is cheap and easy to deploy thanks to a client-side API – in other word, the Holy grail.
In the context of mapping, data ownership is a big issue. According to Steve Coast (www.OpenStreetMap.org) geographic data is in general, neither free nor current. To counter this, Coast founded OpenStreetMap (OSM). OSMap members equipped with a GPS build maps as they travel around the country. Tim Berners-Lee calls this ‘grass routes remapping.’ This idea is to make cartographic quality results from trips. Coast reckons that the whole of the UK will be mapped by 2008.
Mikel Maron (www.brainoff.com) believes that data is the ‘missing link’ of the mashup. Enter RSS, which Maron describes as the Unix pipe of the Internet. GeoRSS adds geographical information to RSS. GeoRSS has been adopted by the OGC as a lightweight standard. GeoRSS examples include the USGS earthquake alerts feed, the EU’s online tsunami simulator which calculates a tsunami’s impact in under 15 seconds, mashing the USGS earthquake feed to produce targeted warnings. The Global Disaster Alert and Coordination System (GDACS) also ran. Other projects monitor geographic wikis, mobile devices, sensors, providing time-based GIS. The technology is ‘tearing down the walls.’ Mapufacture.com is Maron’s idea of ‘what it’s all about,’ letting users answer mission critical questions like ‘How’s the weather for house hunting in Islington?’
Raj Singh spoke on behalf on the Open GIS Consortium—which he described as ‘the old guard’ of GIS. Singh opined that people have been mashing up for years, with Mylar overlays. Today the OGIS’ web map services and web feature service offer standards-based ways of ‘getting maps into mashups.’ But somehow one gets the impression that the OGC has been a bit overtaken by events here.
Norman Barker’s presentation highlighted the commercial benefits of mashups and open standards. Barker’s company, ITT, writes image software for defense and intelligence. ITT’s specialized services are distributed as mashups along with served geodata, typically from Google/Yahoo Maps or Microsoft Virtual Earth. This represents a new paradigm for both ITT as developers and for its end users. Before mashups, ITT would have built desktop apps supported with standalone server solutions. ITT’s mashups leverage the above popular clients, hiding complexity from users.
An ITT VIS mash up of the GeoINT tile server was used for a KSAT/Kongsberg ship iceberg awareness solution in polar regions. The technology was also used in the ESA flood alert for London.
Digital rights management
Some technology issues remain along with with digital rights management (DRM), service level agreements for server downtime, business models. Barker asked enigmatically, ‘What happens when another company goes bust a la A9. Barker concluded his talk with a ‘fascinating fact.’ Apparently, some 10,000 containers are pushed into the sea every year to stabilize ships in storms. As yet there isn’t a mashup showing them bobbing around in the ocean—but this is just a matter of time!
There was a sluggish debate on what would ultimately be the business model for the mashup of data from different owners? In other words, ‘What happens when Google gets bored with this game?’ Conspiracy theorists saw this as an ‘arms race’ between MS and Google. With Google way in the lead following its acquisition of Keyhole (now Google Earth). The geolocation community is concerned about the availability and cost of street level data, teleatlas, satellites etc. Who is paying for all this? The open source brigade sees OpenStreetmap (above) as playing the role of the open source Apache web server in the geolocation space. But someone pointed out that extending the ‘GPS on a bike’ paradigm to satellite observation was an improbable notion. Free availability of data would also upset the business model of the hosts, the UK Ordinance Survey. But a representative from the British Geological Survey stated that the venerable organization was indeed thinking seriously about giving its data away as a public service.
Some 200 attended the first meeting of the European ESRI petroleum user group (PUG). Brian Boulmay’s (ESRI) keynote discussed ArcGIS 9.2. ESRI’s key message is the advent of ‘distributed collaboration,’ enabling several authors and publishers to deliver content through an ‘interconnected, interoperable, and dynamic framework.’ GIS technology is enabling collaboration from the server, across desktop, PDAs to mobile phones.
ESRI increasingly supports industry standards from the Open Geospatial Consortium (OGC), Autocad’s DXF and Google Earth KML. Data models and content metadata standards like ISO 19139 are now also on ESRI’s ‘map.’ An openness demo involved connectivity to CAD, Google Earth, SAP, MapInfo and Geomedia.
Arc GIS 9.2
ArcGIS 9.2 supports grids of faulted surfaces and a new flat file geodatabase enhances performance. Animation is used to display time variant data such as production. A new ArcGIS Explorer brings a Google Earth-like user front end to enterprise GIS and geoprocessing.
Charles Fried discussed data analysis and automation at BP. The goal is a single source of information for the Gulf of Mexico of resource base, financials, inventory, prospect ranking, competitive intelligence—all displayed on backdrop of play fairways. Sharepoint, Access, Excel and ArcGIS support semi-automated updates that take hours (rather than days as before) to refresh a supporting dataset.
Karina Schmidt presented Wintershall’s Enterprise GIS Strategy integrating GIS with tools from Schlumberger, IHS, EDMS, OpenSpirit and other applications. Wintershall is moving towards a ‘Galaxychart’ technology vision embracing EDMS, production and geodata, but with more spatially enabled processes bringing GIS and webservices into the framework.
Tore Hoff introduced HydroGIS which leverages GIS in regional studies, play fairway mapping, data management, geodetic QC and data inventory. HydroGIS was launched in 2006, adding an enterprise framework to previous GIS initiatives. HydroGIS leverages the Petroleum Spatial Data Model (PSDM) from Nexen, itself based on PPDM. Safe Software’s Feature Manipulation Engine is used to load data from the Norwegian Petroleum Directorate, Petrobank, Finder and in-house vector and raster data sources.
Achim Kamelger presented OMV’s cross-discipline spatial data management and IM services project offering enhanced connectivity between business systems and GIS. The project has encountered many technical and organizational challenges particularly as ‘users have no idea about data management’ and legacy applications that are ‘black boxes.’ Kamelger concluded that the oil industry is ripe for ‘useful’ new technology enabling better integration between GIS and other E&P applications.
Amelia Pickering, British Geological Survey (BGS), presented the new Strategic Environmental Assessment data (SEA) on the DEAL website. This includes a range of sea bed geophysical data, video tracks, seabirds and marine mammals’ distributions and more.
Thanks to Brian Boulmay for this report.
Geosoft has just released Dapple, an open source globe viewer, derived from NASA’s World Wind (NWW). Dapple extends NWW, adding the ability to find and view network-hosted geoscience spatial data.
Dapple provides an intuitive interface that lets geoscientists find and visualize the massive quantities of geoscience data available on the internet. Data includes satellite imagery, remote sensing data, geology maps and geophysical data. Dapple leverages open standards and server technologies including the OpenGIS Consortium’s web map service (WMS) and Geosoft’s own DAP format. The earth science community is encouraged to build on and extend the open source software.
Spatial Data Rights
Since its acquisition of EnergyScitech last month, Roxar has lost no time integrating the EnAble history matching and uncertainty management package which is now an optional module within Roxar’s flagship geomodeler, Irap RMS.
Roxar CEO, Sandy Esslemont, said, ‘Roxar’s uncertainty management solutions cover the complete E&P lifecycle from prospect ranking to reservoir management. Now, multi-million dollar decisions on bid valuations, field development and operational plans can all be based on all available information, with a realistic understanding of the uncertainties.’
Roxar’s uncertainty management workflow performs history matches on a number of geological scenarios, creating models that are consistent with the geological interpretation. By incorporating multiple realizations into decision-making, companies can better quantify the effects of uncertainties on reserves.
The latest release of Geomodeling’s SBED 2006 includes the ability to incorporate bioturbation in its flow models and new multi-phase upscaling functionality. SBED characterizes reservoir potential through small-scale heterogeneity modeling and flow-based upscaling.
General Robotics Ltd. has just released DeepSim 1.0, a desktop subsea planning and training simulator. DeepSim is a scenario planning application offering dynamic visualization of complex subsea scenarios. DeepSim models the hydrodynamic response of objects such as ROVs, tethers and moorings when acted on by currents and surface vessel motion. DeepSim targets offshore contractors involved in survey, inspection, maintenance and repair, drilling and pipe laying.
Altair Engineering has just released a special version of its workload management solution for high performance computing (HPC), PBS Professional for IBM’s Blue Gene massively parallel Blue Gene supercomputers. PBS Professional maximizes the utilization of computing resources by intelligently scheduling and managing computational workloads. Blue Gene currently occupies the number one spot on top500.org. In the oil and gas sector, ConocoPhillips uses PBS Pro for its HPC workload management.
The latest version of DataFlux promises data managers a ‘single version of the truth’ for data governance, compliance and master data management (MDM). Key features of Version 8 include ‘accelerators,’ pre-built workflows, a new data quality integration platform and new metadata discovery and data monitoring capabilities. The package includes customizable reporting on data that does not meet corporate quality and integrity standards.
Microsoft announced that the international standards body ECMA has approved its ‘Office Open XML’ formats and is to submit them to the International Organization for Standardization (ISO). The Microsoft initiative came partly in response to a threat from OASIS’ OpenDocument format (ISO/IEC DIS 26300).
A new release of CD-Adapco’s STAR-CCM promises a faster route from complex geometry to accurate flow simulation. STAR-CCM includes surface-wrapping, advanced automated meshing (creating either polyhedral or predominantly hexahedral control volumes) and cut and paste of meshes between calculations. STAR-CCM uses an automatic surface wrapper that ‘shrinks’ a triangulated surface mesh onto any geometrical model, closing holes in the geometry and joining disconnected and overlapping surfaces. The resulting single manifold surface is used to generate a computational mesh without user intervention.
Foster Findlay Associates’ SVI Pro v2 is due for release early 2007. SVI Pro enables rapid data screening and delineation of significant geological features within 3D seismic data. The results generated by SVI Pro can be directly integrated into seismic interpretation, 3D modeling and well planning workflows.
The package includes frequency decomposition, RGB blending and segmentation developed and tested in collaboration with Hydro. Measurement tools compute volume, surface area and principal lengths of geobodies or area of interest for connectivity studies and reservoir evaluation. A DHI tool detects changes in seismic response with time or depth that may be associated with a mobile fluid contact.
The US Minerals Management Service (MMS) has launched a ‘hurricane web site’ to explain the preparation that goes into protecting human life and environmental safety when hurricanes threaten facilities in the Gulf of Mexico.
MMS Director Johnnie Burton said, ‘The general public tends to focus on hurricanes only during hurricane season and then only when there is a threat to land. The MMS prepares every year, throughout the year, in order to produce the outstanding safety record demonstrated during 2005 Hurricanes Katrina, Rita, and Wilma.’
Katrina and Rita affected some 3,000 oil and gas facilities in the Gulf of Mexico without loss of life or serious injury. All subsurface safety valves operated at 100% efficiency, sealing oil and gas wells below the ocean floor, and protecting the Gulf waters from contamination.
The MMS Hurricane website contains safety and evacuation information, environmental studies and technology backgrounders. The site also houses historical hurricane data, pictures and statistics on production that was shut in and returned to production during the 2005 storms. Visit the site on www.mms.gov/2006Hurricanes/2006HurricaneSeason.htm.
Presentations from Mercury and HueSpace focused on different routes to high performance computing (HPC). Mercury is partnering with IBM on a Cell BE-based supercomputer which promises 16 TFLOPS in a 6 foot rack. HueSpace’s next generation technology eschews the Cell, and uses NVIDIA graphics cards for number crunching. Hue delivers an SDK with a prototyping interface.
Advanced modeling consortium
Bee Bednar (Panoramtech) reported little progress on the SEG’s attempt to build a 3D model in the style of the French Petroleum Institute’s earlier Marmousi model. The project appears to be finding it hard to raise adequate funding for a useful model that is to allow for stochastic seismic facies that can be warped into structural section and mild near surface velocity variations. Model design is underway and model execution gurus are investigating the hardware required. Current thinking is for a 40km x 40km x 10 km (60 GOM Blocks) area with a memory requirement of around 256GB. Costing an 8 cable survey over the model at 20 cent/flop/hour amounts to $2 million or 18 months of computing. The conclusion is that acoustic modeling is feasible but not elastic modeling. The consortium is now looking for cheaper Flops. The IBM Cell BE promises 460 GFlops, so models could run in four days. PGS, BHP Billiton, Total, CGG, Shell, Halliburton, Exxon, Western-Geco have all put $50k in the pot.
Earth Vision’s CoViz
EarthVision’s ‘CoViz’ was originally created for BP which needed a vendor-independent browser/viewer for seismic, well and reservoir simulation data. Queries from within CoViz can retrieve core data, well test reports etc. Earth Vision wants CoViz to become everyone’s ‘second viewer’ for checking to see if the ‘other world’ confirms their interpretation. The viewer and relevant data can be decoupled from the original data sources and taken into the field or for a partner meeting.
Mark Chidwick outlined some best practices for E&P information management (IM) that his company, eDecisions, has developed in collaboration with a large Canadian independent drilling some 1,200 wells per year. The operator was having problems keeping data in applications up to date and was experiencing issues with vendor data quality. The solution was to build trust through data governance, appointing a full time data custodian. Schlumberger’s ProSource was configured to apply business rules, for instance to check that bottom hole coordinates agree with the last point in the deviation survey. Data coming in from vendors was moved into SIS’ Finder which has a ‘rich set of business rules for maintaining data quality.’ Data was made visible to applications via Open Spirit. The IM pilot resulted in costs reduction as the 40,000 well database is now managed and QC’d by one person (down from 4). There was also more confidence in data and business decisions. Now geoscientists can build their own projects. Open Spirit connectivity was ‘the easiest part of project.’
Both Ikon Science and Headwave (previously Finetooth) have leveraged Schlumberger Information Solutions’ Ocean development kit to produce plug-ins for Petrel. Ikon is offering its rock physics based seismic modeling technology which can now be launched from inside Petrel. Finetooth is offering pre-stack data visualization from within Petrel. Data is compressed and decompressed with Hue Space’s technology before blending with Petrel’s OpenInventor canvas. The tool lets users compute an attribute on the whole pre stack dataset. Headwave has found the Ocean development environment to be ‘much more open than GeoFrame.’
Landmark lifted the veil a little on its new ‘EZ Model’ replacement for Power Model in a field development planning demo. Here, Geoprobe’s new framework construction tools were used to clean up faults and horizons for export to the ‘EZ Model’ earth modeler. EZ Model is due for release early 2007.
Rock & Fluid Canvas
Paradigm’s marketing literature asks the challenging question, ‘why not routinely convert seismic cubes into meaningful reservoir property volumes?’ Such is the intent of Paradigm’s Rock and Fluid Canvas 2007 and is part of Paradigm’s attempt to ‘get serious about integrating the macro realm of geophysics with the micro realm of petrophysics.’ In other words – a move from G&G to G&P. All tools in the suite are linked via Paradigm’s Epos infrastructure. Integration with Open Spirit is not yet a ‘done deal.’
CTO Jeff Pferd introduced Petris’ Semantic Designer, which leverages Petris’ patented ‘dynamic, common to all’ data model for integration. XSL mappings provide access to vendor data sources. The Semantic Designer, aka the semantic manager toolkit, hides the complexity of the XSL code from data managers. Pferd showed an OpenWorks 2003 database load leveraged the Semantic Designer for mappings, parsing and business rules for null values etc. According to Pferd, this represents a ‘step forward in standardizing taxonomies across the enterprise’ and for accessing geotechnical data in different vendor data stores.
SMT was demonstrating the scalability of it’s Kingdom Suite seismic interpretation package with a 10 million trace survey (PGS’ new Southern North Sea ‘megasurvey’). SMT in conjunction with UK-based Equipoise Software has released VelPak (aka Scott Pickford’s VelIT) an add on for time-depth conversion. VelPak uses velocities techniques developed by velocity guru Mahboub Al Chalabi.
Spectraseis’ HyMas is a passive recording system for capturing low frequencies, multi component data over a field. Spectraseis’ RIO software is used to process and interpret survey results. Results of seven land surveys currently underway in Brazil, Mexico, Libya and Austria will be used to tune the company’s algorithms before a potential market release of a processing suite next year. Shareholder Norsk Hydro has committed to a marine survey on the Norwegian Shelf in 2007. Passive seismic, aka seismic interferometry, is the subject of an Aramco-sponsored EAGE workshop in Dubai this month.
Rich and Wide
Our star of show (virtual) award this year goes to WesternGeco’s ‘rich azimuth’ towed streamer seismics. Mark Egan likened the problem of seismic imaging to that of viewing a spoon in a liquid-filled chunky English beer glass. For proper target illumination, many azimuths are acquired—some gained from shooting vessels on the flanks of the immense recording array. The shoot over BHP Billiton’s Shenzi field was performed in a spectacular rotary star pattern. Rich azimuth and multi vessel operations has brought a ‘huge improvement’ in subsalt imagery. WesternGeco is now planning to combine these techniques with ‘over/under’ acquisition using two superimposed streamers to separate up and down going wave fields.
United Devices GridMP product virtualizes compute resources across workstations and clusters. GridMP optimizes under-used seismic clusters. A ‘meta scheduler’ operates across heterogeneous grid schedulers including SGE (Sun), PBS Altair/Open Source, LSF and HP. The system can also integrate (transparently) with utility ‘on demand’ computing outside of the firewall. GridMP is used by both Landmark and Schlumberger. UD is an active member of the OpenGrid Forum.
Enigma’s Project Archival System PARS has had a major upgrade with a re-write as a Java-based, cross platform tool. PARS 3 archives Kingdom and Petrel projects at project milestones. Project snapshots are captured to the PARS database along with project metadata. PARS also produces corporate thumbnails of OpenWorks projects.
This article has been taken from a 15 page illustrated report produced by The Data Room’s Technology Watch program. More information and sample reports from email@example.com.
According to Paul Sava (Colorado School of Mines), the high performance computing industry is moving to petaflop machines, putting huge compute power in the hands of industry and academia and making possible elastic and anisotropic, reverse time migration and inversion. When you cross correlate two 4D wavefields you get a very big multi dimensional field. In conventional imaging, most of this information is dumped. Using more of this information is the way forward.
Art Weglein of the University of Houston’s Mission-Oriented Seismic Research Program (M-OSRP) is likewise leveraging HPC to ‘responding to pressing seismic E&P challenges’. In the context of wider azimuth and finer sampling, Weglein cautioned that ‘no current migration algorithm will correctly model a flat bed beneath a sphere.’ Weglein was most enthusiastic about the possibility of seismic inversion without velocities, suggesting a method involving seismic events ‘talking’ to each other. A form of ‘seismic group therapy.’ Here, a ‘closed form’ processing technique using Fang Liu’s multi-agent genetic algorithm goes straight from recorded data to the depth model.
Earl Dodd described IBM’s move to ‘petascale computing.’ Commodity Linux clusters have ‘repealed’ Moore’s Law and now dominate the HPC landscape. Oil and gas HPC is the second industry to government spy satellites etc. Data growth is currently 100% annual – making for huge storage requirements. The need for speed is shown by the 45 petaflops required for the M-OSRP seismic experiment above and the ‘intelligent oilfield’ will require 1.7 x 1021 Flops (i.e. beyond petascale). Matching algorithms to fast evolving hardware like multi cores and GPUs is where it’s at.
John Etgen, BP’s HPC guru, described the design and operations of BP’s innovative Wide Azimuth Towed Streamer (WATS) seismic survey in the Gulf of Mexico. The WATS technique was conceived by BP’s scientists working on BP’s in-house HPC cluster (OITJ Vol. 8 N° 4). WATS meant that BP had spent $100 million ‘on a scientist’s hunch’ and needed rapid feed back from the survey to check that everything was working. Etgen likes big memory machines and regrets the demise of the ‘traditional’ supercomputer. WATS processing involves ‘buying whatever it takes.’ Etgen sees ‘tension’ between the large memory requirements and the conventional COTS cluster community. The issue is that the tools aren’t there (for HPC on multi core etc.). Etgen also expressed concern over the quality of Intel’s Fortran compiler for HPC.
ATI unit PeakStream was founded mid 2006 to develop ‘stream computing’ solutions that allow ATI graphics processors (GPUs) to work alongside CPUs to solve compute intensive calculations. A speedup of 20 fold is claimed for seismic imaging. PeakStream is also working on IBM Cell BE-based computing. The hardware uses existing development tools and also the Brook language from Stanford researcher Pat Hanrahan.
IBM’s Lotus unit made an enigmatic release this month announcing the ‘most important announcement of its history’ for 2007. IBM is to ‘wrestle the workstation from Microsoft’s hegemony, leveraging Web 2 and open standards.’
Schlumberger has acquired Norwegian i-well specialists, Reslink.
UK-based Exprodat Consulting is inviting sponsors to a multi-client study of GIS in E&P.
CGG has appointed Thierry Le Roux president and COO, reporting to the chairman and CEO Robert Brunck.
Computer Modeling Group’s R&D project with Shell and Petrobras to develop a ‘next generation’ reservoir simulator will mobilize 25 CMG employees and represent a $2 million commitment by CMG over the next five years.
CompuPrint is changing its name to Terra Energy & Resource Technologies (TERT). TERT unit Terra Insight’s satellite-based Sub-Terrain Prospecting (STeP) ‘predicts and locates commercially viable deposits of hydrocarbons, gold, diamonds, and other natural resources’!
Decision Dynamics (formerly Malibu) has promoted its COO Justin Zinke to president and CEO, succeeding Cecil Shewchuk.
ER Mapper has appointed Andrew Compson as Manager Africa and Middle East and Gus Dominguez Manager, East Coast, Australia.
GeoFields has appointed Jeff Amason as VP professional services.
Halliburton has named Milton Carroll to its board of directors. Carroll is founder and chairman of Instrument Products, a Houston-based oil field equipment manufacturer.
Tore Torvund (Hydro) has been re-elected chairman of the board at the Norwegian Oil Industry Association (OLF).
Nobuo Tanaka will succeed Claude Mandil as Executive Director of the International Energy Agency in September 2007.
The Norwegian Institute for Energy Technology has appointed Dag Thomassen director of petroleum R&D.
Venture capital groups Texas Pacific and Hellman & Friedman have acquired Intergraph Corp. in a $1.3 billion transaction.
Emerson and OSI Software report that to date, 7,000 plants leverage their joint PI System/PlantWeb solution.
Knowledge Systems has named Morris Covington MD of its new London office and Russell Smith MD of its Asia/Pacific center in Perth, Australia.
Panasas has appointed Mark Mertens Regional Sales Manager, Benelux & Southern Europe. Mertens was latterly with Hitachi Data Systems.
The Society of Petroleum Engineers has posted revised reserves and resources guidelines on its website (www.spe.org/reserves/ ) and is inviting comments through February 1st 2007.
Tim Doel is to head up Venture Information Management’s new Solutions Group which is to market Venture’s solutions to the energy industry.
Energy software house SolArc has acquired Trinity Apex Solutions of Dallas, a developer of natural gas production and trading solutions.
We wrongly stated that Kjell Trommestad works for PGS in last month’s Journal. Trommestad works for TGS-NOPEC. Our apologies to all concerned.
The Calgary-based Public Petroleum Data Model Association’s membership has grown to 107 companies. PPDM’s main revenue contribution is now from US (not Canadian) oil and gas companies. Worldwide, PPDM has 11 members in Europe, 45 in Canada, 38 in the US, 7 in Australia and 4 in Latin America. Total revenue for fiscal 2006 was almost $CDN800,000. Model take up continues but falls short of PPMD’s ambitious goal of ‘universal industry adoption.’ PPDM is deployed in part or in whole in many vendor applications and a few companies leverage the tool as a corporate data store.
PPDM CEO Trudy Curtis reported another kind of growth, in the depth of the PPDM model whose 3.8 flavor sports some 1,600 tables. These include new or extended domains including drilling additives, taxonomy, equipment catalogs, facilities, HSE, land, project and records management, and a new ‘metamodel.’
Wes Baird presented Nexen’s technical computing strategy initiative, built around a PPDM 3.7 data store. This was designed as a vendor-independent, standards-based, business-driven framework for data management. Nexen has made a few extensions leveraging PPDM-style conventions. The model was spatialized leveraging the Spatial 2 work, now the vision is for a more standards-based spatialization. Nexen’s spatial model has also been adopted by Hydro (see our report from the ESRI European PUG, page 4).
Lonnie Chin outlined Talisman’s well identity master (WIM) database, co-developed and managed by IHS. WIM, originally developed by for Talisman IPL, is based on a PPDM 3.7 Database on Oracle and a Microsoft .NET GUI. WIM provides secure access to current, world wide data from a blend of public and internal Talisman sources. Effort has been made to identify ‘best of’ data from multiple sources, leveraging PPDM’s Well Alias concepts and keeping the original data. Data load and roll up is shared between Talisman and IHS. Performance tuning has proved key to accessingthe 7 million row well alias table. PPDM has proved ‘a versatile, robust and scaleable data model for the Talisman’s foundational data layer.’ Co-management with IHS has allowed great flexibility and reductions in redundancy.
Sean Udell offered a vendor’s perspective on PPDM deployment. Calgary-based data vendor GeoLogic leverages PPDM in the GeoLogic Data Center (GDC) opened in 2005 (OITJ Vol. 10 N° 6) and now has some 6,000 users in Canada, the US and overseas. The GDC supplies customers and software partners with data from a PPDM master database. Building this began in 2004 when it was decided to migrate to PPDM from GeoLogic’s existing proprietary data model. Version 3.7 was selected because of its extensive domain support. The hardware includes dual redundant, high performance UNIX servers with live failover capability. Udell concluded by saying that PPDM ‘3.7 is huge, but not all extensions were required,’ ‘Oracle 10g is cool’ and ‘the GDC is PPDM success story.’
Calgary-based startup LicenseTracker has just published an analysis of the evolution of software licensing business models. Over recent years, software vendors have adapted to changes in technology and are now offering a variety of licensing options. Traditional software licensing offers either single user-single license, multiple users-shared license and temporary or fixed period licenses. While such models have evolved with technology innovations, they may not fully satisfy business requirements for balancing productivity and efficiency, estimating software needs, adjusting to changing needs and dealing with new requirements late in a fiscal year.
Detailed analysis of actual software usage using license tracking technology is an enabling technology for many new licensing models which do address these issues. Such models range from user classification through pay-per-use and product family ‘remixing’ to technology partnerships. The latter provide increasing value to technology consumers and at the same time increase end user commitment to the vendors.
The LicenseTracker paper offers an evolutionary taxonomy of software license models, from the straightforward ‘one license per user’ model though more sophisticated floating license models. But even today’s sophisticated license models may not completely balance the conflicting goals of productivity and efficiency. Denial of access to core software can have a negative impact on productivity. On the other hand, purchasing an extra license that may only be used 1 or 2% of the year is inefficient.
Today, most concurrent use license managers provide usage logs with information on who used what software, for how long, as well as details of any denials of access. New tools like LicenseTracker can be used to analyze such usage data and optimize software license counts, departmental charge backs and budgeting. Such information can also help vendors design new licensing models.
The paper concludes by advocating a new ‘remix’ license model that entails a tighter business relationship between vendor and users. Here the client’s investment goes into vendor’s technology rather than a specific license. Access and use of a range of products is then monitored and the results used to adjust billing to complex usage patterns—the remix model.
Read the full LicenseTracker paper on www.oilit.com/papers/licenstracker.pdf or visit www.licensetracker.ca/documents.htm.
Neuralog has added support for A2D’s ‘SilverWire’ web-services-based log data delivery services to its NeuraSection and NeuraWellTool interpretation packages. A2D’s SilverWire service (OITJ Vol. 9 No 3) exposes A2D’s LogLine Plus hosted well log data library for remote query and access.
4.3 million wells
NeuraSection and NeuraWellTool users can now browse and select data from A2D’s collection of over 4.3 million well logs from the US and worldwide hydrocarbon provinces.
Neuralog VP Javan Meinwald explained, ‘The real value of this upgrade is fewer interruptions for the geologist working with logs. We are planning to integrate a similar technology for data access in NeuraLog, for automated log digitizing.’.
A2D’s Robert Gibson added, ‘Matching Neuralog’s software with our well log data collection will change the way users interact with well log data. Getting data into a project used to be a challenge and a drain on the time actually spent on exploration.’
Seismic specialty processing software house, Headwave (formerly FineTooth) is an early adopter of new technology from NVIDIA Corp. NVIDIA’s CUDA technology is a new architecture for computing with graphics processing units (GPUs) along with what is claimed as the industry’s first C-compiler for the GPU.
CUDA allows hundreds of on-chip processor cores to communicate, synchronize, and share data to solve computational problems up to 100 times faster than traditional approaches. Headwave president, Alex Krueger, said, ‘CUDA brings new ways to analyze and interpret seismic data, allowing for interaction with multi-terabyte prestack surveys. With NVIDIA’s new architecture, we can accelerate some of the most computationally intensive algorithms in oil and gas exploration-far beyond the performance of traditional CPU-based hardware.’
SCADA specialists Verano’s Real-Time Application Platform (RTAP) now includes a Windows client. RTAP is used by companies such as British Energy, Enbridge Pipelines, RWE and Shell. The new client, ‘Visualizer’ now lets operators on Windows-based workstations access RTAP running on Linux servers.
Van de Velde
Verano partner and SCADA systems integrator, Belgium-based Ferranti Computer Systems has developed a portfolio of solutions around RTAP. Ferranti’s Guido Van de Velde said, ‘We have been integrating our applications on RTAP for 15 years. The Visualizer brings mission-critical SCADA security, to environments where Windows-based servers are not ideal, but where the flexibility and familiarity of Windows is desired on the client side, together with the integration possibilities with other desktop tools.’
Upstream e-commerce solution provider Wellogix has teamed with Cleo Communications to extend the widely used Applicability Statement 2 (AS2) into the Oil and Gas Industry. Since its adoption by Wal-Mart in 2002, AS2 has become a dominant protocol for secure e-commerce transactions. AS2 has been adopted by the API’s Petroleum Industry Data Exchange (PIDX) committee on electronic business in oil and gas (OITJ Vol. 10 N° 6).
Cleo’s interfaces add AS2 support to Wellogix’ PIDX Small Business Adaptor (SBA), enabling smaller suppliers to deliver invoices and supporting materials directly to the Wellogix Complex Services Management (CSM ) suite. Wellogix director Tim Morgan said, ‘This agreement establishes a game-changing benchmark for PIDX integrations.’
Cleo’s software has been certified for AS2, AS3, and ebXML Messaging Service (ebMS) by the Drummond Goup. Cleo’s VersaLex communications package offers pre-configured connections to many major trading networks.
Process control optimization boutique software house Cutler Technology Corp. (CTC) has signed reseller agreements with Invensys Process Control, Mustang Engineering and UK-based Aptitude Ltd. The deals concern deployment of CTC’s latest Adaptive Multivariable Controller (ADMC) technology. ADMC was developed by CTC president Charles Cutler.
Speaking at the Invensys Process Control user group meeting in Dallas this month (full report in next month’s Oil IT Journal) Cutler reported that the technology has been proven in a large-scale test at a ConcocoPhillips refinery in the US. ADMC utilizes a unique open loop (all valve) model of the process, eliminating the PID (Proportional Integral Derivative) process controllers from the control hierarchy.
A similar deal with Wood Group unit Mustang Engineering, involves application of the ADMC technology to the digital oilfield. In the 1970s, process control automation pioneer Cutler invented Dynamic Matrix Control, founding a company of the same name in 1984. The new technology has added 2-3% to process plant capacity over traditional multivariable controllers. The ADMC controller model also provides the core of CTC’s dynamic operator training simulator.
Statoil is to deploy an operator training and process optimizing simulator from Baltimore-based GSE Systems at its Mongstad refinery, Norway, in addition to other GSE models used at the site.
GSE CEO John Moran said, ‘Our refining and petrochemical customers initially purchase simulators for their most complex and high value processes. As they discover the power and benefits of these simulators, we are seeing increased interest to expand simulation into other unit operations of their refineries. We are also seeing interest from potential first time simulation users in the refining and petrochemical industries focused on improving safety within their plants in addition to the economic benefits simulation can provide. The oil production, petrochemical and refining industries offer substantial growth opportunities for GSE both domestically and abroad.’
GSE Systems are deployed at 265 installations in more than 25 countries in energy, process, manufacturing and government.
Pemex E&P is to deploy drilling and workover scheduling and management software from Houston-based The Information Store (iStore). The package is a customized version of iStore’s PetroTrek Asset Management Solution for Drilling (AMS-D) known inside Pemex as ‘SISPAP.’
Jaime Gonzalez Alanis, planning general manager for Pemex’s RSUR unit said, ‘Users now can design effective operational programs based on factual and timely information. Management can track the status of the activities almost to the hour. We can rapidly adjust our planning scenarios according to availability of funds, drilling permits and the status of engineering works.’ SISPAP collects data from daily operations and also retrieves online data from other systems allowing management to rank scenarios based on their incremental impact on reserves, production or profitability.
iStore president Barry Irani added, ‘SISPAP interfaces with Pemex’s ERP systems and geospatial applications like Google Earth. A management dashboard is also being designed to provide key indicators.’
Metacarta has teamed with knowledge management specialist Thetus Partners to blend MetaCarta’s GeoTagger and Thetus Publisher. The technology will leverage Metacarta’s geolocation taxonomy to derive geolocation metadata for incorporation in Thetus’ knowledge management and workflow package.
Corporate information sources will be parsed by Metacarta to produce metadata for consumption by Thetus Publisher. Geographically-relevant data is then analyzed and indexed in Publisher and made available for users of Thetus tools. These include a semantic search engine, automated workflow, and comprehensive ‘lineage’ tracking—providing ubiquitous access to evolving knowledge regardless of data location or format. Thetus also has partnerships with ESRI and Spotfire.
Shell E&P has signed a ‘sole source’ agreement with Tokyo-headquartered Yokogawa Electric Corp. for the supply of monitoring and control systems for Shell’s greenfield and revamp projects in the Gulf of Mexico. Yokogawa’s North American unit signed the five-year contract. The deal covers Yokogawa’s Centum CS 3000 monitoring and control systems and ProSafe-RS safety instrumentation for upgrades to Shell’s GOM production facilities.
David Johnson, president of Yokogawa America said, ‘This contract involves Yokogawa in all aspects of the projects from application engineering to systems integration and strengthens our long-term relationship with Shell. These are among the largest and most complex projects we have done with Shell, and will bring ground-breaking technologies into play.’
Projects include the Ursa water-flood and the Perdido hub, the deepest spar production facility in the world. Last year, Yokogawa signed a strategic supplier agreement with the Shell Group for provision of products, engineering, and maintenance services worldwide.
Norwegian Kadme has just released a new information management tool for the upstream. K-view is an image server that renders LAS, SEGY, JPEG, PDF and TIFF images without data movement. This approach reduces network traffic and enhances security, as data is not transferred to the user. Semitransparent configurable watermarks and variable image quality can be applied, offering a form of digital rights management to protect the viewed information.
For large files such as a 100Mb TIFF or 1Gb SEGY file, server side rendering is the only practical approach to previewing data. Another benefit is that data duplication is minimized, easing the burden on the data manager. A web browser is all that is required to preview K-view files. No plugins or specialized viewers are required on the client. K-view is a pure Java based server side tool that works on TomCat or JBoss running on Windows and Linux operating systems.
Other Kadme tools include K-crawl, a set of automated and semi-automated tools that extract metadata from sources like shapefiles, relational and proprietary G&G datasets. K-crawl performs incremental indexing and incremental spatial updates. Another tool, K-dex allows for search across on all parsed information, both spatial and non-spatial.
A study by IDC unit, Energy Insights finds hydrocarbon accounting is both ‘critical’ for E&P companies and a ‘tough business for vendors.’ The Energy Insights (EI) report sets out to help executives select a hydrocarbon accounting package. The study compares packages from Enertia Software, Halliburton, Merrick Systems, P2ES (Enterprise Upstream and Excalibur), Quorum Business Solutions, SAP and TietoEnator.
Study author Sekhar Venkat said, ‘Complex joint ventures,
production sharing agreements and a stringent regulatory environment are leading
to an increased demand for hydrocarbon accounting solutions. A shortage of accounting
professionals is bringing standardization to business practices, creating opportunities
firms with talented resource pools.’
Industry Short List
EI’s ‘Industry Short List’ methodology was used to prepare the study comparing features, interoperability, architecture, quality of service, support and cost. An ‘ownership confidence’ assessment addresses the soundness of vendors’ strategy, financials, commitment and customer satisfaction. More from www.energy-insights.com.
Norwegian independent Hydro (formerly Norsk Hydro), like
just about everyone else in the oil patch, is ‘looking for a considerable number
of trainees’ and is trying a novel way of getting the attention of potential
hires. Hydro has set up a blog on www.hydro.com,
to encourage a dialog between new hires and prospective newer recruits.
Hydro.com editor Sissel Rinde said, ‘The blog lets newly qualified students talk directly to Hydro’s trainees and ask them about their work and their onboarding.’ The blog follows Hydro’s recent press and internet jobseeker ads which has helped Hydro recruit nearly 700 new hires. Some 50 openings remain for trainees who are permanently appointed from day one.
Kirsten Margrethe Hovi, head of training added, ‘The blog is a natural medium for our target audience.’ The aggressive hiring program is in support of Hydro’s ambitious growth targets. Oil and gas production is forecast to grow significantly with 200,000 bopd from new fields by 2010, 25% from the Brazilian Peregrino field.
Records management specialist Iron Mountain has just launched Iron Mountain Connect, a web portal that helps customers set up and manage a formal records management program for records held in Iron Mountain’s centers. The portal helps companies comply with legislation such as the US Sarbanes Oxley Act and the UK’s Markets in Financial Instruments Directive (MiFID).
Iron Mountain Connect (IMC) is a secure portal that support the complete records management lifecycle, from document archival to destruction. Reporting tools include transaction history, inventory and financial overviews. Training modules and access to industry resources provide users with current information about regulation and best practices for compliance.
Search capabilities allow users to locate files stored at Iron Mountain’s off-site records centers, and request the retrieval or destruction of files or boxes. The portal already has over 35,000 unique users in the US, where the service was launched.
Emerson Process Management has teamed with Intergraph to ‘streamline data management of automation projects and plants’ by delivering interoperability between Emerson’s DeltaV digital automation system, a core component of the company’s PlantWeb digital plant architecture, and Intergraph’s SmartPlant Instrumentation software. The deal promises enhanced bi-directional integration between PlantWeb’s digital architecture and the SmartPlant Enterprise suite. Emerson’s DeltaV automation system will be integrated with SmartPlant Instrumentation. A bi-directional data exchange will speed configuration and documentation of PlantWeb projects and provide lifecycle engineering support for operation and maintenance of process plants.
Duncan Schleiss, EPS VP, said, ‘SmartPlant is used by all the engineering procurement companies and owner operators with which we do business. The integrated SmartPlant Enterprise strategy is one of the primary components in the industry trend toward interoperability.’
include ("copyright.inc"); ?>