What happens when a software vendor (or a hardware manufacturer) that is selling into a rather narrow vertical like oil and gas stumbles onto something that has wider application? I remember writing about this before, in my October 2000 editorial when I discussed the case of Landmark’s ‘Smart Windows’ for synchronizing of events like mouse clicks and graphics across multiple windows. The technology was so generic that it begged questions such as, ‘hasn’t somebody else solved this problem?’ or, ‘if it’s that good, why aren’t you pitching this outside of oil and gas?’
Another erstwhile developer of oil and gas technology, PrismTech managed a more successful move out of the vertical. This Shell-backed developer was successful in leveraging CORBA—in particular with the POSC business objects project that later begat Open Spirit Corp. Following a company-making deal with Inprise, PrismTech went on to serve the broader and presumably more lucrative verticals of telecommunications and defense.
I recall these bits of history because in this month’s issue, we have more examples of technology that is sold to a couple of verticals—but that ought to have wider application. In the hardware category we have the Landmark and Storewize compression technology. Rising storage costs make data compression an attractive proposition if it is as seamless as claimed. But lossless data compression is of such broad application that one wonders why it isn’t sold to disk manufacturers for their controllers.
Speaking at the well attended Panasas breakfast, Paradigm’s Phil Neri described the increasing complexity of data access pathways in a modern high performance computing environment. To a programmer seeking to optimize data access you would think that knowledge of where your data is at any given moment in time is of considerable importance. But as Neri pointed out, data may be in the L1/L2 cache of the CPU, in memory, on a local disk or out on some SAN/NAS storage—maybe even in another cache on a disk controller. So perhaps the issue of where to compress does merit a more in-depth analysis of a vertical’s requirements.
Another example of an apparently horizontal technology is ThinAnywhere—the 3D Citrix like technology that is widely used in the upstream. This, according to the press release, ‘allows 3D graphics programs running on Linux/Unix and Windows servers to be accessed with full functionality from remote PC-based workstations.’ Again, this is such a generic issue that one wonders why it needs a specific development effort for the upstream. Could it be that the deal has something to do with the fact that ThinAnywhere is pretty widely used by Schlumberger’s competitors? Or is that just a conspiracy theory?
Which brings me to what is probably another conspiracy theory—but the ding-dong between Landmark, Paradigm and Schlumberger concerning 3D topology modeling really has driven a lot of industry activity recently. From Paradigm’s acquisition of Earth Decision/GoCad, through Landmark’s abandonment of PowerModel, to its most recent purchase of GeoSmith’s Shapes—the engine inside Schlumberger’s GeoFrame! A lot of fuss for what could be a generic, horizontal issue. Is upstream 3D topology really that different from say, CAD?
I was chatting to an IBMer at the SEG who was trying to convince me that virtual reality—à la Second Life was something that was going to be really important. I scanned his face for a hint of a joke—there was none. Until then I had been surprised by the amount of hype the press has given Second Life in general. But IBM’s allegiance to the cause seemed, well, a bit potty really. I made a mental note to give it a whirl when I got back to the office.
But when I went on line and got asked for a credit card number just to have a peek I though ‘yeah sure’ and got back to the mountain of work that had meantime piled up in my first life. And then I read the newspaper. 2,000 demonstrators from 20 countries were involved in a (second) world first, a virtual strike—right in front of IBM’s Second Life ‘Business Center.’ The virtual demo was organized by IBM’s Italian trades union to protest about a salary cut. So maybe Second Life is important to IBM—but perhaps not in the way they had planned.
The October issue of the IEEE’s Computer magazine has an article about the virtual world by Mike Macedonia of Forterra Systems. The article’s strap line is ‘within decades, people could spend as least as much time in virtual worlds as in the real one.’ Macedonia’s thesis is that bandwidth and 3D graphics, and the younger generation’s predilection for Halo 3 will lead us to this scary state of affairs. The hypothesis is backed up by a quote from Gartner: ‘By 2010, 80% of global Fortune 500 companies will have some form of massive multiplayer online or virtual-world presence.’
I don’t really have time or space to unpick such poppycock—but if ‘some form of’ is to include half-assed skunk works projects, flops and bits of a corporate ‘presence’ ‘we thought we’d taken down,’ then maybe Gartner has a point. For my part I’d like to predict that sooner or later, people will actually spend less time in any form of computer interaction. This will come about because of a) class actions for RSI* and the newly classified malady of couch potato-itis, b) automation—computers will be doing more stuff on their own and c) people will get fed up of pointing and clicking and will be more interested in getting a life. Spending half your life in a virtual world will be about as exciting as spending it in your car. Unfortunately for you all, I have to confess that my predictions are usually wrong.
* Repetitive strain injury—see our report from the SEG on page 6.
At the 2007 conference and exhibition of the Society of Exploration Geophysicists (SEG) in San Antonio last month, Schlumberger Information Solutions (SIS) described how both it and third party software developers were ‘adding value’ to the Petrel interpretation flagship by leveraging the ‘Ocean’ development framework that was officially released at the 2006 Schlumberger Forum.
Tightest integration with Petrel is achieved by ‘re-factoring’ an application (rewriting the code) to take full advantage of Petrel’s .PET database file. Companies with niche applications can embed their intellectual property (IP) into Petrel and have the resulting composite certified by Infosys.
Connectivity with external data sources including Landmark, Paradigm and GeoFrame is assured by OpenSpirit. Some of Petrel’s ‘novel’ objects, like uncertainty, can be ‘persisted’ in the Seabed database and managed by ProSource.
Today only master data and Petrel interpretation results are stored in Seabed. In a future release it will be possible to store all Petrel data in Seabed—a potential solution to the reported data management issues that have plagued Petrel in the past.
Probe in Petrel
An example of a ground-up re-write was the appearance of spectacular voxel-based ‘probe’ interpretation technology inside Petrel. This leverages graphics processing units (GPU) for rendering and geobody interpretation. An offshore West African crevasse splay play was shown, with ‘tuning’ of probe amplitudes to optimize discrimination of reservoir sands from shales. A ‘WYSIWIG’ geobody picking tool gives a rough outline to the sand body in a semi automated process which did a reasonable job of following the channel. This was instantly ‘cellularized’ and color coded with seismic-derived attributes.
An SIS-developed microseismic monitoring application has been integrated by embedding new data objects into Petrel. This showed a real time monitoring of a Barnett Shale frac job with a dual display of fracs ‘popping’ with a second window showing engineering data on pumping rate and proppant concentration as a function of time.
Third party apps
Third parties like Ikon Science, Petrosys and Geovariances have joined the Petrel developers’ club. Ikon’s ‘RokDoc’ seismic modeling can be kicked off from within Petrel, a wavelet extracted from the data and then used to build rock physics models. Geovariances’ Isatis geostatistical package can now share data with Petrel and Petrosys has added presentation quality mapping for Petrel—leveraging the Ocean API’s ‘rich connections’ between Petrel and third party or corporate spatial data stores.
Hess Corp. has signed a global IT outsourcing deal with IBM worth $73 million over five years. The deal includes IT infrastructure services, including server, workstation and network management, asset tracking and end user support. IBM will update Hess’ servers to the latest IBM System x and System p boxes.
Hess CIO Pete Walton said, ‘This agreement is consistent with our larger effort to create a competitive advantage through technology. We look forward to leveraging IBM’s best-in-class tools, processes, and expertise.’ As an integrated oil company, Hess’ activity spans E&P, transportation and sale of crude oil, natural gas and refined petroleum products. By selecting IBM, Hess will leverage IBM’s global delivery network and its ‘extensive research capabilities.’
IBM’s Scott Hopkins added, ‘When IBM takes on a client’s IT infrastructure it lets them focus on more strategic parts of their business, such as oilfield operations. It also provides an opportunity to examine the new generation of technologies which can help them visualize, virtualize and integrate business processes more efficiently.’
At the SEG, Paradigm personnel were sporting very green shirts in a visual rebranding exercise, emphasizing its status as a third upstream software pole, alongside Halliburton’s red and Schlumberger’s blue. The positioning was backed up by the roll-out of a brand new seismic interpretation tool, ‘Skua,’ for ‘subsurface knowledge unified approach.’
Skua’s origins are work done by Jean-Laurent Mallet’s team at the École Nationale Supérieure de Géologie in Nancy, France, the team that created Paradigm’s GoCad flagship geomodeler. Skua takes seismic data in the conventional time or depth domain and transforms it into a ‘paleo-chronologic’ coordinate system. An unstructured tetrahedral mesh that can be deformed and un-deformed places each cell at its depositional location. The process, which was developed under the GeoChron* consortium as a GoCad plug-in, is now commercially available from Paradigm.
Working with a palaeo-geographically ‘correct’ mesh, geobodies, reservoir properties and other attributes can be studied in their depositional state. Skua is claimed to facilitate model building and to circumvent the limitations of pillar grids and 2 ½D ‘extrusion’ representations. The full-3D balanced ‘paleo-chronologic’ restoration has been successfully applied to fracture studies of a deforming sedimentary layer**. GeoChron has also been shown to help in geostatistical reservoir studies by providing more realistic in-situ realizations of syn-depositional effects. More from email@example.com.
Schlumberger Information Solutions (SIS) has acquired exclusive development and distribution rights to Mercury International Technology’s ThinAnywhere product for the oil and gas vertical. ThinAnywhere allows 3D graphics programs running on Linux/Unix and Windows servers to be accessed with full functionality from remote PC-based workstations.
ThinAnywhere underpins Schlumberger’s LiveQuest application hosting offering, providing 3D thin client access and enabling centralized application support. LiveQuest now offers single sign-on, collaboration and remote management for E&P applications, simplifying IT support through infrastructure consolidation.
SIS Olivier Le Peuch president said, ‘With the shortage of petrotechnical experts, support for remote collaboration is critical. The acquisition of ThinAnywhere strengthens our position in the remote access and collaboration markets.’
Schlumberger will acquire all existing contracts for ThinAnywhere in the oil and gas sector, resulting in a single source for support and development of this technology in the industry. Other ThinAnywhere oil and gas clients have included Landmark, Paradigm and CoreLab. ThinAnywhere also underpins CGG Veritas’ Remote QC (VRQC) seismic processing service (Oil ITJ July 2006).
As announced at the San Antonio meet of the Society of Exploration Geophysicists (SEG) last month, Landmark is to offer compressed seismic data storage through a ‘preferred provider’ deal with San Jose, CA-based Storewize Inc. (Storewize changed its name from Storwiz earlier this year.) The Storwize STN 6000 device sits between the user and disk storage and offers a claimed lossless compression of a factor of about four for a database and around twofold compression for seismic data.
At an earlier SEG, (Oil ITJ December 1996) a half day event deliberated on the possibilities that data compression could have for the seismic business. Western Geophysical’s Zejlko Jericevic showed that lossless compression of 30-50% could be achieved on data stored in internal numerical formats such as IEEE Float or 24 bit internal. Although lossless compression of this order of magnitude is not new, what the Storewize technology brings to the table is ‘bit level’ access to the compressed data.
In fact, Storewize’s marketing now places less emphasis on high compression ratios—the company’s real intellectual property lies in the bit level access. This means that the compressed representation of a database for instance can actually be queried. Storewize also provides a software tool ‘PrediSave’ for computing the compression ROI for your data.
A background compressor is also supplied for large volume legacy data. A ‘revert box’ software package decompresses data on the disk in the event of a failure of the STN device for ‘worst case disaster recovery.’ Storewize states that SEG Y data shows ‘55%’ compression, in other words, a 100TB of data is compressed to 45TB with ‘no impact’ on performance. The box is a 30 minute ‘plug and play’ install with no change to existing infrastructure.
Landmark, as ‘preferred provider’ of the STN 6000 to the E&P industry, offers global delivery and installation of the product, as well as deployment and integration services. Storwize works with most network attached storage including EMC and NetApp. Storwize claims 120 appliances have shipped worldwide.
Both Schlumberger and Halliburton have recently opened prestigious centers leveraging state of the art information technology and communications (ICT). Halliburton’s ‘Edgar Ortiz Real Time Center’ is a 13,000 square foot Houston-based facility for drilling and well optimization activity around the world. The system has the bandwidth to support over 14,000 real-time jobs a year and double that within four years.
The engine inside the Center is Halliburton’s ‘Digital Asset’ real-time environment along with applications for well engineering, cementation, drillstring optimization, geomechanics and hydraulic modeling. The Center is named for Edgar Ortiz, who retired as CEO of Halliburton Energy Service Group in 2002 having established Halliburton’s vision of for real time operations and collaboration.
Schlumberger Information Solutions’ (SIS) ‘Innovation Lab,’ (IL) also in Houston, is to ‘create, test and showcase’ digital innovations. SIS president Olivier Le Peuch said, ‘We will collaborate with clients and industry experts to solve tough challenges of exploring and producing complex reservoirs.’ IL partners are Barco, Dell, HP, IBM, Intel, Microsoft, NetApp, NVIDIA, Panoram and Whitlock.
As anticipated in the June issue of Oil IT Journal, de Groot-Bril (dGB) is to integrate its OpendTect interpretation environment with the open source ‘Madagascar’ seismic processing package. The integrated system will give the seismic community a complete environment for seismic R&D from processing, through interpretation, analysis and visualization.
Madagascar (formerly known as RSF) is an open-source software package for geophysical data processing and reproducible numerical experiments. The technology developed using the Madagascar project management system is transferred in the form of recorded processing histories, which become ‘computational recipes’ to be verified, exchanged, and modified by users of the system. Over 300 programs have been developed to date by a quickly growing community.
dGB will develop a workflow builder to construct and execute Madagascar processing jobs. Processing flows can start from 2D and 3D pre and post-stack data in either Madagascar or OpendTect format. Processing jobs can be saved and executed directly in OpendTect. The project is sponsored by Statoil, BG, Graham Bowyer, Gerhard Diephuis and ‘an unnamed E&P company.’
Houston-based Kinesix Software is to target real time upstream operations with the latest release of its KX EDGE for Microsoft’s .NET architectures: KX EDGE Version 2.0 is a graphical human-machine interface derived from Kinesix’ flagship Sammi control room solution.
Kinesix believes that by applying control room techniques, upstream users can monitor large volumes of reall time operations data without the requirement for ‘complex in-house developed software.’ The new release adds remote access, an API for third-party integration, a new web-based GUI and an interface to Sammi apps.
Kinesix COE Russ Jamerson said, ‘KX EDGE is the perfect combination of .NET protocols and real-time control-room technology, bringing cutting-edge data visualization to industrial IT.’ The asynchronous, peer-to-peer API supports .NET remoting and HTTP.
Kinesix is working to extend KX EDGE to the upstream with off-the-shelf interfaces for XML-based data sources including WITSML, PRODML, and OPC.
Australia-based Energy Information Solutions EIS has just released ‘Standards Manager,’ a new approach to managing naming conventions in upstream interpretation systems and databases. Instead of relying on ‘long winded and rarely read’ documents on a corporate website, Standards Manager brings naming conventions into the end users’ work flow – avoiding the risk of the IM team becoming an obstacle to standards use.
EIS MD Garry Heaven told Oil IT Journal, ‘Standards Manager can be used to package data along with standard naming convention for compliance testing or version management. A user just fills in a few terms and the system generates, for instance, horizon names according to the selected standard nomenclature. A Petrel user can tie data files to an interpretation via standard metadata.’
Standards Manager uses pragmatic ‘CTRL-C/CTRL-V integration’
to cut and paste a reference between applications. Standard nomenclatures can
be extracted from databases via a set of utilities and end users can add correctly
formed names which can be broadcast. EIS flagship client is Santos. More from
AVEVA has announced a new ‘review-based collaboration tool,’ ReviewShare, a component of the upcoming 6.4 release of AVEVA Review. ReviewShare offers 3D engineering document review, annotation, markup and collaboration capabilities for 3D models of unlimited size.
Blue Marble Geographics’ Geographic Calculator V7.0 includes enterprise collaboration tools for geodetic transformation projects. An XML data source provides a comprehensive coordinate system library and an interface to the upcoming European Petroleum Survey Group’s (EPSG) Web Feature Service.
Moblize has announced a WITSML interface to its Kanak3.0 solution for real-time integrated operations centers. Kanak provides an overview of drilling and production operations collecting data from SCADA, well loggers, PLC, RTU and EFMs for delivery to surveillance and decision support applications.
Neuralog’s NeuraMap map digitizing application now includes volumetrics and reserves based on industry accepted methods.
P2 Energy Solutions (P2ES) has released Enterprise Land (EL) 2.0. Enterprise Land deploys a services oriented, component based architecture to manage land assets. Adaptors allow the software to exchange data with data sources such as SAP, Oracle eBusiness Suite and other P2 ES products. EL2.0’s development was supported by an advisory board including representatives from 13 major oil and gas companies.
Weatherford has announced PanSystem 3.4 with new pressure transient analysis (PTA) capability, modeling tools and support for wireline formation tests and permanent downhole gauges.
Fugro-Jason has released two new packages this month, FastTracker and RockScale – both in their 7.2 release. The FastTracker reservoir modeler enables users to revise, add and update ‘anything, anywhere’ in the model with on-the-fly, automated model re-build. FastTracker integrates structural modeling, reservoir property modeling, upscaling, and direct connections to flow simulators. New functions in RockScale, the seismic-based rock property modeler, include ‘rigorous seismic-based, quantitative rock property volumes in 3D reservoir models.’
The 6.2 release of Geomodeling’s VisualVoxAt package introduces a ‘geobody paintbrush,’ that lets users ‘paint stratigraphic sequences, play concepts and geomorphology on time and strata slices.’ The results can be fed into cross-plots or neural networks for further work such as connectivity analysis and volumetrics.
Following our Short Take last month, Mike Wysatta advises that RyderScott offers some 18 freely available Excel-based programs for various reservoir engineering tasks. Check out the freeware on www.ryderscott.com.
The latest issue of Landmark’s ‘Solutions in Action’ provides an insight into the technology deployed in New Zealand’s Crown Minerals Group’s (CMG) data repository. CMG manages all government-owned petroleum, coal and mineral resources and is responsible for promoting New Zealand’s acreage investors around the world. In 2005, following a decline in both production and exploration, CMG decided to invest in technology to ‘attract more explorers with sufficient capital and technical capabilities.’
CMG commissioned Landmark to develop a ‘robust’ national online geotechnical data bank providing ‘free, online public access to exploration data.’ The solution, which is now operational, deploys an internal PetroBank master data store managed, with PowerExplorer, by CMG Personnel. Data is made available to the public from a TeamWorkspace portal.
CMG data manager Richard Garlick, said, ‘Previously, an in-house database held an index of our technical data and was used to search on our web site. But our metadata had been cobbled together and some information was incomplete or erroneous. Compliance was also an issue with submitted surveys just cataloged stored on the shelf for five years.’
‘It was only when the data came into the public domain that the tapes were copied and quality checked. By which time it was often too late to fix data problems as often the operator had left the country. Also, users weren’t sure what data was available—the lack of a mapping package meant we couldn’t display the location of our seismic surveys. Finally, CMG was restructured to move from a ‘rubber-stamp,’ to implement proactive data management and improved public access.’
Garlick commented, ‘return on investment on this initiative is hard to evaluate – but it would only take one large exploration company deciding to invest in New Zealand based on information found on our web site to justify the investment.’
In July 2007, New Zealand announced the award of an exploration permit in its Great South Basin to a group of investors led by ExxonMobil. Companies are expected to spend over a billion dollars exploring the basin over the next few years. Users anywhere in the world can search, preview and order a range of data including 2D and 3D seismic surveys on www.crownminerals.govt.nz.
The topic of the SEG’s Forum was ‘geophysics and unconventional reserves.’ Sverre Strandenes (PGS) noted applications of multi transient EM in reserve mapping of tar sands and in monitoring steam assisted gravity drainage. Permanent monitoring with seabed fiber seismics has applications in all reservoir monitoring and CO2 sequestration. PGS is working to integrate EM acquisition with seabed seismics. Jean-Marie Masset (Total) offered a wide range of definitions for ‘unconventional’ including the usual heavy oil and tight gas, but also ‘sour gas, high pressure/temperature reservoirs and deep water.’
In the latter category, Total’s Block 17 in Angola shows geophysics ‘at its best,’ allowing for fluid content changes to be tracked with 4D time lapse seismics. Again, seismics is used to monitor steam chamber evolution with live seismic ‘movies.’ Total’s experience with the 5200m, 1100 bar, Elgin Franklin North Sea development includes wide azimuth seismic acquisition, innovative processing and high performance computing to investigate what happens burial depths of around 10,000m.
Chesapeake, according to Larry Lunardi, is a player in all the unconventional exploration plays in the US. These include granite wash, fractured reservoirs and shale gas plays. Chesapeake uses 3D seismic for all of these plays and is now the prime 3D onshore seismic data acquirer in US. One spectacular application has been an extensive 3D seismic survey targeting the Barnett Shale in the Dallas Fort Worth area. Seismics was acquired on the DFW airport taxiway system (by Dawson). Barnett Shale exploration is complex with karsts and faults obstructing the search for the ideal ‘fracturable’ sweet spot. Drilling is equally high tech with five rigs currently drilling horizontal wells under the airport. In all some 250-300 wells will be drilled. Chesapeake has invested some $160 million on 3D seismic.
Ray Boswell (US Department of Energy) traced the interest in gas hydrates back to 1984 when the Glomar Challenger recovered a 1 meter core offshore Guatamala. Gas hydrates may contain half of all the world’s captured organic carbon. Arctic sandstones are estimated to hold 100s of kTCF reserves but the numbers for low permeability marine reservoirs are much higher.
Here, geophysics has given some paradoxical results. Initially it was believed that the ‘bottom simulating reflection’ (BSR) observable on many deepwater seismic lines was a direct gas hydrate indicator. It turns out that this is not the case. In fact the science today is confusing and some doubt has been cast on the existence of significant reserves. A federal program on rock physics modeling, seismic reprocessing and 4C OBS acquisition is being conducted by the DoE’s National Energy Technology Lab in the Gulf of Mexico.
A presentation by ExxonMobil’s Doug Bishea investigated the causes and possible remedies for repeat strain injury (RSI) experienced by users of interpretation systems. RSI is a growing, serious problem ‘we have people in pain.’ There is no real correlation with age, even young people get it. To address the problem Exxon is using HSE and engineering controls to reduce ‘safety risk’. These include removing or enclosing the hazard.
Here automation, scripting and autotracking minimize the amount of input required. A joint collaboration with Schlumberger has resulted in an ‘ergonomic fitness forum’ to identify potentially harmful software. Reducing time spent at the workstation has proved successful but this does imply a degree of cooperation. There is not much point if an interpreter is going home to a six hour stint of ‘World of Warcraft!’
Whether this represents reality or just some very successful marketing is unclear but NVIDIA’s Quadro and Tesla boxes were everywhere at the show. The Quadro Plex boxes offer hardware accelerated graphics. The Tesla D870is NVIDIA’s GPU-based computing engine in a box – a.k.a. the deskside supercomputer, powered by two 128-processor core GPUs.
According to TerraSpark GeoSciences’ CEO Geoff Dorn ‘We don’t compete with VoxelGeo or GeoProbe, we just make them better.’ TerraSpark’s Stratal Slice technology is very similar to Paradigm’s newly announced ‘Skua’ (see page 3). Stratal Slice transforms acquired seismic volumes into a ‘depositional environment’ volume in which palinspast reconstruction is used to visualize channels and sedimentology that are invisible on the original times slices. Automated fault extraction, ‘surface wrapping’ and geobody extraction combine in what TerraSpark calls Computer Aided Seismic Interpretation—CASI. TerraSpark has teamed with Transform Software and has also signed with JOA to develop a Jewel Suite plug-in.
IBM has decided that the SEG is its bug show—and is not afraid of straying from the geophysical path as it were. Thus, alongside the Blue Genes and cluster racks for seismic processors, were a wide range of solutions for more downstream problems. IBM’s Monitor Runtime is a browser-based interface that ‘sucks’ a data model from ISO 15926-based engineering data sources and maps it across to real time data from the Historian. The technology is used in an event early warning system that uses control room alarm techniques to provide sand alert. A ‘semantic engine’ from IBM Research captures knowledge and best practices.
Speaking at a Panasas-sponsored breakfast event, Keith Gray, BP’s High Performance Computing (HPC) team leader described how BP is using R&D to create new ideas and solve problems like deepwater imaging. Innovative techniques like wide azimuth towed streamer acquisition and seismic nodes on the seabed were ‘proved’ in the computer before field tests. These have now been industrialized with, for example, PGS’ Crystal wide azimuth towed streamer (WATS) survey covering 400 OCS blocks and resulting in 200 terabytes of field data.
BP’s own HPC setup now boasts Intel quad core Xeons totaling 15,000 cpus and a 125 terflop bandwidth. Storage is currently 500 going to 750 terabytes of Panasas disk and 2 petabytes of SGI CXFX ‘and we still can’t keep up’. BP has been testing Cluster File System’s Lustre file systems—which ‘looks promising.’ Looking to the future there is a pressing need to build bigger file systems that assure data integrity. ‘We can’t afford to lose data that took weeks to compute.’
Phil Neri (Paradigm) noted that computer processor speed greatly exceeds memory access and seek times are increasing in the new huge memory systems. Multiple levels of cache (on die, in memory, disk, network and in storage) complicate optimization. A related issue is the fact that commercial clusters are designed for transactions not HPC. HPC is a second class citizen! Seismic processing applications like wide azimuth and tomography need to ‘see’ all data and the distributed computing paradigm is not so good in this context. Neri sees a breakthrough in Panasas’ parallel storage and multiple Gigabit Ethernet connectivity. A 20 fold improvement was observed on one migration algorithm. This was achieved by de-tweaking prior ‘optimizations’ and running the application on a virtual machine with parallel storage.
Dynamic Graphics (DG) has livened up its CoViz product, adding a 4D time lapse display function. This was developed for Occidental’s reservoir surveillance and can show, for instance, producers and injectors as a movie with changing well status—alongside of simulator results. DG reports 160 CoViz users within BP. The tool offers an impressive ‘big picture’ view for a ‘high level’ understanding.
Landmark was showing an equally impressive huge 8 megapixel screen. For under $100,000 and with no wiring companies can deploy an ‘instant team room.’ The screen is driven by the NVIDIA Quadro Plex as above.
Petris has developed Kelman’s new seismic data management system. The package leverages Petris WindsEnterprise technology to support Kelman’s clients’ ordering and data delivery. The interface offers digital terrain and geology backdrops and a classic ESRI interface for map control and search. Data can be ordered and loaded into a workstation project.
The SEG has created the SEG Advanced Modeling project (SEAM) Corporation to develop synthetic data sets for algorithm testing. Members have currently chipped in over $1 million to the SEAM fund. Members include PGS, BHP Billiton, Total, CGG, Shell, Halliburton, Exxon and WesternGeco. After an initial period, results of the modeling will be made available to the SEG membership at a nominal cost.
Finally, our ‘cutest presentation’ award goes to the student from Brigham Young University on the Landmark booth who presented a truly unusual application of GeoProbe’s 3D visualization capability. GeoProbe was used to investigate a ground penetrating radar data set showing gopher burrows. These, intriguingly, corkscrew their way down below ground. Burrows display the odd mouse-made side tracks!
Oh and the big news from I/O? It’s now called ‘ION’!
This report is a short version of an illustrated Technology Watch report from the SEG produced by The Data Room. You can read the full text of our report from the 2006 SEG on www.oilit.com/tech.
Speaking on the Landmark booth to a good audience, Chuck Mosher (ConocoPhillips) traced the history of the JavaSeis project*. ConocoPhillips, Arco and Chevron all contributed intellectual property to the project which is now in the public domain. The main value proposition of JavaSeis is its support for parallel input/output (I/O) and the ‘parallel distributed array’ concept.
Typical seismic processing is done on a trace, or on an ‘ensemble’ of traces, at a time. 3D ‘process-lets’ produce small but cascading quality improvements. Unfortunately these are hard to implement and manage on clusters, which is where JavaSeis comes in. JavaSeis’ ‘filing cabinet’ metaphor allows for massively parallel I/O and introduces ‘true’ multi dimensional arrays à la Fortran into Java. Code is written for serial processes or for a single processor and JavaSeis manages parallelization behind the scenes. Unlike many conventional approaches, data is processed ‘in situ’ and not moved around.
JavaSeis is the default storage format for Landmark’s SeisSpace seismic processing environment. SeisSpace also leverages JavaSeis as the native I/O subsystem for large scale parallel data processing. The latest release of SeisSpace now supports parallel processing. And significant performance of sorting and FXY deconvolution has been reported. It is now possible to work with an ‘ensemble’ and perform true 3D operations with 3D algorithms. A new 3D visualization toolkit has been built for processing as opposed to interpretation visualization tools.
Andrea Chiappe has joined Caesar Systems as product manager. Chiappe was previously with Pioneer Natural Resources.
According to a study by IHS’ Cambridge Energy Research Associates (CERA) unit, staff shortages will delay large oil and gas production projects worldwide. A 10-15% ‘people deficit’ is expected through 2010.
INT’s Wellbore Viewer and Geologix’ Geo Suite are in the process of WITSML certification by Energistics.
RPS Energy MD Mick Cook is the inaugural chairman of the new International Energy Consultants Organisation (IECO) which is to offer geoscience, survey and HSE consultancy services.
Infoweb has published an introduction (infowebml.ws/intro/intro.htm) to the ISO 15926 oil and gas plant data standard and its representation in RDF/OWL.
Invensys is to resell and support the full suite of Innotec’s Comos engineering lifecycle management modules.
Following the retirement of Jan Erik Korssjøen, Walter Qvam has been named Kongsberg’s CEO. Qvam was previously with Bene Agere, a Norwegian strategic consultancy.
Eugene Nosal is to head up MicroSeismic’s new Middle East office in Dubai.
Ali Ferling now heads-up Microsoft’s Oil and Gas practice. Ferling was previously with HP.
BP Americas Production Co and Marathon Oil are two of the latest joiners of the Public Petroleum Data Model Association (PPDM).
BJ Services has promoted Henri Niewold to country manager of its Netherlands Well Services Division.
Geoinfo SRL is to represent OpenSpirit Corp. for Argentina, Uruguay, Chile, Peru, Ecuador and Bolivia.
Ola Bøsterud is now head of communications with Norwegian oil services group Aibel. Bøsterud was previously with PGS.
Dave Wallis, EU representative for OFS Portal is now chair of the PIDX Classification Work Group. Paul Mayer of SparesFinder is vice-chair.
Jack Angel has been named VP sales and marketing for fiber optic specialist, Sabeus. Angel was previously with Baker Hughes’ Pipeline Management Group.
Resolve GeoSciences has announced representative agreements with Houston-based processing boutiques SeisLink and Texseis.
Resolute Natural Resources has joined the OFS Portal community. Resolute is a brown-field development with a a primary area of activity in the Paradox Basin of south eastern Utah.
Predictive analytics specialist SmartSignal has appointed Stacey Kacek as VP product development. Kacek hails from Motorola.
Craig Tamlin has been named CEO of SpectrumData replacing Guy Holmes who is now Director of oil and gas operations. Tamlin was previously country manager for tape specialist Quantum.
Pentti Heikkinen has steped down as President and CEO of TietoEnator following disappointing financial results. The TietoEnator board has started a search for a replacement. Meanwhile Åke Plyhm is interim CEO.
George Mathewson is now non-executive chairman of Wood Mackenzie. Stephen Halliday has been named CEO as Paul Gregory moves to non-executive deputy chairman.
Autodesk is to donate its coordinate system and map projection technology, acquired from Mentor Software, to the geospatial open source community.
Maxeler Technologies and the Stanford Center for Earth and Environmental Sciences (CEES) report a 48 fold speed up in a seismic algorithm using Maxeler’s MAX-1 card with a single Xilinx Virtex-4 FPGA over a single AMD Opteron node.
Helge Tveit is to head-up Norwegian VC Energy Ventures’ new Houston office. He is seconded by Bob Schwartz, previously president of energy at the Houston Technology Center.
IBM unveiled its Master Data Management (MDM) Server this month, a service-oriented architectured (SOA) system for central management of master data for real time use. Master data management promises a single, ‘multi-purpose’ view of master data for use in multiple applications and by many different users.
While the MDM initially targets the retail and financial services sectors, the topic is potentially of interest to users in other verticals including oil and gas and process industries. Oil IT Journal talked to IBM MDM guru Fred Busche about potential application in oil and gas. Busche explained, ‘In the commercial world, MDM is used to get a single view of a customer who might be involved in a multitude of transactions. Most MDM solutions therefore have a transactional bias.’
‘Mapping the transaction paradigm to the upstream requires a rethink of what an oilfield really is. In fact the issues involved in optimizing performance across multiple wells of fields depends on a single view of data over the whole entity and involving multiple applications. This is where the dynamic warehouse concept, along with MDM comes in to federate data. Data cleansing is important to create a ‘gold standard’ data set for the oilfield.’
‘Once the dynamic warehouse is established it can be leveraged with data mining techniques to ‘pre-think’ activities and optimize complex operations like sand control.’ IBM is currently trialing the technology with a couple of major producers. Busche didn’t want to name them but did admit that, ‘It is taking time to understand the business requirements of the upstream. Mapping from the transactional environment to the oilfield also involves a few issues, wells are not customers!’
IBM’s MDM offering is based on the DB2 data warehouse and the WebSphere Information Server which includes extraction transform load (ETL) technology developed by Ascential which was acquired by IBM in 2005.
Speaking at the recent IEEE Conference on ‘Human Factors and Power Plants’ in Monterey, California, Hydro’s Holst Nystad described how integrated operations on the Norwegian North Sea Brage field have extended the field’s life ‘for ten years.’ The ‘Brage 2010+’project takes a holistic approach to the development of ‘best practice work processes’ spanning the offshore and onshore environments. These include production and process optimization, operations and maintenance and logistics. These are enabled by ‘extensive application’ of information and communication technologies (ICT) including a broadband secure network, upgraded IT infrastructure, intelligent real time information workspaces and an onshore visionarium.
Workflow retooling has moved from linear to a hub based approach cutting cycle times ‘from weeks to hours.’ Multiple point to point application/data links have been replaced by a data hub with Aspentech’s InfoPlus.21 real time data historian at its core.
Critical Performance Analysis
Current efforts target fiscal metering, heavy rotating machinery through ‘deep’ integrated operations, change management, critical performance analysis and the ‘Man, Technology and Organization’ (MTO) approach developed at NTNU. These have successfully addressed the challenge of Brage’s increasing ops and maintenance costs—forecast to rise exponentially toward end of field life. In a 2006 McKinsey study, of 44 North Sea fields, Brage was the best in terms of costs.
Geographical extract, transform and load (ETL) specialist Safe Software gave a presentation at the Free and Open Source Software for Geospatial (FOSS4G) conference in Victoria, BC, last month showing how Safe’s FME platform can be used to exchange data between open source and proprietary software.
Safe Software president Don Murray said, ‘Open source software was once viewed as a side note to an industry dominated by proprietary systems. But this is no longer the case. Today companies are looking to build hybrid geospatial systems combining commercial and open source solutions.’
A FOSS4G demo by Safe’s Dale Lutz showed bi directional data exchange between geospatial data in PostGIS, MySQL and SQLite backends and proprietary GIS systems such as ArcGIS and GeoMedia. Other demos showed GIS data load to PostGIS, MySQL, SQLite and PostGRES with FME’s quality assurance and data cleansing applied during the process.
The US Department of Energy has kicked-off three large scale carbon sequestration pilots under a presidential initiative to promote clean energy technology and to ‘confront’ climate change. The tests involve the storage of over 1 million tons of CO2 in deep saline reservoirs. The DOE plans to invest $197 million over ten years. Extra monies from partners will bring the project’s value to $318 million. The formations to be tested during this third phase of the regional partnerships program are considered as the most promising locations in the US with the potential to store more than 100 years of emissions from all major North American sources.
The projects include participation from 27 states and three Canadian provinces. The whole sequestration lifecycle will be studied at scale to asses the ability of different geologic settings to permanently store CO2.
Panoram Technologies and ModViz have teamed to deliver high resolution visualization centers. HP workstations and NVIDIA graphics will equip the new ‘visual super-computing platform’ capable of driving high resolution and immersive displays.
The new system will leverage multi-core CPUs and multi-GPU technology for the workstation alongside ModViz’s graphics virtualization to deliver the 3D performance and large data capabilities of older Unix systems. ModViz’ middleware sits between the operating system, the application and graphics subsystems to provide high end performance on commodity hardware.
ModViz CEO Tom Coull said, ‘By sharing our technology and marketing efforts we will be able to deliver a world-class visual computing solution. Our software and Panoram’s large-scale visualization centers will deliver a much needed solution to the oil and gas and other industries that use large screen, high resolution visualization.’
Montreal-based CGI has just announced the ‘official’ release of its Production Revenue Accounting Solution CGI PRAS. PRAS was developed in CGI’s Calgary location along with industry partners Devon, EnCana, Husky and Talisman, building on CGI’s product line that includes Triangle for the Canadian market and CGI Horizon and Petrocomp for the US. The packages manage production volume, royalty and government reporting along with facility charges for operated and non-operated properties.
CGI has developed the PRAS’ help system with the Darwin Type Information Architecture (DITA), a framework for the storage and management of textual information. CGI found DITA’s scalability and configurability to be superior to the previous commercial help authoring system. Tiberon Information Architects helped out on the information framework using its ‘Q6’ Architecture alongside DITA to generate the hypertext system.
In August 2007 the OASIS open standards consortium approved DITA version 1.1 as an OASIS Standard. DITA builds content re-use into the authoring process, defining an XML architecture for designing, writing, managing, and publishing many kinds of information in print and on the Web. CGI claims a broad client base of over 700 clients world wide.
Houston-based pipeline software specialists Energy Solutions International (ESI) has acquired the PipeWorks division of CriticalControl Solutions Corp. of Calgary. PipeWorks is an integrated suite of software products supported by a team of experienced pipeline hydraulics specialists. PipeWorks targets pipeline batch scheduling and leak detection. Details of the transaction were not disclosed.
ESI COE Jo Webber said, ‘This acquisition concentrates the world’s best liquid pipeline software engineers and the leading technology into one company. The integration is strategic to our focus on liquids pipelines and providing end-to-end solutions that help our clients integrate with their customers’ supply chains.
PipeWorks’ location near the Alberta oil sands gives us a local presence to serve this growing market with software solutions that can improve pipeline transmission performance.’
In a separate announcement, ESI reports deployment of its PipelineTransporter liquids scheduling application at Wolverine Pipe Line Company of Michigan. Wolverine’s pipeline system consists of more than 1,000 miles of various-sized pipe and pumping stations. PipelineTransporter addresses the challenges of moving large volumes of crude oil and refined products through the world’s pipelines, providing schedulers with tools and information for planning and timely delivery of products, including batched pipelines. ESI serves more than 500 clients in over 70 countries.
Calgary-based Acceleware has received a $2 million private placement from a Camlin Asset Management fund. The placement follows a $3 million injection from NVIDIA in January. Acceleware leverages NVIDIA-based GPU computing in a variety of vertical markets including seismic processing.
HP has acquired Atos Origin’s Middle East (AOME) group, with 450 employees and offices in Saudi Arabia, Bahrain, the United Arab Emirates, Qatar and Libya. AOME is specialist in a large-scale, SAP-based enterprise resource planning (ERP) systems for, inter alia, the oil and gas market.
IHS has acquired the assets of Exploration Data Services and its subsidiary Geodigit for $6.25 million cash. The assets include EDS’ catalog of interpreted formation tops on some 25,000 offshore wells and Geodigit’s MMS offshore well data, base maps and other well header data. IHS has also acquired EnvironMax, a supplier of environmental, health and safety compliance solutions for $22.5 million cash plus 65,000 IHS shares.
London-based private equity company Actis has taken a 49% stake Pakistan’s LMKR Holdings (LMKR), for an undisclosed sum. LMKR is a global provider of IM solutions to the oil and gas sector and is a partner with Landmark in Pakistan.
Ikon Science has bought Edinburgh-based seismic anisotropy specialist Anitec. Anitec’s founders Colin MacBeth and Phil Wild will join the Ikon team.
MicroSeismic has obtained a $2 million line of credit from Square 1 Bank. The working capital facility is to underpin an aggressive growth strategy.
PGS has acquired depth imaging specialists Applied Geophysical Services of Houston for $51 million.
SAP is to acquire Business Objects in a € 4.8 billon friendly takeover.
Schlumberger has acquired a minority interest in marine EM specialist PetroMarker.
ARKeX has acquired Ark Geophysics.
Chicago-based Aleri has teamed with OILspace of Houston to add Aleri’s complex event processing technology to its energy supply chain, risk, and trade management applications. OILspace’s flagship products are the OILwatch energy portal Aspect TradeFlo, an online trading solution. Complex event processing (CEP) involves real-time analysis and decision support leveraging real-time and historical data.
Aleri President and CEO Don DeLoach said, ‘Other trading markets use real-time analysis along with historical data. The deal with OILspace brings CEP to energy trading and will provide timely and accurate analysis and forecasting.’ CEP lets oil traders aggregate, analyze and correlate real-time and historical data from multiple sources. Traders can react quickly to market fluctuations, leveraging proprietary knowledge and processes for a competitive edge.
OILspace President and CEO Steve Hughes added, ‘We are constantly looking to improve clients’ trading, risk and financial operations. The combined solution will let our users analyze and respond instantly to high-volume, high-speed data while minimizing risk.’ OILspace customers include 9 of the world’s 10 largest oil companies.
Aspect TradeFlo is a simplified trade management solution for small trading and brokerage groups and individual traders. Aleri’s background is in CEP technology supply to banking and financial services. The Aleri Liquidity Management System is used by bank treasury departments.
European natural gas company E.ON Ruhrgas of Essen, Germany is installing GE Oil & Gas’ ThreatScan pipeline impact detection system on several of its German pipelines to evaluate the system’s effectiveness in protecting pipeline networks from third-party damage. The two-phase, $635,000 pilot is the first European application of ThreatScan.
50 km test
In phase one, five ThreatScan sensors have been installed on a 50 km section of pipeline near a construction site in southern Germany. Phase two will see the installation of further ThreatScan sensors on pipelines operated by DVGW (the German scientific and technical association for gas and water). The tests are being conducted in conjunction with planned repair activities on a particular section, where a serious impact will be ‘allowed.’ Tests will include deliberate drilling, scratching and striking of the pipeline.
GE Oil & Gas CEO Claudi Santiago said, ‘ThreatScan allows operators to monitor pipelines 24 hours a day, allowing them to immediately respond to, and investigate, potential problems as they happen.’ E.ON Ruhrgas operates more than 11,400 km of pipeline, 11 underground storage facilities and 28 compressor stations.
Earlier this year, the US Patent Office issued Patent No. 7,246,156 to Industrial Defender for a ‘method and computer program for monitoring an industrial network.’ Industrial Defender (ID) uses agent-based technology to report network data to a server. This monitors data, determines when an alarm condition occurs and sends out a notification. A ‘threat thermostat’ sets the threat level.
The safety features embedded in ID’s agents rely on a write-only use of the TCP/IP protocol. Agents never read anything back from the server and so cannot be hacked or subject to buffer overflow attacks. The ID security event management console is ‘no more a single point of failure than is any other host in your network’.
ID’s activity dovetails with the recent Homeland Security Appropriations Act of 2007, Section 550 for Chemical Facilities which establishes risk-based performance standards for US plants. ID recently open an office in Asia and is planning further expansion in North America and Europe.
Penn Virginia Corp., an independent oil and gas company based in Randnor, PA, has completed implementation of Quorum Business Solutions’ Quorum Land Suite to manage its land and lease administration activities. Quorum Land is a component of the Quorum Upstream which supports land and lease management, integrated mapping/GIS, production and revenue accounting and financials.
Penn Virginia CIO Gary Bailey said, ‘We chose Quorum Land Suite because of its ease of use, proven integration with SAP, and Quorum’s track record for project implementation and support. Quorum’s configurable design means that we will be able to use the product to manage our coal business assets as well.’ In addition to Quorum Land, Penn Virginia utilizes Quorum TIPS to manage its midstream accounting activities.
Yokogawa has announced its ‘Lifecycle Management Program’ a software and services bundle that targets integrated maintenance of, inter alia, oil and gas production facilities. The program seeks to strike a balance between minimizing maintenance costs and avoiding unexpected equipment failures and lost production. This is achieved by Condition Based Maintenance (CBM) and an Alarm Rationalization Service (ARS).
Yokogawa’s Satoru Kurosu said, ‘The LMP provides the customer with an optimal maintenance plan from the long-term perspective of the equipment lifecycle. The LMP assures asset reliability, availability and performance – key aims of our ‘VigilantPlant’ vision.’
A 2006 survey by the Aberdeen Group found that up to 20% of production costs are maintenance. A move from corrective to strategic maintenance could reduce maintenance costs by 80% - i.e. 15% of lifting costs.
Yokogawa’s Michael Büßelmann told Oil IT Journal, ‘Our solutions span upstream and downstream operations. We monitor real-time field device information from pumps, heat exchangers and turbines and provide overall key performance indicators (KPI) to operators.’
Field of Future
Yokogawa has ‘field of future’ pilots in progress with Shell and BP. Yokogawa is a partner on a NAM gas storage consortium where the technology was used initially to monitor heavy rotating equipment. But as CBM proves it worth, smaller equipment items are coming into the system. CBM is a component of Yokogawa’s ‘Asset Excellence’ program and a component of BP’s Field of the Future effort. Data standards embedded in the toolset include S95 and Open Operations and Maintenance, ‘OpenO&M.’
Halliburton’s Landmark unit has acquired the intellectual property and ‘substantially all’ of the assets and existing business of GeoSmith Consulting including the ‘Shapes’ 3D geometric modeling toolset. In its heyday, Shapes, originally developed by XoX Corp., claimed several oil and gas customers including Schlumberger (GeoFrame), VeritasDGC (RC2), TGS Nopec/A2D and Seismic Micro Technology—although the current status of the toolset within Landmark’s competitors is unclear at the present time.
Landmark’s Nick Purday told Oil IT Journal, ‘The GeoSmith modeling tools represent a framework for 3D interpretation from basin modeling through to simulation. They enable our sealed framework interpretation to be updated in real time. The toolset is deployed in both our Geographics Smart Section and the new DecisionSpace EZ Model.’ Shapes offers a ‘non-manifold’ topology architecture with triangular or rectangular meshes of arbitrary geometric shapes.
Previously, Landmark deployed the GoCad modeling engine in its PowerModel product, but this was discontinued when Paradigm acquired Earth Decision and GoCad. Today the geomodeling boot is on the other foot as it were, with Landmark now owning the IP inside ‘some of our competitor’s products!’
Oil and gas standards organization, Energistics (formerly POSC) and the supply-side e-commerce trade body OFS Portal have teamed to form an oil and gas chapter within the Electronic Commerce Code Management Association, ECCMA. The Oil & Gas Content Standard Council (www.OGICSC.org) ‘manages and governs a consensus process that ensures that a common industry standard is ratified through a process of collaboration with participating global stakeholders, customers and ECCMA.’ Interested parties currently include API/PIDX, SparesFinder, POSC Caesar and Suncor Energy.
One ECCMA deliverable is the NATO-backed ECCMA Open Technical Dictionary (eOTD), first unveiled in 2005. The eOTD is an XML representation of the NATO codification system for product parts. The public domain eOTD is used in NATO’s procurement and is currently being extended for use in CAD/CAM programs. Another ECCMA project involves the Management Resource Group (MRG) and is in the process of creating a ‘Maintenance, Repair and Operations Content Standard Council (MROCSC).’
Upstream involvement in ECCMA began last year with the intent of harmonizing product and services classifications. According to the member bodies, the proprietary classification systems used to date have only had marginal interoperability success. The intent is therefore to leverage the API’s Petroleum Industry Data eXchange Subcommittee’s (PIDX) set of product classification templates. These cover oil country tubulars, valves, bearings, belts and electrical equipment. The main task ahead of the OGICSC is to align these PIDX templates with the wider eOTD, thereby ensuring that the oil and gas industry uses standard, public domain terms for procurement.
A ‘mini pilot’, led by SparesFinder is underway to investigate the feasibility of linking the eOTD with the PIDX templates and the ISO 15926 Reference Data Library. If this is successful, a larger pilot will likely be scoped for early 2008.