July-August 2003

Buffet bets on Seitel

Warren Buffet’s Berkshire Hathaway has made its first venture capital investment in the oil industry with the rescue of troubled seismic data broker Seitel from the throes of bankruptcy.

The Seitel plot thickens! As revealed in Oil IT Journal last month, the troubled seismic data broker was rescued from bankruptcy by a ‘white knight’ in the form of Californian private equity firm Ranch Capital.


Now it emerges that Warren Buffet’s Berkshire Hathaway investment company is behind the Ranch move, and will be financing Seitel’s reorganization. If all goes as planned, Seitel will become a wholly-owned subsidiary of Berkshire Hathaway.


The Seitel dénouement began when holders of Seitel’s ‘senior’ debt (i.e. first creditors in the pecking order in the event of the company winding-up) filed a bankruptcy petition against Seitel. But before the court acted on the petition, Ranch acquired all $255 the senior debt at an undisclosed, but likely highly discounted rate. This debt was then sold on to Berkshire – again for an undisclosed amount and the forced bankruptcy proceedings were halted.


Seitel has now filed a reorganization plan with the Delaware Bankruptcy Court for itself and some 30 of its US-based subsidiaries. The reorganization will be financed by Berkshire. Ranch CEO Larry Hershfield will be the new Seitel chairman while Larry Lenig is to continue as CEO, upon its emergence from bankruptcy.

Business as usual

Seitel has filed to dismiss the original bankruptcy petitions, the reorganization will be carried out under US Chapter 11 legislation. Seitel and its US subsidiaries will continue to operate their business as usual. Seitel’s Canadian subsidiaries are concerned by the present reorganization. Seitel currently has $26 million of cash and a further $20 million line of credit from Wells Fargo bank to see it through the restructuring process.

A sign?

It is tempting to speculate on Buffet’s motivation behind the Seitel acquisition. Is this just an opportunistic punt by the ‘most powerful businessperson in America’—according to the current edition of Fortune magazine? Buffet is a great believer in investing in fundamentals and in intrinsic value—a great sign for a troubled industry sector. Another sign maybe the 20% hike in worldwide drilling activity just reported by Baker Hughes. But our question for Buffet is—why Seitel? Why not PGS!

SMT HOT’s up!

Seismic Micro Technology is moving into reservoir simulation with the acquisition of Veritas’ software HOT and (RC)2. Veritas retains a service offering.

Seismic Micro Technology (SMT) is to acquire Veritas’ reservoir engineering software (RC)2 and HOT fmillion of or an undisclosed cash amount and a percentage of future revenues over a four year period. SMT will acquire the software developed and marketed by Veritas units (RC)2 and Heinemann Oil Technology along with related trademarks.


SMT will assume maintenance and support obligations for the software and grant Veritas and its subsidiaries a royalty-free license to continue use of all SMT software including the software being sold.

Significant addition

SMT was founded in 1984 and provides PC-based interpretation and analysis tools. Veritas’ reservoir modeling will make a significant addition to SMT’s software line-up which currently has a geology and geophysics focus.

Exploration Services

Likewise, Veritas’ continued leverage of (RC)2 software—and the addition of a service offering built around the SMT suite—make for a credible extension of the company’s exploration services offering.

On MP3, HPC and having vs. eating cake

Oil IT Journal editor Neil McNaughton reflects on Microsoft’s settlement with AOL in the browser wars—where an anti trust settlement has effectively killed off the competition. Some of the arguments Microsoft used in this case—the coupling of the browser with the operating system make it hard to believe that Windows’ design is modular enough for high performance computing.

As a technology writer I little expected that, in the comfort of my own office, my own PC would suddenly jump up and surprise me in a semi-magical way. I had just popped in a music CD into the drive and was listening to Bjork’s dulcet tones belting out from my total overkill sound system that usually gives me a headache after five minutes. Midway through track 3—i.e. rather near the beginning of the CD, I was amazed to see the drive pop open and eject the media—while the music carried on playing!

Black magic

Not as I first thought, black magic, but just new software from RealAudio whose default behavior is not just to play a music CD, but to compress and record each track in real time onto your disk. What is played back is the MP3. But because the compression and writing to disk overtakes the real time stream, the disc is ejected well before you are through with the record. Neat stuff!


Not only is it neat stuff, it is free stuff. And not only is it neat and free, RealAudio is at the heart of the protracted battle between Microsoft and the rest of the world as to what is ‘fair’ and what is not in competitive software development and marketing. OK, this may be a twisted way of introducing an editorial on Microsoft’s predatory or otherwise practices, but holidays loom, frivolity is in the air what?


In an ideal world, this editorial would have walked you through the legal arguments which have led various plaintiffs in the form of US corporations, some States and the European Union to accuse Microsoft of beating up on the competition. I propose to spare you this and offer a rather succinct account which no doubt omits many of the fine and not so fine points—heck this is an editorial after all.


A crucial aspect of the case against Microsoft was the bundling of Internet Explorer with the Windows operating system. This was claimed by Microsoft’s adversaries—notably by America Online’s Netscape unit—to ‘violate two sections of the 1890 Sherman Antitrust Act’ (legal research on-the-fly from Google!). Netscape argued that to allow a modicum of competition—in short to give Netscape a chance, Microsoft should un-bundle the browser from the OS. Microsoft replied, initially at least, that the technologies of browser and OS were so imbricated that this was impossible.

Go figure!

Earlier this year, the browser part of the anti-trust suit was ‘settled’ when Microsoft agreed to pay AOL $750 million along with a ‘royalty-free’ license to use its (free) Internet Explorer technology! An anti-trust settlement that has killed off the remaining competition and left Microsoft with a complete monopoly of the browser market. As they say, ‘go figure’.


OK, that’s water under the bridge, but what’s this I see at the edge of the stream? A thread—something for an editor to pull on and unravel—viz. in Microsoft’s argument that the browser is somehow inevitably coupled to the OS, that technology obliges it to sell both together—along with it’s competition for RealAudio, Windows Media Player—the object of the current EU legal challenge.

Design principles

This thread is particularly worth pulling on because good IT design principles—of the sort that allow the Internet or telephone system to function—dictate that software should be decoupled and layered. That applications like browsers should be built and sold independently of the OS.


A good example of good software design is Unix (or Linux if you will)—where the windowing system is completely disconnected from the OS. At The Data Room, we have a Linux server running in text mode as an internet test bed and Samba file server. No windows at all. In fact the Unix design principles take this a step further, with a minimalist kernel performing the essentials of the OS, and a multiplicity of modules which can be linked—or more likely not—depending on your requirements.


Thus if you are a Cisco, you take the Unix kernel and strip out all the stuff you don’t need, call the result a router and launch a zillion dollar world-beating corporation. Or more prosaically, if you are a seismic processing house, you chuck out X-Windows, and probably a bunch of other stuff that is normally included in Linux and build a supercomputer.


Which brings me finally to the question of the month. What does Microsoft have to offer high performance computing? To take the behemoth literally, if you really need Internet Explorer to ‘run Windows’ or if you even need Windows to run Windows, then having these frills installed on every machine in a 5,000 cpu cluster is clearly an unwelcome overhead.


The unfortunate truth may be that Microsoft has been hoist by its own IT petard. By embedding this in that, creating a ‘compelling’ solution for its marketing effort, clients who have bought one widget are inextricably drawn to the next. Good commerce from bad design.


But the inextricability of Microsoft’s solutions make its move into HPC (see our report on page 8) a little hard to believe. Something to do with having and eating cake maybe?

New sponsor-contributed material from PGS Tigress and a paper on migrating upstream computing to Linux by Anadarko’s Will Morse.

Munro—“Manage your contracts”

Digital Oilfield CEO Rod Munro argues that contract management has great potential in the upstream. But the specific and evolving nature of our business means that existing high volume, transaction-focused solutions of the marketplace may have limited applicability. Instead, Munro suggests concentrating on contract visibility, price reconciliation and organizational ’white space’.

Contract management is a fast growing software category, not only in oil and gas, but across industries. According to research from Goldman Sachs, the global contract management software business is expected to top $3 billion by 2005—driven by a desire to computerize buyer—supplier relationships. In another survey, PriceWaterhouseCoopers forecast savings of 2% of total spend by leveraging contract automation to eliminate inaccuracies and non-compliance.


Digital Oilfield has interviewed a cross-section of operating companies, and found that professionals face several challenges relating to contract management. Poor visibility can mean that the managers of contracted work may not have access to the terms and conditions of the agreement, or even the most recent copy of the contract. Even when a company has a documented contract approval process, the chain of approval may be incomplete. This exposes the company to ‘maverick’ contracting, increased risk and expense. Historical data about the supplier relationship is often ad hoc, anecdotal, and limited to recent memory. Without such information, contract professionals may not get the best possible deal for the company.


Many oil and gas contracts have complex terms and conditions, with triggers for obligations on the part of the supplier and the operating company. This can lead to the ‘sign and forget’ syndrome, as there is no practical mechanism for ensuring delivery and sign-off. Even straightforward items such as pricing are not always communicated and verified because of complex, manpower-intensive processes.

Value recovery

Good contract management promises the oil and gas industry great opportunities for value recovery. Over 60% of an operator’s spending comprises complex services like drilling, well services, field operations and plant and maintenance. Purchase of such complex services cannot be automated through a straightforward purchase-order process. A lifecycle contract-to-invoice reconciliation process is needed. Line-item price reconciliation for complex services becomes a high value opportunity within a contracts management initiative.


An operator may have 100 contracts with a single supplier—some with multiple pricing scenarios. Service companies also provide key personnel working in hazardous and difficult working conditions. Third party compliance to required safety and training certification becomes an important part of contract management.


Suppliers benefit from electronic access to the contract system. Ideally a supplier should be able to electronically signify that an obligation has been met. Messaging and warnings in the case of upcoming deadlines should include suppliers. The simple capability of accessing contracts from anywhere in the organization will allow for negotiated corporate savings. New contracts can take advantage of terms negotiated in other business units or departments. Our research shows that line-item price compliance can bring savings of 5% of total E&P spend.


We found many oil companies were evaluating RFX tools (Request for Information, Request for Proposal, Request for Quote, etc.). In industries where a high number of bids are processed annually for purchases of high volumes of standard components, electronic RFX tools generate substantial ROI. In oil and gas, however, the industry has been moving away from “bidding every job” for some time. There are several reasons for this.


Consolidation has lead to a reduced number of global suppliers. The industry has adapted to this with a move away from repeat bidding cycles to longer term contracting with negotiated discounts. Sourcing of new suppliers on a continuous basis isn’t required. Most major oil and gas companies do not consider price as the main driver vendor selection. Safety, the crew’s experience, maintenance record and technological capability are uppermost.

Low hanging fruit

We believe that the real ‘low hanging fruit’ of computerizing the negotiating and contracting process should first focus on repetitive manual processes and on non-integrated processes that make it difficult and time consuming to execute effectively. Labor intensive reconciliation of contract pricing against incoming invoices would certainly lend itself to computerization. The resulting capture of incorrect pricing is a real cost saving opportunity.

White space

Organizational gaps or non-integrated processes (organizational ‘white space’) also present opportunities for software that provides automated and standardized workflows and connectivity. Software that helps close these gaps and improve collaboration can provide not only large cost savings, but also the opportunity to mitigate risk associated with non-compliance.


Contract management is a growing technology that is being adopted in the oil and gas industry. Benefits accrue from cost reduction (process automation and price reconciliation), improved corporate controls (standardized processes and auditability), risk mitigation (ensuring contract compliance), and improved supplier performance. The unique challenges of the oil and gas industry in turn present unique opportunities for automation and process improvements that ultimately drive an impressive ROI.

Total deploys INT in Sismage

Total has chosen INT’s Java GeoToolkit to develop the user interface for its Sismage application.

Total’s seismic software specialists have chosen INT’s J/GeoToolkit for GUI development on its flagship Sismage image analysis package. Sismage, the brainchild of Total’s geophysical whiz Naamen Keskes, is the result of 10 years of in-house research (see Oil ITJ Vol. 6 N° 7). Total has settled on Java for future Sismage work and is implementing a component-based architecture to facilitate the integration of new research results. Total leverages OpenSpirit for data management and access to vendor platforms such as OpenWorks or Geoframe.


The new Sismage includes a redesigned graphical user interface (GUI) to enhance usability. Total has developed a set of viewers with INT technology for seismic cross-sections, base maps, well log editors, section and basemap multi views and 3D viewers. INT also provided on-site support for the viewer development project.

Quorum Land for Mission Resources

Mission resources will be using Quorum’s GIS-based software in its land department.

Mission Resources is to deploy Quorum Business Solutions’ Land solution in its land management department. Quorum Land, a GIS-based solution, is part of the Quorum Energy Suite (QES). QES provides an integrated suite of software solutions for energy company operations ‘from wellhead to burner tip’.


Mission land VP Marshall Munsell said, “This project was a tremendous success. We immediately began realizing benefits from the Quorum Land product through ease of use and training and flexible access to our land and lease information.” Quorum’s Energy Suite includes TIPS—the ‘de facto’ industry standard solution for gas plant accounting. The Mission project was completed in May 2003.

Open Spirit V2.5—100 times faster!

Open Spirit Corp. announces a new release and reports 45 oil companies as corporate users.

Open Spirit Release 2.5 has been rolled-out and the company claims significant performance enhancements in well data access. The new release also includes two Subsurface Data Modules for accessing seismic data in SEGY (2D and 3D) and Earth Decision Sciences Voxet format.


OpenSpirit CTO Clay Harter said, “We have seen more than a 100 fold improvement in accessing large numbers of well objects. The new release also provides new libraries for COM and .NET in addition to Java and C++.”


Bart Stafford, VP of Sales and Marketing, added, “45 oil and gas companies now use OpenSpirit as one of their primary data and application integration tools. With the performance enhancement available in this release, as well as our focus on extending data coverage into the drilling and production domains, we are delivering what the market demands for integrated and cross discipline workflows. Our approach to integration leverages our customers’ investment in applications and datastores.”

IFP technology to power Aspen solution

IFP’s multi-phase modeling tool will be embedded in Apen’s HySys Engineering Suite

Aspen Technology, Inc. and the French Petroleum Institute (IFP) have agreed to offer the IFP’s Tacite hydrodynamic module embedded within HySys – Aspen’s Engineering Suite (AES). AES helps engineers design and operate oil and gas facilities. The addition of Tacite, a multi-phase flow model for pipeline design will permit optimization of complete production systems – ‘from flowlines to facility’.


IFP VP Christian Pauchon said, “Tacite has been validated using real world data from the oil and gas industry. Its integration with HySys allow users to investigate the behavior of existing multi-phase pipelines or to study alternative design methodologies for new systems.” Aspen’s Manolis Kotzabasakis added, “This deal will enable our upstream customers to improve the performance of their production assets. Accurate hydrodynamic calculations are crucial to pipeline designs and ratings, ensuring optimal operations and improved safety.”


Tacite models complex multi-phase flows in pipeline systems and incorporates local steady-state prediction methods for pressure gradients, liquid hold-up, phase velocities and flow regimes. Steady-state modeling capabilities using TACITE-HDM will be made available first in the HYSYS TACITE Option planned for release in the third quarter of 2003, while dynamic capabilities will be incorporated in AES at a later date.

SAP in Oil and Gas

Shell, Marathon and ChevronTexaco presented papers at the annual SAPPHIRE conference in Orlando last month. CTC’s e-Foundation was the largest IT project in the company’s history and Shell reports significant e-business through Trade Ranger.

Bob Ford described ChevronTexaco’s (CTC) e-Foundation Project as ‘the largest, most complex IT project in our history’. E-Foundation provides sales, supply and financial transaction processing, decision support and reporting for CTC’s products business.

E-Foundation was the first major system to leverage SAP’s portal technology and supports 12,000 users. The massive project involved 450 full-time employees at its peak, beavering away for an estimated 1.5 million hours. The project set out to underpin 70-80% of CTC’s transactions in SAP with interfaces to over 60 external systems and to replace some ‘really old’ (25 years!) legacy systems. Project scope centered on ‘order to cash’—the heart of the products business. Enterprise application integration was achieved with Tibco. A whole team was dedicated to data clean up, a necessary and in the end worthwhile activity which removed the need for 30 % of anticipated technical deliverables.

E-Foundation’s biggest gain was in data visibility—suddenly, “you could see exactly where an order was being held up”. An SAP benchmark program carried out by Aberdeen University determined that CTC’s extended SAP implementation has resulted in a huge drop in transaction costs.

Shell’s portals

Shell has 85,000 seats of the mySAP Enterprise Portal and some 60,000 of SAP Solutions worldwide. Ben Krutzen described how Shell is using portals in its upstream business—which has 25,000 employees in 41 countries. Shell’s ‘globalization’ encompasses drilling, capex project execution and IT—which is undergoing a major rationalization towards a global set of tools and applications. Shell was ‘in dire need’ of good collaboration tools and needed to improve information and application accessibility. Shell’s GeoPortal (not actually an SAP portal) ‘moves work to people’. Seismic interpretation or reservoir modeling can be performed by experts on the middle east sitting in an office in Norway. Session ‘shadowing’ facilitates cooperation at a distance. Shell’s e-Learning portal was originally developed by an oil service company using the Top Tier web portal (later acquired by SAP). Shell uses the portal to simplify access to information and applications. Relatively straightforward content is served a large number of users in the form of documents discussion forums. Shell has also developed a Wells Portal—a ‘mock-up’ at the moment but which will likely be productized by Accenture and SAP.


John Cavallero presented Marathon’s SAP HR installation. Project ‘Edison’ set out to provide an enterprise-wide HR system across the domestic and international, downstream, refinery and retail operations. Edison replaced multiple aging systems with a single integrated platform. Edison now has 125 portal ‘eye-views’ visible to 12,000 portal users. Around 110 folks worked on Edison at its peak. Cavallero cited a strong relationship with Price-WaterhouseCoopers and IBM as critical to project success.


Willem Zuidema illustrated Shell’s global e-procurement effort with a case study of a buying tool and a preview of some SAP developments—the Content Integrator and Management Information System—core components of Shell’s global solution. E-Procurement via Trade Ranger impacts Shell’s business by monitoring contractual compliance, reducing and controlling costs, process efficiency and standardization. Shell uses desktop purchasing tools from ElectroBase which are linked to the SAP Market Set, itself comprising a hosted content integrator and business warehouse. The content integrator matches product names across different vendor catalogs. The business warehouse (or MIS) captures purchase orders, invoices and receipts – leveraging USU, SPCE, DUNS and PIDEX standards.

PIDX European users meet

Presentations at the PIDX EU meet included RFID tags, localization and new routing protocols.

The American Petroleum Institute’s Petroleum Information Data Exchange (PIDX) held a meeting of its European branch this month. Some 40 attended from a sprinkling of major European oils and a broader selection of service companies.


Spares Finder’s Paul Mayer presented a paper on the impact of radio frequency identification devices (RFID) on the supply chain. RFID tags, along with classification standards, are set to “eliminate many of our current supply chain issues, including stock and order mismatch, rogue buying, theft, loss and wastage, duplication, physical audits, and poor supply chain visibility”.


Total’s Jean-Pierre Foehn stressed the need for language localization of standards and software tools. Localization is a thorny issue involving units of measure and other attributes. Moreover, subject matter experts, not just translators, are required for this specialist work.


Schlumberger’s Alan Perro described ongoing work on the Transport, Routing and Packaging (TRP) for XML Documents Project. Here the intent is to lower the cost of implementing PIDX XML Transaction Standards. Currently, TRP for XML leverages the (expensive) Rosettanet infrastructure. An alternative using OASIS ebMS is under development.


Jerry Hubbard (OFS Portal) reviewed the status of the Services Classification Project which sets out to provide a standard nomenclature for upstream services. The deliverable will be a revised UNSPSC.

Landmark European User Forum

Landmark’s 2003 European Regional Forum was finally held in Aberdeen last month. It was well attended by Landmark folks—but the industry at large seems to have an increasingly hard time motivating itself for travel—even it is just from Dyce to Deeside! The highlight of Landmark’s current software effort is the ongoing release of components of its new application suite ‘Decision Space’. Decision Space (DS) is a new suite of applications from Landmark covering similar ground to OpenWorks—but with a re-packaging of parts of the ‘workflow’. DS is written in Java—and provides native cross-platform deployment, with focus on Windows and Linux. The new ‘base module’—a common ‘viewing/doing’ environment is a significant rationalization. A constellation of new applications is growing up around the base module which replaces the old ‘point applications’—and brings a new scope spanning components of the older point apps’ workflows.

John Sherman’s keynote focused on Landmark’s vision of a ‘truly interactive’ reservoir model which matches development activity in real time. This is made possible thanks to Moore’s law – compute power growing exponentially and also to an exponential decline in ‘form factor’ – compute device size. As devices dwindle to pinhead size – computing will be moving down hole – where ‘Darcy meets Moore’. Landmark is helping the industry ‘ride Moore’s Law’ with the NOW field (a.k.a. the ‘E-Field’) concept. The essence of ‘NOW’ is speed – whether in interpreting a 700 GOM block dataset with Magic Earth, recognizing depositional patterns with Spec Decomp, or using the emerging technology of Decision Space Power Model to pick intrinsically consistent 3D geometrical bodies as you go. ‘Real soon’ these emerging products will be joined by ProMagic – adding interpretation capability to ProMax seismic processing, and well-seismic ‘fusion’ – for log and seismic based fluid and rock physical analysis. The future will see integration from 4D seismics through interpretative reservoir characterization to simulation with ‘online predictive modeling’ – automated earth modeling leveraging the atomic mesh unstructured gridding to ‘eliminate upscaling’ and to ‘condition the reservoir model to the results of 4D seismics. Sherman forecasts the demise of the ‘tinker toy’ environment of current interpretation workflows – soon ‘you won’t pick a horizon, you’ll pick a formation’. Results will plug straight in to Landmark’s new unstructured reservoir simulator.

Landmark R&D

Murray Roth explained how Landmark arbitrates it’s R&D spend. The lion’s share (73%) goes on core product development, notably Release 2003, Linux, ASP enablement and ‘Project Houston’. 20% goes into new product innovation, 3% on ‘basic’ research including university-based research programs and 4% on applied research into areas like leveraging bump mapping graphics technology from video gaming, to display seismic and coherency data together. Landmark is also involved in client-funded R&D – customizing solutions to client workflows.

ASP in action

Mike James reported that ASP is working well for Helix’s international consultants. Prior to hosting, Helix had around 30 PC/UNIX workstations running six different operating systems and a ‘considerable’ IT overhead. Now all Unix hardware has been replaced with twin head PC’s and an internet connection. Migration was a painless ‘one day’ project and users were very positive—especially at regional offices. Some software vendors did not want their tools hosted by Landmark. Web based plotting is ‘work in progress’. Bandwidth for 3D visualization is the main remaining issue. James is confident that “what doesn’t work today will tomorrow.”

Data Management

Maggie Montaigne explored Landmark’s comprehensive data management offering which spans hosting, commercial data sources, data e-commerce, web services and National Data Banks. Today, data management strategy focuses on providing secure access to data of known quality. A variety of in-house, co-sourced and outsourced configurations are deployable.

Digital Oilfield

Robin Wye gave an update on the Accenture—Landmark Real Time Asset Management (RTAM) Center—designed to ‘link stakeholders through web technology’. Anadarko is already using application and data hosting to share CAPEX and joint venture data from a deepwater asset with a major EU oil co. Another EU major links G&G knowledge across several locations through a portal of project decision making workflows. BP’s Norwegian Valhall field has 12kms of fiber and 10,000 sensors. The Snovhit gas field deploys fiber optics in subsea umbilicals. Statoil’s Snorre has down hole instrumentation and control (DIACS) . Statoil has deployed 3 SGI Reality Centers for real time drilling. Wye believes that oils today are at best ‘on the lower slopes’ of the digital oilfield. Companies are not ready to reap the business benefits and need to reflect on how to ‘deconstruct’ their businesses. To evaluate which digital oilfield is right for your company, a holistic view of data ownership and management is required. The technology is there – the business model is ‘emergent’.

Power Explorer

Leslie Mashburn demoed Power Explorer (PE), Landmark’s new GIS-based data front end combining elements of Open Explorer and Surf and Connect. PE is Java-based and leverages ESRI’s Map Objects. PE now supports GeoFrame, PetroBank and Open Works and a (local) GeoFrame. All these repositories can be viewed simultaneously and ‘federated’ queries executed thanks to the new Power Hub ‘middle tier’.


Ben Trewin described ENI’s use of the Team Workspace portal to provide data access, application hosting and workflow management. ENI has linked Team Workspace Finder, Open RSO, OpenWorks and Recall. ENI-AGIP’s IT strategy is to access data and applications from a central location (Milan). A mix of internal and external hosting supports applications such as Access, Oracle, SQL Server, IHS Energy’s Iris 21 and CC reservoirs. Users can drill down to specialized navigators like WOW or EDIN for Iris. ‘Real soon now’ ENI’s portal will offer metadata management and web services, intelligent agents and the semantic web.

Decision Space

Nick Purday showed how the DecisionSpace (DS) ‘umbrella’ is used to launch other Landmark ‘point apps’ from the Decision Space menu bar. A demo of BP’s Wytch Farm field showed a Landsat image draped over the topography. Moving underground the extended reach wells became visible—including the 11km plus record breakers. Top and base reservoir are represented by colored disks at the wells. Seismic and horizon data and faults can be paged in. Firing up PowerModel allows for object specific actions such as fault plane smoothing. Gridding up a fairly complex 125,000 cell model took around 10 minutes using the 3D grid building system. Cubes can be displayed as ‘roaming’ – trace only – or voxel-based ‘opacity’ cubes with user-configurable object transparency. PM also does property modeling, statistics and seismic-based model ‘conditioning’.

Real Time

Helen O’Connor cited a recent Cambridge Energy Research Associates study of ‘The Digital Oilfield of the Future’ (DOFF) which determined that digital technology could ‘expand oil reserves by the equivalent of a new Saudi Arabia’. Landmark’s contribution to the DOFF is the Real Time Asset Management Center. O’Conner notes that real time is not new—data has been captured in the field by SCADA systems. What is new is the use made of real time—‘it’s all about knowing sooner’, thanks to links to SCADA, 4D seismic ‘remote sensing’, real time drilling, visualization and modeling. O’Connor presented some case histories of RT usage—Statoil’s use of OpenWire (WITSML) for field to shore data link and Shell’s real time operations centers which have reduced non productive time.

This a shortened version of a report produced as part of The Data Room’s Technology Watch reporting service—tw@oilit.com.

Folks, facts, orgs, et cetera

People on the move at Landmark, CERA, CTC, SPE. Reports and data package announcements from A2D, Chartwell, Newton Evans and AAPG.

Landmark has appointed Murray Roth as Executive VP of Marketing & Systems. He is replaced as VP R&D by Dean Witte.


Cambridge Energy Research Associates (CERA) has named Peter Jackson Director for Oil Industry Activity. Jackson was previously with Enterprise Oil.


Following its recent acquisition, Jason Geosystems has changed its name to Fugro-Jason.


ChevronTexaco Information Technology Company president and CIO Dave Clementz is to retire. His position will be filled by Gary Masada. CTC has created a new Energy Technology Company (ETC) to be lead by Mark Puckett.

Ross Davidson has been appointed operations director of the SPE’s new Middle East Office in Dubai UAE. Davidson has moved from a similar post in London, now filled by Val Johnston-Jones.


Cambridge Energy Research Associates (CERA) has just announced the Digital Oil Field of the Future Forum (DOFFF), a successor to its eponymous multiclient study. More from cera.com.


A2D Technologies is offering well log data packages for BP’s 2003 Undeveloped Land Offerings in the Permian Basin and East Texas areas. Visit a2d.com.


Chartwell’s Guide to Bill Presentment and Payment (utilities and retail energy) 2003 is now available from chartwellinc.com.


Newton-Evans Research Company has published a report on the World Market for Supervisory Control and Data Acquisition Systems (SCADA) in Gas & Oil Pipeline Operations: 2000-2004. More from EnergyPromotion.net.


The AAPG has announced a Digital GIS Database – a ‘one-stop on-line shopping center’ geoscience data. More from aapg.org.


During the first half of 2003, the Oil IT Journal website received a total of 73,000 visitors who spent an average of 4 minutes on the site.

Oil & gas high performance computing

Around 100 attended Microsoft’s High Performance Computing (HPC) for Oil and Gas Forum held in Houston last month. Microsoft unveiled an HPC toolkit developed by Cornell’s CTC HPC unit which vaunts Windows 2000 Advanced Server as an HPC platform. But in HPC, Microsoft is the underdog fighting the dominance of Linux. Although various heavyweights from Intel, HP and others rallied to the cause, their true affiliations don’t really beat scrutiny. The Cornell unit is Microsoft-funded. Intel is agnostic and made no real attempt to justify Windows as an HPC platform. HP again is agnostic and was bent on selling its ‘Agile’ infrastructure rather than defending Windows.

Microsoft energy manager Marise Mikulis welcomed attendees to the High Performance Computing for Oil and Gas Forum. Mikulis announced a ‘new era’ in Microsoft’s dedication to the E&P industry. A Global Business Unit will focus on E&P and Microsoft intends to ensure that its technology meets E&P needs—both upstream and downstream.


Microsoft HPC manager Greg Rankich noted that the research community ‘still develops in FORTRAN’ and that parallel programming hasn’t changed in 10 years and still requires specialized skills. The plethora of tools and approaches make HPC a fragmented market with many applications but a lack of integrated solutions. Microsoft is working with customers to optimize deployments by investing in companies developing cluster management solutions. Microsoft has been working with the Cornell Theory Center (see below), HP, Intel and other customers and is giving away a toolkit to help users get started with cluster-based computing.


Roger Lang (Cornell Theory Center) wants to make HPC simple enough to be used by ‘the masses’ through standard off-the-shelf tools. CTC has demonstrated the world’s first Windows-based CAVE visualization system and is ‘trying to make clusters easy to build’. The advent of Windows 2003 ‘reveals a new roadmap’ for HPC with Enterprise Web Services making HPC resources transparent across the enterprise, promising ‘excellent reliability and scalability’. Lang deprecates cross-platform enterprise Web Services as requiring complex systems integration and producing a custom, hard to deploy product. On the other hand, Microsoft’s .NET framework offers ‘end-to-end integration for scalable Web Services’.


Intel’s HPC architect David Barkai recalled that in 1980, a gigaflop cost around $5 million. Today the same compute power costs $2,000. Clusters now offer around 10 teraflops (TF). By the end of the decade (2010) Intel foresees 30 GHz processors. Barkai states that 56 of the top 500 supercomputers sites are Intel-based (up from 3 in ’99). Barkai believes HPC requires ‘ecosystems’ for industry and end users and Intel’s HPC program is ‘more than silicon’. HPC is differentiated by computational intensity large scale applications and random, dynamic data access patterns. The community today is composed of ‘pre-early adopters’ and expert users. Intel wants to make technical computing adoptable by a larger market. Common off-the-shelf components make HPC clusters affordable at the department or project level.


Existing challenges will get worse as Moore’s Law drives on. I/O and interconnect bandwidth will have a hard time keeping pace with processor speed and memory size. Cluster management also raises issues of performance monitoring and maintenance. Barkai enumerates reasons for using Intel’s HPC solutions without advocating a particular operating system.


HP’s Oil & Gas industry director Michele Isernia presented HP’s ‘Agile Computing’ initiative which aims to loosen the tight relationship that software has with hardware. Software should be ‘owned’ by the user, who can use whatever computers or appliances are available. Users are freed from the burden of carrying electronics and computers will ‘recede in the background and become pervasively useful instead of a constant annoyance’. Components include the Agile Client and the Agile Data Center built from ‘industry standard’ servers. Infrastructure is designed for use-on-demand by focusing on services, shared resources and pay for use. A pool of virtual servers can expand or shrink as required. HP technology to achieve this is the Utility Data Center a ‘complete solution for virtualizing data center environments’. Like Barkai, Isernia offered more reasons for using HP than Windows!


Calgary-based Computer Modelling Group (CMG) provides reservoir simulation with its flagship IMEX II GEM and STARS products. CMG’s Jim Erdle explained that a driver behind HPC in reservoir modeling is to avoid time consuming upscaling by using geocellular models for simulation. This is being achieved by a combination of 64-bit computing, shared memory parallel processing, and dynamic PEBI grids. Erdle believes that 1-5 million cell models will soon be commonplace. CMG is a Windows shop and appreciates the easy to use GUI and the speed of Windows on the Itanium. CMG claims a world record for its 112 million cell simulation. [actually this was achieved on an IBM cluster running IBM Unix (AIX) - see Oil ITJ Vol. 7 N° 11!].


VP Shing Pan introduced ModViz (a Siemens spin-off) as a developer of hardware and software for large scale visualization with PC technology. ModViz’s goal is to provide a scalable software platform to enable high performance real time visualization. ModViz synchronizes 3D data across a cluster in real time and is partnering with HP to ‘make scalable visualization a reality’. Modviz’s Renderizer Visualization Cluster software for Open Inventor is currently shipping in Linux and will be available for Windows ‘real soon now’.

Statoil forces DIACS competition

Statoil has put Schlumberger and WellDynamics head-to-head on downhole instrumentation.

Statoil has awarded two ‘frame’ contracts for downhole instrumentation and control systems (DIACS) on its North Sea Snorre field. The contracts were awarded to Schlumberger and WellDynamics. Each contract, worth about NOK 100 million, includes development, production, testing and installation of the solutions. The contracts will run for three years with possible extensions to other of Statoil’s fields.


Statoil contract manager Bjørn Schibevaag said, “With these frame agreements, we’ve helped to introduce new solutions and create competition over price and technology development in the market.”


WellDynamics has already installed four DIACS on Snorre. The new frame contract means that the company will continue to supply such equipment to the platform. But from now on, this activity will be performed ‘in competition with Schlumberger’. Schlumberger plans to start installing its own DIACS equipment during 2004. WellDynamics is a Halliburton/Shell joint venture formed in 2001 to leverage and combine Shell’s iWell technology with Halliburton’s SmartWell intelligent completion system.

Stratify to explore Petrobras’ information

Petrobras is using Stratify’s intelligent text search toolkit to manage its unstructured information.

Petrobras has chosen the Stratify Discovery System (SDS) to manage its unstructured corporate data. SDS claims to offer a unified solution for delivery of relevant, high-value information directly to business users, as well as enabling content providers to enhance the quality of their information products.


Petrobras enterprise information manager Paulo Cesar Coletti said, “Petrobras selected Stratify to provide the core discovery capabilities in our corporate portal. Integrating seamlessly with our portal, content management and search systems, Stratify provides us with a robust platform to organize our unstructured data and enables us to manage our information more effectively throughout the company.”

SDS 3.0

Stratify’s latest release, SDS 3.0 combines transparent search, integrated categorization and entity recognition (such as people, locations and organizations) with taxonomy management capabilities. SDS users can now leverage analytical capabilities to build customized reports summarizing business critical information embedded in unstructured data, as well as easy-to-use visual discovery tools for analysis and data mining.


SDS lets business users categorize documents, recognize critical entities such as people, locations and organizations, and ‘intuitively discover’ new relationships from unstructured information in text documents, presentations and e-mails.


SDS’ Taxonomy Manager provides an interface for managing all phases of the taxonomy lifecycle. The Taxonomy Manager’s new security architecture ‘integrates seamlessly’ with enterprise identity management environments (i.e. LDAP and Active Directory) to provide granular control over taxonomies and topics. These enhancements make managing large-scale, complex taxonomies faster and more efficient.

Flow simulation add-on for Irap RMS

Roxar’s RMS FlowSim crosses the boundary between geology and reservoir simulation.

Roxar Software Solutions is about to release a new module in its Irap RMS reservoir modeling portfolio. RMS FlowSim is a collaborative tool for geologists and reservoir engineers that crosses the silo boundaries of modeling and simulation. RMS FlowSim integrates with Irap RMS to provide fluid flow simulations embedded into the static reservoir model. This integration reduces the need for data import and export and allows users to benefit from visualization, data analysis and other modeling support tools in the Irap RMS environment.


RMS FlowSim claims a user-friendly graphical user interface that ‘minimizes the learning curve and increases efficiency’. RMS FlowSim, along with the other Irap modules now constitutes a focus for the exchange of ideas between members of the asset team.

Finite difference

RMS FlowSim is a state-of-the-art finite difference simulator based on proven technology. It is a fully-implicit black oil simulator with extensive reporting, multiwell handling, group controls and economic limits. Pre and post-processing options include production profiles, P10/P90 comparisons etc. Geological concepts can be tested and ranked and prospects and development scenarios evaluated prior to full-field simulation. Roxar claims that by testing the dynamic properties earlier in the modeling process leads to the efficient generation of a high quality simulation model.

P2ES Enterprise for Newfield Exploration

Newfield has deployed Petroleum Place Energy Solutions’ finance and accounting package.

Newfield Exploration has implemented P2 Energy Solutions’ Enterprise Upstream suite of applications for financial and operational accounting. Newfield is now live on the Enterprise Upstream suite of applications. P2ES professionals are working with Newfield to convert its recently acquired EEX assets into Enterprise Upstream.


Mark Spicer, IT Manager for Newfield, commented, “The Upstream application suite provides good integration of land, production and accounting and will provide a solid back office that will support Newfield’s future growth. When integrated with our existing systems Upstream will support our reporting architecture.”


P2ES professionals are helping Newfield establish an enterprise datamart to provide a single, centralized source of operations and financial data to all of its office and field personnel. P2ES, a Petroleum Place unit was created in January 2003, by the merger of Paradigm Technologies, Novistar and Petroleum Financial.

MySQL—the new upstream contender?

A Calgary-based startup plans to leverage the Open Source database and public data model.

CleanDB, Brian Marshall’s Calgary-based startup has set out to port the Public Petroleum Data Model (PPDM) to the Open Source MySQL database. Marshall told Oil IT Journal, “When I joined PPDM in May, members could download scripts to create PPDM databases using Oracle or Microsoft SQL Server. Because I believe that MySQL is going to become more important, I have made scripts to create a PPDM database using MySQL available on the PPDM website.” Marshall claims that MySQL is much simpler to run than Oracle and asks less of DBAs. Marshall is currently working on porting the Oracle stored-procedures that do referential integrity checking in a PPDM database.


Recent announcements by major IT players should dispel any doubts that MySQL is becoming a force to be reckoned with. MySQL AB has just announced a strategic alliance with SGI to provide the MySQL database on the SGI Altix 3000 family of servers and superclusters. MySQL AB and SGI are cooperating on engineering optimization, marketing and sales for the Intel Itanium 2-based Altix 3000 supercluster running MySQL on Linux to power heavy-load, high-performance database applications.

Marathon’s $63 million outsource deal

The 8 year contract confirms EDS as leading upstream outsourcer.

IT service behemoth EDS has been awarded an eight-year, $63 million outsourcing services agreement by Marathon Oil Co. to improve its computing capabilities and consolidate its server environment. Under the agreement, EDS will provide mainframe, mid-range and distributed server management services to Marathon, letting Marathon concentrate on its core business and strategic initiatives.


EDS oil industry boss Amanda Mesler said, “EDS has proven experience in developing innovative solutions to clients in the energy sector, and we look forward to putting that industry expertise to work for Marathon. Our services, industry knowledge and global capabilities will allow Marathon to focus on its core business, reduce IT costs and enable its employees to pursue strategic IT initiatives to enhance Marathon’s competitive position.” EDS claims to be a leading outsourcing provider to the energy industry. EDS’ offering spans E&P, refining, petrochemicals, transportation and marketing. Global energy customers include BP, ChevronTexaco, ExxonMobil, and ENI.

AQPC content management

Megan Salch takes issue with Oil IT Journal’s review of the AQPC’s content management guide.

“I recently received your review of APQC’s booklet Content Management - A Guide for Your Journey to Knowledge Management Best Practices and was surprised by the negative review. This booklet was written based on a comprehensive research project. It was designed to be a quick read to introduce customers to the concept of content management, to explain the place of knowledge management in organizations, and to focus on some facets of IT implementations which are frequently overlooked. With application choices changing daily, this booklet focuses on the much-neglected aspects of a content management system: people and processes. So, like other contributions to our Passport series, this booklet serves as an introduction to jumpstart readers. It’s not intended to be the comprehensive, ‘how-to’ book that your review sought.”

Megan F. Salch

APQC, Marketing Department


OpenWells operations reporting

Landmark’s DIMS replacement links operation reporting to the Engineering Data Model.

Landmark Graphics Corp. has just released OpenWells, the replacement for its Drilling Information Management System (DIMS). OpenWells sits on top of the Landmark Engineering Data Model (EDM), providing operations drilling and well services reporting. Landmark president Andy Lane said, “OpenWells is a user-configurable operations data management tool that integrates directly with well engineering and earth model visualization applications. OpenWells has been engineered to support today’s rapidly expanding real-time drilling environments from shallow, onshore fields to deepwater frontier areas.”


OpenWells is integrated with Landmark drilling and well services applications such as Compass, WellPlan and StressCheck. Visualization leverages other Landmark software including AssetView and 3D DrillView. OpenWells is ‘highly configurable’ and can manage entitlements for data visualization by partners or ‘tight groups’. OpenWells uses Crystal Reports to configure and generate printable or XML-based reports.

Click2learn KM for Baker Hughes

Baker Hughes is to deploy Click2learn’s learning management server for internal training.

Baker Hughes’ INTEQ division (BHI) has chosen Click2learn’s Aspen suite to power its global learning initiative. BHI has used Click2learn’s Learning Management Server (LMS) to create and deliver technical and product information, training and ‘orientation to the company’s core values’ to its employees. The Aspen Learning Content Management Server and the Virtual Classroom (VCS) are under evaluation for future expansion of BHI’s learning solution.


BHI training director Scott Gray said, “Aspen enables us to provide consistent, effective training to our global workforce. In addition to giving our employees access to Web-based courseware, Aspen also allows us to map and track skills and competencies and to assign ‘secondary jobs’ to employees based on their skills”


BHI anticipates that the deployment of e-learning will reduce travel costs, provide just-in-time training and offer a ‘constant resource’ for revision. Aspen also provides competency tracking and certification for compliance and quality standards. By managing all learning and training information in a single repository, BHI can also accurately report on training creating a training audit trail for every employee.


Click2learn chairman Kevin Oakes added, “Aspen provides BHI with a scalable platform that will deliver a consistent learning experience across the enterprise, creating personalized, specialized learning opportunities for its employees whose skills can be aligned with key business goals and metrics”.

Midland Valley rolls out 4D Vista

Midland Valley’s generic visualization tool is its first 64-bit ‘new technology’ offering.

Midlan Valley Exploration (MVE) has just rolled out a new product – 4D Vista for generic visualization of geological objects. MVE has been rebuilding the software infrastructure underpinning its structural interpretation and modeling tools to benefit from emergent 64-bit hardware (see Oil ITJ Vol. 8 N° 3). The first ‘new technology’ product is described as a new engine and a new window for viewing—4DVista.


MVE hopes to attract new customers by providing 4D Vista free to all. The tool will read a variety of file formats apart from MVE’s other software tools. DXF, Golden Software’s grid and raw triangle formats open automatically and ASCII files can be opened with a wizard. Once a document has been loaded, additional data can be inserted into it.

3D Document

Data is visualized in multiple 3-D Document View windows. Several views of the same document can be opened synchronously alongside views of other documents. Each view has its own display options. A 3-button or wheel mouse is required to get the full potential of 3D visualization. MVE is looking for partners who would like to attach their own applications to broaden the scope of 4DVista. In the future, 4DVista will be extended to provide an ‘all encompassing’ operating environment for MVE’s suite of kinematic geological modeling. Download 4D Vista from mve.com.

Iron Mountain buys Hays IMS

Iron Mountain has acquired Hays Information Management Services unit in a £200 deal.

Storage specialists Iron Mountain have acquired Hays’ information management services unit for £200 million. The transaction was jointly funded by Iron Mountain’s US parent group and its European joint-venture with Mentmore plc. Iron Mountain reports that the Hays acquisition adds seven new markets in the UK, Germany, Norway and Belgium.


Iron Montain chairman Richard Reese, said “The strategic acquisition of Hays IMS doubles the size of our business in Europe and our initial observations reaffirm our enthusiasm for this transaction. This transaction consolidates our position as the leader in records and information management on three continents, allowing us to better serve new and existing customers. This transaction will create opportunities for our employees and value for our shareholders.”


The acquisition comes hot on the heels of a deal between Hays and Schlumberger Information Solutions (SIS – see last month’s Oil IT Journal) which was described as ‘the industry’s first desktop-to-warehouse information management solution’.


Hays IMS reported EBITDA of £19 million for the financial year ending June 2002 and operating £13 million on turnover of £ 88 million. Turnover of £11.3 million came from Hays US unit.

Intel—Schlumberger center for Houston

Intel and Schlumberger have set up another competency center to showcase reservoir technology.

Reservoir engineers can now ‘test drive’ the latest versions of Schlumberger’s Eclipse’ reservoir simulator at the new Intel Energy Competency Center (IECC) located in the Schlumberger Information Solutions (SIS) headquarters in Houston.


Intel energy supremo Dick Bland said, “Intel and Schlumberger are committed to accelerating the availability of best-in-class IT solutions to energy customers worldwide. We offer an affordable, high-performance computing solution to the energy industry.”


The IECC provides an environment where energy industry professionals can evaluate the performance and cost benefits of various reservoir simulation scenarios using Schlumberger software, including Eclipse Parallel reservoir simulation software prior to purchase.


SIS president Ihab Toma added, “Validating solutions that have been optimized for Intel-based architectures in this ‘try-before-you-buy’ lets customers determine the system configuration that best meets their specific needs. With Intel we are driving performance by expanding the applications available on 32-bit and 64-bit computing architectures.”


SIS customers can run simulations on Xeon and Itanium 2 workstations and clusters from Hewlett Packard. The Center runs a combination of Microsoft Windows and Linux operating systems. A Schlumberger-Intel research establishment was set up last year next in Abingdon, UK (Oil ITJ Vol. 7 N° 10).

SGI for Total’s geophysicists

Total has acquired a 256 processor SGI Altix 3000 for its seismic processing effort.

Total has bought a 256 processor SGI Altix 3000 ‘supercluster’ for its seismic processing department. The new system will complement the company’s existing SGI infrastructure and will be integrated into its SGI CXFSTM shared filesystem storage area network (SAN).

Itanium 2

The Altix system will be located at Total’s technical center in Pau, France where researchers support Total’s 44 E&P subsidiaries throughout the world. The system comprises 256 Intel Itanium 2 processors running at 900 MHz, with 1TB memory and 16TB of disk storage. Estimated total bandwidth is quoted as around 1 Terra Flop (a million million operations per second).


Total’s head of geoscience information systems Arnaud Althabegoity said, “We are continuing to work with SGI, a partners since 1998, because the new Altix 3000 supercluster is by far the most suitable machine for our applications”. Launched in January 2003, the SGI Altix 3000 supercluster deploys an optimized Linux environment capable of scaling to 64 Itanium 2 processors per node and to hundreds of processors in a cluster configuration.


SGI VP Steve Coggins added, “The Altix provides oil and gas companies with an unbeatable combination of powerful high-performance computing technologies in a Linux environment. Total’s purchase underscores SGI’s commitment to developing and bringing to market the tools necessary for the breakthrough achievements of the 21st century.”


SGI recently announced a restructuring plan to cut $40 million costs with a reduction of approximately 400 positions, or 10% of the company’s workforce. The reduction is intended to preserve customer-facing activities, protect customer investments, and accelerate a return to profitability.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.