February 2001

E - Data Room

Landmark’s OpenJournal is to offer electronic data room functionality for users of the Petroleum Place acquisition and divestment website.

Landmark is working with Petroleum Place to provide upstream acquisition and divestment users with electronic data rooms. Petroleum Place provides Internet-enabled products and services, supporting online oil and gas asset evaluation, disposal and enterprise resource planning solutions.


Landmark’s OpenJournal will be the key technology behind the new initiative. The development will enable customers to securely upload Landmark Graphics and GeoGraphix project data, to create electronic data rooms on the Petroleum Place Web site.


John Gibson, Landmark president and CEO said, “This joint development is one of the first major initiatives to deliver business process efficiencies for oil and gas property acquisition and divestiture that comes as a result of the worldwide strategic alliance between the two companies”. Landmark’s OpenJournal documentation product will enable customers to upload asset information to an electronic data room hosted on Petroleum Place. An OpenJournal template, developed to provide a standard format for listing assets, is said to expedite the information gathering process.


Petroleum Place president Gary Vickers added “This development will bring significant efficiencies to sellers, and will enable prospective buyers to leverage their analytical expertise and speed up evaluation of properties listed on Petroleum Place.” The added functionality will be released with OpenJournal V 2.0 in March.


To an extent, Halliburton is playing catch-up with Schlumberger in upstream e-business. IndigoPool already offers application service provision of GeoQuest products. But from the content standpoint, Petroleum Place’s website (www.petroleumplace.com) is the daddy of them all, having been in operation since 1995.


Petroleum Place is hedging its bets with regard to e-commerce and still holds traditional and hybrid property auctions through its wholly owned Oil and Gas Asset Clearinghouse unit (see page 11). Last year, Landmark parent company Halliburton acquired a 15% interest in Petroleum Place for $55 million.

Marathon outsources

Marathon is to outsource management of its E&P physical data assets to GeoQuest’s Powerhouse.

Marathon Oil has selected GeoQuest for the management of its North American and international physical E&P data. GeoQuest’s PowerHouse data management solution, will team with IHS Energy unit, Data Logic Services to provide onsite and warehouse services over a seven year period.


Marathon’s senior VP of worldwide exploration, Phil Behrman said, “Timely access to accurate, high-quality data is essential to exploration success. GeoQuest has provided a more efficient way for us to store and retrieve data. It will improve productivity and enhance our decision-making.”


GeoQuest’s Jeff Spath added “The PowerHouse service was designed as a more efficient way for oil and gas companies to manage their E&P data. We are able to impact our customer’s operations by leveraging skilled personnel, best practices, GeoQuest’s proven software and our data management facility to deliver a high-quality, yet cost-effective data management solution to the industry.” More on the PowerHouse in PDM Vol. 4 N° 5 and on www.slb.com.

Bandwidth - will there be enough?

PDM editor Neil McNaughton wonders what will limit the growth of Internet traffic. He concludes that as long as the world has an appetite for voice telephony and commodity traffic like MP3, the future for data intensive usage is good.

This month’s editorial starts with a confession. I am a BBC radio addict. I even listen in bed, in the wee hours with an earphone plugged in. I sometimes wake up in the morning with the thing twisted round my neck. I can admit to this pathetic vice because I met someone recently who does the same thing. Only he uses an infra red headset linked to his satellite box. His wife sometimes wakes up, to find an alien-like beast with red lights flashing on its headset, snoring away beside her.

Terabits per second

This silliness does expose one to some fascinating facts. One night, I heard an Indian telecoms engineer describe a new cable link from India to the far east, and was amazed to hear that its bandwidth will be a staggering 9 terabits per second. That works out at very nearly 100 peta bytes per day! In other words, the stuff pumping in and out of this one undersea cable could fill up the largest tape robotics installation in a few hours. As a friend remarked, how on earth do you manage the stuff coming out of the cable?

No ‘management’

Much reflection later I realized that such traffic was only possible exactly because it isn’t ‘managed’ at all. To understand this you need to know something of the subtle differences between telephony and data and just what is involved in their ‘convergence.’ Today, both telecoms and data are as near as damn it all digital. OK you may be unlucky and have some non digital technology such as a modem on your local loop, but as soon as your telephone gets to a local exchange, your voice is digitized, and can and will be commingled with data and other voices traveling down wider and wider pipes (such as the one of my ‘dreams’) before being separated out again and served up to your listener.


Modern transport technologies such as ATM allow this mixing of digitized voice and data. Multiplexed phone calls share the same infrastructure as Internet and other data traffic. But while the Internet traffic goes out onto disks chez your ISP, the telephony world is different on two counts. First telephone traffic is switched, not managed - it is the ultimate in ‘hot’ data (see Van Kuijk on page 7) - You talk, and someone listens, nothing is stored anywhere.


Second, telephony has a real-time component that is absent from most data traffic. There is a need for constant end-to-end bandwidth. Otherwise, you’ll talk, and someone will fall asleep waiting to hear you. Data on the other hand is mostly not real time. Email for instance, can tolerate relatively long delays in transmission. Each category of data transmission, or interaction will require a certain amount of bandwidth to operate without unduly frustrating the user.


Infrastructure providers - like the people laying the Indian fiber link - neither know nor care what traffic goes down the tube. Others, telcos, ISPs and governments fight over that. Bandwidth has become a commodity. You can pay a lot - for guaranteed quality of service suitable for a dedicated link, in fact you pay quite a lot for a telephone conversation. Or you can pay a vanishingly small amount for non urgent bandwidth such as email. The huge price differentials are due to the way the bandwidth is carved up. Figuratively, the cable operator will sell as much as he can to the high added value telcos etc. But any gaps in transmission, the drop off in traffic when America goes to bed, to the single pause for breath in a phone call will be plugged with lower cost data traffic.

Here at The Data Room we have just been equipped with ADSL and a router, and immediately became involved in higher bandwidth activities such as online radio (I said I was an addict!) and downloading MP3 files. The question many are asking is - will the Internet hold up to such increasingly intensive use? The answer depends on what your expectations are. At any point in time, the very top end of quality-of-service assured bandwidth will cost. But in general it is the demand for commodity applications - from voice though MP3 to video - that ensures that your basic IP traffic is getting a faster and faster cheap ride by piggy-backing the bandwidth bulimics.


Claude Shannon

One can hardly talk about communications without recording the death this month of the Mr. Bandwidth himself, Claude Shannon, formerly of Bell Labs and MIT. Shannon was one of the first workers in the telecoms industry to realize that all ‘signals’ - a single voice channel, multiplexed voice or color television can be represented and analyzed as the passage of ones and zeros - ‘bits’ - along the wire. Shannon realized the overlap between telecoms and computing very early on. In fact his expectations of the future impact of technology were boundless.


Alan Turing, on a fishing trip to Bell Labs in 1943, from his base in Bletchley Park, found a kindred spirit. “Shannon wants to feed not just data to a Brain [a computer] but cultural things! He wants to play music to it!” Subsequent IT developments must have seemed rather lackluster to such a visionary. But that didn’t stop him enjoying life after early retirement in 1972. Apart from the occasional lecture and scientific paper, the great man enjoyed himself, inventing motorized pogo-sticks. He also developed a theory of juggling. More from www.juggling.org.

PDM Interview - Steve Decatur, BP

PDM learned about BP’s Virtual Prospect from its inventor, Steve Decatur, exploration manager in BP’s Gulf of Mexico Shelf business unit. Decatur told PDM what deals were on offer behind the Virtual Prospect initiative.

PDM – What was the driver behind Virtual Prospect?

Decatur - A year ago we realized that we had to speed up the exploration process, but we were faced with the problem of limited staff. With conventional means, it would have taken 3-4 years to go through all BP’s GOM Shelf acreage, so we decided to use the net to advertise and push data out to geoscientists.

PDM - How does Virtual Prospect work?

Decatur - Potential venturers apply to BP for the rights to propose, or pursue a variety of deals. If they are accepted, then they have exclusive rights to work up the acreage for a three-month period. After this time, they have to show what they've got to BP. If BP likes what they see, they are then paid fees according to the deal they signed up for (see below). IF BP doesn't like it, then they get another two months to find another partner to drill. BP maintains a list of companies who have expressed an interest in such participation. So we can put the prospector in touch with potential investors. If this second way goes ahead, BP keeps a 5% overriding royalty interest in the prospect - with an option to convert this to a working interest in the discovery, if significant.

PDM – If BP does like what they see, what are the terms of your regular deals?

Decatur – We talked to a lot of small companies and independents to find out what sort of deal would attract prospectors and came up with an optional payment schedule where the reward depends on the risk to the prospector. There are various options as follows.

1) - If BP decides to purchase the prospect - we pay the prospector a guaranteed $10,000 fee (assuming that at least there are some maps to justify the work) plus $20,000 for each prospect drilled. For each successful prospect drilled, we pay an additional $20,000.

2) - There is no guarantee, but $50,000 is paid for each prospect drilled with another $50,000 for each discovery.

3) - Again, no guarantee, and no money changes hands for a drillable prospect. But for each discovery $150,000 is payable.

4) – Finally, we have an open option. Prospectors can write in and offer other deals.

PDM – Virtual Prospect seems to have begun in a very discreet manner.

Decatur – Indeed, we went live on the 22nd November 2000, with an invitation to apply posted on the IndigoPool website. No other advertising was used. Forty companies trolled the site and had a look at what was on offer - a pilot of producing GOM Shelf properties. We received expressions of interest from a number of very well qualified prospecting companies. Five were selected and awarded different deals as above (all deal categories were filled - two took option 2). They are all currently working on the data.

PDM – So prospectors don’t actually access data on the website.

Decatur – At the moment, but IndigoPool is planning to offer companies the use of interpretation tools over the web. So in the future you will be able to work from your mountain retreat.

PDM – Is this acreage ‘surplus to requirements’ - are these second grade plays?

Decatur - They are not BP’s ‘core’ properties, but they are not bad, they just haven't been looked at in a while. We did not want to put poor acreage out there. We truly want to generate prospects. We would have worked on them later. One block includes the second largest field on the GOM shelf with 300 million cu. ft. /day production. The GOM Shelf is prime exploration acreage for BP.

PDM - What is the economics of this as opposed to hiring a few more geoscientists?

Decatur - Actually the cost of doing it either way is insignificant compared to the cost of drilling a well.

PDM - So why don't you take on more people or use consultants?

Decatur - We don't like to take on more people because of the cyclical nature of the industry, and the possibility that we might have to lay them off. As for consultants, they do not have the same incentives as under this kind of deal. This is the real carrot.

PDM - What about the risk equation for the consultants?

Decatur - Consultants actually have less risk than they might have otherwise. They do not need to assemble the data - or get the land together. It is a great opportunity for retirees, some of whom may have 20+ years of proven experience.

PDM – Are other parts of BP getting involved?

Decatur - Yes in Alaska and the North Sea.

PDM – And what does the regulator – the MMS - think about this activity?

Decatur – The MMS has not commented so far. We have been careful to keep them informed as well as the seismic contractors and data brokers. We have arranged for a spec license fee to be paid in the event of a discovery. All prospectors sign confidentiality agreements.

PDM - Historically, many majors have neglected farm out opportunities and sometimes seem to prefer to hand their acreage back to the government. This proactive stance is quite a change.

Decatur - It really is an attractive business model for us. A new proximal discovery lets us defer abandonment costs and reduces ongoing, per barrel operating costs. I am personally very impressed by the reception this has got from BP - it could go world-wide .

PetroVision for Bapco

CGG announces new PetroVision contracts in Bahrain and Russia.

Following its vigorous rebuttal of rumors of PetroVision’s demise last December (PDM Vol. 5 N° 12) CGG has claimed further successes for its data management system. In addition to its four existing West Siberian contracts with Lukoil, CGG has now added a fifth Russian language databank contract, with Lukoil’s Lukbur unit based in Kogalym. CGG has also been successful with state-run Russian companies, with one databank contract from Yuzhmorgeologia, to manage all Russian offshore seismic data. Another contract was awarded for a regional databank for Khantymansiysk Okrug, Russia’s main oil producing region.


PetroVision was also selected as core technology by Bahrain state oil company Bapco. Part of a larger reservoir services contract won last year, the PetroVision E&P data management services will integrate all well and seismic data acquired on Bahraini territory. CGG’s Jean- Francois Arzel said “CGG continues to work steadily with its valued clients to address their problems, including those relating specifically to data management.”

New Geographix release

The latest release of the Geographix Discovery suite now implements a true shared earth model.

Landmark unit GeoGraphix has upgraded its PC-based interpretation suite Discovery to incorporate a shared database for the entire GeoGraphix suite of products. The 2000.3 Discovery release incorporates new tools and templates to store measured and interpreted data in a common location, accessible by any geologist, geophysicist or petroleum engineer working on a project. Discovery also integrates LogM geophysical modeling tools. Other additions include thematic mapping, a new petrophysical math engine, 3-D autopicking and enhanced volumetrics.


GeoGraphix VP Rick Slack said, “The substantial depth of this release is an example of Landmark’s commitment to our development of best of breed PC-based interpretation solutions for the mainstream geoscientist. The new Discovery release is an example of how our data integration activities, coupled with our adherence to Microsoft’s solution development, allows us to build and test quality, robust software in shorter time frames.”

Petrel’s new functions

Technoguide’s Petrel is positioned as the geotechnical equivalent of Microsoft Office.

Norwegian Technoguide has released a new version of Petrel, its Windows-based reservoir modeling suite. Described as a multi-disciplinary tool for interactive 3D modeling, Petrel claims over 100 clients companies and approximately 200 users worldwide.


Technoguide’s Paal Hovdenak told PDM “Petrel is the ‘Office’ of the geotechnical world and is the everyday tool for all members of the asset team. Petrel’s ease of use allows all users to view data through the same interface.” The new 3.2 release expands functionality to include interactive production and simulation data analysis and plotting and seismic depth conversion. Satellite images can be imported and draped over structural maps.

Voxel rendering

In the next release scheduled for later this year, Petrel will include powerful voxel rendering, leveraging the price performance of off-the-shelf PC graphics cards such as the Elsa Erazor. More from www.technoguide.no.

Hybrid auction success at NAPE

The Oil and Gas Clearinghouse hosted a successful ‘hybrid’ auction at the North American Prospect Expo.

Petroleum Place unit the Oil & Gas Asset Clearinghouse held a successful ‘hybrid’ auction of oil and gas properties at the North American Prospect Expo 2001 (NAPE) early this month.


The hybrid auction integrates simultaneous floor and Internet bidding during a live auction. At the NAPE show, 371 lots were on offer, with over 500 bidders. The Internet accounted for more than 16% of the total registered bidders who competed on over 80% of the lots.


NAPE attendees followed the bidding through a live video broadcast. An Internet browser tracked both Internet and floor bids. Registered bidders and viewers could monitor the progress of the sale and let NAPE attendees observe the dynamics of the auction.


Ken Olive, president and CEO of The Oil & Gas Asset Clearinghouse, commented, “This was a unique opportunity for the industry to observe the effectiveness of a hybrid live floor - Internet auction. We are extremely pleased with the results of the auction, and commend NAPE for hosting such an incredibly successful Expo with record attendance.”

How much is sector IT worth?

The engineering and construction market will be worth $ 5 trillion by 2004 according to a Daratech study. A mere $ 1 billion will be spent on IT.

A frequently asked question from analysts and writers of business plans is ‘how much is the whole sector worth.’ A study by Cambridge MA based Daratech suggests that for the engineering, construction and operations (ECO) market the overall Information Technology spend is a remarkably small proportion of revenues. Daratech estimates the overall size of the ECO market as around $3.6 trillion in 2000. The need for investment in IT support for internal and semi-public collaboration sites has created a $500 million market in 2000 for generalists such as i2 Technologies, Ariba, Inc and Commerce One as well as vertical technology suppliers such as Industria Solutions, Bricsnet, Buzzsaw.com, Citadon, e-Builder, Primavera and Bentley.

$ 1 billion

Daratech estimates that this market will grow to around $1 billion by 2004. Rapid technological and economic change has led to a new breed of technology vendor - providing the enabling technology for ‘web-hosted, standards-based, component-level design and project collaboration.’ The traditional dispersed model of asset creation is giving way to a more integrated model ‘in which a network of companies and individuals pools their economic and intellectual resources to maximize profit over an asset’s life.’ Daratech projects that the global ECO economy will grow to $5 trillion by 2004, over 10% of the world’s GDP.

I/O’s VectorSeis for Veritas

Veritas is to deploy Input Output’s new multi-component geophone on its North American land crews.

Veritas DGC Inc. and Input Output (I/O) are to cooperate on the commercialization of I/O’s VectorSeis digital sensor. The new geophone, revealed at the SEG last August (see PDM Vol. 5 N° 8) is a 3-component digital accelerometer. Veritas will deploy the new technology on its North American land seismic crews. It is claimed that the new phone has the potential to improve subsurface imaging and interpretation of key reservoir attributes.

La Fleur

Rob LaFleur, Senior VP of Veritas’ land division said, “The collaboration with I/O provides an affordable solution for the acquisition and processing of multi-component land seismic data. The benefit to our customers provided by this revolutionary new sensor technology will be of immense value in helping to understand the reservoir”


Bud Pope, Vice President of I/O’s Land Division added, “Our field tests over the past 18 months have provided encouraging results for substantially enhanced seismic image quality. The alliance with Veritas, will accelerate our commercialization efforts over the next year.” More from www.i-o.com.

Accenture in Upstream

Accenture and Novistar are to collaborate on e-commerce business integration services.

Accenture (formerly Andersen Consulting) has teamed with Novistar to leverage e-Commerce efficiencies to reduce E&P costs. Novistar was formed by the combination of Oracle’s vertical business in energy, Oracle Energy, and Torch Energy Advisors Incorporated, an early customer of Oracle Energy.


Accenture will provide e-Commerce business integration services to enhance Novistar’s software applications and service delivery. This relationship will facilitate complete coverage of the U.S. upstream petroleum industry from planning through consolidated financial reporting, for large or small, domestic or multinational companies.


Accenture partner Ken Theut said “Novistar’s software and client support deliver value to the U.S. upstream market. Its business process outsourcing model offers affordable access to state-of-the-art upstream software by reducing application deployment and maintenance costs.”


Novistar president and CEO, Trip Ray added, “Novistar’s legacy lies in its products and services, which reduce the costs of conducting business in the upstream. When you combine our product with Accenture’s industry reach and delivery knowledge, you create an alliance that empowers customers with the enhanced capabilities to make more informed business decisions. Together we are the solution provider of choice for the upstream market.” More from www.novistar.com.

SMi E&P Data and Information Management

SMi’s fourth annual conference was re-titled “Data and Information Management” this year to reflect an expanded coverage of document, unstructured data and information management. Show attendance was up significantly from 55 last year to 80, with attendees from S Africa, Brunei, N America, Norway, France and Germany. The data management debate has advanced, both in the direction of lightweight data management and in terms of more heavyweight corporate data management – where the true cost (and value) of doing it right has been recognized, at least by Shell’s EXPRO unit.

Exprodat director Bruce Rodney, speaking of Enterprise Oil’s Global E&P Information Portal, set the scene with the observation that data management has proved an intractable problem. People want everything, but still ‘can’t find stuff.’ Rodney breaks data management in to light and heavy components – and distributes these tasks to the front and back ‘offices.’

Back Office

The back office provides the long term corporate memory, and the front office, what is needed to make a decision. Such decoupling allows problems to be solved separately. The front office exposes information through web/GIS paradigm and offers a return path to the archive. The goal is to “find it fast and capture the value”. Enterprise’s Regional Access to Portal Information and Data (RAPID) is an ARC IMS-based development utilizing Exprodat’s Project Documentor (see PDM Vol. 5 N° 8).

The Data Room

PDM Editor Neil McNaughton proposed a scheme for integrating vertical (E&P) applications with horizontal tools such as Office Automation, GIS and document management. Today, web-driven corporate IT is making new demands on domain-specific applications. Corporate-wide document management systems and web portals require a much larger vision of what IT is trying to achieve. Current thinking is that techniques such as API’s and middleware (COM and CORBA) have their place at departmental-level and domain-specific computing.


But as IT scope expands, these tightly coupled techniques necessitate an unrealisable IT schema (the ‘uber-schema’) of the whole enterprise. Modern integration centres around three concepts – limiting application coupling, sharing metadata and XML-based messaging. A restrained amount of data sharing leads to the concept of the Corporate Metadata Store. Although it is still early days, the hyper portability of XML has a promising future for E&P data replication, sharing and exchange.


GeoQuest’s Bruce Sketchley reckons that this year, some 2000 terabytes of seismic data will have to be ‘managed.’This includes all processed, in-house, actively managed data but excludes field data. The exponential growth of seismic data volumes is dubbed the seismic ‘tsunami’. To avoid being swept away, Schlumberger advocates ‘virtual services’ for data management. In Aberdeen for instance, BP has an in-house ‘window’ to off-site hosted applications. Schlumberger’s data centers in Stavanger, London, Aberdeen and Paris will allow for similar offerings across the EU, while high bandwidth will facilitate remote access from Milan, Pau and Madrid. Business models for the new ASP-enabled software and data management are still ‘work in progress’.

Norsk Hydro

Duncan McKay, now with Conoco, was mandated by Norsk Hydro last year to ‘sell himself,’ along with the rest of the recently acquired North Sea assets of Saga Petroleum. A virtual data room, with a scanned and indexed set of Saga’s north sea library was opened on 5th May 2000. Conoco signed on the 19th July 2000 and it was a done deal by 1st August. All paper data was bar-coded and scanned onto a set of CD’s. Even electronic data was printed, scanned and output to TIF. Though counterintuitive, this was considered a pragmatic solution.

The ‘Big List’

Saga’s documents were catalogued into one big flat file (the ‘Big List’) and Alchemy software was used to access the files. Two data rooms were built, each with a Sun Enterprise Serve and 10 Centra 2000 workstations. At the end of the project, 4 1/2 tonnes of paper was shredded.


The scanning and infrastructure contractor for Hydro’s project was Spectrum, and Richard Stowe followed up with other IM projects. One involved the OCR of 2800 scanned geotechnical reports (500,000 pages) with output to Alchemy CD’s, another, an Open Text LiveLink installation, hooked into the company’s Finder (GeoQuest) database. Stow considers that take-up of electronic document management (EDM) has been slow, but that the leverage gained from web access to managed documents is changing this. Questioned on the cost of such operations, in terms of storage space savings, Stowe replied that the cost of lost opportunities far outweighed such considerations.

£ 6 million man!

Erik van Kuijk, head of subsurface data management with Shell’s Expro North Sea unit is going to be a popular man. He has convinced his bosses that data management really is important, and has obtained £ 6 million funding for 19 projects. These, developed with help from Flare consultants, set out to provide a firm infrastructure to Expro’s Knowledge, Information and Data (KID) environment. Projects are pragmatic, one concerns rationalizing naming standards, of which Shell currently has 7!

Too hot to handle

Van Kuijk offered a new slant on data management, introducing the concept of data ‘entropy’. Data can be more or less static – with ‘cold’ unchanging data residing in the corporate data store and ‘hot’ data deployed in projects. Van Kuijk suggests that project data can be too hot to handle – or rather to manage, and this should be recognized. Also, continuing with the thermodynamic analogy, over time, even cold data will tend to an anarchic state, will deteriorate and requires ‘energy’ input to maintain its quality.

Publish or freeze!

Data types fall naturally into more or less hot states. But the degree of short term activity can vary also – thus exploration tends to generate a very ‘hot’ state, while production is somewhat cooler. Publishing is defined as making project-level deliverables available at corporate level as new reference data. The cost of publishing – the energy required to move from hot to cold - depends on the ‘entropy’ of the activity. This process involves cooling data down by for instance ‘adding meta data.’ It costs more to publish exploration data than production.

Expert System

A second Enterprise project, the DART acreage evaluation system was presented by Exprodat’s Gareth Smith. The expert system, for quick-look evaluation of exploration opportunities in the North Sea, leverages Enterprise’s corporate database. This is based on Open Explorer and other information management tools such as Open Journal and Exprodat’s own Project Documentor. DART offers ArcView GIS-based data access and retrieval of data which is then piped to Microsoft Access where business rules can be applied. The system talks to other databases such as Arthur Andersen, Asset Geoscience (Target) DTI, LIFT, UKOOA and Enterprise’s prospect inventory. Business rules involve a scoring system – blocks with 3D seismics score high, as do those near facilities.


Common Data Access CEO, Malcom Fleming provided an update on the project’s status. Currently, seven data vendors display their products on the DEAL website, with the vast majority (26,000 products) coming from the IHS Energy stable. DEAL has 400 registered users, including around 100 oils, plus 440 ‘anonymous’. There have been some 2,300 logins over the last four months, with 420 logins in January 2001. Registered users hail from 17 countries, 90% outside UK.


Guy Gueritz, EAME Marketing Manager, with Silicon Graphics (SGI) offered a tally of world-wide Reality Centers. Heavy duty users (with 4 centers a piece) include ExxonMobil, PDVSA and Halliburton with around another 50 other centers at locations throughout the world.

Virtual Insight

Gueritz introduced SGI’s ‘Virtual Insight’ described as a combination of Information Management, Application Service Provision, and a data warehouse. Virtual Insight has been used during reservoir modeling – to store metadata and processing parameters from pre-processing through modeling to post-processing. Virtual Insight is a ‘process-oriented data management solution’ and allows visualization users to ‘keep track and back track’. Virtual Insight runs under Magic Earth and with Landmark’s visualization software and deploys its own independent Oracle database.


Hampton Data Services’ consultant Robert Casalis de Pury believes that new technology is on the horizon that will revolutionize the way we search for data. De Pury believes that while the web paradigm allows many data types to be visible and linked to data in the corporate data store it has its limitations. Hampton believes that peer to peer storage à la Napster is about to revolutionize information access. The idea is that spare storage and compute cycles are an untapped resource in the organization and could make up a huge, distributed storage and search engine.

Next Page

Hampton deploys software from Next Page to achieve this. Described as an e-content platform, NXT3 is modular middleware that offers distributed text processing. NXT3 was originally developed for the publishing industry. Content is ‘syndicated’ to other peer servers so that users ‘see’ it locally. In reality, content may be replicated or just indexed depending on performance requirements. NXT3 can link to existing structured and unstructured data and document repositories. The software also provides entitlement management and triggers to push reports out daily, weekly etc. NXT3 can work with Access, Oracle, SQL, Word, PDF, Excel etc. An XML-based content network protocol and control network adaptors are deployed. De Pury noted that ‘peer to peer’ could mean contractor to client.

Service Company financials, year end 2000

The long awaited upturn seems to have arrived in the last months of 2000. Corelab posted record revenues and Nopec reported “extraordinary” data library sales in the 4th quarter. CGG returns to profitability and PGS 2000 revenues are up 15% on the year. In general, the prognosis for 2001 is good.


For the full year, Core reported $336 million in revenues, a record for the company. For the full year 2000 Core earned $0.58 per fully diluted share, a 29% increase over full year 1999 operating totals. Net income for the year reached $19,152,000, the second highest total in c ompany history.

Late demand

Corelab benefited from mid-to-late quarter demand increases for its reservoir optimization services and technologies related to reservoir description and production enhancement. This was especially the case from multinationals, large independents and national oil companies on crude oil developments outside of North America. Services provided on deepwater field development worldwide also remained in strong demand. Many of these reservoir optimization projects are long term and are continuing into the first quarter of 2001.


David Demshur, president and CEO said “The long-awaited increase in international oil-related projects was finally evident in our revenue streams from mid-November on. This trend has continued into the first quarter of 2001, and we currently are projecting first quarter revenues at the high end or above current analysts’ estimates. The year 2001 will be another record performance for Core Laboratories.”


Norwegian Nopec’s gross consolidated Revenues were NOK 817,7 million, up 32% from NOK 617,3 million in 1999. Earnings per share were NOK 8.85 (NOK 8,45 fully diluted) for 2000 compared to NOK 4.97 in 1999. EBITDA* from operations of NOK 622.6 million was 80% of Net Revenues, up 43% from last year (NOK 436.5 million) and pre-tax profit was NOK 330.4 million representing 43% of Net Revenues and an increase of 71% from NOK 193.1 million during 1999.


Gulf of Mexico sales were particularly strong during the fourth quarter, followed by Brazil, Europe, and Africa. Quarterly net revenues were split roughly 79%-21% between 2D and 3D respectively. For the full year 2000, geographic distribution of net revenues was approximately as follows: 45% Gulf of Mexico, 30% other Western Hemisphere (Brazil and Canada), 17% Europe, and 8% Africa & Asia/Pacific.


CEO Hank Hamilton said “The extraordinary data library sales achieved in the fourth quarter result from a combination of factors including previous oil company under-spending of 2000 budget funds, a realization that prospect inventories will now start to decline, preparation for upcoming licensing rounds, the excellent timing and placement of our newest projects, and a terrific effort from our marketing staff . We are very comfortable with the stated net book value of our library. The sales momentum we have established gives us great confidence in growing our Multi-Client investments.”


The CGG Group returned to profitability in the second half of 2000. Un-audited total revenues for the year 2000 are estimated at Euros 696.5 million (US$ 635.9 million) up 37% compared to 1999 (Euros 506.7 million or US$ 542.0 million). These figures reflect an upturn in the seismic services and equipment markets materialized for CGG in particular by a strong last quarter. CGG expects a net profit for the second half year 2000 in the range of Euros 10 million, marking the return to profitability of the Group after four consecutive semesters of losses due to the ‘harsh conditions’ of the seismic market. Excluding non recurring items, in particular the gain resulting from the sale of Flagship to Paradigm, this net result would have been at break even level.

Marine strong

The year has been characterized by the good performance of the Marine and multiclient activities in a market, which remained extremely competitive. Problems were encountered in land acquisition resulting, in particular, from low volumes and strong pressure on margins.

Petroleum Geo-Services

PGS annual revenue increased 15% over 1999, while fourth quarter revenue decreased 5%. Year-to-date operating profit margin (before unusual items) remained consistent with the comparable 1999 margin at 16%. Year-to-date and fourth quarter diluted earnings per share (before unusual items) were $0.38 and $0.01, respectively.


PGS recorded pre-tax charges totaling $365.8 million related to asset impairments, including the data library, and accrued loss contracts. The sale of the Spinnaker Exploration Company brought net proceeds of $150.5 million and a pre-tax gain of $54.7 million. The agreement with Halliburton to sell the Petrobank data management business is expected to close before the end of February, and will result in a gain of approximately $145 million.


Chairman and CEO Reidar Michaelsen said “Going forward, we remain committed to, and expect to expand our market-share in the emerging reservoir characterization and monitoring business. Now with renewed focus, we will work to balance these commitments to improve our financial and operational performance.” PGS continues to expect that higher oil and gas prices will prompt greater oil and gas company spending on the application of enhanced production techniques - such as PGS’ PetroTrac suite of advanced seismic tools - to existing reservoirs. Higher prices, as well as declining oil company reserves, should lead to increased exploration [in areas where] PGS has focused its multi-client activity.

*EBITDA - Earnings before interest, tax, depreciation and amortization.

Shell spins off Kalido

Shell Services has spun-off its downstream business data mining software Kalido into a new company.

Shell Services International (SSI) is to spin-off its performance management solution and data warehousing software environment into a new company Kalido Ltd. The Kalido data warehouse was initially conceived in 1996 within the IT services arm of Royal Dutch/Shell Group of Companies (Shell) to address the data management issues faced by Shell’s diverse businesses.

100 sites

Shell continues to be a key strategic investor in the spun-off unit. Kalido, now in its sixth release, is installed at over 100 sites in over 85 different countries. Deployed as a dynamically linked federation of warehouses connecting a company’s global and regional organizations, Kalido delivers both unified and multiple views of the business. Kalido allows users to generate reports on business-critical information even if the data formats are inconsistent. It also allows analysis to continue during business change, such as expansion into new geographical locations, company re-organizations or mergers and acquisitions.


Kalido CEO Andy Hayler said, “To successfully capitalize on strategic business opportunities in today’s competitive climate, organizations need their vital business and financial performance information on tap. Kalido allows our customers to maintain a global view of their business without affecting the way the local or regional arms of their organization conduct business and manage their information.”


Kalido was presented at a POSC member meeting in 1999. SSI’s Linda Hickman traced the product’s evolution from Shell’s internally-developed “Genie” software into SSI’s data mining flagship. Hickman described the technology as ‘next generation’ data modeling – using STAR schema data cubes and OLAP. Although the product was developed well before e-business was in vogue, it has turned out to be “a great facilitator.” Most Kalido use has been in the downstream sector, but Shell has used the tool for upstream key performance indicators. Kalido was originally implemented in Platinum, a product now subsumed into Computer Associates’ ER-Win. More from www.kalido.com.

Terrasciences & GeoFrame

TerraStation can now interoperate with Schlumberger’s GeoFrame as a new product ‘TerraFrame.’

Terrasciences TerraStation II will soon integrate Schlumberger’s GeoFrame environment via a new product TerraFrame. Terrasciences’ developer Keith Kempton explained “TerraFrame is a TerraStation that stores its data in a hybrid database. TerraFrame can be used in a hybrid mode, with log curves and deviation surveys resident in GeoFrame directly.


TerraFrame can also be used as a standard TerraStation, with tools for transferring data into and out of existing GeoFrame databases.” The initial version of TerraFrame will run on Solaris. The system is in the final stages of beta testing. A GeoFrame 4.0 compatible version is also under development. Kerr-McGee Corp. assisted with the development of this capability.

Granite Rock assets for Troy-Ikoda

Troy Ikoda has acquires FaultMagic from Just2Clicks, parent group of Granite Rock.

Troy-Ikoda Group (TIG) has acquired the assets of Aberdeen-based consultancy Granite Rock from parent group Just2Clicks.com (now “J2C”). The deal includes Granite Rock’s FaultMagic software, trademarks, source code, and sundry Internet intellectual property.


Troy Ikoda managing director Martyn Millwood Hargrave said, “The acquisition of FaultMagic will bolster and enhance our software sales and service offerings in high fidelity 3D reservoir modeling.


FaultMagic’s robust methodology and successful case histories compliments our FaultFinder 3D application perfectly. The time is ripe for the industry to apply new high productivity 3D/4D search engines and predictive technologies to existing fields and exploration problems.”


Trade Ranger is to use core e-market technology from Commerce One.

Trade-Ranger is to license Commerce One’s MarketSite e-marketplace infrastructure as core technology for its energy and petrochemical e-marketplace. Interim co-CEOs Peter Lamell and Allen May of Trade-Ranger said [simultaneously!] “Commerce One is key to helping us achieve a successful, liquid marketplace. The technology will let us develop additional marketplace functionalities and value-added services.” Trade-Ranger began processing catalog transactions with Commerce One e-marketplace solutions last year.

ONGC inaugurates geophysical facility

India’s new processing center in the Himalayan foothills is the largest computer installation in the country.

Chairman and MD Bikash Chandra Bora inaugurated the ONGC’s new processing center at Dehradun, Northern India this month.


The center, built under contract by Paradigm Geophysical at a cost of $ 10 million deploys Paradigm’s software running on hardware components from Silicon Graphics and IBM. Bora said “We are privileged to have witnessed a milestone in the history of ONGC.

Origin 2000

The installation of the SGI Origin 2000, with processing, interpretation, and reservoir characterization software from Paradigm Geophysical marks a new chapter in the continuing saga of seismic data analysis.”

Problem data

ONGC scientists have already identified a new prospect in one of its existing fields using the new installation. A first target for the new software will be improving the sub-surface imagery of problem data in the hilly terrain of Asam and Tripura.

PetroCosm teams with NetworkOil and Petrobras

PetroCosm and NetworkOil are to partner on e-commerce, as PetroCosm sets up a new Brazilian venture with Petrobras.

PetroCosm has selected NetworkOil, as exclusive provider of oilfield surplus equipment acquisition and sales services to PetroCosm in the U.S.A and Canada. PetroCosm will feature NetworkOil on its web site allowing member companies to link directly to NetworkOil through PetroCosm’s marketplace.


PetroCosm president and CEO Norman Chambers said “This alliance will accelerate our ability to offer a full range of services beyond procurement and process integration.” NetworkOil’s CEO Stuart Page added “This agreement increases the NetworkOil asset base and gives our members wider access to quality surplus equipment.”


In a separate agreement, PetroCosm has entered a joint venture with PetroBras to create a B2B eCommerce Marketplace for the Brazilian oil and gas industry.


Petrobras Materials Executive General Manager Joao Carlos Soares Nunes said “Our procurement process and supply chain management will be enhanced by PetroCosm’s Marketplace solution, while allowing us to reduce our overall operating expense.”


This months movers hail from POSC, Trade Ranger and Common Data Access.

The POSC Board has elected Nico de Rooij (Shell) as Chairman, replacing John Hanten (Chevron). Jim Honefenger (GeoNet) is Vice-Chairman, replacing joint Vice-Chairs Nico deRooij and Bertrand du Castel (Schlumberger).


Trade-Ranger has appointed Claire Farley as CEO. Farley formerly held high-level positions within Texaco and most recently was CEO of Intelligent Diagnostics, Inc., a medical software developer. In 1998, Farley was named one of the top 50 women executives in the US.


Robert Archibald is the new chair of UK data repository Common Data Access, replacing Phil Challis. Archibald is currently Manager of Subsurface Software Support and Data Management at Shell Exploration & Production in Aberdeen.

AGI teams with NeoTech

AGI and NeoTech are to jointly develop borehole stability software.

AGI and Neotechnology Consultants are collaborating in the development of new software to link borehole stability and wellbore hydraulics.

Under balanced

Neotechnology’s Wellflo software will be linked to AGI’s StabView to supply under balanced drilling operations with more accurate hole size data and improved annular pressure and flowrate predictions while drilling. Wellflo has been in use for over 25 years. More from www.neotec.com.

Accenture & SAP develop e-commerce

Accenture and SAP are rejuvenating their 8 year long oil and gas industry collaboration

Arthur Andersen and SAP first joined forces in the energy industry to develop IS-Oil in 1993. Now SAP and Accenture renew their commitment to work together, expanding their strategic relationship in the energy industry. The alliance will capitalize on Accenture’s industry knowledge and e-commerce capability coupled with SAP’s e-business, to deliver mySAP.com solutions for the oil, gas and petrochemical industry. The new e-business solutions will leverage web efficiencies to condense cycle times, reduce inventories and streamline pricing procedures and capacity utilization.


Lead Accenture partner Paul Fockens said “SAP and Accenture have an exceptional track record of teamwork to the benefit of our clients in the industry. We look forward to introducing this alliance to our clients.”


SAP board member Peter Zencke added “In an increasingly competitive oil and gas industry, ‘thought leadership’ and proven e-business solutions are critical ingredients to success. SAP looks forward to expanding its alliance with Accenture in the energy industry.”

Open Spirit V2.1 announced

A new ‘commercial’ version of E&P middleware Open Spirit adds functionality and extends platform support.

OpenSpirit Corp. has announced a new version of its ‘vendor-neutral,’ platform-independent application framework. Version 2.1 adds support for Landmark OpenWorks 98.5 and GeoQuest GeoFrame 3.8 servers, support for basic Earth Models, horizons and horizon properties, improved event gateway functionality and support for Microsoft Windows NT/2000 C++ clients. In addition, the new release includes system performance improvements and enhanced deployment tools.


Open Spirit CEO Neil Buckley said “With this release, our customers will be able to integrate a rich set of applications across a broader range of computer platforms. OpenSpirit 2.1 reflects our commitment and continued focus on key areas of interoperability that maximize work-flow and decision efficiency for the oil and gas sector”. OpenSpirit claims to accelerate time to market for software vendors and to improve workflow integration for oil and gas companies. More from www.openspirit.com.

Nopec boosts data management

TGS-Nopec is to acquire seismic data management specialists Symtronix.

Norwegian TGS-Nopec Geophysical Company is to purchase Houston-based data management specialists Symtronix Corp. for approximately $750,000, payable partly in cash and partly in TGS-Nopec stock. Following a brief due diligence process, final closing of the transaction is expected within two weeks. TGS-Nopec will buy an amount of its own shares on the Oslo Stock Exchange necessary to complete the Symtronix purchase. Symtronix, a privately held company, provides a variety of data management services to the oil and gas industry. The company specializes in seismic data loading and format conversions on a variety of platforms. The company also provides on-site services to its clients. Symtronix was founded in 1993 and currently has eight employees.


TGS-Nopec CEO Hank Hamilton said “TGS-Nopec has been a client of Symtronix Corporation for years. We have been impressed with the quality of their work and high level of service. The acquisition of Symtronix will allow us to provide a more complete and thorough set of data delivery alternatives to clients who license TGS-Nopec seismic data. Symtronix will continue to provide services to its existing client base.” TGS-Nopec provides non-exclusive seismic data and associated products to the oil and gas industry.

Schlumberger to buy Sema

Schlumberger intends to accelerate its IT strategy and enhance its system integration capabilities with the acquisition of Franco-British consulting house Sema Group. The deal is valued at $ 5.2 billion.

Schlumberger has reached an agreement with the board of directors of Sema plc on the terms of a recommended offer for the entire share capital of Sema. The transaction is valued at approximately $5.2 billion fully diluted. The Sema Board intends unanimously to recommend the offer.


Schlumberger Chairman and CEO Euan Baird said, “The acquisition of Sema will enable us to accelerate significantly our existing information technology strategy. It will enhance our capabilities and critical mass in systems integration, widen our IT skills and create revenue synergies in many of our core competencies. I am confident that the excellent personal relationships which we have developed with senior Sema management and the strong cultural fit between our organizations will facilitate the integration and subsequent growth of Sema within the Schlumberger group. We believe that Sema is the catalyst that will help us approach the five-year goals for growth and profitability that I set out two years ago.”


“For several years, we have been actively exploiting IT to improve our internal business processes and efficiencies, to grow our existing businesses and to develop new IT-based revenue generation opportunities. Such initiatives have been ongoing in all three of the Schlumberger core vertical markets: oilfield services, wireless telecom and utilities.


The winners in the Internet age will be companies with excellent products and market shares in specific verticals that are able to substantially enhance their business models with these new technologies. We need to continue to add strong IT technology, systems integration and consulting competencies on a global scale to accelerate the growth in our core vertical markets and to establish ourselves as a leading information solutions provider in those core vertical markets. Sema provides us with such competencies.” Schlumberger expects to complete this transaction in the second quarter of 2001.

Shell’s new desktop

Shell anticipates 30% savings in a massive upgrade of 90,000 desktops, IT and telecom infrastructure. Getronics is to assist with delivery and support.

The Shell Group is to standardize its desktop computing, telecommunications, and IT support environment at approximately 1,000 Shell locations in over 135 countries around the world. The project involves the upgrade of some 90,000 desktops. The new Group Infrastructure (GI) Project will provide standard PC infrastructure and support, enabling all Shell companies to “increase the range and reach of their operations.” The project sets out to ready Shell for a world of e-business and to facilitate new ways of working.


Shell MD Harry Roels said “Shell will be breaking new ground with the GI Project. We are the first major company to undertake a full migration of this technical depth and global breadth. By implementing a Group Infrastructure that is standardized throughout the world, we are taking a critical next step in preparing our organization to compete successfully in the future. The GI will improve the stability and performance of our desktop computing and telecommunications systems while lowering our total cost of ownership and will result in a platform that enables better ways of doing business globally.”


Shell Services International (SSI), the Group’s IT services company, is delivering the project and has just signed a multi-million dollar contract with Getronics to assist in the delivery and support of the new infrastructure over the next three years.

Van Luijk

Getronics president and CEO Cees van Luijk said “We are excited and proud to have been awarded this contract to deliver global services on behalf of Shell, that will enable them to execute their business development more rapidly. Shell has been one of our major clients for the past 30 years and this win demonstrates the position of Getronics in the marketplace as a major, vendor-independent, supplier of global services.”

30% savings

Standardization of the computing environment across the Group is projected to achieve an approximate reduction of 30% in the average desktop total cost of ownership. This will be realized through global procurement contracts and economies of scale, streamlined and automated web-based supply chain processes, rationalization of telecommunications facilities and networks, and rationalized information technology support services. Shell expects the project to be fully completed during 2002.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.