An old theme of these columns is the notion that, for an IT project to have a cat in hells chance of succeeding, it needs to be specified with clarity and without ambiguity. This is why IT terminology is so precise. Such that when I say ‘business object,’ my intent is so limpid that you could just sit down at your computer and build a few without further instruction. Well, no of course you can’t and I apologize for the heavy irony. What I mean is that IT’s very terminology often undermines attempts at clarity as words are used, misused, translated and redeployed in a different context. In so far as technology does progress (which it does, but much more slowly than we like to think), verbal evolution is natural, even if it is hard sometimes to keep up with the latest jargon. But often, the words get ahead of the technology and move out into - well ‘Hype’erspace as it were. The terminology races ahead, and covers up a technological stasis.
CRMs of consolation
I think I have stumbled on a good example of this. Like me you may have puzzled at a new acronym ‘CRM’ - which being expanded, means Customer Relationship Management. What trendy new technology does this term entail? Well actually nothing very much. CRM is just a fancy way of describing the activity of the sales department, boosted by on-line databases and embellished with an (infuriating) automated response system. CRM is a verbal Trojan horse for vendors to rush new software into the corporation. Words can be old wine in new bottles or harbingers of something new. So what of “e”-buiness, “e”-commerce and the like? Which camp are these concepts in - and just what do they mean anyhow? First, let’s try to constrain “e-business” a little. I will just pluck a definition out of thin air here....
Or rather from www.askjeeves.com where there are quite a few definitions, but most seem to agree that e-business centers on “using the internet to do business better or differently.” It is thus quite puzzling that although the internet has been in operation since the early 1970s, e-business as a term has only been around for a couple of years at the most! So is it a new concept - or more hype? Did it really take a decade or more for ‘business’ to realize the benefits of the internet? Of course not. Oil and gas companies have been using internet technology for over a decade to transfer data internally. And E&P e-business to business can be said to have started when the switch was thrown on the Norwegian Diskos project, back in March 1995. Incidentally, Diskos planning began in 1992 so it could be considered to have an 8 year lead on some of the e-upstarts of today.
Here at The Data Room, albeit on a much more modest level, we have been doing e-business since 1996. We perform QC on large seismic tape remastering projects by analyzing the log files generated during transcription. A few gigabytes of email traffic over the past four years bear witness to the “E” nature of this business. Some much higher tech e-business - such as Application Service Provision is in reality e-business as usual for GeoQuest, which has been providing ASP-type services to BP Aberdeen and others for several years now.
So what is new in e-business? Well one thing that is new is the arrival of a lot of companies who, in true dot com fashion, have all their future before them. Companies whose business model is basically - ‘give us your business.’ Recently in the Wall Street Journal I read of one floundering dot com that complained that they had not sold out early enough. Wow, what an admission! This really is the old ‘eating sardines vs. selling sardines’ distinction! The business plan for some of these ‘wannab“E”s’ doesn’t seem to go beyond a successful IPO.
But just going back to the established, old-guard e-business Diskos. The technology behind Diskos was originally developed by IBM and was the first commercial deployment of POSC’s Epicentre data model. Of course one can argue about how much of Epicentre is in PetroBank, and whether a data model is a necessity for e-commerce, particularly as the main data transferred is SEG-Y seismic over FTP. But with this background, it is a wonder that everyone is plugging away quite so heavily at XML. As though it were the sine qua non of e-business! What really matters in business, whether its “E” or not, is the process not the technology. It is just much harder to do the process than the technology. There is also less whiz-bang, hype and potential for immediate gain, real or imagined.
The Norwegians were unusually absent from the POSC 10 year party. Do they know something we don’t? I wonder what the Norwegian is for déjÓ vu?
Subject to contract Petroleum Geo-Services (PGS) will take over the role of operator for the Common Data Access (CDA) data repository early in 2001. This contract was previously managed by QC Data, using the Axxes client software and server side software including CGG’s PetroVision and Schlumberger-GeoQuest’s LogDB.
Surf & Connect
All this is to be replaced by PGS’ Surf and Connect clients and a PetroBank server. The scope of services remains - digital and hardcopy well data and seismic navigation data. Well data will be managed through PetroBank’s well data module - which deploys Recall technology from Baker Atlas. CDA members expect PGS to extend the coverage to other data types and PDM was told by Jor Gjose, PGS’ UK boss, that it plans to promote its multi-client seismic library through the repository. PGS will also be loading proprietary 3D seismic from seven CDA members companies. PGS estimate that around 70% of all UKCS 3D seismic is already on the UK PetroBank.
A new PetroBank center will be commissioned in Aberdeen early next year which will duplicate the existing Maidenhead PetroBank. PGS’ Joe King told PDM “The plan is to move from an ISDN/FTP type of access to full web access, and we will install a high bandwidth connection to link Maidenhead with the new Aberdeen center. Substantial storage resources will be deployed at both locations, with some 2TB of disk at both localities and a 30 TB tape silo at Maidenhead.”
Document management is performed with a proprietary ‘soft object’ data store, with E&P data centric indexing. PGS will also offer on-demand document scanning.
PGS is working with the British Geological Survey to ensure seamless operation with DEAL. The DEAL/CDA separation is intended to separate the web component from the repository. Migration to the new system will take place early 2001, with a transfer of operations in the second quarter 2001. PetroBank has been in operation for five years in the Norwegian Diskos national data store. More from www.petrodata.no.
Spectrum Data Management is a finalist in Information Age Magazine’s IM Awards 2000. The nomination was made for the Electronic Data Room that Spectrum put in place for the disposal of the UK assets of Saga Petroleum, subsequent to its acquisition by Norsk Hydro.
The virtual data room housed over 140,000 documents which were scanned and indexed in three weeks. The data was made available to potential purchasers on indexed sets of 22 CD-ROMs. Additionally, two virtual data rooms were set up with an Oracle server running the Centra 2000 document management system, and housing ten networked PC client workstations. Centra developed a customized web-based front end browser for the document repository.
Spectrum’s Jon Lucas told PDM that clients preferred the assured security of the managed data room over the public portal approach. Spectrum Data Management is a wholly owned subsidiary of Spectrum Energy and Information Technology. The unit provides scanning and e-business services to the oil and other industries. More from www.spectrum-eit.co.uk.
PDM - What were the drivers behind Landmark's latest product 'Decision Space'?
Gibson - Historically, applications have been developed in a domain-specific manner by entrepreneurs with a passion for their subject; people like Rutt Bridges with Advance Geophysical and Munroe Garrett with Aries. Today, the industry has different requirements. Our software needs to be multi-domain, cutting across many disciplines and teams. We need to eliminate the barriers between these disciplines, and to focus on the work process as a whole. The key to this analysis is decision support. But Decision Space is more than just a new product for Landmark. We have literally re-engineered the whole company around this need for multi-disciplined software and decision support.
PDM - What are the tricks of the trade you have deployed in Decision Space? Does it use web, workflow technology or what?
Gibson - The 'trick' was to realize what our clients were looking for. But even once we had established what they were, we found it is still hard to sell such a revolutionary product - not least because we don't yet know how to price it! The change from discipline to team focus is a whole new work paradigm.
PDM - What does Decision Support bring, say, to a geophysicist working with SeisWorks? Does he/she see a Decision Support window pop up?
Gibson - Decision Space would expose the results of other domain-specific interpretations to such and interpreter - but will also add an economic and financial dimension.
PDM - All this extension of Landmark's activity into the financial and economic sphere is all very well, but doesn't it mean that you are neglecting your core business?
Gibson - Well we certainly know what our core business is and I believe that our R&D effort today in the field of seismic interpretation is second to none - in both quality and spend.
PDM - Actually, at the SEG and elsewhere, we often hear that R&D expenditure is declining in an alarming manner and that the industry at large is at risk from this.
Gibson - Well actually I agree with that. But while Landmark as a company is an over-achiever in R&D, that is not the case for the rest of the geophysical sector. Landmark is in the top three (along with Shell and Exxon) for software R&D spend, but elsewhere, the picture is indeed alarming.
PDM - Where is Landmark making its R&D breakthroughs today?
Gibson - We are patenting some of the results of our R&D in seismic interpretation. These focus on new processes and technologies and reflect the move from volume and trace based systems to voxel-based earth modeling. By year end we will have reinvented seismic interpretation. Our R&D effort in this field is in the tens of millions of dollars range. But these developments will take time to deploy. I estimate the move to earth model-based interpretation, coupled with pervasive, decision-supported, multi-disciplinary interpretation will take a couple of years to complete. But this effort will share the same focus as all our R&D, to increase reserves and lower costs.
PDM - You spoke last month at the POSC 10 year meeting, and are on the POSC board. But it is not clear that standards have or will play a determining role in moving the industry forward.
Gibson - Actually standards are more and more important and will play an increasingly significant role in the industry's future. If you look at the cost to industry of data reformatting - which we estimate as being several hundreds of millions of dollars per year - against total spend on standards organizations - which must be of the order of a few million - the potential cost benefits are clear. We need to better quantify these benefits and work in a focused manner to achieve these potential costs savings.
PDM - Still, as Landmark has shown recently with its new compressed data formats, multiple formats are inevitable, and can be part of innovation. Surely reformatting is a necessary evil?
Gibson - Yes, there are domains where multiple formats are necessary, but there are many others where they are not, and where their existence is very costly. Well logs are a good example - we estimate tens of millions of dollars go into post-acquisition well log curve reformatting, transcription and re-writing. We can reengineer a lot of this out of the system. Going back to our position with regards to POSC. Landmark is a strong supporter of POSC although it has been said that Epicentre is 10 miles wide and one inch deep! But in general we do indeed favor open standards as opposed to commercial proprietary integration platforms.
PDM - Where would you place Open Spirit in this context? Do you see this as a viable platform that Landmark will sign up to?
Gibson - We have monitored OS since it started and have maintained a constant, measured stance. We believe that middleware-based systems fail on two counts. From the performance standpoint - users cannot tolerate the overhead, and from the commercial standpoint - users won't pay for something they can't see! Going back to what I said about standards, we would like to have a standard way of doing lots of stuff - like reformatting well logs. But Open Spirit is now a commercial organization. We doubt that as a user we would be in a position to influence and change the middleware. A massive amount will have to be spent on the middleware to maintain and enhance performance - we think that such money would be better spent on R&D in fields like direct hydrocarbon indicators, 4C and 4D. Our position vis a vis Open Spirit has made for friction with some of our clients in the past.
PDM - A couple of years back, Landmark was touting a massive shift from UNIX to Windows NT - even with a the ‘virtual’ participation of Bill Gates at the 1998 Landmark forum. What's your position now?
Gibson - Well we tend to de-emphasize the move to Windows NT and talk now of a 'commodity platform'. But we do see great benefits to the industry coming from the massive R&D spend of the PC graphics industry which is orders of magnitude greater than any vertical.
PDM - We hear a lot of Application Service Provision (ASP) these days. Is this on offer for Landmark's software? And how are you pricing such a facility?
Gibson - Grand Basin is Halliburton's e-business offering and we do offer ASP through Petroleum Place. Pricing is not an issue for us. We already serve applications in ASP mode at client sites and we find that our clients do not want to be billed by the minute of use. They prefer to be able to budget accurately and are happy with the regular pricing per month, whether it is by license/seat/CPU or whatever.
PDM - What about the more far flung parts of the world. Can Landmark offer ASP in Europe?
Gibson - Our goal is to be able to do this at any location. Of course bandwidth may be an issue. For example, a client working at three locations - say Perth, Jakarta, Thailand would probably find the available infrastructure is not up to supporting collaborative work. We would therefore deploy a hub at each location. As bandwidth grows, these would be consolidated and ultimately, the asset team could be working together from different locations.
PDM What is Bob Peebler doing these days?
Gibson - Bob is the executive in charge of the whole of Halliburton's activity in the fields of procurement, data management, real time collaboration with suppliers and customers. He is taking Halliburton into the e-business world. They need an evangelist like Bob to do this.
Enterprise has renewed its upstream software contract with Landmark for a second five year period. The renewed agreement provides Enterprise with unlimited global access to Landmark’s suite of geological, geophysical, engineering and drilling applications, and is combined with an enlarged technical support and consulting package.
Enterprise has committed to worldwide deployment of on-site support consultants, as well as a significant technical training and mentoring program. The new deal is based on what is described as an “innovative business model that sets a new standard in client-provider relationships.”
Enterprise is a long time Landmark client, buying one of the first three interpretation systems produced in 1985. Enterprise was also closely involved in the development of Landmark’s Open Explorer data management system. Enterprise exploration director Andrew Armour said “Enterprise has long recognized the value to be gained from leveraging technology Through access to this range of software and permanent on-site support for our users, we anticipate being able to extract yet more relevant information from our E&P data.”
Another major, Spanish-Argentinean Repsol YPF has taken the path of a single vendor solution for its upstream software. Repsol YPF has chosen Schlumberger as its core provider and has licensed a comprehensive suite of GeoQuest’s applications.
3 year deal
Repsol YPF has signed a three-year contract to license geophysical, geologic, mapping, modeling, petrophysical, reservoir engineering, drilling, data management and economic analysis applications worldwide.
6th major this year
Repsol YPF is the sixth major company this year to globally move to a GeoQuest software solution. GeoQuest VP of marketing, Roger Goodan said “The selection of GeoQuest software solutions by Repsol YPF represents a validation of our effort to continually develop and deliver top-class integrated E&P software solutions that bring value. We are looking forward to extending our already strong working relationship with Repsol YPF worldwide.”
The Public Petroleum Data Model Association is releasing Version 3.5 of its standard data model in “Pre-Production” form for final comments by the membership.
V 3.5 RSN
Version 3.5 includes significant improvements and developments of the Model in the areas of Stratigraphy, Land Management, Contracts, Spatial Enabling and Projects. Version 3.5 includes Reference Guides, Roadmaps and a Sample Data Set. PPDM has also started work on an XML data exchange and schema architecture.
A membership survey found considerable interest for an XML-based transport mechanism for well data. Work is now in progress on an XML schema, based on the PPDM data model. The mechanism is intended to provide data transfer which will be independent of computer platform, operating system and application software.
The XML schema can also provide constraints on uniqueness and referential integrity. The intent is to further encapsulate PPDM standards and business rules in the schema. Mappings to other industry standard ASCII formats is also planned.
Aberdeen based Production Geoscience Ltd. (PGL) is productizing some of the tools it has developed since it began its consultancy work 10 years ago. It’s first product, Interactive Petrophysics, hit the market last year and will be joined by a new tool Oilfield Data Manager (ODM) shortly.
ODM is a data management and analysis package for depth related information such as palynological, biostratigraphy, picked tops and other point curve data. The new product is currently being beta tested at several client sites. The Windows-based package allows for the display and correlation of data between wells. The software is not intended to replace interpretation suites. Rather its role is on a PC at the rig site, or in the office playing a QA role prior to data loading into the interpretation suite.
A single user license costs under 4,000. The flexibility of the product (data is held in an Access database) means that it can be extended to novel data types and has been used for chronostratigraphy, magnetostratigraphy and engineering data. The vertical depth display can cater for supra-horizontal well bores. ODM will be released early in 2001. More from www.pglweb.com.
I want to thank you for your article on B4E. I would like the add the following comment. In your editorial note at the bottom of the article you said “one suspects a little reticence” on the part of the B4E membership in releasing a ratified standard to the public. That is not the case and would be counter productive to what we are trying accomplish. Ratified standards will be posted to the general website. Though I've been a little slow in getting them posted it is a reflection on my efficiency not on the desire of the foundation.
I've attached a copy of document that outlines the standards submission process. Please note that in Step 5 the process states that the "ratified" standards will be posted on both the MSN community site (which is restricted to B4E members) and the general website (which is open to the public). You will also note that until ratified, pending standards will only be posted on the MSN community site. I hope this clarifies the situation and again I thank you for your article.
POSC was 10 years old this month. Way back when, POSC set out to build the definitive standard data model for E&P, which all vendors and users would deploy to underpin their data management. Thus was Epicentre born. But the E&P IT world at large gave the project a lukewarm reception, and while Epicentre is still on POSC’s books, focus has shifted to the greener pastures of e-business. E-business brings new technology - in the form of the ‘X’tensible Markup Language, XML. For hardcore data modelers, moving from Express (the data modeling language used to define Epicentre) to XML probably seems like a huge step backwards, but believers in the KISS principle will no doubt be greatly relieved by this turn of events.
Having established the new technology, the debate is now focusing on what to do with it. The guiding line is that POSC’s new business is e-business (hasn’t it always – see this month’s editorial). But the shift in technology focus hides a more fundamental mutation. During its first 10 years of existence, POSC was firmly rooted in upstream IT. The technological spectrum was from geology and geophysics through to production. Now e-business, XML and the rest is causing the domain focus to shift. POSC’s business now encompasses procurement, finance and knowledge management to name but a few. If you came along to the POSC show expecting to find out exactly what has been achieved over the last 10 years, or even what is going on today, you would have left disappointed. But if you were looking for an excellent introduction to the world of E-squared E2 – i.e. Energy and E-business, then this was just the show for you.
There are a few fundamentally different types of company being set up to do E2 – this and that. Some majors oils have set up independent companies to run their e-services for them, with the intent that they should capture more business on the open marketplace. This approach neatly combines new tech e-business with old tech restructuring! Another type of company is the successful bricks and mortar business which has set up a web-based service – Network Oil falls into this category. And the third is the pure e-play, the technology and venture capital funded dot com of which Application Service Provider GeoNet is an example. Naturally, the different start-up’s don’t agree on who has got it right, and Network Oil’s Stuart Page was quick to vaunt the merits of the free market – and to deprecate the much larger Trade Ranger and PetroCosm as ultimately suffering from being tied to a single buyer, or group of buyers.
Chevron is a firm believer in the tied supplier business model. Indeed Chevron has several irons in the e-fire. One of the few large oil companies to have continued lending significant support to POSC, Chevron has been instrumental in the organization’s refocusing on e-business. Chevron has set up PetroCosm (e-procurement) and UpstreamInfo (ASP, IT outsourcing) and Retailers Market (sales). All this is handled through a new Chevron unit – the e-Business Development Company with David Clementz as president.
Chevron does not like the term ‘new economy’ since this implicitly deprecates the old economy, but it believes that new types of companies are emerging with few assets and lots of intellectual capital. ‘Old economy’ companies can play too – by leveraging their own intellectual capital - as Chevron intends to do through its new unit. The new structure has allowed Chevron to move extremely fast – in the case of PetroCosm, it took only three months to go from a ‘piece of paper’ to having a management team in place.
In a keynote address on the rise of e-business, POSC Chairman, Chevron’s John Hanten gave an update on the health of the estimated 150 different e-commerce ventures in the petroleum industry. The signs of a commercial shakeout are already here, and portal users are now faced with a multiplicity of different standards - which was what everyone was trying to avoid in the first place. Hanten raised the specter of ‘portal wars’ between these hubs - “if everyone does it differently, there will be chaos.” Chevron is therefore pushing for XML-based standards for the business to business (B2B) hubs. Chevron is in a good position to act upon this as it occupies the chairs of POSC, PIDEX and the API.
The POSC led attempt to bring about convergence between the standards orgs(Open Energy Exchange - See PDM Vol 4 N°10) has not met with unqualified success. The contact with the POSC CAESAR Association (which despite its name is not connected with POSC!) does not appear to have born fruit. PPDM rebuffed POSC’s overtures at the organizational level – but the collaborative reference data work resulted in a ‘just in time’ press release announcing positive results (see last month’s PDM.) The different bodies (API, PIDEX, PCA, PPDM and POSC) have, however, agreed to ‘keep communication channels open.’
Trade Ranger’s Gertjan Ophoff described the pitfalls and potential of having standardized electronic catalogues to enable e-procurement. E-Procurement requires identification of trading partners, RFI quotation and proposal, contract issuance, invoicing, payment, auctioning and bidding. Another dimension is the position of the e-commerce hubs with respect to Federal Trades Commission (FTC) competition legislation. Legal issues are provisionally resolved now that the FTC has approved the hub’s license, but reserves the right to review. The fear is that the hubs will act as a cartel and squeeze suppliers. Ophoff described the FTC review as a ‘sword of Damocles’ over the hubs’ activity. Trade Ranger has forty companies led by BP Amoco and Conoco procuring goods and services for plant construction.
A dissenting voice was heard from a POSC traditionalist, Total Fina Elf’s CIO, Philippe Chalon. POSC’s standard data model, Epicentre, and the software integration platform ‘should by now have solved the problems of lost data and software interoperability.’ But the lack of cooperation between vendors has prevented this happening. Even within a single vendor’s product line there may be multiple data models. Chalon is skeptical as to the merits of XML considering the technology as equivalent to the POSC Exchange Format (PEF) solution. In this context, XML is therefore not especially innovative and unlikely to be a panacea. Chalon considered that ‘behind XML there must be a data model.’ Chalon also lends support to the Open Spirit Architecture – which he considers the ‘right solution’ allowing for some data to be shared without sharing the whole model.
Exxon Mobil, according to Bill Ragosa, uses many world-wide standards allowing staff to work seamlessly irrespective of location. Exxon recognizes best of breed software tools and strives to use one tool per function where possible. Data management supports this philosophy by establishing authority and responsibility, common procedures for access and update, and common formats. Data Management is managed through a world-wide organization which for instance ensures data ownership, common database management systems, common services for storage space, backup and problem solving. A ‘standard set’ of upstream data repositories (some Exxon proprietary developments) is deployed. Data services ensure business success and increase upstream productivity. Standard processes allow staff to move from one location to another and find themselves in front of the same desktop. Exxon is not afraid of developing its own technology and is “committed to developing high impact technology for the upstream through a $250 million annual R&D budget.” Over the years, the fruits of this program have included 3D seismic, seismic stratigraphy, digital reservoir simulation, seismic imaging with supercomputers and the first horizontal well.
Following the merger with Mobil, the Finder-GeoShare-OpenWorks infrastructure has been adopted to the Exxon Proprietary data model. This reflects less on the merits of the databases, but rather on the fact that 70% of Exxon Mobil’s data is in the Exxon database with the remainder in Finder. Exxon Mobil still uses GeoShare to move data into OpenWorks projects.
BP Amoco’s Hunter Rowe described work done internally using real-time collaborative tools SameTime and QuickPlace. These were deemed to be immature technologies and BP developed its own tools to support partner collaboration on the Gulf of Mexico Crazy Horse development. Other innovative technologies which have been on trial at BP include Autonomy’s Active Knowledge Agent. Outside of the organization, the enLinx.com site was developed to leverage the Forties pipeline system in the North Sea and now allows proprietors of small oil and gas pools to contact infrastructure owners. Rowe states “e-Business has been an exciting journey for BP. The internet is a ‘show stopper’ for the large number of employees who can access it. BP is ‘deeply digital’ thanks to the Microsoft Common Operating System.” Problems remain with interconnectivity between ‘back office’ data systems.
Fluid transactions depend on interoperability so how do different e-commerce hubs talk to each other? The answer seems to be with difficulty. Suppliers are ‘going crazy’ with proliferating hubs. For the standards bodies, a common XML-based taxonomy is what is required and PIDEX, API and POSC ‘should get together on this problem.’ But the portal community offers little support for standards. The consensus is that XML itself is as near to a standard as we are going to get. While some believe that de-facto standards may emerge, Petris’ Jim Pritchett opined “Standards will not emerge – but technology may circumvent some of these problems.” Louis Hecht from Open GIS informed the audience “You are not alone in facing these problems. The US Government shares them and has the same misgivings about XML and whether it will fulfill its promise. Taxonomy is the key issue.” While Shell’s Nico de Rooij asked “in 10 years time from now will we still face the same problems and will there be a ‘new XML’?”
ASP offers great scope to vendors to offer jam tomorrow, but initial feedback from early adopters suggests that it may be harder to deploy than first thought. Shell’s Nico de Rooij believes that ASP is a disaster for data management and pleaded to vendors and ASP’s to “get together and sort this out - don’t leave the integration misery to the oil companies.” Total’s Philippe Chalon outlined its experience of ASP with GeoNet. This has been disappointing due to poor bandwidth in remote parts of the world, problems of data sharing, plotting, printing and security. Chalon insists “We need a data model, an exchange format and Open Spirit.” Chalon believes that an ASP must be a data management service provider and that oil companies will need and use several ASP’s and SSP’s. GeoNet’s Bob Aydelotte believes that a service provider can do all of this. “It is a tremendous challenge but not insuperable.” BP Amoco’s Hunter Rowe added that ASP is ‘obviously valid,’ but all of these tools come unstuck when confronted by issues such as bandwidth to Azerbaijan or Angola!
GeoQuest has been showing its collaborative document authoring solution MindShare for a while – PDM first covered the product in May this year. At that time (PDM Vol 5 N° 5) we also reported on a new tool ‘ReactivWeb’ - knowledge management software that Badley Earth Sciences had developed following its success with Open Journal. Now, Mike Badley has re-baptized ReactivWeb as ‘Unite IT’ and spun off a new company ‘Collaborative Technology’ (CT) to market the new software. Unite IT provides both a software environment and tools where users, connected by any type of network and using the same or different computer platforms, can create and capture content, organize, share, personalize and publish information and data. CT’s first client is – you’ve guessed, Schlumberger-GeoQuest which will be deploying Unite IT as the MindShare document editing and management engine.
PDM had a preview of MindShare from GeoQuest’s Samantha Hanley. The software is a multi-user, multi platform digital library and document management tool. Running in client server mode, MindShare uses a “fat” client (i.e. it does not just run in an internet browser). The client software shows a split screen with a tree view of folders and documents on the left, and a document viewing and editing pane on the right. All elements of the viewer, and indeed the document repository structure, and the documents themselves can be pre-configured through corporate templates.
Text, sound and video clips can be dragged and dropped onto tree nodes to create a data/document model. The document itself can be a composite of text (from Office documents), data or images. Security and object sharing technology allows locks down to individual data items within the document. Currently the images are not live – i.e. there is no tie-in to live data feeds or databases, but this will happen in a future version. MindShare will be available early in 2001. More from www.slb.com and www.collabortech.com.
PDM - Why the new company?
Badley - Collaborative Technologies has been set up as a separate entity from the other Badley Earth Science companies to allow for separate development of horizontal software.
PDM - How is Unite IT customized?
Badley - Unite IT sits on top of an existing intranet and is delivered with basic tools. It can ‘metamorphose’ into a domain specific application. For instance we are currently targeting the educational market for distance learning.
PDM - What’s the competition?
Badley - There are some competing products for collaborative solutions - notably Equus and E-Room, but they are different, and are all web based. This thin client approach has several disadvantages – it allows for unrestricted access and it requires different software for content generation at the client.
PDM - So the fat client is a good thing?
Badley - In some ways yes, but we are planning a web browser version of the Unite IT which will be available in the second quarter of 2001.
PDM - Has Java proved a satisfactory development tool?
Badley - We have used Java 2 and are quite pleased with it. No significant portability issues, and there are now lots of development tools.
PDM - MindShare does not as yet seem particularly tailored to upstream – not much database connection – just fancy document management.
Badley - The verticality will come – we expect Schlumberger to provide all that soon.
PDM - Does Unite IT support full text search?
Badley - No we do not have that yet – but we will provide full text in a future release – this will probably come from a third party plug-in.
Tape transcription specialist and high-end PC integrator Seacon and Paris-based Georex AT are to join forces to offer European clients integrated data transcription services. Seacon began operations in 1990 and has claims to have developed the first DOS-based tape transcription system. The software now consists of over 100 routines for conversion and diagnostics covering large range of industry formats including demultiplexing and consolidation of seismic data registered in old tapes and old formats to SEG-Y. Services now include archival to Digital Video Disk.
Seacon president Neil Moore told PDM - “The new 4.7 GB DVD-RAM drives have just come on the market. We have been testing them since August and have found performance quite remarkable. The 9.6 GB double sided disks was greeted with excitement here in Houston at the GCAGS meeting last month. Everyone is tired of fooling with tape and wasting a great deal of time that could be better spent looking for oil and gas.”
Georex provides petroleum exploration services, data management, indexing and data QC. The new alliance covers both existing Seacon technology and new tools for transcription, storage and enhanced data retrieval.
The companies are investigating the possibility of setting up a Paris-based transcription service. Georex AT has 45 employees. More from www.georex-at.com and www.scscomputers.com.
ESRI has developed new data models for electric and gas utilities under the auspices of the Open Modeling Consortium (OMC). The OMC is composed of ESRI, Miner and Miner, Consulting Engineers, Inc., and ESRI geographic information system (GIS) users at 70 utilities throughout the world. ESRI product manager Steve Grisé said “ESRI is providing a data model representing common, fundamental aspects of most electric and gas networks. We are providing an invaluable tool to help utilities realize the many benefits of GIS.”
With ArcGIS data models, consultants and developers can build applications based on open standards and users will benefit from a new level of interoperability and data exchange. Miner and Miner president, Jeff Meyers “It might seem a little ambitious to try to get an entire user community to agree on core technical concepts, but this group has worked well together and the benefits are enormous. The OMC is a forum for sharing technology, for helping new users and will provide a stable target for application developers.” The data models are now available for free download on ArcOnline and an ArcGIS data model book and CD will be available in the near future.
Geometry engine provider XoX has appointed a new Chief Executive Officer, John Sutton, and an Executive Vice President of Business Development, Bill Fuller. The new lineup is mandated to expand XoX’s business through mergers and or acquisitions.
Before year-end, XOX plans to release Version 3.0 of ShapesProspector, its first end-user product, as well as ShapesPlanner, a new 2D/3D planning tool for wellbore trajectory design and quality assurance.
XoX has announced net revenues for the quarter ended September 30, 2000 at $505,875 (down from $653,961 the same quarter last year) and a net income of $30,293 or $0.01 per share ($252,144 and $0.08 per share, for the same period in 1999.) The XoX geometry engine is used by Schlumberger-GeoQuest and Seismic Micro Technology. More from www.xox.com.
Schlumberger’s Bill Quinlivan is trying to rustle up support in the Geoshare community for a project aimed at enhancing data interchange and communications between drillers and geoscientists. Drilling data exchange focuses on communication between client offices and the rig site and promises multi-vendor opportunities. The ultimate objective is to achieve all this in real time.
Obstacles in the way of such inter-operability are what Quinlivan describes as the UNIX/PC ‘speed bump’ and the cultural gap between drillers and geoscientists. To overcome these, Quinlivan advocates a a full 3D data model enabling cross sections, formations and faults. The goal is to ‘drill geometrically rather than geologically.’ This could be achieved several ways - by extending Geoshare, switching to XML, going for Open Spirit type business objects or by ‘federating existing data exchange methods.’
Quinlivan has begun investigating what is needed to enhance the Geoshare spatial model to cater for such novel usage. Quinlivan told PDM “I think the drilling control loop is a useful test case as a run-up to the larger issue of collaboration among computing products for continuous reservoir management.”
Missouri City, Texas-based Petrove is expanding its business into software development. Petrove’s flagship product, FloodAnalysis is designed to optimize waterflooding of oil and gas reservoirs.
FloodAnalysis optimizes individual well production and injection rates and features flexible pattern definitions, multiple production intervals, handy case management, and forecasting of oil/water rates. FloodAnalysis optimizes oil production by calculating the wells' production and injection rates. It handles non-standard well patterns and multiple production intervals. It also provides the ability to quickly predict field-wide oil and water production.
Its Case Management strategy allows you to easily justify planned well interventions. Finally, it has the flexibility to store your data in a variety of databases to facilitate sharing the information with others. A free, 14-day evaluation copy is available for download at www.petrove.com.
UK-based international petroleum recruitment consultancy Working Smart has launched its online recruitment web site www.working-smart.co.uk. The new site offers clients and job seekers access to new search, matching and applicant management tools, streamlining the recruitment process.
Working Smart clients now benefit from free job advertising, and direct e-mailing of jobs to potential candidates. Applications can reviewed and tracked online and the Working Smart skills database searched for appropriate, high quality candidates. Another new service is ‘pre-emptive interviewing.’
Online headhunting allows employers to evaluate the interest of selected candidates in a position. Other innovations include a salary survey and a reporting service. These allow an employer to benchmark salaries against market rates.
Working Smart’s Professional Recruitment Services also include personal off-line recruitment services, job profiling, executive headhunting, interview screening, reference verification and human resources support services.
PDM - Why does Shell need a supercomputer?
Burr - We held a workshop earlier this year and realized that many new ideas for E&P applications were extremely compute intensive. Pre-stack depth migration, risk analysis and reservoir modeling were all pushing the envelope for Shell's existing infrastructure. There was a general feeling that although many quality algorithms had been developed in house over the years, they were not being used to full effect, just because they were taking too long. So SIEP's researchers decided to build a supercomputer.
PDM - What decided you to build not buy?
Burr- We decided to go for off the shelf hardware and open software to give us more long term flexibility. Open software makes us independent of vendor operating systems, and promises a better growth path. We have found that our algorithms have a much longer life span than hardware. Some of our software was originally developed over 10 years ago for Vaxes or for the Cray. Also Linux is getting very professional and reliable.
PDM - Have you had to adapt your algorithms for parallel computing?
Burr - Not really. We have two sorts of uses for parallel processing. One just involves running the same program on lots of data. This can be fairly easily shared out between the nodes and the results collated after the fact. Another kind of parallel process involves operations on different nodes sharing data during processing. This calls for more sophisticated programming with inter-process communication. Fortunately Shell has been working on this type of process for some time and we are in pretty good shape to implement this technology on the new machine.
PDM - Just how fast will the new machine be?
Burr - There are 1024 Pentium III single 1GHZ processor nodes. This
gives us a theoretical upper limit of 2 teraflops. Of course in reality the
machine will probably deliver 10-20% of this.É
Virtual Reality specialist Evans & Sutherland (E&S) is bringing 3D
GIS data “to life” with its new RapidSite Producer software. Producer
creates photo-realistic 3D site visualizations from cartographic site data held
in ESRI's ArcView 3D Analyst. GIS professionals can now fly their audiences
interactively through a photo realistic visualization of their 3D site data.
Producer also allows users to create digital images and video for electronic
publication and distribution. Ard E&S VP Bob Ard said “Many people find it difficult to interpret accurately
symbolic representations, whether presented in a 2D or 3D form. Producer solves
this problem, transforming 3D GIS site data into a photo realistic 3D representation
that appears literal and natural, rather than symbolic.” Producer reads TINs,
GRIDS, and shapefiles from 3D Analyst, plus aerial or satellite ortho images.
More from www.es.com.
Baker Hughes has decided to throw in the towel on exploration following a
strategic decision to ‘substantially exit the oil and gas exploration
business.’ Baker Hughes has already sold its Chinese oil and gas properties
and has signed an agreement to dispose of its Gulf of Mexico properties and is
negotiating another sale. These disposals will generate approximately $53
million, although Baker Hughes’ other exploration properties are to be written
off. Loss Overall, the company will record an after-tax loss of approximately $75
million in the fourth quarter as a result of these sales and asset write-offs.
In taking these actions, the company will avoid one-quarter of a billion dollars
in future capital expenditures over the next few years. Nigeria The company will retain its interest in the OPL-230 property in Nigeria, which
currently produces approximately 25,000 barrels of oil per day.
Western Geophysical and Magic Earth have reached a global agreement on
cooperative use of Magic Earth’s GeoProbe virtual reality seismic
interpretation software. The volume visualization software will be combined with
Western Geophysical’s seismic data acquisition and processing resources. Multiclient The Magic Earth system will be deployed worldwide to showcase Western’s
multi-client seismic libraries and to aid in planning seismic acquisition and
data processing. The deal also allows for global deployment of the GeoProbe
throughout Western Geophysical operations – to give Western’s clients access
to the high-end interpretation software. A joint marketing and distribution
agreement will underpin further development and implementation of the software
to end-user clients worldwide. Jones Western Geophysical President Gary Jones commented “Magic Earth’s
technology can be used to unlock value contained within Western Geophysical’s
3-D seismic libraries and data processing and reservoir services. This unique
relationship is intended to help our customers map and interpret more data in
less time. Western has the quality data, global reach and skilled people, and
now we have added a powerful visualization capability in Magic Earth, which can
bring substantial value to our customers.” Zeitlin Mike Zeitlin, Chairman and CEO of Magic Earth, added, “Through Magic Earth’s
strategic relationship with Western Geophysical, we can quickly deploy our innovative
solution to the marketplace integrated with Western Geophysical’s seismic processing
technology and global network. This can provide our customers with outstanding
value in both their data and the means to rapidly interpret that data using
GeoProbe.” Western is now also to share responsibility for the joint operation
of Magic Earth’s first immersive visualization center in Houston.
IBM and Shell International Exploration & Production are collaborating on
what is claimed as the world’s largest ‘Beowulf’, a clustered
supercomputer built from Intel-based PCs running Linux. 2 teraflops The new machine will comprise 1024 IBM x330s packaged in 32 racks, all
running Linux and capable of performing two thousand billion floating point
operations per second (2 teraflops). The largest commercial Linux cluster IBM
has ever built is due for delivery in January. Shell will use the supercomputer
to run Seismic and other E&P applications. Berger IBM VP of technology and strategy, Irving Berger said “The fact that Shell
has decided to run these applications on an IBM Linux supercomputer demonstrates
that Linux is coming of age. It shows that Linux can scale to meet the
high-workload demands of even the most progressive supercomputing tasks.” Princeton Prior use of Beowulf clusters in seismic processing was reported by Advanced
Data Solutions and by Princeton University. (see PDM Vol 4 N° 10.) The Beowulf
project was initiated by NASA in 1994. Since then the technology has gained in
popularity and a Beowulf cluster was used to simulate both the ship as well as
the ocean and wave action in the movie Titanic. Burr Jack Burr, Shell’s Principal Research Physicist and designer of the Beowulf
cluster told PDM “Shell International E&P’s R&D is considered to give
the company a significant competitive edge. We have a superb library of processing
algorithms, but up till now we have been lacking the means to deploy them effectively.
The new machine will allow us to do this and also will let us test out some
very interesting new algorithms that we are developing.” See PDM’s interview
with Burr inside this issue.
VR for ESRI
Evans and Sutherland is offering an add-on to ArcView to allow for photo-realistic representation of 3D GIS site models.
Baker Hughes abandons E&P
Baker Hughes is disposing of its direct interests in producing oil and gas properties.
Virtual Reality seismic Processing for Western
Western Geophysical is to set up its own ‘Reality center’ in Houston in a joint venture with Magic Earth. The new center will showcase Western’s multi-client data and provide VR capability to seismic processors.
Massive Beowulf cluster for Shell
Shell International E&P has commissioned a clustered ‘Beowulf’ supercomputer from IBM. The machine is built from 1024 Linux-based IBM servers and has a theoretical 2 teraflop capacity.
Virtual Reality specialist Evans & Sutherland (E&S) is bringing 3D GIS data “to life” with its new RapidSite Producer software. Producer creates photo-realistic 3D site visualizations from cartographic site data held in ESRI's ArcView 3D Analyst. GIS professionals can now fly their audiences interactively through a photo realistic visualization of their 3D site data. Producer also allows users to create digital images and video for electronic publication and distribution.
E&S VP Bob Ard said “Many people find it difficult to interpret accurately symbolic representations, whether presented in a 2D or 3D form. Producer solves this problem, transforming 3D GIS site data into a photo realistic 3D representation that appears literal and natural, rather than symbolic.” Producer reads TINs, GRIDS, and shapefiles from 3D Analyst, plus aerial or satellite ortho images. More from www.es.com.
Baker Hughes has decided to throw in the towel on exploration following a strategic decision to ‘substantially exit the oil and gas exploration business.’ Baker Hughes has already sold its Chinese oil and gas properties and has signed an agreement to dispose of its Gulf of Mexico properties and is negotiating another sale. These disposals will generate approximately $53 million, although Baker Hughes’ other exploration properties are to be written off.
Overall, the company will record an after-tax loss of approximately $75 million in the fourth quarter as a result of these sales and asset write-offs. In taking these actions, the company will avoid one-quarter of a billion dollars in future capital expenditures over the next few years.
The company will retain its interest in the OPL-230 property in Nigeria, which currently produces approximately 25,000 barrels of oil per day.
Western Geophysical and Magic Earth have reached a global agreement on cooperative use of Magic Earth’s GeoProbe virtual reality seismic interpretation software. The volume visualization software will be combined with Western Geophysical’s seismic data acquisition and processing resources.
The Magic Earth system will be deployed worldwide to showcase Western’s multi-client seismic libraries and to aid in planning seismic acquisition and data processing. The deal also allows for global deployment of the GeoProbe throughout Western Geophysical operations – to give Western’s clients access to the high-end interpretation software. A joint marketing and distribution agreement will underpin further development and implementation of the software to end-user clients worldwide.
Western Geophysical President Gary Jones commented “Magic Earth’s technology can be used to unlock value contained within Western Geophysical’s 3-D seismic libraries and data processing and reservoir services. This unique relationship is intended to help our customers map and interpret more data in less time. Western has the quality data, global reach and skilled people, and now we have added a powerful visualization capability in Magic Earth, which can bring substantial value to our customers.”
Mike Zeitlin, Chairman and CEO of Magic Earth, added, “Through Magic Earth’s strategic relationship with Western Geophysical, we can quickly deploy our innovative solution to the marketplace integrated with Western Geophysical’s seismic processing technology and global network. This can provide our customers with outstanding value in both their data and the means to rapidly interpret that data using GeoProbe.” Western is now also to share responsibility for the joint operation of Magic Earth’s first immersive visualization center in Houston.
IBM and Shell International Exploration & Production are collaborating on what is claimed as the world’s largest ‘Beowulf’, a clustered supercomputer built from Intel-based PCs running Linux.
The new machine will comprise 1024 IBM x330s packaged in 32 racks, all running Linux and capable of performing two thousand billion floating point operations per second (2 teraflops). The largest commercial Linux cluster IBM has ever built is due for delivery in January. Shell will use the supercomputer to run Seismic and other E&P applications.
IBM VP of technology and strategy, Irving Berger said “The fact that Shell has decided to run these applications on an IBM Linux supercomputer demonstrates that Linux is coming of age. It shows that Linux can scale to meet the high-workload demands of even the most progressive supercomputing tasks.”
Prior use of Beowulf clusters in seismic processing was reported by Advanced Data Solutions and by Princeton University. (see PDM Vol 4 N° 10.) The Beowulf project was initiated by NASA in 1994. Since then the technology has gained in popularity and a Beowulf cluster was used to simulate both the ship as well as the ocean and wave action in the movie Titanic.
Jack Burr, Shell’s Principal Research Physicist and designer of the Beowulf cluster told PDM “Shell International E&P’s R&D is considered to give the company a significant competitive edge. We have a superb library of processing algorithms, but up till now we have been lacking the means to deploy them effectively. The new machine will allow us to do this and also will let us test out some very interesting new algorithms that we are developing.” See PDM’s interview with Burr inside this issue.