As you know, PDM has a website – www.oilit.com. This was originally developed using Microsoft Front Page, which is supposed to remove the need for server side programming. Because HTML is “dumb”, server-side programs are required when for instance, a website visitor fills out a form, or performs a full text search across the site. Of course there is a snag in using Front Page (FP). Your internet provider has to play ball and run an FP compatible server. This can be done in two different ways.
Either the internet service provider (ISP) goes the whole Microsoft hog and runs everything on Windows NT with the Microsoft Internet Information Server extensions, or they elect to run FP extensions on their regular Unix boxes. You have to remember that most websites run the Unix/Linux Apache server. My provider, Verio, runs the FP extensions on a Unix box, and they work OK, up to a point.
Unfortunately, if you want to tailor the FP search engine even a little bit, you enter an inexorable logical path which Microsoft expects you and your ISP to take. It goes like this. One – you want to use FP to do your server side stuff? Just get your ISP to run the FP extensions on his Unix box. Two – you want a bell or whistle or two more out of FP? Just get your ISP to throw away the Unix boxes and move to Windows NT.
In other words, through a series of ‘gotchas’ Microsoft, expects the end user to force the ISPs to jettison Unix! Or perhaps I am seeing a conspiracy where there is none. Maybe Microsoft just forgot to include the bells and whistles in FP by accident. But it hard to understand why Microsoft, whose success is based on supplying massive functionality to the end-user, should now turn around and start playing to the centralized IT manager’s gallery. But I digress...
Faced with the dilemma of either upgrading my account with Verio (they do run a few NT boxes..) or installing my own server-side search engine on my Unix account (a hassle), I stumbled on another solution – ASP. PDM has been investigating application service provision for the past few months, now we had the chance of putting it to the test ourselves. Text searching is a prima facie candidate for ASP. Apart from avoiding the hassles mentioned above (you no longer either know nor care about what system is running whose software), you can expect to have a configurable search engine, professional looking output, and hopefully, more bells and whistles than your vanilla search components à la Microsoft.
Go to it
How does it work? First you fill out a few forms on the ASP website – telling it what directories to search and chosing a presentation format. Next you tell the ASP’s indexing motor to go to it. This took around an hour for the 1000 or so pages on the oil IT.com website. Next you cut and paste a few lines of html supplied by the ASP into your web page. That’s it! Now when a users launches a query on the PDM website, the query is actually carried out on the ASP’s machine, which returns a professional-looking list of hits worthy of Alta Vista. We can even use the search from a ‘dumb’ page of html on a local machine – no need for extensions of any sort.
So far we are pretty happy with the service (from AtomZ.com by the way). It does cost money, but has removed a thorn from our flesh. Seductive as the ASP logic is, before you outsource all your enterprise portal’s search functionality to an ASP there are a few mindsets which will need changing. It is exciting and quite easy to see how IT resources have become virtual, and how you no longer need to worry about what is being done how or where.
But are you going to let a third party index (and potentially read!) all your contracts, tenders, joint venture agreements, and other sensitive information? The ASP would no doubt claim that their contractual engagement and security ensures confidentiality. They probably never read any of the material they index! But I think it will be a while before all this activity is outsourceable. Which leaves the portal manager with the following problem. If you have to manage a part of your corporate information internally, then you might as well do it all!
So to conclude, how about a business opportunity. ASP-ed ASP! What is required in such circumstances is for the ASP to provide and maintain the binary, but to let the end-user run the program on his/her own machine. With the separation that the web imposes, and good design, it should be possible to upgrade and maintain the binary remotely and offer the security conscious end-user the assurance that mission critical information was not being perused off-site.
The recipient of the award is the UK government, in the form of the British Geological Survey (BGS), partnering with ESRI, which will receive £300,000 over the next three years. Helen Liddell, Minister for Energy and Chair of PILOT, said: "DEAL will exploit the Internet and e-commerce to provide, a single virtual repository for UK data. DEAL will promote competitiveness and encourage activity on the UK Continental Shelf."
Massive cost savings are forecast for the project by reducing “the time taken to search for data and eliminate duplication in data storage.” A public website www.ukdeal.co.uk will provide the “definitive” index database for exploration and production activities and will act as a "virtual shop window."
First phase, due for September 2000 is the Information Service which will provide a single, comprehensive set of basic information on the UKCS, including geospatial and well data, seismic surveys, platforms, pipelines, quadrants/blocks, international boundaries, coastlines, safety zones, coastal shipping lanes, environmental exclusion zones and other public domain data.
Later this year the Data Market Place will add e-commerce for E&P data. An ambitious time frame promises a “Unified Data Network” early in 2001, with an inclusive "virtual environment" allowing companies to access data on an entitlement basis. An agreement with the Department of Trade and Industry (DTI) will be extended to let subscribers fulfill UKCS license obligations via DEAL.
DEAL will be managed by Common Data Access Limited (CDA), recently acquired for a nominal sum by the UK Offshore Operators Association (UKOOA). UKOOA director general James May is now CDA chairman. May said: "UKOOA is a natural home for CDA, given our organizations’ shared membership. The merger will allow closer collaboration between CDA and UKOOA’s technical committees giving CDA direct access to the information it needs."
PDM comment - How many protagonists does is take to run a data repository? The answer, for the UK is quite a few - like the DTI, CDA, UKOOA, OGTIF, PILOT LIFT, DEAL, (BB King and Doris Day!)
Tobin International, a supplier of land management data and software, has released its Online Data Catalog (ODC). Offering a “complete e-business solution” to landmen, ODC is built on ESRI’s ArcIMS web mapping technology. Users can browse maps, query spatial objects and create visual representations of coverage.
The ODC is a portal into over 6TB of Tobin’s digital map products. Other recently announced Tobin products, BaseMap USA and OverVue have been released as part of the ESRI Geography Network. BaseMap USA is a GIS-ready digital land base of the lower 48, Alaska and the offshore GOM. OverVue adds GIS-referenced imagery, graphics and terrain information.
Tobin joins ESRI for the launch of the Geography Network. Tobin’s Marty Schardt said, "The magnitude of offering data and map services to the largest collection of GIS users in the world is incredible. We plan to deliver many new products to enhance the Geography Network as this rapidly changing technology continues to evolve."
Oslo, Norway-based Technoguide has released a new version of its shared earth modeling software, Petrel. Since its release in December ‘98 (PDM Vol 3 N° 11), over 100 Petrel licenses have been sold. Petrel 3.0 is described as “workflow-oriented” and spans seismic interpretation to production.
Technoguide claims that Petrel is “10 to 100 times” faster than other Windows-based tools thanks to optimized code and advanced mathematical algorithms for gridding and modeling. Petrel deploys “state of the art” object programming techniques and leverages Microsoft’s C++ Foundation Classes. Petrel’s 3D graphics are take advantage of the OpenGL API for rendering, texture mapping and special visualization effects.
New modules include well correlation, data analysis with cross plots, histograms and variograms of various reservoir properties and enhancements to the 3D grid module. Plotting is carried out to Windows Metafiles or using the many plotters and printing formats supported by Windows. Technoguide has offices in Oslo, Houston, London, Aberdeen, Calgary, Perth, Beijing and Stavanger. Petrel is available for under $20K. More from www.technoguide.no.
GeoGraphix now offers raster log capability as part of the Discovery Suite. The new functionality combines the affordability of raster logs with the power of digital well logs.
Geoscientists can incorporate paper-based logs into their integrated interpretation and mapping, scanning their own in-house paper log libraries or by importing TIF images from log data vendors. Geographix Discovery is a suite of geoscience applications built on a shared data management and common visualization foundation to provide dynamic integration between geological and geophysical data.
GeoGraphix’ Richard Slack said "We have been watching this market closely for the past twelve months to see what impact raster logs would play in the interpretation workflow. Most companies want an integrated solution - one that enables them to build cross sections as part of a total geological and geophysical interpretation.
Because our cross-sectional capabilities are a fundamental part of our integrated geological and geophysical interpretation system, they provide customers with a complete, interpretive solution.” More from www.geographix.com.
Software toolkit-provider INT’s is offering Application Service ¨Providers (ASP) a web seismic viewer (WSV). With the WSV, end-users can view seismic data within a web browser from a remote SEG-Y data server.
A special client-server data transfer protocol has been devised so that the viewing and the download of traces are handled in separate threads. This allows viewing to start as soon as enough traces have been downloaded for the initial display, maximizing perceived speed.
The seismic viewer Java applet is downloaded automatically to the web browser when needed and requires the Java 2 plug-in. Check out the web seismic viewer on www.int.com.
But progress is slow on the cooperative project.
The EuroFinder project (see PDM Vol 3, N° 4) was instigated after many Finder users realized that they were independently devoting significant resources to the same Finder customizations. EuroFinder began in 1997 and reached critical mass after a couple of years with twelve participating companies. The Phase II development which began in 1998 extended the Finder data model following GeoQuest and using POSC’s Epicentre data model.
The plan all along was that GeoQuest would incorporate the EuroFinder extensions into future releases of Finder. Currently, the plan is to include EuroFinder Stratigraphy in Finder release 9.0 and to continue to work on the other domains beyond the 9.0 release.
Implementation of the Well Header Module is imminent. EuroFinder customization for North Sea use includes well header localization for elevation references and datums, new referential integrity constraints to all enumerated data types and the addition of attributes to model the complex linkage of sidetracks to preexisting borehole. Elsewhere, more granularity has been introduced into lease modeling, with lease attributes replaced by foreign keys to a new lease data model.
Similarly, business associate attributes have been replaced by foreign keys to the new business associate model. Similar changes are underway in areas such as stratigraphy, logging, drilling, and seismics and a set of standard meta-data attributes are being introduced to track transaction history, data source and verification status.
PDM comment : EuroFinder progress has been heavy going and not all members are satisfied with the effort that GeoQuest has put into the project. This is surprising since the EuroFinder project is the archetype of the kind of outsourcing that the service industry at large is advocating. Why then doesn’t it work better? Is it that GeoQuest is not all that interested in the economy of scale and effort that such a collaborative venture should produce? Is it that data modeling by committee is doomed from the outset? Is it that Epicentre is an unwieldy tool? Answers on a postcard please.
Calgary-based Sterne Stackhouse Inc.’s (SSI) traditional market has been the oil and gas industry. SSI’s Petro-LAB suite is a real-time query and mapping package allowing for retrieval of E&P objects such as real-time well tickets, enhanced PL100 reports and detailed SegP1 seismic reports.
Now, Petro-LAB’s real-time ad-hoc querying and self-service reporting, has been extended into a new, non E&P specific data mining package, Labrador which has just been evaluated by consultants from Boston’s Aberdeen Group. Their report, "Labrador: Redefining the Value of Query Software", describes the package as “a revolutionary business intelligence solution.” The report’s author, Bob Moran said Labrador’s strength lay in "allowing business users to think and analyze in the language of their business".
The Aberdeen Group found that Labrador, improves on existing online analytical processing (OLAP) and query tools. The Labrador framework has eliminate the need for a technical specialist to continually mediate between the source database and the end business user to conclude "Labrador delivers query software that reaches well beyond reporting to become a true analytical 'tool of the masses'". Labrador was field tested by Denver cancer treatment specialist OpTx. SSI CEO, Ron Sterne said "We decided to delay undertaking independent research until we had clearly proven that Labrador thrives in very complex data environments outside of oil and gas, such as OpTx Corporation’s forty oncology datasets. Needless to say, we are pleased to be viewed with such high regard by a leading research firm." More from www.sternestackhouse.com.
Paradigm demonstrated its new web-based E&P applet technology at the EAGE exposition held in Glasgow this month. Web-enabled versions of Paradigm’s software are said to “facilitate project collaboration across organizational boundaries.”
Paradigm’s Mark Walker said “We can now run high-end geoscience applications from anywhere with Internet access. Plug and play applets let asset teams access unlimited data volumes through the simplest of web tools.” The web-enabled products demonstrated are GeoLog, with the Well Data Server (see PDM Vol. 5 N° 4) interface providing remote access to third party legacy databases. Users now have access to their corporate data from remote locations such as the rig site using ordinary telecommunications facilities.
Paradigm, has teamed with Ilex Technologies on web-enabled data loading to GeoLog. The technology enables remote hosting of raw data.
The new PetroPower Interpretation Workstation is tuned to run software such as SMT’s Kingdom Suite, PrimeGeoscience’s 3D Attribute Visualization and EDS’ APEX Seismic Processing.
The available power lets seismic processing run in the background while interpretation continues. Apart from the potential cost savings accrued by processing and interpreting everything on the same workstation, SCS anticipates more integrated project management with the same geoscientist performing a “total geo-evaluation.”
A high end SCS PetroPower-NT Model 866-DX Dual CPU Interpretive Workstation running Kingdom Interpretive Suite is the seismic Process Control System (PCS). Task priorities can be set between processing and other functions. The PCS is connected to one or more SCS PetroPower NT Interpretive Workstations by 100 MBPS Ethernet.
The APEX seismic processing suite uses all of the dual processors. For more power, inexpensive Auxiliary Processing Units (APU’s) can be networked to the PCS. A SCS configuration has processed a 150 sq. km. 3D seismic volume in about twelve days with post stack migration in five more. An entry-level PetroPower Workstation costs around $10,000 with the APU a further $3,675.
The Geoshare AGM was held last month in Houston during the fourth annual PNEC Petroleum Data Management and Integration conference. Last year the Geoshare board of directors established a three-year strategy focused on communications, experts lists, code examples, sample data sets, and a scorecard to determine board’s effectiveness.
Outgoing chairman, Vastar’s Ken Bastow described the results of a survey showing gave a progress of Vastar presented the status of the strategy and the while the communications objectives had been achieved, it is too early to judge the effectiveness of the other initiatives. Ken remains on the Geoshare board as former chairman and the following new officers have been elected:- Jack Gordon - Chairman, Steve Hawtin – Vice Chairman, Dan Shearer – Secretary and Jim Theriot – Treasurer. The general board members are William F. Quinlivan and Helen O’Connor. The standing committee chairmen are Ravi Nudurumati – ANSC and Ellen Lasseter – DMEC. The new Geoshare web page was demonstrated and is now easier to navigate. Some of the new features include the list of experts, sample data sets, and code examples.
Data model update
The Geoshare data model is driven by user requests for changes, to either the data model or the ancillary standards, one month before the meeting. There were 14 issues discussed at the meeting. The final change is made with a one-company one-vote policy. Accepted changes will be incorporated in the new data model to be released this fall.
There was a lot of discussion about the implications of XML and the Geoshare data model. After the discussion, the action was tabled to the Geoshare WEBSIG. If you have interest in seeing an XML implementation for the Geoshare data model please address your comments to the websig at firstname.lastname@example.org.
Times have been hard for the seismic industry over the last couple of years. Having launched a fleet of super-ships just at the tail-end of the last industry boom, it has been a case of sitting-tight and waiting for post-megamerged clients to loosen-up the purse strings.
But there are signs that the hi-tech boats are coming into their own now with news from both CGG and PGS of recently completed super-surveys. CGG’s Alize flagship is popping the last pop on a 4400 sq. km. non-exclusive survey over Block 33 offshore Angola.
70 sq. km. per day
The vessel deployed a dual source, 10 streamer configuration with a 900 m distance between outer streamers. Production exceeded 70 sq. km. on several days. Block 33 is one of the three deepwater blocks offshore Angola which attracted a combined total signature bonus in excess of USD 1 billion.
Meanwhile in the mid-Norwegian deepwater sector, PGS claims a record for the streamer configuration used in its latest multi-client 3D survey. Operating in the western part of the Donna Terrace, the PGS Ramform vessel is towing 12 streamers, each over 5 kilometers long. Totaling 61.2 kilometers of in-sea streamers, PGS claims this survey is the largest ever spread towed.
Onboard data processing will be carried out using the Company’s massively parallel onboard processing supercomputers and proprietary software. The 2,300 sq. km. survey will use a single source, high resolution configuration.
Modeling has shown that high-definition surveying will improve imaging beneath the complex overburden. BP Amoco and Chevron have “substantially” pre-funded the survey. Work is expected to be completed in September. PGS sees a great future for high-definition surveying and expects to be pushing the envelope further to the maximum 20-streamer capacity of the Ramform.
Following the deal with Anadarko (see PDM Vol. 5 N° 5) Internet well log data vendor A2D Technologies reports further success with its e-commerce Log-Line site.
Business has increased by 42% since the beginning of the year and 60 oil and gas companies have opened Log-Line accounts in the last two months.
New clients include Basin Exploration, Burlington Resources, Chesapeake Energy CXY Energy, Range Resources, Samson, Southwestern Energy, St. Mary Land & Exploration, Stone Energy and Tom Brown.
A2D’s Executive Vice President Kim Boddy said "Increasing data availability and efficient e-business solutions are being adopted by the oil and gas industry at all levels." A2D’s believes e-business solutions cost less than traditional methods of data acquisition and management. A2D claims pioneer status in providing quality well log data over the Internet since 1997. Currently on-line data is available for 200,000 wells from most major basins in the United States.
Data is available for Gulf of Mexico, Gulf Coast, Permian Basin, Rockies and Mid-continent wells in TIFF raster log images, LAS digital data, and smartRaster image files. Velocity data is also available in the Gulf of Mexico including key deepwater wells. Check out Log-Line, on www.a2d.com.
BP Amoco has just published its annual Statistical Review Of World Energy. Demand was flat in 1999, mainly due to a large fall in Chinese energy use (which intriguingly happened “without discernible impact on Chinese economic growth”) Oil prices recovered strongly from 1998, rising 39 per cent to an average $18.25 for the year.
OPEC production fell 5.4 per cent to 29.3 million barrels a day. Non-OPEC production also fell, by 3.8 per cent in the USA and 4.8 per cent in Mexico. UK production, however, rose by 3.4 per cent, with ten new fields commissioned during the year. G as gained market share, rising to over 24 per cent of total energy consumption.
Coal continued to lose market share, while nuclear power use rose 3.8 per cent, largely due to an 8.0 per cent rise in the USA where re-licensing of existing plant supported output despite no new plant commissioning. Use of hydroelectricity continued to slowly grow, up 0.9 per cent in 1999, but is significantly influenced by the weather on a year-by-year basis. Production of energy from other renewable sources remains tiny by comparison with other sources. Wind is the fastest growing of the renewable sources. The BP Amoco Statistical Review of World Energy 2000 is published on the internet at www.bpamoco.com/worldenergy.
Advanced Geotechnology Inc (AGI) will be launching a new Joint Industry Project in September 2000 to develop new borehole stability, lost circulation, and sand production risk analysis software. StabView 2.0’s new features will include real-time capabilities, multiple zone analyses, a new database, sand production tools for weak formations, and more.
StabView Software will be available in an Application Service Provosion (ASP) mode. AGI and Houston-based GeoNetservices have signed a letter of intent to offer the StabView software package on-line to customers.
GeoNetservices offers a suite of over 100 petroleum-related software programs for users to test drive or rent on a time-metered basis.
AGI has a major new StabView client with the signing of the Taiwanese Chinese Petroleum Corporation (CPC), the state oil and gas company. CPC is involved in exploration and production activities in South-east Asia and abroad, including gas storage operations. A customized training course on rock mechanics applications to drilling and completions is being provided as part of AGI’s agreement with CPC. For more information on AGI’s Joint Industry Program contact Pat McLellan at email@example.com.
Arthur Ballard has joined Knowledge Systems as director, Latin America. Ballard come from Enron where he was director of business development.
Sterne Stackhouse has appointed Steven Parent as VP Corporate Development. Parent was previously with software house Hummingbird.
Fakespace Systems has poached Jerry King from HP’s technical computing division to become manager of business development for commercial systems.
Javan Meinwald has moved from Paradigm to VP of Sales at Neuralog.
Baker Hughes has named Michael Wiley as the new CEO. Wiley was president and COO of ARCO prior to its sale to BPAmoco in May.
Jake Drouiter is local manager at Petrosys’ new Calgary office.
Bill Farquhar and Ian Patrick have joined Keith Doherty and Sue Whitbread in start-up A&D consultancy Petresearch. Farquhar was previously with Shell and Patrick with Monument Oil and Gas.
PDM – what was the rationale behind outsourcing document management at the new plant?
Larkin - The Prairie Rose chemical plant at Joffre Alberta was commissioned from constructor Bechtel last year. About the same time, the BPAmoco organization was gearing-up with a new slug of reporting requirements, reduced head count and a new vision of what was core business. We saw outsourcing the IT infrastructure of the document repository as an opportunity to capitalize on the all-digital dataset and at the same time benefit from the economies of scale that ASP promised.
Grouse – The ASP model lets BPA focus on the use of engineering documents without worrying about the supporting infrastructure. The secure extranet allows BPAC’s 800 epmloyees to access engineering, finance and contract documents. Xerox acted as systems integrator for the Documentum solution, but went a step further with a full blown off-site IT infrastructure.
PDM – we have seen other European implementations of engineering information systems built with sophisticated data models. Was this the case at Prairie Rose?
Larkin - This was a not data modeling exercise, though when implemented, the HSE, and ISO 1401 reporting for document control and management will deploy a structured DMS. We have however implemented Knowledge management technologies like Xerox’ AskOne natural language query to analyze accident causes
PDM - Why Documentum?
Grouse – Documentum is flexible and can be made to relate to the business process. It makes good use of key word search and allows for intelligent version control, change management and workflow.
PDM - What infrastructure is used for ASP?
Grouse - Hardware is in Xerox’ Calgary office. The BPA users are about 100 miles north and connect over the internet. Within the Xerox data center we run 100mbps Ethernet but T1 (1mbps) bandwidth is sufficient for client sites. All server software runs on Compaq servers and Windows NT. Storage requirements in 2000 are in the 25-50 GB range, growing to 150-200GB by the end of 2001.
PDM - What’s on the BPA client workstation?
Grouse – Just a browser! The ASP applications are web-based. They may require browser plug-ins, for instance to markup engineering drawings.
PDM - How much did it all cost?
Grouse - The up-front application design, development, and deployment costs vary substantially depending on the application being developed. Our applications at BPAmoco ranged from $40,000-$180,000 CDN for the upfront design, development, and deployment costs. Overall we have estimated the ongoing cost for a view-only user at $20 CDN per month.
PDM – what’s next?
Larkin - With an estimated 50% savings over what it would have cost internally I really believe that we have a valid solution for a difficult problem. I fully intend to “sell” the merits of this solution to other BPA units. In fact I’m already planning my next trips to Anchorage and Europe. Another aspect of doing business over the wire is obvious synergy of cable networks with our pipeline routes.
Odegaard is joining the Aberdeen-based 4D seismic solutions company Seismic-To-Simulation as an equal partner with the two UK founding companies IKODA and TRACS International. Odegaard has been working closely with STS since its inception in 1998 contributing its ISIS inversion technology to STS’s 4D seismic methodology. STS has established a track record with oil industry clients such as Shell UK, Enterprise, Amerada Hess and Talisman as a delivering significant cost savings through its seismic data processing and reservoir analysis.
STS has demonstrated that legacy seismics can be used as a base-line for 4D seismic acquisition. Time lapse field models can be tested by new acquisition at intervals determined from STS’s forward modeling. It is claimed that this approach reduces the number and scale of such surveys as compared to conventional blanket 3D seismic coverage.
Odegaard’s Kim Gunn Maver said “4D seismic has been slow catch on despite the benefits in maximizing reservoir recovery. We believe that specialist treatment of companies’ existing data provides a cost effective alternative.” Maver joins Martyn Millwood Hargreaves of IKODA and Mark Cook from TRACS on the STS Board alongside STS managing director Dave Davies who has recently been appointed a director of Odegaard.
Odegaard recently released its new Lithology Cube interpretation software for AVO interpretation of 3D volumes. The ’Cube is said to take geologists and engineers “straight to the rock properties of a reservoir.”
If you work with seismics you will occasionally see potential field data (gravity and magnetics) as rather uninspiring wavy lines along the top of the section. Once in a blue moon you may even use them to check if an intrusion is salt or volcanics.
Cart before horse
But if your business is grav and mag (for instance for crustal investigations) this is really putting the cart before the horse. Which is where Corvallis, Oregon-based Northwest Geophysical System’s latest software comes into it’s own. The new release of GM-SYS, version 4.6, puts the seismic data right inside the potential-field model. Other enhancements in the new release include improved data management, integration with Oasis Montaj and new display preferences. GM-Sys runs on Windows Unix and Linux. More from www.nga.com.
Dave Abbott is a man with a mission. A geophysicist and software engineer with 17 years industry experience, Dave founded Mountain Man Geosciences to further his work in volume visualization and software engineering. MMSeisVis is the first major software release from Mountain Man Geosciences (MMG). MMSeisVis is a PC-based 3D seismic visualization and interpretation package.
Running on Windows ‘98 or NT, system requirements are modest. MMSeisVis provides "Quicklook" prospect generation with transparency tools as well as detailed horizon and fault interpretation. Seismics can be viewed as inline/trace/time slices, random vertical slice, solid cube, transparent cube and others.
Horizon interpretation can be performed by auto detection from a single seed or manually. Surfaces can be imported, edited exported in ASCII to mapping packages. Other functionality of MMSeisVis includes straightforward fault picking and stratigraphic analysis with the surface-bounded viewing object.
Additionally, horizontal wells can be plotted and cultural data can be visualized. Version 1.0 of MMSeisVis is available now a the knock-down price of $2000. Take the Mountain Man software for a test climb on
Bingen, Germany-based NSM Storage GmbH is shipping the MediaManage CD/DVD-ROM server with capacity of up to 600 CD/DVD-ROM images in a single network volume. The high-end MediaManage Server is built around a dual-processor Pentium PC and boasts a twin SCSI-3 controller.
In this configuration, the MediaManager server has 64-bit PCI slots, an additional SCSI-3 RAID controller and 256 MB of RAM. An entry-level DataProvider server with the MediaManage software comes in at DM 9,500 (about $5,000) for 50 virtual CD/DVD media.
According to Hugh Tucker, Corporate Strategy Manager for Elf Oil UK, Knowledge Management (KM) is about the “people side” of IT. Raw information needs to be sorted and checked for relevance before it can be understood. Technology is the enabler for more efficient capture and sharing of knowledge, but “people issues” are still a problem. Knowledge is still power, and people are afraid of their ideas being stolen. In Elf Oil UK, KM has been successfully applied to optimizing commodity prices. KM gives constant updates of cost vs. margin and the process can be reversed to select which market to serve. Tucker advocates a K-facilitator – someone not connected to Divisions, an outsider offering an impartial balanced view. Ultimately, a Division [of the company] may need to do something which doesn’t suit its own ends, but is for the “greater good”.
Dog food (again!)
Ron Mobed (GeoQuest) believes that a service provider should use its own KM products internally – in other words should “eat its own dog food”. Schlumberger GeoQuest, with help from the American Productivity and Quality Center (APQC) has developed the K-Hub which collects dispersed practices from assets and practitioners. These are filtered and validated before integrating the K-Hub. The Hub’s components are Bulletin boards, document management, workflow, data management, expert directories, help desks and FAQs – the whole caboodle! Various initiatives have contributed to Schlumberger’s KM experience. “In Touch” - field personnel were quizzed as to how service could be improved. “Eureka” – to share, not “steal” ideas in geophysics, chemistry, IT etc. Finally, the Coiled Tubing Drilling Library. – a shared initiative with BP – allows for instance a Houston-based drilling engineer to collaborate with a Paris-based expert on fluid flow in a horizontal well.
Total-Fina-Elf’s Bertand Mélèse mused on the advantages to be gained “if Total Fina knew what Total Fina knows.” But according to Mélèse, KM technology is immature IMulti-platform and product search engine technology does not work well - particularly with Lotus Notes databases.
Walk and talk..
AMEC, the construction company has implemented KM successfully according to Chief K-Officer Ruth Mallors. AMEC’s 22 Communities of Practice (COP) have allowed knowledge gained on an Australian offshore production facility to be applied to the UK rail sector - an offshore platform is managed very much like a big railway station. Mallors exhorts senior management to ‘walk and talk KM’, but downplays the IT component. For AMEC simple email is the main tool for discussion, and if needs be the COP can get together for a good old fashioned meeting!
PDM’s best talk (virtual) prize goes to BG Technology (BGT). Martin Vasey and Ken Pratt, describe themselves as “between fluffy consultants and intransigent IT”. Beyond the fluff, the restructured BGT now does the same amount of work with 3 times fewer people in part thanks to the Knowledge and Information for Everyone (KITE) initiative. KITE flies between two intranet hazards:- with too little control, you get non-validated information, with too much control you have no content! BGT believes in the portal. Some 130,000 pages of legacy documentation were scanned, and OCR-ed; “We threw everything in”. Excalibur Retrievalware allows for a single search across everything. Another killer application is Cartia Knowledge Landscape. This intriguing product scans vast document databases and produces contoured maps of textual similarity and is used by BG for competitor analysis.
Other key technologies used by the K-managers include AltaVista Forum and the Mezzanine document management system - both used by Shell Global Solutions International. On the technology frontier, the Imana intelligent web text agent has been used to crawl the web and seek out relevant documents and cookies. Other agent technologies under test are Autonomy and “Taboo” a new Microsoft cross platform search engine.
John Pomeroy (Documentum) reported on BHP’s Liverpool Bay document repository – a wholesale replacement of the engineering DMS with a Documentum-based operations-focused repository. The project involved 20 GB of data representing 100,000 documents (mostly CAD and scanned images). HSE reporting is integrated, with dynamic content generated through bi-directional links to SAP R/3. A straightforward end-user interface allows for CAD drawing visualization with zoom, print and redline markup for hot work permitting. The project’s strong points were the team quality, core product stability and good specifications and tender. Problems arose from underestimating the effort required from BHP. The link to SAP was a “struggle.”
Deboragh Humphreville, (Landmark) coined Yet Another Acronym (YAA – just coined another!) by describing Halliburton and SAIC’s latest alliance as a Knowledge Service Provider (KSP). This offering is scheduled to mature into a brand new e-commerce oil and gas portal, myLandmark.com. The switch is to be thrown towards te end of the year. More on the First Conferences Knowledge Management Strategy show from www.firstconf.com/energy.
The International Center for Gas Technology Information (ICGTI) has selected Excalibur Technologies’ RetrievalWare WebExpress (RWWE) as the search engine for its GTIonline web portal. GTIonline is described as a "virtual marketplace" where buyers and sellers of gas technology can meet to discuss technology issues online.
Under the agreement, ICGTI will use Excalibur RWWE to provide intelligent search to its users, helping them retrieve information from all sources related to the natural gas industry. RWWE indexes and searches a wide range of formats including text files, HTML, PDF, XML/SGML, relational database tables and over 200 proprietary document formats. Search capabilities include concept and keyword searching, pattern searching and query-by-example. RWWE targets distributed client/server environments with thousands of users, voluminous data and/or multiple media assets.
ICGTI Director Mary Lang said “GTI’s users require a robust search engine able to focus on highly technical information relating to all segments of the natural gas chain. Excalibur provides an effective way of indexing and transferring such information to our users." Founded in 1995, ICGTI now has nearly 1000 members including GRI (USA), BG Technology (UK), the European Commission, Directorate General - Energy, the Danish Gas Technology Center and the U.S. Department of Energy. ICGTI is operated by GTI Inc., a subsidiary of the Gas Research Institute. Visit on www.gtionline.com.
DMS and ILEX are to offer Application Service Provision to the Oil Industry. The new ASP offering, GeoHost, is described as a “remote information management service.” Using the secure Oil Partnering Network (OPNet), GeoHost will provide groupware and other applications on a ‘pay as you use’ basis.
Such applications will utilize ILEX’s GeoVault data management system allowing users to browse, analyze and interpret data sets using a variety of applications from any physical location. GeoVault can be used to manage documents, well log and seismic data using technology developed by Cambridge (UK)-based Petrologic, now an ILEX subsidiary.
Security for remote users is provided by DMS’ private network OPNet which already connects major oil industry service companies and North Sea operators. Access is provided from remote locations using satellite communication systems. (OPNet is deployed in the Opito-Vantage personnel tracking system.) Typical system bandwidth of 2Mb/sec will increase over the next 12 - 18 months with the implementation of a new North Sea fiber network. DMS was established in 1988. Today, the company has 200 employees and a $50 million turnover. More from www.ilexgroup.com and www.dms.net.uk.
La dolce vita is to continue for two more years for Guardian Data’s Milan unit. Guardian has been transcribing Agip’s seismic data since 1997, operating jointly with Geoquest. The latest tranche of the transcription contract has been renewed after a complete re-tendering process was initiated.
The new contract is for a further two years with a possible one year extension. The center uses Guardian’s propriety "Dingo" transcription system with output to 3590 high density tape. A metadata listing of Guardians "Seistore" database is also supplied. During the first two and a half years of the project, over 125,000 tapes and cartridges were processed. A similar volume of activity is forecast for the next phase.
Magic Earth, purveyors of high-end visualization technology to the better-endowed oil and gas company has signed an agreement with Marathon Oil Co. for its volume-based interpretation system, GeoProbe. Magic Earth President Mike Zeitlin said, "We are delighted with the outcome of Marathon’s review and its acquisition of our innovative technology.
We believe that GeoProbe will cut Marathon’s cycle time in 3D interpretation and improve the company’s exploration and production evaluations and drilling programs.” Ron Keisler, Senior VP Exploration for Marathon, added, "Marathon’s agreement with Magic Earth reflects our commitment to be an innovative company. It is the latest example of how we are leveraging technology to grow our business and further enhance our operational efficiency.
Magic Earth’s visual interpretation system, GeoProbe, has the potential to add value to our work processes and improve our competitiveness." GeoProbe is said to transform massive seismic datasets, well logs and other related data into “lucid” 3D images of potential prospects in hours and days instead of the usual weeks and months.
This capability enables geoscientists and business managers to complete comprehensive analyses of the earth’s subsurface more quickly. You can view some GeoProbe imagery in the Virtual Visit of last year’s SEG convention on the www.oilit.com website and of course on the Magic Earth site www.magic-earth.com.
PGS Data Management has entered into an agreement to develop and implement a National Geophysical and Geological Data Bank (NBDGI) for Russia. Partners in the venture are GlavNIVC of the Ministry of Natural Resources (MNR), and The Central Geophysical Expedition (CGE) of the Ministry of Fuel and Energy (MFE). The Russian databank is scheduled for startup this year and will be built around the PGS’ PetroBank.
The databank will be operated as a division of GlavNIVC from a central location in Moscow with regional centers throughout Russia, from St. Petersburg to Yuzhno-Sakhalinsk. When operational, oil companies will be able to connect online with pre-approved access to quality-controlled seismic, well, production and cultural data. Bjarte Bruheim, PGS president and COO said "The National Data Bank will increase efficiency by enabling the oil industry to make swift business decisions, while ensuring the security of national interests.”
GlavNIVC director Garold Lubimov added “We have been engaged in a six-year program of gathering and preparing data for NBDGI. We evaluated several technologies and found PetroBank to be the best. We are confident such state-of-the art and proven technology will secure the future success of NBDGI.”
Bruheim added “This is an exciting step forward in the expansion of our data management business in this region, where we have been active since 1995. We have customers located in Moscow, The Urals, Western Siberia, Sakhalin and the Caspian area. These types of arrangements put PGS in the unique position of securing the majority of the seismic, well, production and cultural data for these key oil bearing regions.
PGS has long been the dominant player in Norway where we currently manage and store virtually all such oil industry information. We are pursuing similar agreements in most regions where the oil and gas industry is actively searching for or producing hydrocarbons, and I firmly believe this approach will prove to be the right strategy for PGS in the long term.”