March 2002


Tobin’s $10M InSight

Tobin has unveiled the results of a $10 million development with the release of its ‘InSight’ GIS-based enterprise data management and browsing environment.

Tobin’s new InSight environment offers users ‘full-spectrum’ management of wells, seismics, scans, leases, imagery and CAD documents. InSight can be used to capture interpretation results from interpretation carried out in other applications such as Landmark’s Open Works.

TREX

The product is built around Tobin’s TREX data model, which leverages ESRI’s SDE and offers transactional update and change tracking. A graphical user interface allows for dragging and dropping of groups of database objects to export locations in project workspace.

XML

Project parameters can be saved as an XML file – and the job can be run from then on as a background cron task to allow for synchronization of project data. Tobin president Charles Ivey said that the development had taken 3 years and cost over $10 million.

PowerViewer

The InSight browser – ‘PowerViewer’ is a map-based browser with some innovative technology. From a selected area of interest in the map view users can drill down into a document management system such as Novasoft’s Cimage. Tobin showed Oil IT Journal how a click on a well location would bring up related documents such as reports, contracts and maps. Alternatively an area selected can be plotted out as a ‘thematic’ map with marginalia – taking 15 seconds to make a D size plot. PowerViewer 2.0 will be out in April 2002.

WebViewer

Tobin’s new WebViewer allows for map publishing for viewing in the browser. The web maps are active and can be navigated. The Tobin Area of Interest (AOI) tool navigates immediately to the current AOI. Features can be grouped and classified in a Table of Contents. Foreign databases can be explored through an intuitive display of complex data models. Data can be exported to interpretation environments such as a GeoGraphix project.

Ivey

The technology is currently being ported to ESRI’s ArcIMS 4.0. Tobin integrates with ESRI, Landmark and GeoQuest databases. Ivey sees this development as the culmination of a massive effort which he is convinced will succeed where organizations like POSC failed. The objective? To build a workable E&P data model and surround it with usable tools for data management and selection.


IDC acquires Riley

Following its 2001 acquisitions, International Datashare Corp. has struck again with the purchase of Riley Electric Log Inc. for nearly $4 million.

Calgary-based International Datashare Corporation (IDC) is to acquire Riley Electric Log, Inc. of Oklahoma City for a cash payment of $ 3,950,000 plus an earn out provision. Riley has one of the most extensive collections of logs in the US including broad off-shore coverage. Riley provides log data in paper and electronic format, cross sections viewers, custom digitizing services, an electronic online catalog of logs and an internet delivery and ordering system.

Stein

IDC president Norm Stein said, “The acquisition is a significant opportunity and a perfect fit for IDC. The two companies have the same origin and the same proprietary photographic process was used to create both our databases.”

McGinnis

Riley president Jack McGinnis will continue to oversee the U.S. operations. The cash portion of the acquisition price will be financed by the issuance of a debt instrument. Last year IDC completed the acquisition of MSI Capture, AnGIS and Nickle Map Service Ltd.. The new transaction is expected to close by the end of March 2002.


GIS - the last bastion of ‘build not buy’

GIS holds a privileged position in corporate IT as one of the few IT ‘products’ that make it into the boardroom. With the increasing functionality of GIS-enabled databases the technology could take over the whole of corporate IT. Oil IT Journal editor Neil McNaughton argues for caution - and metadata!

Geographical Information Systems (GIS) are everywhere! At the ESRI PUG we witnessed wireless GIS, GPS GIS and GIS on everything from the Enterprise master data store to the PDA. GIS is pervasive and seductive. Shell found that 80% of upstream data has a location component - and has embarked on a massive GIS deployment - using Oracle and ESRI technology (see the article of page 8 of this issue).

GIS in the boardroom

Most oil companies have implicitly recognized the importance of GIS for several years. Many have built their own GIS front-ends - frequently using ESRI’s technology. To date these have been tools for browsing and assembling GIS data - often to produce a wall map for the boardroom. This connection with the top brass has assured ongoing funding for such limited-scope GIS, and puts the GIS owner in an enviable position as compared say, to someone who is trying to manage log data.

GIS for everything

GIS has status and visibility. New technologies such as ESRI’s Geodatabase make it possible to bring an unlimited amount of corporate data into your GIS - making it immediately available for plotting on a map. You could use GIS to plot - not only coastlines and concessions, but also seismic or reservoir attributes. With 3D, you can plot deviated wells and even geobodies - as shown by John Grace of Earth Science Associates (see page 6 of this issue). But before your GIS developers get carried away I suggest a period of reflection.

Text? Finance?

GIS may be pervasive, but that does not mean that your GIS software has to do everything. Location is a key data type, but there are other ‘components’ of enterprise data. How about finance? How about text? If you come from finance, your view of enterprise data will be financial. If you are from office automation or document management, your worldview will textual. Enterprise IT needs to cater for all of these at the same time.

Horses for courses

Speaking at the ESRI PUG, Ekaterina Casey from BHP Billiton described how BHP uses ER Mapper to perform image analysis on seismic and other data types. BHP has ‘glued’ ArcView and ER Mapper together to allow processed imagery to be viewed in geographical context. BHP has avoided the temptation to do everything in one environment. Rather than doing everything in GIS, Casey advocates ‘using applications for doing what they were designed to do!’

Vacuum cleaner!

But the temptation to do more and more with GIS is growing. Before Arc GIS 8, ESRI’s toolkit offered limited data modeling capability. Now with the Geodatabase, you can start doing full-blown data modeling right along with your spatial data. Alternatively, with SDE or SDO, you can incorporate a spatial component right inside the corporate database. These technologies can act like vacuum cleaners sucking enterprise data away from their natural home.

Applications and data

But it gets better (or worse). Growing GIS functionality - especially 3D means that the vacuum cleaner is sucking up not just data - but application functionality. I keep hearing GIS folks ask - why don’t you do ‘x’ in ArcView? Where ‘x’ can be anything from modeling a salt dome to analyzing production. The dynamic here is a natural ‘struggle’ between horizontal and vertical software.

Inextricable

As horizontal applications like GIS grow, they become candidates for doing more and more of what hitherto could only be achieved in the vertical application. But just as GIS pervades the enterprise, it also pervades the application. GIS data is inextricable from the application itself. Replicating data between applications and the spatial repository appears to be inevitable. But along with replication, it might be a good idea to check out some of this ‘metadata’ stuff. Just as applications have to be capable of exporting ASCII text for minimal interoperability, it might be a good idea for them to ‘expose’ some GIS metadata, to allow those GIS folks working for the bosses to scan and catalogue mission critical data from the fringes of the organization. Hey, there may even be some useful work here for the standards orgs!

~~~~

Numbers game

There is a full text search engine on the www.oilit.com website. We can track what people are looking for - and it is not IT nirvana - but money. The most popular themes come from researchers investigating how much corporations spend on their IT. For those of you who like this sort of thing I thought I would bring to your attention some hard numbers cited by Schlumberger’s Satish Pai interviewed by our old friend Andrew McBarnet in First Break. Pai revealed that “A survey by Gartner suggests oil and gas companies spend something like $29 billion on IT in the broadest sense including hardware, software, telecommunications outsourcing etc.” Pai noted that at present 60% of that figure, $18 billion, is spent internally. It means that $11 billion is currently available to service companies and the figure is “bound to grow as companies increasingly outsource their IT operations.” Pai reckons that the growing data management services market could be worth between $500-$600 million and rising, and the application software market some $800-$900 million.

Shell

Finally a fascinating fact that kind of ties the previous two threads together. Speaking at the official opening of Shell’s ‘MegaCenter’, Shell Malaysia deputy chairman Zainul Rahim said that world-wide, Shell spends about US$1.8 billion a year in IT development and applications in support of a workforce of 95,000. That’s nearly $20,000 a year for every desktop in the company!


Document Management Standard Review

The new ISO 15489 standards offer useful guidelines on establishing a document and records management system. Oil IT Journal reviews the ISO documentation to conclude that it is a must read for corporations in denial as to a document management strategy.

The International Standards Organization (ISO) has just published a new standard entitled ‘Information and Documentation - Records Management’, ISO 15489. The standard comes in two parts - an Overview and the Technical Report and Guidelines. The new ISO standard is derived from the Australian AS 4390 Records Management standard. This ‘standard’ is not a standard in the sense of laying down technical details of data formats. There is no information which would underpin information exchange.

Quality

Instead, the standard is drafted along the lines of the ISO 9000 ‘quality’ management standards and sets out to provide vocabulary and best practices to help the enterprise design its own strategy for information, document and records management. Our first overview of the two documents gave a false impression of a dry, verbose set of documentation whose implementation would be unachievable in a real-world situation. But on drilling down - especially in the second volume - the technical guidelines - the information gets interesting - although not exactly inspiring!

Questions

The Guidelines cover the development of records management policies and strategies before discussing the processes of capture, storage, monitoring and training. Many excellent questions are asked - who should be able to access your documents? How do you know they haven’t been altered? How long should you keep them? What metadata should be captured along with the document (important for digital storage where the archiving process may strip a document of valuable metadata)? But to answer such questions, you will have to look long at your own organization’s requirements and business process. The ‘Standard’ is essentially a detailed checklist of what issues should be considered when implementing a document management system. Technology is absent from the debate - although the curiously anachronistic WORM optical storage does get a mention!

Data ‘erosion’

The report makes episodic reference to electronic document management, but its heart is in hard copy. There is an entreaty to ensure that ‘shelving ... be strong enough to bear potential loads.’ But while brief coverage of digital storage is offered, one feels that this was not the authors’ core business. For instance the recommendation that ‘records may have to be copied to new media to avoid data erosion’ begs the question of potential ‘data erosion’ during the copying process - a problem which has plagued many transcription projects! But all in all these standards must be considered recommended reading to those engaged in records and document management. Today, many companies are in denial when it comes to document access, retention and preservation strategies. The ISO recommendation will help such organizations evaluate the size of the task in hand. This is likely to be quite a scary process.
ISO 15489 is available from ISO on www.iso.ch .


Book Review, Integrating GIS with GPS

Karen Steede-Terry is an ArcView GIS instructor and also a Trimble-certified trainer for Global Positioning System technology. Her book provides a good introduction to marrying GIS and GPS but the ESRI-format privileges glossy illustration over technical content.

Karen Steede-Terry’s book ‘Integrating GIS with the global positioning system’ provides an inside track on the use of both high-end and consumer GPS systems. The book follows the usual ESRI format, with profuse illustrations and examples from a variety of industries and projects. What comes through most clearly is Steede-Terry’s know how - such as the caveat on the usage of consumer GPS systems.

Snapping

These are often sold for use inside an automobile and may perform some serious processing to the data – such as snapping the GPS location to the nearest road, or interpolating the location when there is no signal – like in a parking garage. For Steede-Terry, differential GPS is the ‘revolution within the revolution’ and is the secret to highest accuracy. Post processing the GPS signal is required for the best results. GPS vertical accuracy is always limited due to the fact that satellites, by their nature tend to be high up – limiting triangulation accuracy in the vertical plane. Real-time differential sources are available in the US from organizations like the US Coast Guard.

Accuracy

The choice of GPS receiver depends on your business requirement. If you are after readings in the 1-5 meter range, an inexpensive handheld model will suffice. For sub-meter accuracy, a high-end backpack model, or laptop-connected unit is required. Before rushing into the field, Steede-Terry advocates building a data dictionary of standard nomenclature for features that will be captured in the field. This can be quite a sophisticated data structure, tailored to the problem in hand and loaded into the field unit. Some systems even allow field editing of the dictionary – at the risk of inconsistency and synchronicity issues.

Duke Energy

The book includes a variety of case histories including Duke Energy Field Services’ use of GPS/GIS to validate conflicting map sources over its 50,000 mile pipeline network, 80% of which was acquired through acquisitions and mergers in the last 5 years. Steede-Terry’s insight is unfortunately diluted in the short book by a plethora of illustrations. The bewilderingly wide range of case histories from ESRI’s marketing department take up space that could have been better used explaining the technology.
Integrating GIS and the Global Positioning System. ESRI Press. ISBN 1-879102-81-1.


Drex embeds seismic in ER Mapper

Earth Resource Mapping is to resell 3D image processing technology from Drex Ltd. to facilitate the use of 3D seismic data in ER Mapper.

Image processing is recognized as a valuable way of enhancing seismic imagery to aid fault identification and to map facies. In fact much current ‘seismic processing’ – including edge detection and coherency analysis is available in standard image-processing packages. Spotting an opportunity, UK-based Drex Ltd. has developed an add-on module for the ER Mapper image processing and map visualization package.

ASAT

Earth Resource Mapping has signed with Drex to resell its Advanced Seismic Analysis Toolbox (ASAT). ER Mapper allows for 2D structural and stratigraphic analysis of features within mapped and extracted seismic horizons. The ASAT technology extends the seismic use of ER Mapper to 3D data volumes.

Stout

Drex MD Andre Stout said “Building on top of the powerful mathematical engine and data visualization techniques of ER Mapper, ASAT provides the exploration geophysicist with unique and powerful coherency and waveform classification processing techniques that can now be applied to the 3D seismic domain at the beginning rather than at the end of seismic interpretation workflow.”

Armstrong

ER Mapping’s Petroleum Industry Manager, Mick Armstrong added, “ASAT provides all the necessary functionality to allow the ER Mapper PC user to load, process and interpret their industry standard 3D SEG-Y data in a quick and convenient manner and at a fraction of the cost of equivalent technologies. The toolbox has already been adopted by industry leaders who report reduced cycle time with a correspondingly improved cost/benefit ratio.”


Landmark’s TuningCube rollout

Landmark has just released new technology developed by BP and Apache Corp. Spectral Decomposition is said to ‘turn seismics into geology’.

Landmark’s latest ‘Tuning Cube’ a.k.a ‘SpecDecomp’ claims to ‘turn seismic data into geology’. The basic idea behind the Tuning Cube is to divide - or ‘decompose’ the seismic data into closely spaced bands of different dominant frequencies. These are re-assembled onto the map using a color coding scheme to offer a new type of seismic attribute - which Landmark claims is sensitive to geological facies.

Patented process

The process leverages techniques patented by BP and Apache Corp., although the way the technology is described makes it sound very similar to that developed in the mid 18th century by French mathematical whizz-kid Jean-Baptiste Fourier (but they probably have never heard of him at the Patents office!).

Roth

Murray Roth, Landmark VP of Exploration and Development Systems said, “Landmark’s spectral decomposition solutions will provide our customers unique and powerful methods to comprehend reservoir size and complexity. The success that BP and Apache are having with spectral decomposition begins a new wave in interpretation, moving well beyond amplitude analysis.”

Cooper

Oil IT Journal quizzed BP’s Craig Cooper as to why, in this day of Open Source software, BP elected to patent the technique. Cooper told us that the patent was applied for several years ago and that anyhow the spectral decomposition algorithm was indeed in the public domain - on the Free USP website at www.freeusp.org.


Enterprise GIS for Reliant

Reliant Energy’s latest GIS implementation uses ESRI’s ArcGIS to feed spatial enterprise data to some 3000 users in six states.

Houston-based Reliant Energy has just completed implementation of ArcGIS 8.1 in its Entex local delivery company. Reliant Energy’s regulated utilities serve nearly 4.6 million electric and natural gas customers in Texas, Arkansas, Louisiana, Oklahoma, Mississippi, and Minnesota.

3000 users

Throughout the company there are currently approximately 3,000 GIS users including 200 staff editing the GIS database. Versioning is used to manage multiple-user editing of the GIS database to capture underground asset locating, pole attachment maintenance, streetlight asset management, supporting billing and maintenance, cathodic protection, gas network analysis, electric network analysis, and meter reader routing.

Miner & Miner

Reliant’s GIS includes ArcFM software from Miner and Miner and GTIView software from GIS Technologies, Inc. ESRI has developed a gas infrastructure Web viewer to provide view and query capabilities to district offices of data in the corporate database. Future applications include outage management with CES International, document imaging with GTI and FileNET, a design optimization tool, mobile data access for field crews, wireless real-time data collection and correction, and more.

Myerson

Reliant GIS services manager Jeff Myerson said, “This is a true enterprise implementation that will grow and expand. What we have in place impacts not only the states served by Reliant Energy but our entire organization as well. Using an architecture based on Citrix Systems, Inc., software, even the remote Reliant Energy contract digitizers have access to the centralized data servers.”


CGG deploys distributed NFS in Fabric

CGG is deploying cutting-edge technology from ADIC to incorporate file serving in its Fiber Channel networking infrastructure. The technology serves CGG’s 3000 CPU cluster with data from its 100TB SAN

CGG is convinced that it is on the cutting edge - not only of seismic processing but also on the deployment and management of high performance computing and networking. Oil IT Journal spoke to CGG’s Houston-based IT manager, Laurent Clerc to learn more about its growing seismic processing center and especially its next-generation Distributed Network File System (DNFS) development.

3000 CPU cluster

CGG currently deploys a 3000 CPU Dell-based cluster in Houston which connects to its multi-terabyte storage through a Brocade Fiber Channel Fabric switch. But the demands of round-the-clock seismic processing have stressed conventional NFS to breaking point. The dependence on a file serving computer creates an input-output bottleneck and a potential single point of failure.

ADIC Scalar 1000

This is where the next generation DNFS software from Advanced Digital Information Corporation (ADIC) comes in. CGG uses ADIC’s Scalar range of storage networking libraries to distribute the process of file I/O. Data paths no longer pass through the file server, but go straight from disk to CPU thanks to the intelligence built in to the 2 Gbps network.

6 month lead

Clerc believes CGG has about a six month lead on the competition - a long time in this fast moving area - and is looking to export its experience outside of seismic processing. Possible targets include major oil and gas corporations but also power users of high performance computing and visualization outside of oil and gas.


MagicEarth offers ezFault in GeoProbe

A new release (Version 2.6) of Halliburton unit MagicEarth’s GeoProbe seismic interpretation package offers enhanced fault interpretation capability and OpenWorks connectivity.

Halliburton unit Magic Earth has released GeoProbe 2.6, the latest version of its high-performance visualization and interpretation software. The new release upgrades the ezFault interpretation tool, to allow for simultaneous, multiple fault editing, well data management and connectivity to OpenWorks.

Cheung

Magic Earth Executive VP and CTO Yin Cheung said, “GeoProbe is already well-known for its capability to handle the largest volumes of 3D and 4D seismic data for exploration projects, but it is also becoming a valuable tool for understanding reservoirs and mature fields.”


SmartView to replace SmartMap

Schlumberger is to phase out SmartMap and other embedded mapping tools, replacing them with a new product ‘SmartView’ - based on ESRI technology.

Many of Schlumberger’s clients have been hankering after a generic mapping tool to replace the plethoric embedded mapping engines of GeoFrame and the Finder SmartMap tool. The answer now is SmartView based on ESRI’s Arc 8 technology.

SmartView

SmartView provides access to E&P data using a Common Mapping Canvas. It allows authorized users to select and view data originated by multiple Finder, GeoFrame and SDMS projects. This data integrates with other spatial data into a comprehensive mapping and analysis tool powered by ArcView. SmartView includes ArcView 8.1 functionality along with extensions and controls to retrieve and display Finder data.


Veritas and I/O ally

Input/Output’s VectorSeis will be deployed in N. America on Vertitas’ field crews. The multi-component acquisition technology will be launched with a 10,500 channel system.

Veritas DGC Inc is to ally with Input/Output, Inc. to promote the use of multi-component seismic acquisition using I/O’s VectorSeis digital acquisition. The Alliance, which covers land acquisition in North America includes processing, interpretation and marketing of data acquired with the technology on both contract and non-exclusive terms. Under the terms of the Alliance, Veritas and Input/Output will share revenues arising from data acquired through the Alliance.

Robson

Veritas CEO Dave Robson said “Our trials clearly demonstrate the value of VectorSeis technology in both P-wave acquisition, as well as in multi-component acquisition, where the additional information gathered greatly enhances the knowledge required to help our customers exploit their existing reservoirs.”

10,500 channels

Together, the companies have acquired and processed 16 VectorSeis surveys in 2001. The first Alliance crew, equipped with 3,500 stations (10,500 channels) will initially work in Western Canada.


ESRI Petroleum User Group 2002

The ESRI Petroleum User Group celebrated its 12th year with its largest attendance to date with 450 registered. The PUG’s origins go back to the 1989 ‘Project Database’. In this industry-wide request for technology and GIS ‘challenge’ to vendors, the then unheard of ERSI came out ahead. Since then, ESRI has come to dominate the oil and gas industry’s use of Geographical Information Systems (GIS). GIS cuts across the spectrum of oil and gas activity with a large proportion of the PUG coming from the pipeline community. Here, environment and regulatory pressures are driving GIS development. The results are an impressive use of GIS combining vector, bitmap and spatial data to create accurate maps for use in emergencies. The addition of GPS and wireless GIS underscores the oil industry’s position as an early adopter of such innovative technologies.

Operation Database

Charles Fried (BP) recapped the original ‘Operation Database’ project of 1989 - a shoot-up between GIS vendors in which ESRI came out ahead. ESRI’s Operation Database specialist John Caulkin then showed how the original AML code still compiles to produce a recognizable display - an Arc 5 project running under Arc 8! Caulkin went on to show ESRI ‘as now’ – with a fireworks display of functionality. By combining trend analysis, geostatistics and Spatial Analyst’s powerful smoothing you can prove anything with any data! Reserves in place, mass balance calculations etc. are all computed with ‘small VB routines’.

Caulkin

Caulkin, who used to be a geologist with Tenneco, demonstrated geology-related GIS information processing using data from the Munroe Gas field in Louisiana. The Visual Basic script editor was used to concatenate flow rate information with water cut into a string, which was then formatted to color code gas and water. A demo of Arc Publisher followed – publish to .pmf format with control over what (limited) end-user map functionality was to be offered. The .pmf files can be picked up in the free ArcReader utility where users can query simple features and turn map layers on and off. The .pmf format is built of pointers to live data.

Brown

Clint Brown (ESRI) ran through the recent and future enhancements to the ESRI product line. The Geodatabase, introduced with Arc8 leverages the relational database with the addition of a spatial feature class – a table with columns for geometry or image data. ArcCatalogue is another key development – organizing collections of spatial data with metadata standards into a catalogue of distributed data sets. Transactional Data Management lets you ‘time travel’ through your data. Loosely coupled replication is ‘very important.’ Data bases ‘talk to each other’ and synchronize to facilitate ‘scalable GIS’ – from full-blown Arc GIS to embedded, lightweight GIS with XML broadcast to any client. GIS is ‘a language’ allowing for operations on spatial objects through a ‘rich set of commands and scripts and a generic GIS model.’ The big news for the upstream is that Arc GIS 8.3 – due out later this year adds linear referencing for pipeline stations and seismic shotpoint posting.

G.Net

G.Net is a concept – ESRI president Jack Dangermond’s vision of how ArcIMS will be accessible through a metadata portal. G.Net will build a GIS framework on top of the Internet through web services to enable a user to go through a ‘broker’ to find a data publisher. G.Net will be underpinned by standards, XML, UDDI, WSDL and SOAP and ‘may happen’ in either Microsoft’s .Net environment or with Sun Java.

Degler

Shelly Degler described how Buckeye Pipleline uses GIS to manage data and track pipeline integrity. Pipeline management involves thousands of right of way documents, with updates and distributions, and a multitude of CAD drawings. Early attempts to do risk management were abandoned after 3.5 miles of data were captured – because ‘you can’t do it all at once.’ Working with GeoFields, Buckeye’s second attempt involved data captured on a ‘will it be used?’ basis. The database includes risk data to comply with DOT Integrity Management Rules. An immediate benefit was the visual QC that GIS offers – a valve showed up in the middle of the Ohio river! Buckeye has a rich history going back over a century – some right of way documentation was signed by Rockefeller himself! Buckeye’s project cost $2.1 million over three years and is now ‘100% complete and 25% under-budget.’

3D GIS

John Grace (Earth Science Associates) showed how 3D visualization and modeling can be achieved using ESRI’s 3D Analyst. Logs and borehole tracks are stored as Shapefiles. 3D bodies such as the reservoir are modeled as ‘Multi-Patch’ irregular solids and viewed as a CAD/CAM wire frame drawing. Although the technique benefits from some neat 3D manipulation and visualization techniques, the process of turning geology into a CAD representation appears cumbersome.

Wood Mackenzie

Consultants and data providers Wood Mackenzie were one of the few upstream data users to deploy MapInfo. But their clients are all ESRI users so WoodMac has been forced to migrate to Arc GIS 8.1 - performed by Geodynamic Solutions Inc. The ARC 8.1 suite was deployed with ArcEditor for mapping, ArcSDE for storage and ArcCatalogue for data management. The project involved re-engineering the existing the MapInfo-based tools into the relational spatial model and significant efficiencies were claimed. Customization (an advanced labeling tool, feature editor, batch exporter, data loader and bookmarking tool) was achieved using Visual Basic and ArcObjects. More from www.geodynamic.com and www.woodmac.com.

BHP Billiton

Ekaterina Casey explained how BHP Billiton uses ER Mapper for image processing. Shaded relief maps are used to study geology and seismic horizons. BHP contracted Eagle Mapping and Wintermoon Geotechnologies to integrate ER Mapper with ArcView. Eagle has ‘glued’ ArcView and ER Mapper together, while Wintermoon added expertise in satellite, gravity and magnetics and data analysis to ER Mapper. The project has brought ER Mapper functionality to a broader audience. Casey advocates ‘using applications for doing what they were designed to do.’ This involves complex optimization across GIS/SDE and IMS. GIS is also used to provide geo-located knowledge management with links to Documentum.

Emergency Response

Enron is not all smoke and mirrors. Elaine Tombaugh told how its Transportation Services unit operates over 33,000 miles of pipelines. Enron needs to respond rapidly to leaks and other problems and needed a robust solution for coordinating field response to emergencies. The solution, which evolved out of a pre-existing ArcView IMS solution, was developed in partnership with R7 Solutions and includes MapObjects, SAP and telephony. This development is claimed to be one of the first enterprise systems built on Microsoft’s .Net platform. The system receives around 100 calls per day. These are ‘triaged’ to determine the level of the emergency and dispatched. Telephone automation can initiate a conference call of appropriate teams on the fly and notify local authorities or 911 contacts.

IHS Wildcat

Tor Nelsen, (IHS Energy) described how its clients are faced with the problem of accessing local GIS information along with commercial, distributed data sources, preferably all through a common interface. IHS has tried a variety of solutions to this problem and is currently enthused by the new generation ESRI technology. The key is consistent handling of spatial layers – with ArcSDE as the de-facto standard. The IHS model is based on the KIS principle – keep it simple. Rather than doing a full-blown data modeling exercise, the new model just adds 20 attributes to each layer object. This strategy makes it possible to perform a spatial query on a ‘truly massive’ dataset. Arc GIS 8 ‘solves a major part of the data integration problem.’ Wildcat is IHS’ toolbox for building this kind of solution – and leverages technology from CompuWare and Safe Software. Nelsen provided an impressive demo of data access (to Denver) with query by form - bringing up cartographic data on a transparent DOQQ image. Wildcat allows access to all of IHS data (Petroconsultants and PI/Dwights), proprietary in-house data and data on the web such as the USGS.

Migrating to ArcGIS

Reliant Energy is an international energy services and delivery company with $20bn revenues. Cynthia Salas described Reliant’s post merger rationalization of 4 GIS systems ArcInfo. Salas noted that GIS development was outside the competence of most IT professionals – describing GIS as a ‘niche within a niche.’ Reliant developed its data model from scratch with normalized nomenclature and taxonomy – using Visio and Microsoft Repository. After migration, maps were printed out and overlain on the old maps for QC. All editing continued on the old system while the new one was tested. Only after thorough testing and training was the new data migrated. Salas recommends onsite support from contractors (ESRI and Miner and Miner), she also recommends that consultants ‘educate your folks’.

Real time GIS.

Wade Koteras (Energy Objects) showed how GPS can combine with wireless GIS to offer engineers real-time support in the field. The new technology enables ‘one call before you dig’ solutions and emergency response collaboration. The position of vehicles, containers, ambulances and drilling rigs can be tracked in real time. One-call systems include geocoded street addresses. Spatial analysis of digging locations avoids ‘conflicts’ with underground utilities. Digging tickets can be dispatched automatically. Koteras elaborated on the complex issue of synchronicity. This needs balancing between simple strategies (which may miss changes) and real-time systems which can swamp networks. Koteras advocates using enterprise messaging systems such as that from Tibco.

R7 Solutions

R7 Solutions’ Alex Bain was showing-off wireless GIS capability with a Wi-Fi network set up in the exhibition. GIS, GPS and wireless can combine to support field engineers involved in maintenance, survey and environmental cleanup. The demo showed an oil spill offshore Galveston Island against a satellite image backdrop. Using a Compaq iPAQ Pocket PC with a wireless card, an oil spill response team can track the evolving situation via a real-time map on the iPAQ. Interaction (to indicate extent of the spill) on an iPAQ in the field updates a central GIS repository and the information can be shared with workers at other localities. The solution was developed with ArcPad 6.0 in VB Script. ArcIMS is used on the server and Safe Software’s feature manipulation engine is also deployed. In the field a ruggedized device such as that from www.symbol.com would probably be used. The solution is currently under testing – the first enterprise version should be out later this year.

More info

This article is abstracted from a 20 page report on the PUG produced as part of The Data Room’s ongoing Technology Watch Service. For more information on this email info@oilit.com .


Pipeline Standards Wars

Why have one standard when you can have two? And why get along, when you can engage in trench warfare? Oil IT Journal observed the fun and fireworks at meetings of the ISAT and PODS user groups. Pipeliners are big-time consumers of GIS and whichever model is used, the results, in terms of geographic impact, are pretty convincing.

In the beginning was the Gas Research Institute – a well-endowed government body with budget and mandate to perform R&D to support the US gas industry. What better project for such an organization than a pipeline data model? Thus was born the Integrated Spatial Analysis Techniques (ISAT) data model. ISAT defines a relational data structure for pipeline hierarchy, centerline, facilities, and related events such as line crossings and other ‘events.’ Note that ISAT does not per-se have a GIS component.

The ‘fork’

When the halcyon days of government- funded research ended, the Gas Research Institute, downsized into the Gas Technology Institute, was no longer capable of keeping a tight rein on ISAT. Meanwhile, a participant in the original project, MJ Harden, had built a successful graphical front end to the ISAT model – ‘PipeView.’ All this led to what the open source software folks would call a ‘fork’ – with one ‘ISAT’ still closely associated with MJ Harden and PipeView, and another ‘ISAT’ – ‘ISAT 2’ a.k.a. the Pipeline Open Data Standard (PODS) developing separately. The GTI appears to lend such weight as it has these days to the PODS flavor of ISAT. Both ISAT and PODS had informal user group meetings at the ESRI PUG.

ISAT 1

ISAT 1 was designed to be GIS, database and application independent. While it had Intergraph leanings in its early days, now GIS extensions have been developed for Microstation/MGE, AutoCAD and ArcView. MJ Harden – who now maintain the model – claims that over 100,000 miles of pipeline are currently modeled in ISAT 1, that 19 companies use ISAT and 24 applications are based on the model. A new website is shortly to go live at www.isatmodel.org. ISAT offers queries such as “what locations have questionable CIS readings with numerous corrosion anomalies in class 3 or 4 areas.” The new ‘expanded ISAT’ meets increased reporting requirements from the Office of Pipeline Surveys (OPS). All graphical information is stored in SDE. A move to the ESRI GeoDatabase environment is planned.

PODS

The official rational behind PODS/ISAT 2 was to add liquid handling to the GRI gas-only data model and to better address business requirements. There were undoubtedly other reasons for the new modeling effort linked to MJ Harden’s dominance in the ‘ISAT 1’ world. GTI project manager Keith Leewis officially handed over the intellectual property of the ISAT data model to PODS (on a non-exclusive basis), along with the www.isat.org website. More from www.pods.org.

Killer app.

Whatever standard you favor, GIS is a real ‘killer app.’ for the pipeline community. ‘Alignment sheets’ show dents, buckles and corrosion as measured by pig surveys against backdrops of photo imagery and vector plots of pipeline route and land use. Spatial queries are particularly useful in correlating parameters such as corrosion risk with environmental sensitivity.


Oracle Spatial the joker in the pack?

Corporate deployment of spatial technology does not have to be 100% ESRI. Oracle Spatial offers a viable alternative to ESRI’s SDE. Shell is in the process of moving its corporate spatial data from a legacy Genamap-based solution to Oracle Spatial.

The foundation of ESRI’s new technology is the GeoDatabase. This is designed to be deployed on databases from a variety of vendors including Oracle. But if, as is likely the case, your corporate data is already stored in an Oracle database, you have another option - to use Oracle’s own spatial solution SDO.

Confused?

Confusing? No doubt. According to both ESRI and Oracle, the choice depends on your requirements. Co-location of spatial data in Oracle is said to be good for broader SQL queries. If predominantly spatial – use SDE. You could use both! Consolidating data into Oracle also simplifies infrastructure support. Putting all your spatial eggs into one big Oracle instance also avoids replicating metadata into another ‘Spatial’ database.

Shell

One company that has taken the Oracle Spatial route is Shell who is in the process of phasing out its Genamap legacy system and is migrating all of its upstream spatial data to SDO. Data types include pipelines, exploration mapping and environmental data. Shell reasons that some 80% of upstream data has a spatial component – often stored in different places and formats. The Oracle deployment is intended to centralize spatial data using a single model integrating spatial data with other business data types stored in Oracle.

Why SDO?

Why would you want to store your spatial data in Oracle SDO rather than in ESRI SDE? Oracle argues that this offers a move away from ‘proprietary’ storage. The openness of Oracle SDO is borne out by the fact that Shell uses the ESRI products to access its SDO data.

Faint-hearted?

The Shell solution is not for the fainthearted. Shell are moving from a bespoke in-house development based on Genamap to another using an Oracle Spatial dev kit. These are not shrink–wrap solutions. We are in ‘build not buy country’ here. See this month’s editorial for more on these issues.


Shell MegaCenter opens doors (again!)

Shell has officially inaugurated its first MegaCenter, located in Malaysia’s ’Silicon Valley.’ Originally set to support 90,000 Shell desktops, the Cyberjava center will now serve a mere 75,000.

The red tape on the first of Shell’s MegaCenter IT hubs was cut this month (the centre actually opened last year). The MegaCenter is located in ‘Cyberjava’ – at the end of Kuala Lumpur’s Multimedia Super Corridor (MSC), Malasia’s Silicon Valley. The MSC offers Shell advanced telecommunications infrastructure and services at competitive rates. Managed by Shell Information Technology International (SITI), the MegaCentre concept is a large-scale, global IT application hosting environment, consolidated into three regional hubs (others are planned for Houston and either The Hague – or Manchester!).

BMC Software

BMC Software has signed a ‘multi-million dollar’ strategic partnership with Shell to act as Shell’s single source supplier of systems management solutions for the MegaCentres. BMC will provide systems management and automation for the Shell MegaCentre environment, including SAP and eBusiness deployments. BMC senior VP Debbie Tummins said, “The partnership with Shell shows how companies can align IT with their business goals.”

Downsizing

When the Megacentre concept was announced last year, Harry Roels announced that some 90,000 desktops across the Shell organization would be supported from the Cyberjava center by year-end 2002. SITI’s executive director Aad van Strien revised this figure down to 75,000 during his inaugural address.


Well planning and real-time steering

Paradigm’s new DirectorGeo toolkit supports well planning and simultaneous high-end visualization. The software offers real-time collaboration between earth scientists and drilling engineers.

Paradigm Geophysical has released a new directional well planning and survey management system, DirectorGeo. The new product collaborates with Paradigm’s flagship volume-based interpretation and visualization system VoxelGeo. Paradigm recommends using the two products in concert for geological target selection and well planning in 3D.

Wearing

Paradigm senior VP John Wearing said, “Industry experts are predicting that new drilling solutions, combined with high-end visualization and real-time controls, will yield significant savings. Our combined product offerings present an opportunity for us to enter a large new market, working with drilling contractors and the drilling departments of oil and gas exploration and production companies.”

Geobodies

A typical workflow involves prospects development within VoxelGeo. Horizons and auto-extracted geobodies are then picked up in DirectorGeo for well planning. During the planning process, the well path can be visualized dynamically in VoxelGeo. This allows for in-context viewing of driller’s targets, wellpath positional uncertainty and casing points.

Anti-collision

The system facilitates simultaneous viewing of previously-drilled wells, anti-collision analysis and mud program selection. Wellpath data is stored in the drilling relational database for further engineering and drillability analysis, using Paradigm’s suite of integrated drilling engineering applications.

Operations

DirectorGeo’s ‘Project Ahead’ system uses real-time MWD data to ensure accurate target intersection. DirectorGeo is available on Windows, and VoxelGeo is available on Windows, Sun and SGI platforms.


Seismic petrophysics boutiques join forces

eSeis recently acquired Ulterra Geosciences and GeoScope Exploration Technologies before merging with Exploration Specialty Processing of Houston.

Four seismic petrophysics boutiques are to join forces. eSeis, Inc. has acquired Calgary-based Ulterra Geoscience and GeoScope Exploration Technologies and has also merged with Houston-based Exploration Specialty Processing.

Porche

e Seis CEO Shawn Porche said, “We are excited about the opportunity to more broadly introduce seismic petrophysics, through the combined efforts of these companies. eSeis’ proprietary LithSeis technology has been used with spectacular results in several basins, and will now be available globally to a broad range of E&P companies.”

Ulterra

Ulterra Geoscience was established in 1992 to perform geophysical R&D with a focus on rock/fluid interpretation technology. Ulterra’s products have been marketed by GeoScope Exploration Technologies since 1998. Exploration Specialty Processing has been providing advanced data processing, inversion, and AVO modeling services for the Houston oil and gas industry since 1996.

e SeisNet

e Seis will continue to deliver its products and services via the Internet, using eSeisNet thin-client technology, allowing remote access for calibration and delivery of seismic data. Beside Porche, eSeis’ officers include Roger Young (CTO), formerly CEO and CTO of Exploration Specialty Processing, Dan Morris (President) and Devon Dowell (COO). Dowell was President of Coherence Technology Company before its acquisition by Core Lab.


Blue Marble for BRGM imagery

French GIS and environmental specialist Imagis has geo-referenced the BRGM’s imagery with Blue Marble’s toolkit.

The French Public Geological Survey (BRGM) has geo-referenced and tiled its entire image dataset with Blue Marble Geographics’ Geographic Transformer. The work was done by Imagis
Méditerranée of Nimes, France.

1100 images

Approximately 1100 images were georeferenced to Lambert Conformal Conic II and a local French Lambert Zone, then ‘mosaicked’ into over 5000 tiles. Since this project has been completed, nearly 60 GB of data has been made available to licensed users of the organization.

Dovillez

Imagis’ Philippe Dovillez said “The Geographic Transformer allowed us to georeference all the image formats in our dataset, quickly and accurately. This was a very large and high profile project which is why we chose to use the Geographic Transformer. It is an invaluable tool and one that we will use again.”

‘Image-to-world’

Geographic Transformer establishes an ‘image-to-world’ relationship between image and map coordinates. Image files can be geo-referenced, re-projected and mosaicked. The Transformer is compatible with most common GIS software applications including ArcView, ArcInfo, Microstation, MapInfo, AutoCAD and others while supporting a wide range of standard image formats.

Environmental data

Founded in 1995, Imagis solutions integrate digital cartography and GIS. Imagis assists its 200 clients worldwide in the fields of environmental data analysis, land management and urban planning.


Pason Hub integrates WellView

Pason is to include Peloton’s WellView software in its electronic drilling recorder and rig site data hub.

Pason Systems Inc. has signed with Peloton Computer Enterprises Ltd. to license Peloton’s WellView wellsite and drilling information system. Pason will install WellView on all of its electronic drilling recorder networks. Pason claims approximately 90% of the Canada and 20% of the US markets.

Hill

Pason president Jim Hill said “By deploying WellView within Pason’s infrastructure, wellsite users will no longer have to learn multiple software products with different interfaces and forms. Data flow into the corporate database will now be provided by Pason.”

WITSML

The Pason Data Hub uses the new industry WITSML data exchange standards. Pason’s goal is to combine wellsite real time sensor data, engineering and financial data into a single website to provide ‘next generation’ drilling data management. Peloton’s software is used by over 300 oil and gas companies worldwide.


Petroleum Place DataMap

The latest release of Petroleum Place’s online DataMap application extends into Production Data Analysis.

Petroleum Place has released new versions of its suite of online data access and analysis applications - DataMap 3.0, Query Manager and Production Data Analysis. The applications, which are accessible online, provide ‘quick, convenient, affordable online access to and analysis of industry data.’

DataMap

DataMap 3.0 integrates and displays US and Canadian industry data and allows for data search, project creation and mapping of well, land and production data. DataMap claims 1,700 users.

Production Data Analysis

The Query Manager search tool enables users to locate wells, rigs, land and licenses using multiple criteria. Production Data Analysis, the latest addition to the Petroleum Place suite, provides quick, online analysis of annual US production data.


GeoLogic releases LithoTect

Geo-Logic Systems is to release LithoTect - a new structural and stratigraphic interpretation and modeling tool.

Geo-Logic Systems (GLC) is to commercialize its LithoTect structural and stratigraphic interpretation software. LithoTect creates and validates geological interpretations through restoration and balancing. LithoTect offers structural restoration/forward modeling techniques including decompaction and isostatic calculation on seismic data.

Geiser

GLS president Jim Geiser said, “For dealing with the real complexities of geological interpretation, or ‘uncomplicating the complicated,’ LithoTect is the right budget-saving tool.” GLS began work on its structural restoration software in 1983. LithoTect’s earth modeling framework EMAF ‘transforms raw data into geological knowledge.’ GLC is to use INT’s J/GeoToolkit to develop the commercial release of LithoTect.


Full-field simulation

Landmark’s ‘True Full-Field’ technology performs simulation from reservoir to the tank.

Landmark has released its ‘True Full-Field Simulation’ (TFFS) reservoir and facilities modeling package for field development and production optimization. TFFS is an add-on to VIP, Landmark’s reservoir simulation package. TFFS aims to ‘right size’ surface facilities and pipeline networks, and optimize reservoir performance.

Data Studio

TFFS concurrently simulates compositional fluid flow in both the reservoir and surface facilities. 3D visualization of surface facilities, reservoir models, drilling information and other subsurface information provides the visual context and collaboration platform for the entire team. The process is coordinated through VIP Data Studio, a new Windows-based desktop user interface.

Killough

Landmark simulation guru John Killough said “Landmark’s reservoir simulators have been leading the industry for years and we continue to innovate with the only coupled surface/subsurface solution available today.”


PFAS well planner

New software from ITCAS offers well data management including borehole stability computation.

Integrated Technology Consultants AS has released a new version of its Planning and Field Application (PFAS) well planning, data management and analysis package. PFAS allows for recalculation and update of well prognoses during drilling. PFAS takes a multi-discipline approach and captures geological, drilling and well-bore stability data.

Open system

After completion of a well, PFAS generates a final well report, along with post-analysis and lessons learned. PFAS is described as an ‘open system’ and is usable by both experts for in-depth analysis, and by ‘geologists and others’ for every day well applications.

PFAS Plus

PFAS clients include PDVSA, Conoco, TotalFinaElf, IFP, Chevron, Intevep, BP-Amoco. Norsk Hydro used the software on its deepwater Scarabeo 5 rig to drill in 1351m water depth. The new version - PFAS Plus 6.0 – adds new data types, log handling, databasing and project management to the standard PFAS functionality. More from www.itcas.no.


CGG signs with NNPC and sells transcription business

CGG has renewed its ongoing processing services contract with Nigerian state oil co NNPC. Meanwhile it has sold its US-based transcription business to Ovation Data Services.

The National Nigerian Petroleum Corporation (NNPC) has renewed its agreement with CGG for seismic data processing services. CGG and NNPC-unit IDSL have jointly managed the Port Harcourt facility since 1985. The center uses CGG’s GeovecteurPlus seismic package to process all seismic data acquired in Nigerian territory. Actual number-crunching is shared between the center and CGG’s ‘Mega-Centers’ located in the UK and Houston.

Transcription

In a separate deal, CGG has sold its US-based tape transcription business to Ovation Data Services. The transcription unit was acquired along with PECC completed back in 1997. The announcement heralds what Oil IT Journal understands will become a more extensive agreement for data management and online services between CGG and Ovation.


Folks, offices and .orgs

This month the Presidents of both Schlumberger Information Solutions and Landmark Graphics are promoted to their respective stratospheres.

Satish Pai has been promoted to join the top brass in the Schlumberger organization. He is replaced as president of the Information Systems division by Ihab Toma.

John Gibson is to become president of Halliburton’s Energy Services business unit. Andy Lane will take Gibson’s place as president of Halliburton’s Landmark Graphics unit.

International Datashare Corporation has appoint Todd Chuckry as vice chairman and managing director. Chuckry was previously president and CEO of Request Seismic Surveys Ltd.

Ex-Texaco Tom Carroll has been named Vice President and Chief Operating Officer at the Houston Advanced Research Center.

Petris has opened a new European branch office in Toulouse, France, headed-up by Eric Déliac, formerly president of GeoNet.

Enertia Software has named Walter Oldham as director of business development for its new Denver office.

Geotech has appointed Rick Floyd as VP Business Development.


Data Repository owners meet

National Data Repository owners, select suppliers and users met in Stavanger this month. NDR ‘business’ is booming.

Representatives from sixteen governmental agencies, several major oil companies and data repository software vendors from across the globe gathered in Stavanger earlier this month to discuss National Data Repositories (NDRs). Hosted by the Norwegian Petroleum Directorate (NPD) this was the fourth meeting of the ad-hoc grouping.

Knudsen

Kjell Reidar Knudsen of NPD said “Knowledge of different reporting requirements may help in harmonizing future regimes, allowing for new standards and more efficient data flow between companies and government agencies regionally and across national borders. We all face similar challenges and opportunities.”

UK DTI

The seminar was co-sponsored by the UK Department of Trade and Industry and the Petroleum Open Software Corp. (POSC). NDR reporting falls into three categories from the software standpoint. Halliburton’s PetroBank and Schlumberger’s Finder share most of the market – with a few ‘home brew’ solutions. Noteworthy among the latter is the US National Geoscience Data Repository System whose ‘GeoTrek’ uses open source technology such as MapServer, MapScript/PHP and POSC’s Epicentre.

Robinson

Summing up, Stewart Robinson of the UK DTI noted a significant increase in NDR activity over previous years. Robinson noted that different countries had different NDR cultures. But most shared the same motivation – to establish a national archive and to attract new participants to a province. The NDRs’ future lies in the web, which is extending NDR reach and forcing a re-think of funding models.

Standards

Commenting on the use of the term ‘standard,’ Robinson opined that while there are de facto standards such as DLIS and SEGY, the metadata standards required for NDR development do not exist. Robinson – a POSC board member – believes that organization has a role to play here as a standards custodian.

Centralist

Most NDRs have decided upon a centralist strategy with all data being in the same place. The USA is looking at a dispersed model and the UK’s plan is to move to a meta-repository - with pointers to real data repositories. Robinson urged NDR developers to “be realistic - it will be more difficult than you think.”

Team

This, the fourth meeting of the NDR bodies came after a three year gap. Going forward, the intent is to form a team from Norway, USA, UK, Russia, Schlumberger, Halliburton and POSC to arrange the next meeting. An 18 months interval between meetings was suggested giving next two meetings in September 2003 and Spring 2005.


POSC bids for Geoshare

POSC is offering a new home for the Geoshare standard. A decision is called for at the AGM - to be held alongside the PNEC data integration conference in Houston next month.

Geoshare’s three-year plan is coming to the end of its term – and the upcoming PNEC conference to be held in Houston from the 15th to 17th April will as usual be hosting the Geoshare AGM. To give Geoshare members some food for thought, POSC has made a pre-emptive bid for ongoing support of the Geoshare standard.

Legalistic

POSC’s offer, which is couched in near-legalistic jargon – claims that both organizations support ‘collaboration and adding value in the oil and gas industry.’ The POSC proposal recognizes the value of ‘preserving the stability and continued viability of Geoshare.’

Mature

According to the POSC analysis, Geoshare technology is mature and stable but that it is increasingly hard to resource the initiative through the current volunteer model. POSC proposes a three-point plan for Geoshare.

Change management

POSC will operate the change management and publication services related to the data model and will create a framework for managing sharable implementation resources using an open source registry model where appropriate.

New technology

POSC will include discussion about new technology and Geoshare in its overall planning procedure conducted with members and industry. This will ensure ‘clarity of plans and full involvement of Geoshare users in all decisions.’

Membership

Geoshare User Group members will initially become POSC Geoshare members, a new category of membership, operating under similar membership rules, terms, etc. to current Geoshare User Group practices. With advice from the POSC Geoshare members, POSC may revise these terms in the future.

Vision?

What may be lacking in the POSC offering is a vision of what Geoshare could become. The fate of the Schlumberger-owned Geoshare development kit is also unmentioned in the context of the POSC proposal. It would seem desirable that whoever is to own Geoshare should maintain the dev kit. POSC’s position as custodian of the underlying ex-API RP66 standard argues in favor of the initiative. But generally, the proposal reads like an offering of good accommodation – in a retirement home!


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.