October 1996

Petroconsultants sold to US data specialists (October 1996)

By the time you read this Information Handling Services (IHS) will be completing its acquisition of Petroconsultants SA. Petroconsultants, a famous name in the upstream scouting and data management business has been up for sale since the death of Harry Wassall earlier this year.

IHS Englewood, Colorado is a leading international publisher of

SSI finds suitor in Smedvig (October 1996)

Norwegian oil service supplier Smedvig is to acquire Denver-based reservoir management specialists SSI.

Troubled high tech software developer Scientific Software-Intercomp (SSI), based in Denver with offices worldwide, may have found a safe haven. Last month Norwegian company Smedvig signed a letter of intent to acquire the company although its $23 million cash offer was conditional on a number of factors. There is an obvious synergy between Smedvig, an offshore oil services offshoot of the shipping company specializing in well construction services, mobile production solutions and reservoir and technology services worldwide, and SSI, best known for its reservoir management services and its WorkBench product.


According to the official statement, Smedvig is to acquire the current liabilities of SSI with SSI satisfying its long term debt liabilities to Lindner Funds, Renaissance Capital Partners and Bank One for approximately $7 million and the SSI preferred stock from the proceeds of the sale. It would also be responsible for the pending settlement of its class action lawsuit which is to be substantially funded from the proceeds of insurance. The whole deal will require shareholder approval. A statement from George Steel, chairman and CEO of SSI, said "Under all the circumstances presently applicable to SSI, the Board of Directors has concluded that its acquisition by Smedvig is in the best interests of shareholders, our customers and our employees. SSI's premier position in reservoir and surface facilities simulation and in exploration and production consultancy will be significantly strengthened and broadened with the reservoir and drilling expertise of Smedvig, providing clients with an unmatched full drilling and reservoir service."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_2 as the subject.

Editorial - Data Management Conferences Proliferate. (October 1996)

PDM's editor, Neil McNaughton surveys the state of play of data management conferences and offers a personal view of data management tools.

In September, in the space of a week there were 4 conferences covering E&P data and information management. The PESGB had their Geoscience Data Management Seminar simultaneously with PSTI's Knowledge Working in the Oil and Gas Industry. And a couple of days later Stephenson and Associates followed with first their E&P Data Management '96 2 day event which was back to back with their E&P Data Model Seminar.


I find this sudden interest in data management conferencing a bit scary for two reasons. Firstly, last year there were no conferences, this year if we include the US, there have been a total of 7 to date, which to my mind makes the data management scene look like a bubble about to burst. Secondly, you would imagine that with this level of activity, that just about everything that could be said on data management would have been said, rather like the monkeys playing away on their typewriters and coming up with Shakespeare's sonnets. Well the bubble may burst. I don't think that 7 plus conferences a year is sustainable, but we have certainly neither heard the last word on data management, nor for that matter heard an awful lot about real world solutions to the problems that beset the upstream sector. We are however getting down to describing our problems in detail, an essential first step to understanding and then hopefully solving them.

all fall down

But as for solutions, Barrie Wells (XRG) describes the state of the software industry today as the equivalent of architecture in the 13th century. In those days people new how to build cathedrals, but they often fell down several times before they got them just right! Today many of the "solutions" to data management are yesterdays applications dressed up with some new jargon. On the other hand, there is a groundswell of opinion that yes, it would be a good idea to devote resources to data management, that yes we should all try and use the same names for wells and seismic lines and that yes we have made mistakes in the past, and should learn from them.


Many speakers have attempted to classify various levels within the IT/Data hierarchy, and have usually come up with a pyramid-like structure (do pyramids abound because everyone uses the same clip art, or do they have some mystic signification?). These can be used to represent just about anything, from the data through information to knowledge and wisdom spectrum, to the n-tiered architecture of a distributed database. All these representations have their place, but I would like to offer my own. Because we are still not up to speed with our graphic printing, I'll forgo the pyramid and present this as a table.

A data - centric view of data management


Management tools


data types

O/S Habitat

Executive Information Office Automation, Pesonal Productivity, Notes, Full text, Document Management Systems. Reports, memos, composite graphics, maps, sections. .doc, .xls, tiff, cgm. etc. PC-Intel-Windows.
Vertical Applications Workstation Applications Project data, logs seismics maps etc. SEG, LIS, proprietary binaries, data models UNIX
Data Stores Data delivery systems Raw field data seismic and well logs SEG, LIS, legacy native format. UNIX


Please note that this is not how things should be, it is just a representation of the way things are. Many of our problems stem from the difficulty of transferring data vertically through this matrix, others stem from the fact that our brains and the information we manipulate most definitely do not fall into this type of categorization. If the exploration manager of your company is a geochemist, he will probably, like all his peers use an Excel spreadsheet to evaluate the likely return on investment from a variety of scenarios. But unlike his peers, he will have a very special interest in the maturity of source rocks in the vicinity of the prospect. A geophysically-bent (and they can be..) manager may well check the interpreters pick over the prospect, and query how statics were applied, and what migration velocities were used. Geology and reservoir-engineering based bosses will likewise have their own foibles.


The point of this is that the top level decision in a company does not just occur at the top level of the data matrix, and that attempts to separate "low value" non-competitive "raw data" from the topmost level of a companies thinking are probably doomed to failure because of this. Data, especially in E&P, does not separate easily into hierarchies, and the evaluation of data's worth does so even less. The next mega play in a basin could just come from the realization that migration paths were longer or different from those previously assumed, or as I have seen twice in my brief career, that the predominant dip below the unconformity was counter-regional, and not down to basin, as the multiple-plagued seismics would have it.


Our present technology for interpretation can have a pernicious effect on our ability to jump around the data matrix. We used to have well log and seismic section headers which allowed for verification if processing parameters on the fly and avoided many pitfalls due to processing artifacts etc. On the other hand, a staff member may come back from a data room, or scout meeting with a piece of highly important, but more or less unclassifiable information, that refuses to fit into the data base. Such as the fact that "a nearby well produced oil at a great rate". Barry Wells spoke of the particular difficulty encountered in populating the rich data structures of Epicentre where it seems that wherever you start, there is always some data you needed to enter before that point to ensure referential integrity. Wells also spoke of the need to record in some circumstances the fact that no data was recorded citing the case of a missing plug (why was it not taken?) as a problem that was handled in the now defunct Spooler project which was an attempt to standardize core descriptions which catered for just this eventuality. Such richness is absent from many of our data stores today. In general, it is unlikely that the heavy duty data model will ever match the domain specific detail available within a more focused commercial product. Isobel Emslie (Conoco), again at the Stephenson Conference, cited PetroVision as a particularly rich tool for handling scouting information and entitlements -thereby hangs a tail see our article.


What an E&P shop needs is transparent access to any and all of its' data at the drop of a hat. The multi-tiered data management solutions on offer today are a long way from supplying this. The philosophy behind many of them, that data can be filtered and processed so that the next person in the data food chain only sees what he needs to see is not how things have worked in the past, and it is not how things work today when the members of an asset team all beaver away on data from all over the place. Similarly the separation of tasks between "low value, non competitive" data management and high added value "wisdom based" decision making is equally hard to justify. Good data management will give a company a supreme competitive advantage which will probably increase with time. The data itself will get harder to manage, the use of data will intensify with on the fly processing, and the return on today's investment in data management will be high indeed. Lets get cracking!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_3 as the subject.

Seismic scanning service in South America (October 1996)

UK-based Spectrum Energy and Information Technology has announced that its Spectrum Geoscan service for the scanning of seismic data is now available in Buenos Aires, Argentina, through a new joint venture with local company Rappallini.

Spectrum says that increasing interest in petroleum exploration locally has created a considerable opportunity for its service in Argentina and that Rappallini has a track record of success with related services in Argentina. Rappallini's services include 2D/3D seismic data processing, map and well log digitizing and field QC.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_4 as the subject.

Petrovision chosen as core of Elf's E&P technical data management system (October 1996)

Elf have awarded the Archidex data management project to CGG-Petrosystems. Archidex involves the remastering to RODE encapsulated Magstar robotics of around 650 thousand legacy seismic tapes plus ancillary data.

Following an industry-wide call for tender Elf Aquitaine Production signed an agreement with CGG in September for the renovation of its seismic archives in the Exploration and Production division, based in Pau (France). This will involve the implementation of the PetroVision line of services and products developed by CGG-PECC. The aim of the resulting project, known as Archidex, is threefold: conserve the data free from all forms of physical deterioration, make it easily accessible to end-users and reduce the cost of media storage. CGG will simultaneously carry out data capture operations (both tapes and all associated documents) and implement PetroVision, a "POSC-based" data bank coupled with a mass storage system using IBM 3590 (a.k.a. Magstar/NTP) technology.

PECC subsidiary

This allows for the storage of around 10GB of data onto a single cartridge. CGG's experience in this area, which is chiefly due to its 51% owned subsidiary, PECC, will be used to recover a stock of some 650 000 seismic data tapes (7 track, 4 x 9 track, 21 track, 9 track and 3480) which contain mainly field data. At the same time some 1.2 million sheets of paper documents relating to the magnetic media (observer reports and associated documentation) will be scanned to bitmap. This existing stock which corresponds to over 30 years of acquisition in all the regions of the globe, will be updated in 1997 with around 100 000 forecast magnetic media, (widespread use of new high density media by acquisition contractors should reduce this figure in future) and their associated documents (paper or floppy disk), making Archidex one of the largest transcription-compaction projects in the world.


At the other end of this capture process, the seismic data will be encapsulated using the PECC's RODE format and stored on 3590 cartridges. RODE (Record Oriented Data Encapsulation) is based on the API's RP66 data format, in turn derived from Schlumberger's DLIS tape format. It is expected that the RODE standard will be approved by the SEG as an archive format later this year. These operations will be carried out using software developed jointly by CGG-PECC (MediaManager, RAM, IDS) connected to specific hardware devices. PetroVision will be used to handle the data and integrated into the Exploration and Production technical data management system. The PetroVision data base, which is "built on" the Epicentre model, will be used in the Archidex project to store all the pre and post-stack seismic. In the initial stage, it will be implemented as a corporate data store in Pau and subsequently at other sites around the world, inter-linked via a network. The PetroVision application modules will be located in various EAP operational sites and as such will allow geophysicists specializing in interpretation and processing (the "Archinauts") to surf on the EAP network and directly access the information they require, simply by selecting it on a map, visualize the various elements to validate their choice (thus creating a shopping list) and have them delivered in a standard format so that they can be used with their own specialized software.


PetroVision was originally developed jointly by CGG/Petrosystems and PECC to cater for Algerian State oil company Sonatrach's data management needs under a $16.5 million project financed by the World Bank. The Sonatrach Petroleum Data Bank Project was implemented on a Convex C3420 with digital data stored near line on a 5.6 terabyte D2 Emass Data Tower. The Sonatrach project incorporated over 100,000 seismic and well log tapes and over 200,000 paper items. These are scanned to bitmap and indexed, and where appropriate, vectorized. The Sonatrach project has been implemented in a politically hostile climate. Currently, the central server and workstations have been installed in Sonatrach's Data Control Building located in Bourmedes, a suburb of Algiers, and data is being remastered, catalogued and loaded.


PetroVision is said to be POSC compliant, but as always this can mean many things. Raw data is to be stored in native formats using RODE encapsulation, neither of which form part of the POSC specification. Our understanding of PetroVision functionality is that the POSC "compliant" part is the inventory database which is developed on the relational projection of Epicentre. PetroVision "sees" bulk data as electronic documents on the file system rather than Epicentre's frames within Oracle blobs (binary large objects). This not so much a compromise with the Epicentre data model but more a recognition of the facts that a) frame data in an Oracle blob is no guarantee of interoperability, b) queries of this data type are not generally required and c) data access is much more performant for file/system data.

Integral Plus

But this field of POSC compliance may be extended by ELF, which, as one of the 7-sister POSC sponsor companies is strongly committed to promoting POSC technology. Otherwise PetroVision utilizes the Integral Plus data model for back-populating interpretations. Given the French connection, PetroVision is closely coupled with Integral Plus, which was a joint CGG, Total and Elf development carried out during the 80's. Integral Plus is an integrated workstation suite of applications covering the whole spectrum of E&P activity. This is a fascinating project, and differs from other data banking projects through the incorporation of near-line access to field data. If field data is to be called up via remote terminals as suggested in CGG/PetroSystems press release, then this may make some quite staggering demands on networks and will at least provide a lot of work for the French cable companies! A foretaste of CDA phase III (see inside this issue).

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_5 as the subject.

StratWorks stays with Landmark (October 1996)

Landmark Graphics Corporation announced mid September that Halliburton Company had waived the requirement that Landmark divest its StratWorks software application as a condition to completing Halliburton's acquisition of Landmark.

As a result of this waiver, Landmark intends to retain StratWorks following Landmark's acquisition by Halliburton. Landmark was due to convene a special stockholders' meeting on 4 October 1996 at its headquarters in Houston.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_6 as the subject.

GeoQuest releases new version of GeoFrame (October 1996)

Version 2.5 of GeoQuest's GeoFrame reservoir characterization system now includes StratLog and new WellPix module.

As well as integrating StratLog into the GeoFrame environment, GeoFrame 2.5 is said to introduce new applications, including WellPix and WellSketch and feature many product enhancements to existing products. Howard Neal, vice president of product development, said the integration would streamline the interpretation workflow process. "With this release, GeoQuest now offers the most comprehensive suite of fully integrated geologic interpretation software, allowing highly efficient wellbore to regional studies." The addition of StratLog, a comprehensive geologic interpretation and display application, to GeoFrame offers complete geologic interpretation capabilities and access to petrophysical analysis through the Oracle project database, which underlies all GeoFrame applications. According to GeoQuest, this tight integration ensures that changes to one display are reflected in others, accelerating completion of time-consuming tasks associated with the interpretation process. For example, markers picked in WellPix are immediately available in StratLog.


The new WellPix module adds to the geological interpretation features available in StratLog and focuses on geologic well log correlation and market interpretation. WellPix offers enhanced techniques that speed up and provide new insight to the correlation process. Simple templates for well display and user preferences support ease of use. To interpret complex areas WellPix offers a variety of features including variable area color fill, fault gapping, flattening on markers, independent scrolling and log ghost image drag, stretch and squeeze. WellSketch, another new application, is used for generating wellbore equipment diagrams featuring powerful spreadsheet entry and editing capabilities, robust equipment libraries and graphic editors. Other WellSketch fonctions include zoned displays, free annotation and the generation of hardcopy reports. WellSketch displays are available for display in other GeoFrame applications.

User friendly

GeoQuest says new GeoFrame 2.5 data preparation enhancements provide a powerful set of user-friendly tools for data editing and environmental corrections. WellEdit, a well log and core data editor module, replaces the interactive log editor (ILE) module. WellEdit features general log editing, stretch and squeeze, depth shifting, core data and core image editing, unlimited undo options, interactive data functioning and an audit trail and other features as well. The GeoFrame base tools package, which delivers the functionality needed to load, unload, manage and organize data and work sessions also has been significantly enhanced with additional data loading and unloading options. Introduced in 1993, GeoFrame is the centerpiece for GeoQuest software development. Designed to comply with standards from the Petrotechnical Open Software Corporation, GeoFrame products are organized by discipline-specific product lines, each with a variety of modules.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_7 as the subject.

Veritas and Digicon merger finalized (October 1996)

Veritas Energy Services and Digicon have completed their merger under the new name of Veritas DGC. Gross revenues for the two companies to end January 1996 were $263 million. At the operational level the Digicon and Veritas names will be retianed.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_8 as the subject.

Databanks, trade and Entitlements - or what the Norwegians did right (October 1996)

The PESGB Data Management Seminar held in London last month included case histories of National Data Repositories in Norway and the UK. A comparison of the centralised approach evidenced in the Norwegian DISKOS project with the devolved responsibilities of the UK's CDA shows a lead for the former - at least in allocating entitlements.

Mona Torsvoll (Statoil), speaking at the PESGB Data Management Seminar gave a paper on the Trade Module, the latest addition to the Petrobank suite of tools managing Norway's Diskos project. The subject of databanks was core business in just about all of September's data management conferences. Norway has a great advantage over the UK sector of the north sea thanks to Statoil's omni-presence. Because of Statoil's role in the Norwegian sector, they know pretty well everything that has happened in the way of trades, and what they don't know, hasn't "happened" - if you get my drift. So, contrary to the situation in the UK, the question of entitlements is relatively easy to sort out. If Statoil says you are entitled to data, well then you jolly well are. Why is this such a big deal? Well Isobel Elmslie, speaking at the Stephenson Data Management '96 conference told a different tale. The UK's Common Data Access initiative, destined to provided network access to well log data to subscribers in the UK, has faltered faced with the near intractable problem of who owns what.


But first back to the Diskos and the Petrobank trade module. This has been developed to handle trading of well logs. The module registers well trade proposals, relates these to others, simulates the consequences, offers alternatives and helps to locate common ground between trading partners. Anyone who has been involved in trading data knows how quickly this seemingly simple process turns into a nightmare of interlocking and contradictory desiderata. While the process was described as relatively easy for well trades, even including complications such as bottom hole contributions, shared ownerships and farm-ins, the extension of the trade module to seismic data is anticipated to be problematic. This is because of the complex structure of a seismic line which Torvoll breaks down into "business objects" (that rings a bell) which are essentially shotpoint to shotpoint ranges owned by different companies. The problem of trading pre-stack data makes the business objects fairly rich in complexity. So far the trade module has been used internally by Statoil but is should be available for use by Petrobank customers by year end.


Dave Overton (Amoco) Chairman of CDA described the progress of this project at the PESGB conference. Current status us that 60% of raw well log trace data has been loaded, with 181 wells online. This represents a considerable slippage relative to the forecast schedule, due to the aforementioned entitlements problem. No vendor added value data is available to date, probably the reason why only 16 of the 40+ member companies are connected to date.

what's next?

Phil Williams (Hydrosearch Associates) described the objectives of the next phase of the CDA initiative - to extend the on-line data bank to include seismic data. This has been divided into three phases. Phase 1 will cover the standards, navigational data, entitlements (again), cultural and license data and the access systems. Phase 2 will involve the banking of stacked seismic data, and Phase 3, the main volume of field data. It is interesting to note the inversion in the implementation order between the current well CDA, and the future seismic version. The well program has focused on making the raw data available, whereas the seismic phase will follow the Diskos initiative in considering the processed, stacked data a priority. We are touching on an important issue for data banking. Should it really be a neutral no-added value datastore of raw data, or should it have convenient processed (depth adjusted curves or stacked seismics) data available to the end users. The answer depends on what the end user wants and how one views the industry. Some may be happy to download some lowest common denominator of a processed data set, sure in the knowledge that everyone in the bidding round will have the same interpretation. Others, may actually consider that processing seismics is an activity which can give a considerable competitive advantage to a company who, having done it right for their own use, may offer up a rather poor provisional stack for the common weal.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_9 as the subject.

Schlumberger To Record Unusual Items in Third Quarter (October 1996)

Schlumberger Limited announced on September 25th that it will record 'unusual items' in the third quarter. With increasing profitability and strong outlook in the US, Schlumberger will recognize a portion of the US income tax benefit related to its US subsidiary's tax loss carry-forwards and all temporary differences. This will result in a credit of $360 million.

A charge of $300 million after tax related primarily to Electricity and Gas Management, and Geco-Prakla. Within the Oilfield Services segment, the much improved results of Geco-Prakla in the quarter are mostly due to the Marine activity. The losses in the Land and Transition Zone businesses have been reduced, but we are convinced that more radical changes, including the write-off of Land goodwill, are required to ensure the long-term financial health of these businesses. Chief Financial Officer Arthur Lindenauer stated, "Over the near term these items will have no material impact on the results of Schlumberger."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_10 as the subject.

StorageTek announces HP support (October 1996)

StorageTek announces support for HP -UX PA-RISC-based HP 9000 enterprise servers and is now shipping RedWood cartridges in 10, 25 and 50 gigabytes (GB) capacities uncompressed.

RedWood is claimed as the world's fastest and highest-capacity device using the industry-standard 3480 tape cartridge form factor. HP supports attachment to the StorageTek transports and libraries through its HP OpenView OmniBack II application. "The industries and applications gobbling up data storage space today hardly existed a few years ago when a library of standard 200-megabyte cartridges was sufficient," said Gary Francis, StorageTek vice president, Nearline¸ line of business.

1˘ per MB

"Our emphasis on RedWood ultra-high capacity and performance is particularly timely for applications such as medical imaging; computation-based analysis; scientific data gathering for weather forecasting; and new opportunities made possible by the popularity of the Internet. Using RedWood, customers can store their data for less than a penny per megabyte, and take advantage of the automation our library robotics offer." Francis continued, "In a November 1995 report, the Gartner Group estimated that 5.5 billion documents are created each year in offices. Much of the information in these documents is germane to the conduct of business, must be available and shared within departments and local and remote offices, then saved for possible later reference. Stuffing paper into filing cabinets is not the answer. RedWood facilitates information sharing by making formerly unwieldy paper documents available for online access."

2 TBytes

Mark Rehrauer, E&D consultant at MOBIL Exploration and Production Services, Dallas, is responsible for support of the exploration services computer environment. "We are using RedWood for Unix file system backup for our Cray processors and RS6000 servers," said Rehrauer. "RedWood allows us to store large amounts of data while using only a few of our valuable silo slots. The implementation was very simple with no additional driver or operating system changes required. In addition to solving our problem of backing up data quickly and efficiently, RedWood is also used to manage 2 terabytes of Cray data via a file migration system -- all without adding additional silos to the two 4410s we already have."

17MB per second

The RedWood subsystem is the first ultra-high capacity tape drive to exploit 17 megabytes (MB) per second native ESCON fiber-optic technology. It also fully supports the 20 MB per second SCSI-2 Fast and Wide channel and is architected to incorporate emerging faster channel technologies. The individual RedWood control units are more than twice as fast as traditional tape subsystem architecture. A single 50-gigabyte RedWood cartridge can hold up to 250 times more data than a standard 18-track cartridge and 125 times more than a standard 36-track cartridge. When combined with the 6,000-cartridge StorageTek PowderHorn¸ 9310 tape library, the RedWood technology offers users up to 300 terabytes of data under automated control. The RedWood SD-3 Helical Cartridge Subsystem features digital D3, helical-scan recording technology developed by Matsushita Electric Industrial Co., Ltd. The RedWood cartridge subsystem is part of the StorageTek Nearline family of tape and robotic automation solutions for data storage, retrieval and management. Information on StorageTek is available on the World Wide Web at http://www.stortek.com .

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_11 as the subject.

Executive moves (October 1996)

Parallel Geoscience, Cogniseis and CGG announce personnel moves

New sales manager of Parallel Geoscience Corporation based in the US is Michael C Lochrer, who has been in the industry since 1984 and has been working with Seismic Processing Workshop (SPW), Parallel's intelligent, interactive software since 1990.

Mr. Steve Hunt has been appointed senior vice president, sales, support and operations based at CogniSeis Houston offices. He was the previously the general manager, EAME for CogniSeis and is replaced by Jim Martin who recently joined CogniSeis UK as sales and marketing manager, EAME , after managing the CogniSeis processing center in Paris. Regional Sales Manager in the UK office, Martin Anderson, becomes the new sales and marketing manager, EAME.

Patrice Canal, vice president of research and development at CGG for the past five years., has been appointed vice president of marketing for the CGG Group. He is succeeded by Marc Mortureux, formerly executive advisor to CGGs general management.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_12 as the subject.

Data management in Calgary, a PDM Case History (October 1996)

Leanne MacKinnon describes how Canadian independent Talisman manages its seismic data.

It's no secret that many oil and gas exploration companies have a seismic data management problem; a costly problem that consumes resources, lengthens cycle times, endangers productivity of geophysicists and worst of all, is very expensive to repair. The exploration industry has suddenly taken notice of this problem. Software product companies are busy developing tools aimed at attacking data management, conferences and articles are elevating the profile of this relatively dry subject, and service providers are packaging "solutions" addressing an issue that has traditionally been considered a necessary evil. Seismic data management has always been a high priority for one Canadian company, Talisman Energy. Their qualified, dedicated seismic data management staff use a successful combination of sensible procedures supported by advanced technology to achieve high quality and efficient management of their corporate seismic asset.

ex-BP Canada

Talisman is an independent, Canadian based company that is actively exploring in the Western Canadian Basin, the UK North Sea and Indonesia. Formerly BP Canada, Talisman has grown through strategic acquisitions followed by successful exploration and development programs in and around core areas. They have approximately 80 exploration staff in the Canadian office and incur exploration and development capital spending in the $200 million US range yearly. In 1996 they participated in the drilling of 700 wells in Western Canada. Their seismic holdings are significant by Canadian standards; they have approximately 60,000 2D seismic lines in their database and this continues to grow each year. Linda Lindsay has over twenty years of experience in the inventory management business and has been managing Talisman's seismic library for eleven years. She applies a meticulous, yet pragmatic approach to inventory management. Lindsay recognizes that the seismic inventory function has evolved from a records management focus to an information management function over the years as the business and the tools have changed. Lindsay reflects that, "Seismic data is finally regarded as a corporate asset that must be protected".

common sense

Talisman recognised that instituting common sense procedures and ensuring that they are constantly maintained was the key to successful data management. Although exploration management is sensitive to inhibiting individual creativity of the geophysicists by imposing unnecessary administrative tasks on them, geophysicists have played a key role in enforcing the standards and procedures. They believe that many of the procedures that were established represent a tool for them to monitor data quality, control costs and track progress on various projects. The competitive environment in Calgary has enabled Talisman to alter their approach to the engagement of contractors in the seismic arena. In the past, work was sent out of house and accepted unconditionally. Now, the importance of a two way client-contractor relationship is recognized. Talisman tends to deal with fewer service providers in a given area, but spends more time and effort educating contractors on their expectations and establishing standards that must be adhered to on an ongoing basis. This has been deemed a mutually beneficial relationship; the contractors benefit because they have "preferred vendor" status and Talisman benefits from higher quality services, yet at competitive rates.


Internally, the exploration department is structured into groups or business units that represent broad geographic areas. One geophysicist in each area is completely responsible for budgeting and controlling all seismic costs incurred in the area. This responsibility automatically translates into the geophysicist having a vested interest in authorising vendors to do work and ensuring that the work is completed to their satisfaction. In 1994, Talisman was the first company in the local oil and gas industry to make the bold move of remastering their entire 80,000 plus nine track seismic tape library into an off-site facility connected to their office via a high speed network . They chose to put all their processed data online and set up appropriate security so that all geophysicists and technicians could request their own data. To ensure the quality of post-stack data in the corporate data bank on a going forward basis, Talisman has set and is enforcing a SEG-Y standard to be used by all the seismic processors they contract. The geophysical technicians in each business unit are responsible for loading all seismic data to the interpretation applications. Brad Peers, responsible for exploration systems support notes that, "The technicians feel they can have clean data loaded to the application in less time than it used to take them to complete the data request form in the old system". Although the implementation of this new technology at Talisman has not been without its troubles, it has achieved almost immediate acceptance in its geophysical user community.


Reprocessing has always been the bane of the inventory manager's existence. It is a time consuming, labour intensive, multi-step work flow process that necessitates careful tracking of all items as they travel through each phase. In Western Canada, companies are relying more and more on maximising the use of any data that has been shot in an area because it is now much more difficult to shoot new data. Talisman recognised this and designed a model where, in the near future, reprocessing will simply involve an electronic transfer of the pre-stack data, audited survey and observers reports to the processor. All pre-stack data is maintained in the same off-site facility as the post-stack, available through near-line instead of on-line access. All observer reports have been scanned and indexed, and all survey data is in the PPDM compliant seismic database. With electronic transfer of observer reports, co-ordinate data and pre-stack data to the seismic processors, turnaround on reprocessing can be reduced significantly. Handling time on the back end is eliminated because there is no need to re-file items.

sales revenue

Data sales is another area that is handled very effectively at Talisman. In Western Canada, seismic data is a revenue generating asset because it is a lifetime proprietary asset to the companies that shot the program. In 1994, exploration management set a new policy that all revenue generated from the sale of seismic data would be credited to the budget of the appropriate area geophysicist. In return, geophysicists would have to approve all Quality Inspection requests that come into Talisman. Previously, revenue from data sales was appropriated to a general corporate revenue account. The result of this change was that revenue from data sales increased tenfold in one year. According to Greg Becker, an interpretation geophysicist, "We love it - we don't mind doing the administration associated with data sales because we get direct benefit from it".

computer literacy

Becker was originally a processing geophysicist from BP in London. He came to Canada in 1988 to set up BP Canada's in-house processing system and later made the transition to interpretation geophysicist. He has been supportive of all deployment of new technology in geophysical domain, including the recent efforts to improve management of seismic data. Becker states that, "The nature of projects in Western Canada dictates that careful management of seismic data is a must. The projects tend to be comprised of 2D and 3D data of varying vintages, the plays are often subtle so that knowledge of detailed processing information is important, and the data is more difficult to handle because it is all land data". Becker notes that geophysicists as a group span the spectrum of computer literacy, from those that are interested in any new technology to those who resist the use of it. He considers himself firmly rooted in the business of geophysics, but is able to see the value of some of the emerging technologies, and is prepared to support the implementation of these technologies. Becker says, "I'm not a visionary; my focus is to make better use of the tools that are available today".


Talisman does have a technology visionary in Mo Crous. He has been at the helm of the exploration services group for years and is responsible for many of the innovative changes in seismic data management that Talisman has pioneered and implemented over time. He places a great deal of faith and trust in the people he works with, and encourages them to approach him with business improvement ideas and solutions which he is often prepared to support. Crous was able to see the seismic data management problem that was taking shape in the industry many years ago and capitalised on some of the technology opportunities that he felt would address this problem. He enlisted the help of high quality staff in Lindsay and her group and helped to establish some of the new procedures in use at Talisman today.

not dead yet!

Talisman has by no means slain the seismic data management dragon; according to Lindsay, that is a job that is never finished. They will continue to improve the service they provide to the geophysical community by implementing new tools and procedures in the same thorough, diligent way they have done for the past decade.

Leanne MacKinnon was formerly head of software development and support for Home Oil. A founding member of the Public Petroleum Data Model (PPDM), Leanne is currently Principal of Panther Services, the professional consulting arm of Panther Software.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_13 as the subject.

Private placement by GX Technology (October 1996)

Houston-based seismic software specialist GX Technology Corporation has completed a private placement of $1,500,000 in Senior Notes. The financing was provided by KABET and Nations Bank Capital Corporation. In a related action the company appointed Douglas C Williamson as its vice chairman. Williamson is senior vice president of Nations Bank Capital Corporation and has served on GX Technology's board since Nov 1994.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199610_14 as the subject.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.