December 1996


Amoco on the warpath with Coherency Cube patent. (December 1996)

Amoco is to be defending their recently awarded patent covering the use of the Continuity Cube technique. Coherence Technology is Amoco's exclusive licensee, and others, notably Landmark, are in Amoco's sights.

Amoco has just announced the award of a patent for their Coherence Cube technology - which is described as "a method for imaging discontinuities (e.g. faults or stratigraphic features)" which allows for "revealing fault surfaces within a 3D volume for which no reflections have been recorded". The process involves scanning a 3D dataset with a filter that outputs a measure of the spatial coherency of reflectors, in the vicinity. The output from the process is in fact the local lack of coherency, and thus is particularly sensitive to faults and other discontinuities. Many users of the Coherency Cube have reported that this technology has become an essential starting point for their 3D interpretations by providing an automated way of extracting fault alignments from the data. Roberts of Amoco, speaking at the Petroleum Exploration Society of Great Britain's PETEX conference described the Coherency Cube as "indispensable for interpretation work in a complex area" saying that they would "hate to have to do without it".

Service

Amoco has awarded Coherence Technology Company (CTC) a world-wide license to market their Coherence Cube Technology with the Geoscience Marketing Group (GMG) as their "preferred provider" for the Europe Africa and Middle East region. CTC has offices in Houston and Calgary via a joint venture with Pulsonic and is thought to be planning further expansion into the UK. Marketing strategy to date has centered around the provision of a service from CTC's Houston headquarters, with no licensing of the technologies to third parties. CTC plan to stick to this approach for the time being, although they do envisage setting up dedicated processing centers for larger organizations outside of the US. The US headquarters are staffed by 13 specialists, and CTC's main processing power is provided by a Silicon Power Challenger with 1 Gig memory and 300 Gigs of disk. Networked NT PC's running Promax are used as front ends by the processors.

in-house competition

Techniques for extracting coherency from seismic data have been tried before, with several in-house products developed by major oil co.s, and some commercial offerings such as Landmark Graphics Corp.'s Continuity Cube. However now that Amoco have succeeded with their patent on this technology, they are now, according to Tony Rebec, CTC's head of marketing seeking to "vigorously defend" this in the face of other commercial providers of competitive technologies. Landmark is thought to be involved in discussions with Amoco on this, and will no doubt be "vigorously defending" their own product. Patents have been awarded and successfully defended in the past for geophysical technology from the CDP, through Vibroseis to various processing techniques. This should make for an interesting struggle as, after about one year of commercial applications outside of Amoco (the CTC was first shown - and with considerable impact - at the 1995 SEG in Houston) the CTC is beginning to be considered by some companies as a mature, essential technology for fully exploiting the 3D seismic data volume.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_1 as the subject.


Merak Projects eyes up Ogle's petroleum software (December 1996)

Calgary-based Merak Projects is purchasing Ogle, the United Kingdom's software system for petroleum economic analysis.

This acquisition, expected to close mid- November, is the centerpiece of a strategic alliance, with UK-based Energy Resource Consultants (ERC), a division of PGS Reservoir. Ogle has more than 100 users in companies that include BHP, Fina and Marathon. Ogle is "renowned for its rigorous treatment of the fiscal calculations for UK and North Sea oil and gas production". Kent Burkholder, engineering manager for the company's London office, says "Merak's partnership with ERC will provide Ogle users direct and immediate access to complementary value management tools, while guaranteeing their confidence in the fiscal calculations." The Merak product suite includes more than 10 economic, engineering and field applications, such as PEEP, DecisionTree and WeIlView.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_2 as the subject.


Learning the ABC of compression and its business value (December 1996)

PDM's editor attended a half day workshop on data compression at the Denver SEG. He learned a lot, but was intrigued by the gap between the proffered 'business benefits' and real-world applications of the technology.

In our post restructured universe, in order to get funding for anything, especially for research, it is now necessary to present a "business case" describing the financial benefits to be reaped. Since our industry has only recently emerged from the dark ages of being "just a science", and business benefits are still considered as being "a good thing", let me remind you of the fate that befell those who heeded the "business cases" for the South Sea Bubble, the Suez Canal and more recently the Channel Tunnel. Business cases are generally presented with a degree of spin, and comprise three elements. One - make out an over-bleak case for the status quo. Two - Overstate the ingenuity of the project, while understating the cost of implementation. Three - paint an excessively rosy picture of the future and the benefits to be accrued. I apologize for stating the obvious, but it is useful to regard the purpose of a "business case" as a means of extracting money from an investor, whether he is your boss or shareholder.

Business case?

In E&P research, before the adoption of the "business case" paradigm, decisions were based on an examination on a case by case basis of the intrinsic merit of a project, its likely fields of application, and much of the judgement was based on common sense and experience. What silly ways we had then! The business case in favor of compressing seismic data - as initially presented - was a simple one. As 3D recording uses higher and higher spatial sampling, and as the areal extent of a 3D survey is now often over a whole permit, seismic data volumes recorded during one survey are now often of the order of a terabyte. This is an awful lot of data to move, especially on "conventional" (i.e. non High Density Media (HDM) such as D3/NTP). The number of tapes involved create logistical problems and considerable cost. Furthermore, since "time is money", the time taken in transporting and manipulating the tapes can be translated into a cost weighting in the "before" business case, preparing the way for an even more spectacular saving. The idea behind compression initially was, record as much data as you want to some temporary storage on board ship, then perform some sophisticated on-board processing to compress this by a factor of 10, 100 - even 300 has been suggested, and then throw away the original data.

lossy

There are two types of compression, lossless and lossy. The former is the kind of compression that is used to send faxes, or binary files over the Internet. The latter is illustrated by the video clip data on CD-ROMs which is compressed in this way, leading to a rather poor image quality. One trick used in lossless compression is to identify repeat sequences in a text document, such that a sequence of 20 identical bytes (characters) would be sent as 20x - taking up two 2 bytes instead. This is termed run-length encoding. Another means of lossless compression, Huffman coding, involves ranking the characters in a text message in order of frequency and using short codes for the most frequent. This is rather like the way a single dot is used in Morse code to send the letter "E", while a Z is transmitted as, whatever, something longer I guess.

Huffman

As its name implies, lossless compression allows the original data to be completely recovered. If this worked with seismics, we would be in business. Unfortunately, it doesn't work - at least not very well. Because of the random nature of the seismic time series (which is good news for decon, but that's another story), lossless compression can only achieve a limited compression on raw seismic data. Western's Zejlko Jericevic showed that lossless compression of 30-50% could be achieved on data stored in internal numerical formats such as IEEE Float or 24 bit internal. This compression relies on "byte slicing" i.e. re-arranging the bytes in a computer word. Because the high order bytes change relatively slowly, they do offer some possibility of compression using for instance the Huffman coding mentioned above. This type of compression could well be useful in certain circumstances, but will not provide the dramatic time/space savings which were initially hoped for.

data loss

For a business case of any magnitude to be made out, compression has to be lossy, i.e. the compression process will involve some destruction of the original data. It will not be possible to recover the original trace data by reversing the compression process. This sounds bad, it is, potentially, and leads us on to a whole suite of arguments which have been used to justify this destruction of data. Most of these arguments compare the high volume and precision of field data with the low volume and low numerical precision of the data actually input to the workstation. They are all along the lines of - well other parts of the acquisition/processing/display chain mess up the data quite a bit, so why shouldn't we too! I cannot hide my feeling that this is not how we should be doing business, and that if processing, for instance is failing to preserve the whole of the recorded dynamic range, or that our workstation applications only use 8bits of color depth, then these should be regarded as opportunities for improvement, rather than lowest common denominators to which all our other processes should be driven down.

judicious use

At a half day workshop held at the SEG annual conference and exhibition held in Denver last month The first thing to emerge from this is that only the most oblique references were made to the jettisoning of original data. The business cases today are more subtle. One category involves the transmission of data from ship to shore - in order to achieve some pre-processing ahead of the arrival of the bulk of the data. Here the limit is that imposed by the bandwidth of data communications from a seismic acquisition vessel to the shore. These are still of the order of a few megabits/s range, far less than that required for data transmission of the whole dataset in any reasonable time frame. A judicious use of compression, combined with some thought to what data really has to be moved around, has given rise to an interesting processing methodology developed by Western. Processing power is left on the boat, but processing decisions are made by ground based personnel using limited transmitted data. This allows a considerable time savings, but the original data is kept in its entirety.

intensive

Another use of compression is to allow data manipulation intensive processing to be performed within a realistic time frame on multi-terabyte 3D datasets. While establishing migration parameters for 2D data, considerable to-ing and fro-ing between offset and cdp sorted data is performed. This is not computationally feasible on a big 3D survey, so enter compression. Do these sort-intensive tasks on compressed data, but process on the full dataset. One could go on with reasons to compress, for instance, there is today, but probably not for long, a sharp division between processing and interpretation. This is unnatural, and fits poorly into the asset management paradigm which is increasingly used. What does the interpreter do when during an ongoing development program , a well result comes in way off in depth. This was caused by a velocity "anomaly", and means that the migration velocities need to be changed and the whole post stack dataset regenerated on the fly. A potential candidate for processing with a limited dataset providing it will speed things up. So in steps compression.

Chevron

So how do you achieve these very high compression ratios - some as high of 300 times have been cited by Chevron Petroleum Technology Company (CPTC) notably, who announced an alliance with Landmark to "speed delivery of compression technology"? It would be beyond the call of duty for me to attempt to explain the intricacies of wavelet transforms. If you want to impress your colleagues Discrete Wavelet Transform (DWT) is the buzz-phrase to remember. Evolving from satellite imagery, via the JPEG compression used in video and the FBI, who use it for compressing fingerprint images, in the seismic arena this looks a lot like an f-k transform, but not quite. As in f-k filtering though, some decisions must be made as to what will be compressed, and what will be thrown away. While the DWT mathematics are designed to identify useful signal and to preserve it, the proof of the pudding is in the eating, and the results are quite impressive. The Chevron technology shows little difference in data compressed by up to 300 times! Of course what would be interesting would be to see some of the worst case results. I would suggest that pre-static corrected seismics in an area of rapidly varying surface conditions may make for a harder test of the method.

too slow

In any event, compression must be fit for purpose. As Vermeer (Schlumberger Cambridge Reserearch) states, the admissible compression will be much less for a shot record transmitted for QC, than for processing where at the very least, it would seem reasonable to apply a refraction mute. Vermeer also cautions compressors as to the effects of a single noisy shot record. Compression may smear the noisy record over neighboring records, necessitating a high level of data clean up before compression. While the foregoing has shown that considerable savings in data volumes are possible - and hence in disk space and RAM, this is not sufficient to make the case for compressing a really strong one. The missing link is performance. Much of the above business cases are as time critical as they are volume critical. This is where some compression algorithms show weakness. The time spent in compressing and decompressing data may outweigh the gain imparted by the reduced data volume. This has led some workers, to suggest processing in the compressed (wavelet) domain. An interesting idea, but perhaps there is a trade-off in terms of understanding what is going on. Most geophysicists - and I include myself - had enough problems mastering the frequency domain.

Zip it?

An important spin-off from considerations of compression is the necessity or otherwise to arrive at a standard for the exchange of compressed data. This would be necessary, if compression became a widespread technique, in order to preserve the possibility of acquiring and processing data with different contractors. Diller (Encore Software) suggested an elegant way of ducking the standards issue by compacting seismics with the equivalent of a self extracting .zip file. In other words, delivering the data with a self-executing de-compaction algorithm. To sum up, ompression is unlikely to give savings in data management. In fact if we need compressed data for certain compute intensive tasks, then our data management problems will be increased with potentially multiple sets of data compressed and uncompressed at various stages in the project life-cycle. The original data will be kept, because who knows what the future may be able to extract from it. The original business case for compression will be lying in the dust, but we will have some powerful new tools at our disposal for imaging the reservoir. That after all is a better case for compression than saving a few cubic feet of warehouse space!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_3 as the subject.


4D Seismics Packs Them in at Post Conference Workshop. (December 1996)

Time-lapse (or 4D seismic surveying proved a popular topic at the Denver SEG. A very well attended post-conference workshop presented some case histories outlining the benefits and pitfalls of the novel technique.

They were climbing over each other at the SEG's post conference workshop on 4D (time lapse) seismics hosted Bill Aylor of Amoco. An estimated 200 attendees stayed on an extra day braving the cold in Denver to hear a some impressive case histories covering this hot topic. The essential idea behind 4D seismics is to perform repeat 3D seismic surveys over a field to monitor changes within the reservoir due to the production of fluids. Clearly not all reservoirs will be amenable to these techniques and the best targets are shallow, unconsolidated reservoirs which can respond in a spectacular manner.

artefacts

Jim Robinson of Shell emphasised the technical considerations necessary to achieve valid differential seismics - minimising the risk of artefacts. Problems such as tides, weather, water-table changes may impact repeatability, and in some cases bottom cables or permanent detectors may be necessary. But it was the proselytising from Roger Anderson of Lamont Doherty that won over the audience. Anderson's thesis is that, just as 3D was a quantitative leap into a revolutionary technology, 4D will be the next step and that the attendees were to "remember this day" as being the advent of 4D as a mature technology. Some of the time lapse imaging presented at the SEG workshop is of such an immediate impact that even reservoir engineers have tipped their hats to the seismologists. For the first time the effects of water drive and gas cap expansion can be actually followed from the surface, promising to radically change the way certain types of reservoir are managed, and giving the 3D seismic business yet another shot in the arm.

spectacular

A beautifully illustrated article on 4D time lapse seismics appeared in the November edition of Petroleum Geoscience, the journal of the European Association of Geoscientists and Engineers. This paper, by Watts et al. on monitoring of the Magnus field illustrates the basics of the techniques involved, and shows how even older 3D seismic surveys can provide useful information. The important point in time lapse monitoring, as in seismic characterization of the reservoir in general is that the parameters observed should have some physical relationship with the changes in the reservoir.

data issues

Now why you may ask is an article about 4D time lapse seismics appearing in The Petroleum Data Manager? Two reasons. One, we already know that data volumes are growing exponentially, with around a terabyte of data acquired in a 3D survey. The impact of repeated 3D surveys means that we will have to manage multiple copies of these surveys in the future. The other reason relates to the current fad for compressing seismic data (see the editorial in this edition). If we are acquiring data with a view to comparing it with a future dataset recorded at the same location, then it would seem dangerous to apply some of the lossy compression algorithms to the data before storage. We can anticipate that, a few years down the road, we will be able to extract more bandwidth from the field data, and even perhaps visualize it with more powerful hardware. A 3D reservoir study today must include an element of planning for future re-use and should, in our opinion, be stored uncompressed, to allow for maximizing its value at some future date.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_4 as the subject.


Boring information on CD-ROM (December 1996)

The British Geological Survey has compiled a CD-ROM containing data from over 500,000 on-shore boreholes.

Everyone knows, don't they (?) that boreholes are a vital source of subsurface information, essential for civil engineering, the bulk minerals industry, water resources, hydrocarbon exploration, resource planning and geological consultancy. The British Geological Survey (BGS) certainly knows. It has been collating borehole information for over 160 years and now, in cooperation with GeoInformation International, has published details on a CD-ROM called the British borehole Catalogue (BbC). The main record of down-hole testing or examination of the core (known as the 'log') is a document of long-term value and forms one of the principal components of the geological archive held by the British Geological Survey. The reference collections in this archive, which also contain technical reports, maps, mine and quarry plans and seismic traces are derived from both internal and external resources and, provided they are now held under confidential cover, are available for public inspection. The essential feature of these collections is the availability of efficient indexes to the data they contain, now increasingly held in digital form for rapid computer retrieval. As most geological information is spatially related, and located on the ground by reference to the National Grid coordinates, the CD-ROM system now makes locations uniquely accessible by the use of reference maps based on Ordnance Survey strategic datasets.

500,000 logs

The BbC is a collection of some half a million borehole logs held by the British Geological Survey on a single CD-ROM. Areas of special interest can be located by either name or map area and details of borehole position, reference, drilled length and status are provided. Specific logs can then be inspected at the British Geological Survey's record centers for more detailed geological or geotechnical information. The British borehole Catalogue (BbC) on CD is available to single users at £99.00 or under corporate license at £199.00. Discounts are available for academic use. Paul Duller brought to our attention to a scathing review of the CD-ROM in Computer Weekly entitled "The most boring CD in the World" by Ron McQuaker. Quote "I think I've discovered the dullest CD-ROM ever published, a platter so mind-bogglingly tedious it makes BT's phonedisc look like Encarta." We beg to differ!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_5 as the subject.


GeoQuest gains ISO 9001 quality recognition (December 1996)

GeoQuest has just received ISO 9001 accreditation for its marketing, support and training departments

GeoQuest recently "proudly announced" that its software support, software commercialization and training departments had earned ISO 9001 registration. The three departments follow hardware commercialization and staging which were registered last year. 'Our ISO 90001 program has given GeoQuest an excellent system for identifying the best methods of assuring the quality of our products' said Stan Wedel, GeoQuest vice president of software product commercialization and support. To gain registration GeoQuest had to comply with the 20 elements of ISO 9001, such as management responsibility, quality systems and document and data control. To prepare for registration, GeoQuest reviewed current business practices, documented the processes and conducted training. 'Our ISO 9001 program provides an excellent system for identifying best practices to enhance our quality service worldwide.' The International Organization for Standardization (ISO) promotes the development of quality standards for international manufacturing, trade and communications. The ISO 9001 quality system standard is the broadest of the quality standards and applies to most organizations. ISO 9001 encompasses the design, development, production, installation and servicing of products and services, including computer hardware and software.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_6 as the subject.


Reason to celebrate? (December 1996)

Scott Pickford gives itself a pat on the back for a second birthday present.

A well-publicized boardroom bust-up in the holding company may well be the reason behind the petroleum geoscience consultancy Scott Pickford Group's press release about its second birthday celebrations. Purple patch of the release reads "In describing the numerous milestones achieved by the processing team, Don Scott, the company's chairman, referred to the growth as 'phenomenal'. The Orpington-based operation, now established as a separate subsidiary company, employs almost 30 people, has an annualized turnover approaching £2 million and has worked on more than 100 projects for over 40 clients. The second birthday was celebrated with a champagne reception for the staff and coincided with the award of two major 3D projects." PDM deserves a magnum for printing this stuff!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_7 as the subject.


Dynamic model option for SIEP (December 1996)

Shell International chooses EarthVision visualization and modeling package.

After an 18 month technology evaluation and review, Shell International Exploration and Production (SIEP) has agreed to further license EarthVision, the advanced mapping, 3D structure and property modeling software from Dynamic Graphics. SIEP will be recommending the software to Shell E&P operating companies after a recent successful joint technical development effort. SIEP and Dynamic Graphics will be exploring other avenues for mutual technical cooperation.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_8 as the subject.


Schlumberger Oilfield Services hires new business leader (December 1996)

Schlumberger Oilfield Services has named Vidya Verma as business development manager of Integrated Reservoir Optimization.

In this new position based in Houston, Verma is responsible for co-ordinating development of the next generation of reservoir management services, which integrate and enhance the technologies Schlumberger provides to assist oil and gas companies in increasing the value of their assets.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_9 as the subject.


Shell Services Company Introduces VerisStream (December 1996)

Following their implementation of IBM's PetroBank in Houston (see August PDM) Shell Service Co. (SSC) are going public with VeriStream, a data management service designed to 'help exploration and production companies enhance productivity of their technical staffs'.

VeriStream services include consulting, systems implementation and storage and retrieval of seismic and well data via PetroBank. In a bleaker than usual vision of the status quo, SSC's Lorin Brass estimates that E&P personnel spend up to 70% of their time "trying to locate, verify and reformat information for interpretation and analysis" - VeriStream services are targeted at reducing this wasted time. While data can be physically located on the SSC PetroBank, VeriStream also acts as an information broker, offering a GIS front end to data stored with third parties. A point and click interface can allow for pre-visualization of seismics and other data, while the actual transaction may be off-line. Vendor-neutral consulting, transcription data loading and delivery services complete the VeriStream offer.

privatized

VeriStream complements existing external services offered by SSC extending from Business Process Consulting, Oil and gas Financial Services, Manufacturing Management Systems and General Financial Services. These extend from E&P right through to the downstream sector. SSC has over 1,700 employees and an annual turnover in excess of $300 million. SSC claim their years of in-house service provision for Shell place them in a favorable position to help energy companies and other to "rediscover their core business" and describe VeriStream as a "radical" solution, alluding to other novel types of business partnerships that would not have been considered a decade ago. This "privatization" of a hitherto dedicated service provider is an interesting alternative to the "restructuring" approach adopted by other major oil co.'s. For more information on SSC and VeriStream, contact Jim Wortham, (1) 713 241 6453.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_10 as the subject.


Mapping modules for geoscientists (December 1996)

A new version of GeoQuest's CPS-3 mapping software incorporates new modules for reservoir characterization and visualization.

GeoQuest has been keeping up its flow of new releases and recently announced a new CPS-3 version with two new mapping modules to help geoscientists reduce cycle time. Framework 3D is said to be an innovative tool for the characterization and interpretation of complex geologic structural frameworks. Tightly coupled to Framework 3D, SurfViz supports the visualization and validation of these frameworks. When used together these modules are intended to give users the ability to better leverage fault and horizon interpretations and eliminate the need to perform complex manual operations each time new project data are available. 'Dozens or even hundreds of faults and in-fill horizons can be generated automatically with improved accuracy and in a fraction of the time it takes other mapping applications to provide a less complete solution' says Howard Neal, vice president of product development at GeoQuest. Framework 3D automatically establishes the major-to-minor relationship of faults and properly truncates them, resulting in a validated fault framework. Once the fault framework is built, fault-honored horizons can be generated automatically as in-fill surfaces at whatever resolution necessary to capture the proper reservoir detail. CPS-3 supports mapping, surface modeling and volumetric analysis. Contour maps, base maps, vertical profile displays and 3D views can be produced using this application.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_11 as the subject.


Coherency on CD ROM (December 1996)

Coherency Technology's promotional CD-ROM demo of the Coherency Cube technique provides spectacular demonstration of the method.

The CD-ROM demo from the Coherency Technology Company (CTC) is a pretty spectacular argument in favor of at least trying their wares. This CD-ROM is a must have in terms of showing how CTC's processing can bring out faults where the conventionally processed seismic shows nothing at all. Quick-Time videos are also supplied and in themselves are marvelous examples of the visualization of geological features in a 3D data volume.

black-box

Now a demo is designed to sell a product, and not being experts on this technology we will refrain from passing technical judgement. While the black box element of this technology may be a put-off to some, similar results may be obtainable with other types of non-reflector oriented 3D processing, nonetheless, this is a great demo and is a must have for anyone wanting to persuaded their boss to put a few hundred thousand dollars into a Coherency test. This is an expensive process, but as CTC point out, will probably add "only" around 1% to the overall cost of a 3D survey. CD-ROM demo "Precision That Pays" from Coherence Technology Corporation - more info from CTC tel (1) 713 870 1501, http://www.coherence.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_12 as the subject.


Digital Composite Maps of Tertiary Sediments in Offshore East and South East Asia (December 1996)

Software review - PDM tries out a new digital atlas of SE Asia. Features include various structural and facies maps together with reserve estimates.

This product is the result of a mega-regional compilation by the Committee for Coastal and Offshore Geoscience Programs (CCOP), a group of Asian Governmental and State Oil Company Organizations. Funding for the project came from Norway's NORAD program. The Digital Atlas includes regional structural, facies and isopach maps, 5 regional cross sections. Also included are bitmapped images of individual basin areas and a chapter on remaining undiscovered reserve estimates. The results of the compilation are supplied as an A4 Atlas together with a CD-ROM version of the data which integrates the publicly available Digital Chart of the World (DCW) and allows viewing and printing of the maps with Cambrian's GISMO MS-Windows software. The CD-ROM installs to three files, a vector map viewer and two windows help files, one a real help file, the other an information viewer help file presenting the non-vector information of the Atlas. This latter file offers on-line access to all the textual information in the Atlas, and can be cut and pasted using Control + X,C,V. Likewise the reserve computations supplied on the different basins can be cut and pasted into an Excel spreadsheet, although a little more thought to layout would have made this exercise more useful. The bitmaps can also be copied with Alt + PrintScreen, although the scanned images of plates from various reports are not of a very high quality.

Professional

This is most certainly not true of the mainstream component of the Atlas, the vector maps and sections. While we did not have access to all the other products necessary to give the software a thorough test, what we could do was impressive. Printing from the Windows Printing System to a 720x720 dpi plotter gave a very professional rendition of the DCW data, and the colour shaded isopach maps also look very good. Of course the detail on the geological compilation maps is nothing like as fine as on the DCW. The maps were prepared at a scale of 1:2,000,000, making them adequate for a display down to the individual basin level, but not of great usefulness in analyzing a single permit. This of course is not what such a report is intended for, and is more than counterbalanced by the facility of exporting data to a variety of industry standard formats. Output formats for the following products are supported - Corel Draw, AutoCAD, Microstation, Arc Info as well as the generic DXF and CGM formats. Some datasets are supplied in x,y,z, format and can even be utilized directly in a mapping package. The lack of political boundaries and permits, while undoubtedly letting the authors off a hook, is a drawback to the geographically untutored. Navigating between the basin names and bitmapped images is not easy. Equally the absence of well data detracts from the products overall usefulness, although there are placeholders in the menu structure for all of the above.

niggles

A few final niggles - or suggestions for improvement - an un-install routine would be nice for the busy software reviewer, a way of killing a lengthy redraw (Escape) and the dialogue boxes would benefit from a true Windows look and feel, but overall this is a workmanlike product offering a company coming from scratch to an analysis of East Asia a valuable jump-start. More information on the CCOP Atlas can be obtained from Phil Carpenter, Cambrian Group, at (44) 1291 673022, fax (44) 1291 673023, via e-mail at pcarpent@cambri.com or through the World Wide Web at http://www.u-net.com/~cambrian. Publishers Committee for Coastal and Offshore Geoscience Programs (CCOP) and The Cambrian Group.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_13 as the subject.


Data missing from Denver SEG (December 1996)

Data Management has yet to penetrate the mainstream of the Society of Exploration Geophysicists in as much as there are not, as yet, conference sessions devoted to our subject of predilection.

There was a post conference workshop on Data Models, and in the rush to find the right meeting room one could hear people asking for the "Data Management" workshop. A revealing mistake since while Data Models make for a "serious" subject, data management has a way to go before it is treated as a science. On the exhibition floor things were different, with many exhibitors sporting a variety of data access tools, data management services and impressive bits of hardware. Geoquest demonstrated remote access to their robotic storage system in Houston, with a real-time video link showing a rather lazy robot who was coaxed into action by a guy with a cellular phone wandering around the back of the gathering muttering 'its not moving... its not moving... make it move......make it move!' Good to see that despite Aries and Gigabit bandwidths, the human element is still there! Landmark are still not promoting a global data management solution as such, but was the positioning of the GeoGraphix booth in front of and as large as the main Landmark booth a sign of a change in emphasis? Is Windows NT coming of age?

mudslinging

At the Data Management workshop, which was hosted by BP's Ian Jack, material now familiar to PDM readers was presented from speakers covering POSC's offerings and Project Discovery. A minor mudslinging set piece ensued between Bertrand DuCastel of Geoquest and John Sherman of Landmark. Their respective marketing strategies have come, if not quite full circle, at least a good way around the loop in the DuCastel has moved from slating Epicentre for performance issues last year to being a POSC zealot now that Geoshare is "fully compliant". Landmark's positional shift is less radical, but one senses that Discovery, as the route to POSC compliance via a novel merger is not bearing its fruits as quickly as hoped. POSC COO David Archer, did not enter the fray, insisting instead on business benefits that Texaco have reaped from their POSC based Kern River Project. Stuart McAdoo from Geoquest presented the Geoshare alternative with a quote from a user who stated "we don't want data models, we want integration".

complexity

Helen Stevenson from Stevenson and Associates offered a comparison of three industry data models which concluded that - at least in terms of scope, Epicentre was the daddy of them all, PPDM the tiddler and Petroconsultants' Iris 21 the piggy in the middle. Stevenson insisted however that these statistics hid a lot of complexity, and that for example, 10% of PPDM's, and as much as 20% of Iris21's data domains were absent from Epicentre. A statistical comparison of the number of entities, attributes and relations demonstrated that scope is achieved at the price of increased complexity. In our coverage of the Denver SEG elsewhere in this issue, we do not pretend to be exhaustive. In fact some topics and vendors will be held back for future thematic issues. Oh, by the way, the most important thing to come out of the SEG was the general feeling of well-being and enthusiasm that seems to be returning to the business. Things do seem to be picking up, maybe someone will offer me a real job before too long!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_14 as the subject.


Perceptive change of name (December 1996)

Following the disposal of its instrument subsidiary, Perceptive Scientific Imaging (SPI) has been incorporated under the name of Petris Technology.

Petris a privately held well information management company based in Houston, remains under the same management and will continue to specialize in well log digitization, imaging, storage and retrieval. The company currently provides well data management for over 50 oil and gas exploration and production companies. Petris is a Delaware corporation and a wholly-owned subsidiary of Digital Imaging Technology (DITI) in Houston. The name change was prompted by the July 1996 sale of one of DITI's former subsidiaries, Perceptive Scientific Instruments. The name 'Perceptive Scientific' and the PSI logo were included in the sale. Petris operates in three primary areas - well log digitizing, well log delivery and storage and the newest addition, customized Intranet data management solutions. The Intranet system is currently being tested in oil and gas companies.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_15 as the subject.


That scoreline! (December 1996)

At the SEG in Denver the top brass of Landmark was well aware of PDM's headline in the October issue 'GeoQuest 1, Landmark 0'. Landmark CEO Bob Peebler and Technology VP John Martin both made clear that the ballgame wasn't over. 'You wait', they told PDM. 'Soon it'll be Landmark 2, Geoquest 1'. It's going to be a great game and PDM fans have the best seats.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_16 as the subject.


Tera takes on vector and massively parallel marketplace (December 1996)

Tera Computer's new Multithread Architecture is set to challenge the dominance of vector and massively parallel computing in high performance arena.

Anew name that popped up at the SEG in Denver was Tera Computer, based in Seattle, describing itself as a high performance computer company in the development stage. From a modest stand Tera was testing the E&P market's interest in its Tera Multithreaded Architecture (MTA) high performance, general-purpose computer due out next year, which is said to go beyond massive parallelism in its ease of programming and scalability. The company said that its MTA systems represent a significant breakthrough, offering significant improvements over both parallel vector processors and massively parallel systems. MTA systems are claimed to be the first true shared memory systems that are architecturally scalable. The programmer is freed completely from data layout concerns irrespective of system size. Tera's high performance multithreaded processors provide scalable latency tolerance, and an extremely high bandwidth interconnection network lets each processor access arbitrary locations in global shared memory at 2.8 gigabytes per second. Tera is scathing about massively parallel and vector processors.

cost effective

It says massively parallel systems and workstation networks depend on massive locality for good performance. Applications must be partitioned to minimize communication while balancing the workload across the processors. This task is often impractical or difficult. An MTA system, on the other hand, can accommodate intensive communication and synchronization while balancing the processor workload automatically, according to Tera. The company argues that vector processors are true shared memory systems, but rely on long vectors and massive vectorization for good performance as systems increase in size. Executing scalar code, parallel or not, is seldom cost effective on these machines. In contrast MTA systems optimize both vector and scalar code, exploiting parallelism while retaining the programming ease of true shared memory. The customer's investment in vector parallel software is thereby preserved and new applications and approaches that are better suited to scalar computing become attractive, says Tera, noting that MTA systems represent a new paradigm for high performance computing - scalable shared memory. MTA systems are constructed from resource modules. Each resource module measures approximately 5 by 7 by 32 inches and contains up to six resources: a computational processor (CP); an 1/0 processor (JOP) nominally connected to an 1/0 device via 32- or 64- bit Hl PPI and either two or four memory units. Each resource is individually connected to a separate routing node in the system's 3D toroidal interconnection network.

bandwidth

This connection is capable of supporting data transfers to and from memory at full processor rate in both directions, as are all the connections between the network routing nodes themselves. The 3D torus topology used in MTA systems has eight or sixteen routing nodes per resource module with the resources sparsely distributed among the nodes. In other words, there are several routing nodes per computational processor rather than the several processors per routing node that many systems employ. As a result the bisection of bandwidth of the network scales linearly with the number of processors. Just as MTA system bandwidth scales with the number of processors, so too does its latency tolerance. The current implementation can tolerate average memory latency up to 500 cycles, representing a comfortable margin; future versions of the architecture will be able to extend this limit without changing the programming model as seen by either the compiles or the users.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_17 as the subject.


Drilling history on CD-ROM (December 1996)

Houston-based Petroleum Information Corporation (PI) has made PI History on Demand available as a CD-ROM

The new product provides access to the market and competitive information contained in PI Drilling Wire, itself a digital version of the company's PI Drilling Reports. History on Demand is said to facilitate instant retrieval of articles published in PI Drilling Wire. PI subscribers will be able to browse or search for specific articles in PI History on Demand using a hypertext key-word query, rather than spending hours, weeks or months downloading electronic copies or paging through paper copies. PI History on Demand CD-ROM is updated with over 300 reports every quarter. The reports can be downloaded to a computer's hard drive and used in conjunction with Windows and Macintosh applications. Both regional and national subscriptions are available. Headquartered in Houston, Texas, PI was founded in 1928 as the publisher of a regional drilling report and today its petroleum industry database covers more than 2.6 million wells and production entities.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_18 as the subject.


PetroBank gets a little sister and Magstar has a baby. (December 1996)

IBM announce the Project Data Store (ex-Tigress) and a new low cost High Density Media, Magstar MP.

IBM's is pleased to announce the birth of a little sister to their PetroBank product. The Project Data Store is described as "the industry's most comprehensive upstream E&P project database". For those of you who are not au fait with current trends in E&P data management, and who might think that once you have a Corporate Data Store (CDS), all your worries would be over, think again. Applications, such as E&P interpretation systems like to manipulate and even change data intensively, and it is undesirable to perform this type of activity on the CDS itself. Enter the project data store (PDS), a replication of a subset of the data in the CDS which will be read from, and written to, by the applications themselves.

compliant?

IBM's PDS provides "data domain support for a wide range of geoscientific technologies, including geophysics, geology, petrophysics, mapping, engineering, reservoir simulation in a single, fully integrated relational data model." In an oblique reference to standards "compliance" IBM announce the conversion of the production data domain areas of the PDS to an Epicentre (V2) footprint. Otherwise, the product is presumably not "POSC compliant" whatever that may mean. IBM announce links to what they describe as "best of breed" applications from companies such as CogniSeis, Geomatic, PGS and Z&S Consultants, with discussions under way with others. Absent from the list are IBM's main competitors GeoQuest and Landmark. Although this is understandable in a commercial sense, it does again illustrate the fact that POSC "compliance" does not mean interoperability.

new media

Meanwhile IBM have also announced a new high density tape subsystem dubbed Magstar MP (Multi-Purpose). This baby Magstar product is a new, slimmed down version of this High Density Media (HDM) offering a storage capacity of 5GB per cartridge and data rates of up to 2.2 MB/s (both values without compression). A novel tape design offers a self-enclosed tape path and a new mid-point load design to reduce random search access times. A robotic unit holding 20 tapes (i.e. 100 GB of uncompressed, near-line storage) retails for around $15,000 US, bringing high density storage to the masses, well nearly!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_19 as the subject.


More National Data Banks from GeoQuest, IBM. (December 1996)

Two new National Data Repositories (NDR) announced - GeoQuest will supply Finder in Thailand and IBM PetroBank to Brazil.

GeoQuest has been awarded a $1.9 Million contract to set up the Petroleum and Coal Management Information System (PCMIS) for the Dept. of Mineral Resources of Thailand (DMR). DMR is the Thai government agency that manages mineral, coal and petroleum exploration and production. The PCMIS data management center will be located in Bangkok and will be used by DMR to "support and accelerate the country's E&P objectives". GeoQuest will provide software, hardware, training, customization, data entry and archival services. The center will handle E&P and associated data from all lease holders, operators and contractors in Thailand. Finder and SeisDB will form the core of installation, with additional software products deployed to organize and preserve digital and hard-copy data and to ensure access and rapid retrieval of data by DMR staff.

Petrobras

Meanwhile IBM announced that Petrobras, the Brazilian state oil company has selected PetroBank solution for its seismic data management needs. PetroBank will allow Petrobras "to store and manage a great deal of multidisciplinary data with a high level of confidentiality" a Petrobras executive stated. Magstar 3590 tapes will be used for the data repository, and data delivery will be provided from an IBM RS/6000 SP server.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_20 as the subject.


Spectrum and BDM partnership (December 1996)

UK companies Spectrum and Britannia Data Management (BDM) announce combined data services.

Spectrum is to provide specialized media services as part of BDM's added value program for its energy customers. Services will include tape copying,, tape audits, transcription and concatenation based at BDM's Coulsdon site. According to the companies, the deal will form the 'initial platform' for a close co- operation 'required to maximize the benefits from Spectrum's information technology services and the physical data and information services provided by BDM'.

One stop shop

The intention is to deliver a 'one stop shop' range of services including scanning and indexing to support individual records management needs. Spectrum, based in Woking and with a number of subsidiary overseas operations, specializes in seismic data processing and seismic scanning for the oil industry. It also has ambitions for its data management division supplying consultancy, data conversion and system integration. BDM has been around for 20 years, providing data and records management services for both energy, upstream and downstream customers. It supplies off-site storage, records management consultancy, library and information services, onsite support and reprographic services to over 1,500 commercial and energy clients and claims to be the second largest data and records management group in the UK.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199612_21 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.