One company which would like to see this happen is GeoGraphix, Inc. (Denver, Colorado) who have just announced the release of PRIZM, described as "the most intuitive and powerful PC-based multi-well log analysis system available in the industry" (see below). Of course, since their acquisition by Landmark in 1995 (who were themselves acquired by Halliburton in July) for some $15.4 million, this could potentially involve a kind of intra-corporate struggle for market share.
groundswell
Robert Peebler, Landmark president and CEO believes that "there is a growing need for both GeoGraphix' smaller-scale systems for basic geological and engineering problems, as well as Landmark's traditional solutions designed for the rigorous demands of complex reservoir characterization." But this is questionable in view of the increasing power of the PC, the touted robustness of Windows NT, and the groundswell of end users who have been seduced by the PC's cheapness and interoperability by design. At the time of the acquisition, Peebler announced Landmark's strategy of expanding its base of business to include broader product lines, multiple market segments, geographical dispersion, and extended service offerings.
heavy duty
But is the market really segmented in this way? During the early days
of the PC a similar debate was heard covering a variety of applications notable word
processors and spreadsheets, with protagonists of "heavy duty" applications
advocating software such as 20/20, a time shared mainframe based spreadsheet, leaving the
"lower" end of the marketplace to "toys" like Lotus 123. Now all these
office automation apps run on PCs - mainly because of the cost advantage. Because of their
huge market share, PC packages are very cheap. This argument must be of less importance in
the low volume vertical marketplace of E&P software, but their are other dimensions to
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_1 as the subject.
Designed to meet the needs of the interpretative geoscientist, PRIZM
provides an intuitive user interface optimized to quickly and accurately interpret well
log data. Operating in the Microsoft Windows 32-bit environment, PRIZM is well-suited for
both the casual user and the experienced petrophysicist. PRIZM allows the interpreter to
analyze the key wells of a project, and then quickly apply the same interpretation to
other wells within the project. The program supports digital data from numerous sources
and provides the user with integrated data views and analyses. Users can view, edit, and
analyze well log data in three different formats: traditional log view, crossplot view,
and report view. Each of these views can be customized. PRIZM also enables the user to
quickly analyze well log data using industry-standard or custom petrophysical algorithms.
Lithologies can be identified and displayed, based on formation, defined log data, or
custom specifications. Users can quickly calculate averages, totals, and estimated
reserves based on field or interpreted data. In PRIZM, the user can also perform
interactive data editing using single point, multiple point, or depth shifting. Designed
by geoscientists and logging engineers, PRIZM "brings comprehensive log analysis to
the PC. It makes sophisticated log analysis at the well site, home, or office a reality
for even the inexperienced computer user". GeoGraphix was founded in 1984 and
estimates that it currently has more than 8,000 licenses installed in more than 800
companies worldwide. The company's integrated, Windows-based systems for geoscience, land,
engineering and petrophysics include (1) "GeoGraphix Exploration System" (GES)
for geoscientific data management and mapping; (2) "Jaguar" for petroleum
engineering and economic forecasting; and (3) "LeaseMap" for petroleum land
management. For a limited time, the standalone PRIZM suite of products can be purchased
for a promotional price of $5,000, a savings of $3,000. Contact Kami Schmidt, Marketing
Coordinator (303) 296-0596, Bill Rodgers, Product Manager (303) 296-0596.
Click here to comment on this article
Recent debate on the Usenet sci.geo.petroleum (see article in this
issue) has focused on the hardware side of the Unix vs Windows battle for IT market
supremacy. I'd like to focus here on the Operating System side of this debate. The hidden
agenda is a discussion of Standards, Open Systems and Competition, and especially to
attempt to show how these words don't always mean what they say and how frequently, the
first two are deformed or even abandoned in pursuit of the third. This is a personal
account of how I have seen the industry evolve rather than an attempt to be exhaustive,
but I'm sure than many will recognize some of the following. We will be focusing
particularly on the limits between the stages, the paradigm shifts or evolutionary
changes, real or perceived and will try to show how today's "Standard's Wars"
are really a battle between personalities, particularly Bill Gates (Microsoft) and Larry
Ellison (Oracle), and how they relate back to previous upheavals in the industry. Big (bad?) blue
PRIZM release for well log data (September 1996)
Landmark subsidiary GeoGrapix has released PRIZM, a windows based program for the well log interpreter
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_2 as the subject.
Editorial - Standards, 'Open' Systems and competition - the hidden agenda. (September 1996)
The PC - UNIX debate, a personal view from PDM's editor Neil McNaughton
First we'll focus on the way IBM was perceived in the early days of Unix. There was a kind of 1960's mentality about whereby Big Blue was very much the "baddy" - with deprecated programming techniques likened to "kicking a whale across a beach" by B. Kernighan (a founding father of Unix and the C programming language). Similarly, proprietary hardware and software were done down and presented somehow as a rip-off. Unix came along with the promise of cross platform stability (vi was vi on a Sun, HP or whatever) and most importantly, software interoperability. This was achieved through the use of pipes and filters acting on the byte-stream - and a very powerful tool it was too. It led to the development of a powerful shell programming language, and other "little languages" sed, awk, and later on to perl, cgi and even Java perhaps. You could do anything. I once had to edit a bunch of files all in the same rather complicated way.
Tee for two
I could have used awk, but even that would have required a lot of head scratching so I tried editing the first file with vi, while "tee"ing the output to a file. When my first edit was through, the tee'd file contained a record of all the keystrokes used during the edit operation. The next step was to write a shell script which echoed the keystroke file the stdin of a vi on all the other files. The code (from memory I'm afraid, I'm not sitting at a Unix box at the moment, and not all Unixes offer this flexibility) is shown in the side bar. Now if you are not a Unix freak, this will mean nothing. But this technique allowed me to record the equivalent of a Wordbasic macro and to run it as a batch process, all this about 10 years before these tools existed. I have always regarded this exercise as a proof of the superb interoperability of the operating system, files and the power of redirection.
Open?
Unfortunately, this true openness was not reflected in vendor's products. One early database bypassed the Unix file system completely. Yes it ran on a "Unix" HP, but it ignored everything except the hardware. Personally my first education into how vendors treated Unix's orderly arrangement of system files was when we installed our first commercial software, a database development system. The installation involved the creation of a sort of parallel universe when it came to the device drivers for the terminals. I asked why they didn't use the same drivers as were already on the system. The answer was that that was the way it was. We soon had as many /etc/tty/term files as we had software on the system. Often they "assumed" that our terminals were in different emulations - so our users initially had to switch emulation mode as a function of the software they were using. We later fixed this with a shell script, but what an unnecessary burden on the system administrator. Today's vendor's act pretty much the same way. Some obstinately refuse to provide simple ascii output from their programs, effectively isolating them from the rest of the world. To my way of thinking, a program which does not - at least as an option, accept ascii input from, or provide ascii output to the bytestream is not Unix.
What went wrong?
The answer is, as we have already intimated, that the vendors did not play the game. Why did they not play the game? Some suggestions are
laziness - probably the case for the /etc/tty story
Graphics and X came along and clouded the issue, the focus was now on event driven, user interactive programming and the byte stream was no longer an issue for most.
"Real" non-quiche eating programmers were more interested in learning C than in using the tools that had already been developed.
But these are anecdotes compared with the real reason for Unix's abandonment of the interoperability paradigm. The real reason was that this "Berkeley freeware" aspect of Unix was of no interest to them whatsoever. They had serious ulterior motives. They wanted to sell their software and hurt the competition.
Chapter 2
The next chapter in this brief history is that of the PC. Since the story has been told many times, we won't repeat it here. What I want to discuss here is the openness or otherwise of the PC in the light of what we've just been discussing. First on the hardware front. It is hard to conceive of a more open system. Plug and play technology really is an amazing development coming from such a diverse assembly of constructors. I recently installed a 3 COM PCMCIA card in my Toshiba. I was getting nowhere until I realized that you do not have to install it. I removed all the files I'd laboriously copied from the disk. Removed the disk, plugged the card in and that was it. Amazing.
COM - on!
PC software is, for the main part, not Open. It is of course Microsoft and proprietary. It is nonetheless, no less Open than Unix. In fact today, the COM and OLE specifications for interoperability are far ahead of the Unix equivalent (CORBA - see August) which itself is conspicuous by its absence from E&P software. Unix may well be caught with its pants down by this technology which is in the shops now. In fact Visio and Micrographics Designer are now touted as being almost part of the Microsoft Office Suite, by Microsoft themselves. You can also program your own OLE containers using relatively simple languages such as Visual Basic. COM/OLE is furthermore particularly well suited to the new distributed computing paradigm offered by the Internet. Which is the next stop on our tour.
wolf-pack
It really is history repeating itself. The wolf pack is howling about Open Systems again. Netscape is a "strong supporter" of Open Systems (when in fact their huge stock-market success was the result of dumping the market with their Browser which just happened to have a few proprietary "extensions" to HTML. Meanwhile the Oracle/Sun/Netscape/IBM network computer is likewise "dedicated" to open systems, but will include the Mastercard/Visa proprietary authentication technology in hardware (so it will only transact with a similarly equipped device) Sun's Java programming language will also undoubtedly be turned around to offer a competitive edge to its inventors somehow.
trojans
What is important in all this is that the IT world is not made up of goodies and baddies. There are occasions where "standards" are unequivocal marketing ploys. They are frequently announced before they are available in order to spread Fear Uncertainty and Doubt (FUD) while they are finished off. Most standards are really just protocols. True standards are born, rarely made. On the other hand while proprietary extensions to a standard may be a Trojan horse into an established open system, they can hardly be deprecated, because they are a traditional route to enhanced systems. Remember the VAX FORTRAN extensions? Or SGI today.
which open standard?
Interoperability is in our hands. You can always not buy a product that does not comply with some minimal specification such as, well ascii in and out on the byte stream would be a good point to start while we wait for full compliance with the CORBA standard or should that be a POSC Business Object, or a PPDM compliant data store, or COM/OLE?...here we go again! Finally, wars can have unpredictable results. Before Unix, most E&P software ran on Digital Equipment (DEC) hardware. Unix was itself developed on DEC and while Unix's inventors shot first at IBM, they actually missed and hit DEC! On the other side of the fence, one of the most amazing paradoxes of the IT scene today is the fact that when you but an "IBM-PC" IBM generally speaking receives nothing from the transaction since the OS and software is nearly all Microsoft and the hardware specifications are in the public domain!
Stone-age macro editing with vi
First, capture the keystrokes with -
************************
vi file1 |tee keystrokes.dat
************************
perform the editing as appropriate, then run a shell on the other files as follows -
************************
for file in *
do
vi $file < `cat keystrokes.dat`
done
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199609_3 as the subject.
It is far easier for a company like Scout to develop a product on windows because of the elegant development tools available (far superior to Unix dev tools, far cheaper too). I estimate it would have cost us an extra $50,000 in hardware and software, plus an extra 6-8 months development time, to create probably an inferior product, on say Solaris. We are able to spend the savings cramming features into the product. End users can tell the difference. The same argument applies to SeisVision. Way more elegant than SeisWorks. Still, it's less capable because of the huge feature set SeisWorks has accumulated over the years. But it's easier to add features to SeisVision than to SeisWorks.
Seismic ActiveX
Scout Systems, will soon be releasing a product having no analog in the Unix world. The Scout Controls are a set of drop in ActiveX controls for manipulating and visualizing seismic data, velocities, well logs, etc. Users drop these controls onto forms in Microsoft Access, Excel, Visual Basic, or even Web pages, and write Visual Basic scripts to control them. The controls retrieve and store seismic data, velocities, etc. to and from any database or standard file format. Using the Scout Controls and Microsoft Access, and a 70,000 trace prestack 3-D survey, for example, I have demonstrated a custom velocity analysis application that took only 4 to 6 hours to get running. The idea behind Scout is to leverage these easy to use and inexpensive Microsoft applications, enabling power users to manipulate exploration data to solve special problems, with minimum programming.
Bison Intruments
The performance of Scout-based tools is very good, because Scout
Controls are compiled for the native processor -- you just use Visual Basic to script the
controls. For example, the Normal Moveout control runs at Pentium speed, and in Visual
Basic you just script the control to move out a trace, or a few thousand traces. Bison
Instruments will roll out a Scout-based seismic acquisition system in November at SEG. The
system uses Scout for remote seismic processing and seismic display, and stores all
acquisition and processing parameters in an Access database. The system is set up for turn
key operation, but power users can modify the system by scripting a few lines of Basic.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_4 as the subject.
What's bad about C/S computing?
Client-server Computing Part III (September 1996)
PDM concludes the three part series introducing client-server computing. This month, the downside of C/S.
Now for the down side. Client-server has a number of negative things going for it too. It's hard to put them in order, but certainly near the top of any list, should be the amount of misleading information surrounding the whole idea. Client-server has been grossly oversold. And since it lacks a single, clear, definition, the door is wide-open to general misunderstanding. It's widely believed that client-server is a modern-day replacement for hierarchical computing solutions, for example. But as we discussed earlier, that's usually not true (nothing processes heaps of information better, and keeps data safer, than the good old mainframe).
misconception
Another popular misconception is that client-server is inexpensive (at least compared to the style of system development and implementation we've known in the past). Again, not true. Client-server may use inexpensive PCs as clients, but when everything else is factored in: education, more demanding application development, security, systems management and maintenance, they often turn out to be more expensive than a comparably-sized mainframe effort. But more than just weightier implementation costs, many of these same issues can threaten the very success of a client-server project. Heterogeneity also weighs heavily against client-server success. There's no guarantee that a database engine from vendor A will work with a client application built with a tool from vendor B if client and server are connected by a network supplied by vendor C -- if it's Tuesday, and the moon is full. Seriously though, when you stop and take everything into account -- operating systems, networking protocols, CPU designs -- it's amazing that it works at all. This is the price we pay for living in an non-open systems world.
not new
Client-server is still relatively new. Not only is there a lack widespread experience, but there's an acute shortage of comprehensive tools. This is perhaps, most evident in the areas of systems management and security. To date, there are few products able to simultaneously monitor and control both mainframe and LAN-based systems: bad news for client-server which often lies somewhere between the two. The state of client-server security is even less cheery. Apart from an almost complete absence of tools, client-server's "information for the people" philosophy flies directly in the face of historical security practices. This is certain to become the topic of much debate over the next few years.
Abstracted from Dixon, R., "Client/server & Open Systems: A
Guide to the Techniques and Tools That Make Them Work", John Wiley & Sons, New
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_5 as the subject.
As you will have gathered from other comments in this edition, today's
top-flight PC's can successfully compete with the low to middle range of Unix based
workstations. At the top end, things are rather different as high performance graphics
subsystems, wide bandwidth processors and other specialised bits of technology push the
workstation's capability beyond that of the PC. Nothing in the E&P business stays
still for long and Intel is working with others in the industry to create a new generation
of technical workstations based on the high performance Pentium Pro processor. Designed to
integrate with existing scientific and engineering computing environments, Intel powered
systems are said to provide all of the functionality and performance traditionally
associated with workstations. With functionality equal to some of the most powerful
systems available, including symmetric multiprocessing, integrated networking, powerful
graphics, a broad range of compatible hardware and peripherals, and incorporation of key
standards for interoperability, there is little doubt that they can meet the most
demanding requirements. With the added benefit of a choice of two powerful 32-bit
operating systems, Solaris* or Microsoft Windows NT*, users can readily configure a
solution that meets their needs. This bet hedging between Solaris and Windows NT should be
of particular interest to the Sun-dominated E&P sector. With a quite respectably
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_6 as the subject.
John Rogers (Texaco) asks "Can anyone point me to a vendor that can give me a cheap
replacement for our $60,000 workstations (25 fully loaded Indigo2 High Impact workstations
from Silicon Graphics). The PCs that replace these workstations need to have: An OS equivalent or better than the 64 bit UNIX OS from SGI (IRIX 6.2) A high speed graphics card giving me the same 3D capability as the High
Impact A fast ethernet card (100BaseT for 100mps speeds) Dual 20" CRTs A "PC" kind of price... around $5,000 I suppose; anyway a
whole lot cheaper than my $60,000 workstation A file address space of at least the terrabyte that I get from HFX on
my Indigo2 workstations And the whole mess has to be tightly integrated and highly reliable
like our Indigo2 workstation... don't want one vendor's PC card conflicting with another's
as seems to happen so frequently with our "plug and play" Win95 PCs And this PC must be able to run PVM over the fast ethernet so we can do
pre-stack depth migration on our seismic data in the evenings and on the weekends when our
interpreters don't need them... that is the OS must be multi-user as well as
multi-processing. I work at Texaco and know that Texaco would gleefully embrace any
vendor that can replace our expensive workstations with such PCs with capabilities as
above. There are some big bucks weighing in the balance. Dream vendor where are
you??" Now this seems a bit of a wind up if you ask us, it sounds rather as
though JR is saying that his needs will always be a Gig or so beyond those of a mere PC.
Hugh Winkler took up the challenge to point out that "The only feature .. that an
NT machine doesn't meet is the 64 bit OS", and James Huang took up the torch with
the following comment "IMHO, I do not see what magnitude of difference there would
be between 32-bit and 64-bit in, for example, workstation seismic interpretation. But then
I am a naive non-interpreter. <g>". Hugh also states that "I have
yet to see a system set up with more than 1GB swap." And that while "a
USD7/MB might be true for PC SIMM chips, proprietary RAM for *nix systems come at a higher
price. In these cases it is cheaper to get more storage and create big swap systems. The
case of loading the whole data-set into memory so that things get speeded up by a few
minutes real time is, IMHO, a waste of resources." On the subject of applications migrating from Unix to Windows Winkler
suggested that companies who ported their X Windows software would lose out to those who
wrote them from the ground up in native MS-Windows code, citing the Geographix SeisVision
application as an example of what could be achieved on a PC. how friendly? On the topic of user friendliness, Martin Crick (Texaco) believes that
one of the issues that will push more companies to use NT apps is that "all the
other non-technical stuff, like spreadsheets and word processors, are MUCH better and
cheaper on the Intel platform, because the market is much larger. If you don't want to put
two systems on a desk, NT seems to offer many advantages". And James Huang adds
that "with the large offering of X-servers for the PC market running under all the
OSs it is no problem connecting the different worlds" PDM comment
Intel and the New Workstation Industry (September 1996)
Intel's Pentium Pro processor is likely to blur the boundary between PC's and Workstations.
Seen on the Net PC vs Unix Round 1 (September 1996)
A debate has been raging on the sci.geo.petroleum newsgroup about whether or not the PC can replace the Unix workstation on the E&P desktop.
The PC/Unix debate has been raging since 1981, when IBM first
introduced it's (8bit) PC. Then, one argument used in favor of Unix was that it was 32
bits. But this had more to do with the Intel architecture of PC, which put a variety of
64k (i.e. 16 bit, if you're still with me) barriers in the way of the programmer. Since
then, word length is used as a marketing tool. It has nothing to do with precision, since
a double float will take up 8 bytes in whatever system. It may be processed more
efficiently if the system is 64 bit. But on the other hand, it is conceivable that integer
operations would actually be slowed down by a longer word length. Boolean operators,
taking one bit, may suffer even more. This supposes that all other things are equal, which
of course they never are. What made the Unix boxes better than PC's was the power and
intrinsic inter-operability of Unix in those early days (but see the editorial in this
issue on this topic), and also the flat memory model. This simply meant that it was
possible to create and manipulate very large arrays in memory without having to go through
the 64k hoops. Nowadays Windows 95, and Windows NT both promise 32 bit operating systems.
The facility of use, robustness and security of the latter particularly are touted as
being superior to many Unix systems, time will tell. The flat memory model exists in
Windows NT, while Windows 95 still has a lot of the DOS limitations lurking below the
surface. But it should not be up to the programmers to debate these issues. Who cares how
many bits, or how flat your memory when your application does what you want it too,
doesn't cost too much and most importantly is intuitive and easy to learn? We believe that
Unix has had a macho image associated with it. Why have a graphic shell when a real system
administrator can edit the /etc/passwd file in vi to add a user? To date the PC has scored
hands down over Unix in terms of ease of end-use. With Windows NT, this ease of use is
being brought to the system administrator. Bad news for those who like doing it the hard
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_7 as the subject.
Highlights of the release, according to the company, are the new
mapping and export capabilities from the Geologic Structure Builder (GSB) and the
Geostatistics module which incorporates capabilities from Stanford University's GSLIB
geostatistics software library. Launching the new product, Art Paradis, company president,
said that one of the ways Dynamic Graphics was working to make its clients' lives easier
was to address the connectivity issues between the industry's leading 3D reservoir models
and other widely used software. Paradis claimed "Moving EarthVision models to the
simulation phase of a project should be significantly smoother with our new GSB export
capabilities." GSB constructs 3D models of complex layered, faulted and non-faulted
areas based on scattered data, surfaces, 3D grids and a fault hierarchy. GSB structure and
property distribution models can now be exported to formats suitable for
upscaling/upgridding software including the Grid program from GeoQuest Reservoir
Technologies and the GridGenr and GeoLink programs from Landmark. Complex horizons from
GSB models can also be exported to 2D grids providing a link for post-processing of
precise EarthVision model components using other software. kriging
Earth Vision addresses the connectivity issue (September 1996)
California-based Dynamic Graphics Dynamic Graphics has just released version 3.1 of its EarthVision integrated family of geospatial analysis software for the earth sciences.
The geostatistics model incorporates the ordinary kriging modeling algorithm from Stanford's GSLIB geostatistics software library. Graphical data preview and variogram generation capabilities are available to set up and optimize model calculation. Variance models can be calculated to help quantify model accuracy, this in addition to the 2D and 3D models produced.
Well planning
Dynamic Graphics has added two visual techniques to the EarthVision3D viewer for well planning activities. The Snap to Surface function provides the 3D co-ordinates of a feature of interest picked on a model surface. The Extend Well function calculates the relative azimuth, relative inclination and extension distance from the endpoint of an existing well to a graphically-picked extension point. Another analytical feature provides isochore maps based on 3D GSB models offering information for reserve calculations.
Audit
Other enhancements being promoted by the company include a significant
increase in project size to 2 million input scattered data points and 2 million 3D grid
nodes, a pre-release of a new 3D viewer which is said to provide up to 300% improvement,
additional local faults for intermediate surfaces calculated by GBS Horizon Gridder, new
formula, processor functions, perspective views of 2D surface-based projects and the
EarthVision Notebook, described as an interactive program that can store a detailed record
of processing steps and workflow during a project. The release is available immediately
for Silicon Graphics, IBM RISC System/6000 and Sun workstations. The new geostatistics
module is initially available for Silicon Graphics and IBM and will be ported to other
platforms in due course. The four basic systems are priced in the US from $12,500 to $70
000. Any system can have a Network License Controller module which allows a software seat
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_8 as the subject.
IBM has a new product based on the perception that rapid changes in
petroleum industry trends require new network solutions. The petroleum professional
requires easy access to company information as well as external information. They need to
reduce their time spent finding resources and have the flexibility of acquiring data,
services and products in a timely and efficient manner. They also need to be connected to
project team members whether they be other company employees or in partner organizations
and wherever they geographically reside. Groupware
Petroleum connection offer from IBM (September 1996)
IBM announces PetroConnect, a Lotus-Domino based application for secure electronic commerce in the petroleum industry.
IBM PetroConnect is a service that brings to the petroleum
professionals new capabilities for staying current on news and events, searching for
relevant research, statistical findings, cultural information, accessing digital databases
- maps, surveys, well log, seismic data for acquisition - and collaborating with project
team members through robust messaging and workgroup functions. Domino, Lotus' interactive
web development platform, is the foundation for building IBM's PetroConnect, a web-enabled
application for secure electronic commerce in the petroleum industry. PetroConnect was
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_9 as the subject.
Founded by Texaco Britain Ltd. and Shell U.K. Exploration and
Production, this two year program was initiated to facilitate on-board processing of
marine seismics. The research effort is destined to provide assistance via neural network
computing, artificial intelligence and pattern recognition software. Hostile environment
New partner for Ensign consortium (September 1996)
AGIP has joined Ensign's seismic data processing research group working on the application of fuzzy logic and other innovative techniques to aid the oi-board seismic processor.
Such technology is intended to enable personnel working in an unfriendly environment to per-form seismic processing tasks at a level comparable to their more experienced land-based counterparts. Ensign is building a software package to automate analysis, parameter selection and management of seismic data. Initial target activities include velocity analysis, noise threshold estimation, data management organization of the processing. Modules will be linked through a comprehensive database of seismic data and attributes. The selected hardware platform for this program is, you've guessed it, Windows NT. This is seen as an attractive solution for data management and analysis. Seismic processing, which is not part of the project, will be performed on systems such as Ensign's IBM SP2. Initially command files are to be generated on the PC pending implementation of a Client Server link to allow direct control of the processing from the PC.
Swords to plowshares
In an analogue of the US swords to plowshares program, UK defense
industry establishments such as Winfrith and Harwell will be contributing expertise in the
fields of expert systems, neural nets and fuzzy logic to the program. The UK Government
through the Oil and Gas Projects and Supplies Office (OSO) is also providing funding for
this project. Ensign make a bold claim that all components of the package will be
consistent with the recommendations of the Petroleum Open Software Corporation
(POSC)". It is clear however that given the seismic focus of this project, it is
unlikely that Epicentre will be playing a role in this compliance, which will more likely
be focusing on the user interface. Here too, POSC compliance is difficult to justify since
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_10 as the subject.
A basin modeling software package, TerraMod, has been launched by a new
UK company of the same name. The advanced program allows oil and gas explorers to define,
with much greater precision than has previously been possible, the history of sediment
burial and compaction, the timing and magnitude of source rock maturation and the
volumetrics of oil and/or gas migration in an exploration region. rigorous
Basin modeling software launched (September 1996)
New basin modeling software for the PC claims enhanced oil-generation computation and calibration.
Through rigorous calculation routines utilising automated calibration
checks, TerraMod precisely simulates basin development and thus derives accurate models to
fit the specific geology of an area. Its backstripping and compaction algorithms,
lab-derived lithological parameters and fully coupled pressure and temperature
calculations ensure that each basin model can be optimized to match precisely a specific
area of interest. TerraMod is PC-based and incorporates a project and well databasing
system, a state of the art intuitive user interface, the capability for single and
multi-well simulation runs, model optimisation tools and a high quality graphical and
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_11 as the subject.
The purpose of this article is to give a brief overview of the status
of the options oil companies are encountering today in the area of seismic data
management. Data management in general is the topic "du jour" right now and
seismic data management in particular is the subject of a lot of that attention. The issue
of how to better manage seismic data, both the metadata and the traces themselves has
become a Point of focus for oil companies and product/service providers. Refinement
Seismic Data Management Operations, from Archives to Desktop (September 1996)
Bruce Sketchley of Panther Software offers his perspective on seismic data management
The market has been defining itself over the last two to three years via a steadily increasing number of seminars, industry discussion groups, conferences and through the gradual refinement of the product and service offerings that vendors bring to the table. It appears at this time that the options oil companies have to choose from fall into two main areas. The first and the most competitive arena is that of seismic archiving. By archiving, I am referring to the broad range of issues relating to the managing of the large volumes of seismic trace data that companies have accumulated over past years and that is being rapidly compounded by the vast quantifies of new data being generated every year. The problem set includes the issues around building and maintaining accurate inventory type databases referencing the trace data volumes but possible solutions key on how to handle the conversion of huge data volumes from old media/storage techniques into newer, more effective access and retrieval systems. This has turned out to be the domain of the bigger vendors involved in providing seismic data management options. To date the primary players in the game are companies like IBM/Petrobank, Schlumberger and CGG/PECC on an international basis with a number of other solutions beginning to surface in local or regional areas.
Big bucks!
It has become a hotly contested area of business for these firms as they all go after the large dollar amounts that will be spent by those companies and countries who have identified seismic data management as something that will receive serious budget allocations. The expenditures can be very large in some cases so it is a very competitive market. The vendors offer a range of options that include different media options, robotic tape systems, varying levels of interactive data access, associated tape transcription services and so on. Potential customers will end up making decisions on which vendor to go with based on the relative technical and performance merits of the different offerings, how well they integrate with other data management initiatives, cost etc. As a means of dealing with the inherent high costs of large scale seismic archiving options we have also seen the evolution of consortium-type initiatives by multiple oil companies to leverage the economy of scale and shared resource advantages available. The best known current examples of this are the Diskos project in Norway and the CIDAI group in England (currently just for well logs but now focusing on seismic).
Missing-link
However, the essential piece missing to date has been the managing of seismic data at the desktop level inside the oil company itself. The problems every company wrestles with in, trying to deliver quality data in a timely manner to the people who end up deciding where to drill a well are well known and are consistent worldwide.
The issue internally has always been how to give end-users enough information about the seismic data the company has access to so they can make intelligent decisions about what data to use and then ideally just go get the actual files. Today this is a huge workflow problem for most companies and it is getting worse as the volumes of data increase dramatically.
Fundamental change
Archiving solutions by themselves will not solve the problem. Companies need an option they can implement that allows them to fundamentally change how they handle seismic today - they need to change the processes around managing seismic data within their organizations now or the problems will just get worse. Solutions such as the SDMS system from Panther Software Corp. based in Calgary, Canada are targeted directly at this end of the problem set. SDMS provides users with the capability to directly query all the metadata describing their seismic files and then literally use a "drag and drop" operation that retrieves the relevant seismic trace data and loads it directly into the target interpretation application. Equally important, users can also now easily move seismic data between different vendors' applications and the number of applications available to work with in this way should increase quickly as Panther makes available a development kit that other developers can utilize. Oil companies can correct those practices and processes that directly impact the people who look for oil and gas and this should positively affect the business drivers related to reducing cycle times and risk, improving success ratios and return on investment.
Real question
So, for oil companies that see seismic data management as a key problem
area the real question will come down to how to select and implement a solution that
solves both the immediate, internal "get the data to the user" problem and the
longer term archiving strategies. Fortunately, companies have the option of addressing
both issues concurrently or sequentially and utilizing the scalability inherent in these
systems to build future costs into the budgeting process. There is a strong willingness by
the different vendors to work together for more competitive solutions. Panther, for
example has established business relationships or is in discussion with companies like
IBM, Schlumberger/ GeoQuest and Landmark/Halliburton to better integrate the offerings
each company brings to the data management arena. Given the scope of the seismic data
management problem and the fact that it is just a component of the overall data management
environment it will involve a combined effort by multiple vendors and customers to achieve
optimal results. The best answer will be one that focuses on both sides of the seismic
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_12 as the subject.
Designed for mineral exploration, petroleum exploration, environmental
investigations and other earth science applications, OASIS montaj is said to be a new
generation of software that allows geoscientists to interactively import, manage,
visualize, process, analyze, present and archive located earth science data on a
Pentium-based personal computer. Ian MacLeod, Geosoft's director of research and
development and CEO, is making a big claim for the product: "OASIS montaj represents
the most important advance in PC-based earth science data processing in 10 years. This
technology enables geoscientists to efficiently integrate, process and manipulate
unlimited volumes of data while staying directly in contact with the original data - with
live, visual links to databases, profiles, images and maps. In addition, our users will
benefit from a software architecture that lets us add new functionality more rapidly than
ever before. ". Plug-ins
OASIS montaj not a typo, it's Geosoft's latest offering! (September 1996)
Canadian software house Geosoft has released its OASIS montaj data processing environment for Windows 95 and NT operating systems.
The system comprises the core OASIS montaj environment and advanced
plug in application suites consisting of specialized routines called Geosoft executables
(GXs) that perform specific data processing and analysis tasks. An application development
tool called the GX Developer is available for earth scientiste who want to reuse existing
applications in OASIS montaj or who want to create their own GXs. Users can also take
advantage of the custom application development expertise offered by Geosoft's technical
services group. In designing a standard platform for data management, processing and
interpretation, Geosoft says it was driven by customer requests for a Windows-based
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_13 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_14 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_15 as the subject.
iXL offers fully integrated survey design and processing, a
platform independent interface and supports all major geometry and data formats. It
features a variety of processing modules such as demultiplex, final migration and seismic
inversion to support its customers' needs. Since 1976 MIT has provided seismic processing
services and software for a wide range of workstation environments. With headquarters in
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_16 as the subject.
Geoscan in Egypt (September 1996)
Spectrum has introduced seismic scanning services in Egypt using the company's Geoscan software package. The service is being operated in Cairo by Spectrum-Geopex, a joint venture with the Egyptian-based company Geopex.
Landmark's poll on the Halliburton takeover (September 1996)
Landmark stockholders meet on 4 October in Houston to vote on the proposed sale of the company to Halliburton. Only stockholders as of 29 August 1996 will be eligible to vote.
GeoQuest signs agreement to sell !XL (September 1996)
Schlumberger subsidiary GeoQuest announced in early September that it had signed a joint marketing agreement with Mercury International Technology (MIT) to sell and distribute woridwide MIT's iXL, a 2D, 3D and 4D seismic processing system for both land and marine seismics.
© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.