John Rogers (Texaco) asks
"Can anyone point me to a vendor that can give me a cheap replacement for our $60,000 workstations (25 fully loaded Indigo2 High Impact workstations from Silicon Graphics). The PCs that replace these workstations need to have:
An OS equivalent or better than the 64 bit UNIX OS from SGI (IRIX 6.2)
A high speed graphics card giving me the same 3D capability as the High Impact
A fast ethernet card (100BaseT for 100mps speeds)
Dual 20" CRTs
A "PC" kind of price... around $5,000 I suppose; anyway a whole lot cheaper than my $60,000 workstation
A file address space of at least the terrabyte that I get from HFX on my Indigo2 workstations
And the whole mess has to be tightly integrated and highly reliable like our Indigo2 workstation... don't want one vendor's PC card conflicting with another's as seems to happen so frequently with our "plug and play" Win95 PCs
And this PC must be able to run PVM over the fast ethernet so we can do pre-stack depth migration on our seismic data in the evenings and on the weekends when our interpreters don't need them... that is the OS must be multi-user as well as multi-processing.
I work at Texaco and know that Texaco would gleefully embrace any vendor that can replace our expensive workstations with such PCs with capabilities as above. There are some big bucks weighing in the balance. Dream vendor where are you??"
Now this seems a bit of a wind up if you ask us, it sounds rather as though JR is saying that his needs will always be a Gig or so beyond those of a mere PC. Hugh Winkler took up the challenge to point out that "The only feature .. that an NT machine doesn't meet is the 64 bit OS", and James Huang took up the torch with the following comment "IMHO, I do not see what magnitude of difference there would be between 32-bit and 64-bit in, for example, workstation seismic interpretation. But then I am a naive non-interpreter. <g>". Hugh also states that "I have yet to see a system set up with more than 1GB swap." And that while "a USD7/MB might be true for PC SIMM chips, proprietary RAM for *nix systems come at a higher price. In these cases it is cheaper to get more storage and create big swap systems. The case of loading the whole data-set into memory so that things get speeded up by a few minutes real time is, IMHO, a waste of resources."
On the subject of applications migrating from Unix to Windows Winkler suggested that companies who ported their X Windows software would lose out to those who wrote them from the ground up in native MS-Windows code, citing the Geographix SeisVision application as an example of what could be achieved on a PC.
how friendly?
On the topic of user friendliness, Martin Crick (Texaco) believes that one of the issues that will push more companies to use NT apps is that "all the other non-technical stuff, like spreadsheets and word processors, are MUCH better and cheaper on the Intel platform, because the market is much larger. If you don't want to put two systems on a desk, NT seems to offer many advantages". And James Huang adds that "with the large offering of X-servers for the PC market running under all the OSs it is no problem connecting the different worlds"
PDM comment
The PC/Unix debate has been raging since 1981, when IBM first
introduced it's (8bit) PC. Then, one argument used in favor of Unix was that it was 32
bits. But this had more to do with the Intel architecture of PC, which put a variety of
64k (i.e. 16 bit, if you're still with me) barriers in the way of the programmer. Since
then, word length is used as a marketing tool. It has nothing to do with precision, since
a double float will take up 8 bytes in whatever system. It may be processed more
efficiently if the system is 64 bit. But on the other hand, it is conceivable that integer
operations would actually be slowed down by a longer word length. Boolean operators,
taking one bit, may suffer even more. This supposes that all other things are equal, which
of course they never are. What made the Unix boxes better than PC's was the power and
intrinsic inter-operability of Unix in those early days (but see the editorial in this
issue on this topic), and also the flat memory model. This simply meant that it was
possible to create and manipulate very large arrays in memory without having to go through
the 64k hoops. Nowadays Windows 95, and Windows NT both promise 32 bit operating systems.
The facility of use, robustness and security of the latter particularly are touted as
being superior to many Unix systems, time will tell. The flat memory model exists in
Windows NT, while Windows 95 still has a lot of the DOS limitations lurking below the
surface. But it should not be up to the programmers to debate these issues. Who cares how
many bits, or how flat your memory when your application does what you want it too,
doesn't cost too much and most importantly is intuitive and easy to learn? We believe that
Unix has had a macho image associated with it. Why have a graphic shell when a real system
administrator can edit the /etc/passwd file in vi to add a user? To date the PC has scored
hands down over Unix in terms of ease of end-use. With Windows NT, this ease of use is
being brought to the system administrator. Bad news for those who like doing it the hard
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199609_7 as the subject.
© Oil IT Journal - all rights reserved.