Recent debate on the Usenet sci.geo.petroleum (see article in this issue) has focused on the hardware side of the Unix vs Windows battle for IT market supremacy. I'd like to focus here on the Operating System side of this debate. The hidden agenda is a discussion of Standards, Open Systems and Competition, and especially to attempt to show how these words don't always mean what they say and how frequently, the first two are deformed or even abandoned in pursuit of the third. This is a personal account of how I have seen the industry evolve rather than an attempt to be exhaustive, but I'm sure than many will recognize some of the following. We will be focusing particularly on the limits between the stages, the paradigm shifts or evolutionary changes, real or perceived and will try to show how today's "Standard's Wars" are really a battle between personalities, particularly Bill Gates (Microsoft) and Larry Ellison (Oracle), and how they relate back to previous upheavals in the industry.
Big (bad?) blue
First we'll focus on the way IBM was perceived in the early days of Unix. There was a kind of 1960's mentality about whereby Big Blue was very much the "baddy" - with deprecated programming techniques likened to "kicking a whale across a beach" by B. Kernighan (a founding father of Unix and the C programming language). Similarly, proprietary hardware and software were done down and presented somehow as a rip-off. Unix came along with the promise of cross platform stability (vi was vi on a Sun, HP or whatever) and most importantly, software interoperability. This was achieved through the use of pipes and filters acting on the byte-stream - and a very powerful tool it was too. It led to the development of a powerful shell programming language, and other "little languages" sed, awk, and later on to perl, cgi and even Java perhaps. You could do anything. I once had to edit a bunch of files all in the same rather complicated way.
Tee for two
I could have used awk, but even that would have required a lot of head scratching so I tried editing the first file with vi, while "tee"ing the output to a file. When my first edit was through, the tee'd file contained a record of all the keystrokes used during the edit operation. The next step was to write a shell script which echoed the keystroke file the stdin of a vi on all the other files. The code (from memory I'm afraid, I'm not sitting at a Unix box at the moment, and not all Unixes offer this flexibility) is shown in the side bar. Now if you are not a Unix freak, this will mean nothing. But this technique allowed me to record the equivalent of a Wordbasic macro and to run it as a batch process, all this about 10 years before these tools existed. I have always regarded this exercise as a proof of the superb interoperability of the operating system, files and the power of redirection.
Unfortunately, this true openness was not reflected in vendor's products. One early database bypassed the Unix file system completely. Yes it ran on a "Unix" HP, but it ignored everything except the hardware. Personally my first education into how vendors treated Unix's orderly arrangement of system files was when we installed our first commercial software, a database development system. The installation involved the creation of a sort of parallel universe when it came to the device drivers for the terminals. I asked why they didn't use the same drivers as were already on the system. The answer was that that was the way it was. We soon had as many /etc/tty/term files as we had software on the system. Often they "assumed" that our terminals were in different emulations - so our users initially had to switch emulation mode as a function of the software they were using. We later fixed this with a shell script, but what an unnecessary burden on the system administrator. Today's vendor's act pretty much the same way. Some obstinately refuse to provide simple ascii output from their programs, effectively isolating them from the rest of the world. To my way of thinking, a program which does not - at least as an option, accept ascii input from, or provide ascii output to the bytestream is not Unix.
What went wrong?
The answer is, as we have already intimated, that the vendors did not play the game. Why did they not play the game? Some suggestions are
laziness - probably the case for the /etc/tty story
Graphics and X came along and clouded the issue, the focus was now on event driven, user interactive programming and the byte stream was no longer an issue for most.
"Real" non-quiche eating programmers were more interested in learning C than in using the tools that had already been developed.
But these are anecdotes compared with the real reason for Unix's abandonment of the interoperability paradigm. The real reason was that this "Berkeley freeware" aspect of Unix was of no interest to them whatsoever. They had serious ulterior motives. They wanted to sell their software and hurt the competition.
The next chapter in this brief history is that of the PC. Since the story has been told many times, we won't repeat it here. What I want to discuss here is the openness or otherwise of the PC in the light of what we've just been discussing. First on the hardware front. It is hard to conceive of a more open system. Plug and play technology really is an amazing development coming from such a diverse assembly of constructors. I recently installed a 3 COM PCMCIA card in my Toshiba. I was getting nowhere until I realized that you do not have to install it. I removed all the files I'd laboriously copied from the disk. Removed the disk, plugged the card in and that was it. Amazing.
COM - on!
PC software is, for the main part, not Open. It is of course Microsoft and proprietary. It is nonetheless, no less Open than Unix. In fact today, the COM and OLE specifications for interoperability are far ahead of the Unix equivalent (CORBA - see August) which itself is conspicuous by its absence from E&P software. Unix may well be caught with its pants down by this technology which is in the shops now. In fact Visio and Micrographics Designer are now touted as being almost part of the Microsoft Office Suite, by Microsoft themselves. You can also program your own OLE containers using relatively simple languages such as Visual Basic. COM/OLE is furthermore particularly well suited to the new distributed computing paradigm offered by the Internet. Which is the next stop on our tour.
It really is history repeating itself. The wolf pack is howling about Open Systems again. Netscape is a "strong supporter" of Open Systems (when in fact their huge stock-market success was the result of dumping the market with their Browser which just happened to have a few proprietary "extensions" to HTML. Meanwhile the Oracle/Sun/Netscape/IBM network computer is likewise "dedicated" to open systems, but will include the Mastercard/Visa proprietary authentication technology in hardware (so it will only transact with a similarly equipped device) Sun's Java programming language will also undoubtedly be turned around to offer a competitive edge to its inventors somehow.
What is important in all this is that the IT world is not made up of goodies and baddies. There are occasions where "standards" are unequivocal marketing ploys. They are frequently announced before they are available in order to spread Fear Uncertainty and Doubt (FUD) while they are finished off. Most standards are really just protocols. True standards are born, rarely made. On the other hand while proprietary extensions to a standard may be a Trojan horse into an established open system, they can hardly be deprecated, because they are a traditional route to enhanced systems. Remember the VAX FORTRAN extensions? Or SGI today.
which open standard?
Interoperability is in our hands. You can always not buy a product that does not comply with some minimal specification such as, well ascii in and out on the byte stream would be a good point to start while we wait for full compliance with the CORBA standard or should that be a POSC Business Object, or a PPDM compliant data store, or COM/OLE?...here we go again! Finally, wars can have unpredictable results. Before Unix, most E&P software ran on Digital Equipment (DEC) hardware. Unix was itself developed on DEC and while Unix's inventors shot first at IBM, they actually missed and hit DEC! On the other side of the fence, one of the most amazing paradoxes of the IT scene today is the fact that when you but an "IBM-PC" IBM generally speaking receives nothing from the transaction since the OS and software is nearly all Microsoft and the hardware specifications are in the public domain!
Stone-age macro editing with vi
First, capture the keystrokes with -
vi file1 |tee keystrokes.dat
perform the editing as appropriate, then run a shell on the other files as follows -
for file in *
vi $file < `cat keystrokes.dat`
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to email@example.com with PDM_V_2.0_199609_3 as the subject.
© Oil IT Journal - all rights reserved.