Windows in HPC, breakthrough or bluff?

Oil IT Journal Editor Neil McNaughton ponders on Microsoft’s credentials in scientific computing, competition from Linux and a full frontal attack from a UK academic on Windows code quality.

Not sure what to make of Bill Gates’ talk at SC 2005 (see this month’s lead). It’s not every day that the great Gates speaks to the scientific computing community—but there again, he didn’t have to travel far. Back in 2003, Microsoft hosted an event in Houston on high performance computing (HPC) in oil and gas. We were skeptical then (OITJ Vol. 8 N° 7) and I guess that I’m still skeptical. For one thing, the roll-out in December 2005 of an HPC offering would suggest that the 2003 event had an element of fear uncertainty and doubt (FUD).

GPU

Fortunately for all of us, Microsoft’s marketing department is not going to have a free run at the HPC field where Linux’ lead looks unassailable for the moment, not least because of its reputation for code integrity—see below. But there are other forces at work in HPC. The really interesting work is being done on esoteric hardware such as the graphics processing unit (GPU), used first by SMT for reservoir simulation, and now by FineTooth for seismic processing (see page 7 of this issue). These developments have led to various damage limitation exercises in the form across the board SEG ‘sponsorship’ from Microsoft and Intel—and from the latter, a joint presentation with NVIDIA to ‘explain’ how the GPU is not really a threat!

Hatton

Microsoft’s offerings took a beating in a recent talk by Prof. Les Hatton (Kingston University, London). Hatton’s specialty, ‘Forensic Software Engineering’ includes evaluating software reliability by counting ‘defects’ in executables, both applications and operating systems. Code quality is measured in faults per thousand lines of executable (machine language) code (KXLOC). The ‘state of the art’ is represented by the NASA Shuttle software, with a 0.1 KXLOC fault rate. Windows 2000 is thought to have a fault count in the 2-4 KXLOC range while Linux fares relatively well at 0.5 KXLOC.

OS of choice?

Hatton has little time for Windows as an operating system citing MTBF for 2000/XP as around 100 hours against over 50,000 hours for Linux. Hatton’s recommendations are that if you want a reliable and secure operating system, ‘don’t use Windows’.

Out of memory

Hatton says a lot more about code and compiler quality. It would have been interesting to hear more from Bill Gates on such topics. Windows got a bad rap in the past, but it has improved. My own system doesn’t crash spectacularly like it used to. But I can still run ‘out of memory’ in a 20MB document—despite my recently acquired 2GB ‘just to be on the safe side’!

Excel

For computational reliability though, it’s those folks who are using Microsoft Excel who are really living dangerously. Hatton cites a University of Hawaii study that found over 90% of spreadsheets have errors. Surely a warning to our financial and engineering brethren.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.