Traipsing around the tradeshows can bring some serendipitous insights to those who withstand the partying. This month’s tale is a foretaste of our comprehensive reporting from the twin conventions (Society of Petroleum Engineers (SPE) and Society of Exploration Geophysics (SEG)) which we will bring you in next month’s Oil IT Journal. But just to lift the veil a little I’d like to relate a moment of puzzlement I experienced while listening to the very grand presentation given in San Antonio by Schlumberger and Intel on a new joint development destined to revolutionize reservoir simulation.
My incomprehension came from the fact that Schlumberger is an application software vendor and Intel is a chip manufacturer. They come from opposite ends of the IT stack—and I could not see what was in it for Intel (whose business model is to sell chips by the container-load) or for that matter, Schlumberger, whose flagship Eclipse application used to be so device independent that it would run on anything from a Cray to a toaster—at least that is how I remember the marketing material circa 1985.
The focus of all the marketing hoo-ha was of course the Linux cluster, and before you jump to the conclusion that Intel will be filling the containers with chips destined for cluster-based reservoir simulators, let me disabuse you. The particular nature of reservoir simulation means that a ‘cluster’ today comprises 16 CPU’s. A 32 CPU cluster will be available ‘real soon now’.
I pressed the folks on the Eclipse booth as to what exactly would be the program at the new Schlumberger-Intel research establishment—based next door to the Schlumberger ECL unit in Abingdon, UK. I asked what language Eclipse is written in. I was bowled-over to learn that Eclipse is written in that oldest of programming languages, Fortran! This was something of an epiphany for me in two ways. First I understood that the collaboration was not focused on redesigning the Intel microprocessor (that would really be the tail wagging the dog) but on tuning Intel’s Linux Fortran compiler. Second, it explained why the marketing department just had to steer clear of the real subject. What marketing person could possibly admit in today’s Java and object-oriented world that their software is developed in Fortran!
I left the SPE and headed on to the SEG armed with my little discovery but still wondering what all the Linux fuss was about. In Salt Lake City I ran into yet more Linux propaganda—from Landmark, Paradigm, Magic Earth (running on a one CPU ‘cluster’), WesternGeco—and listened to amazing tales of total cost of ownership and of performance. Seemingly virtually any upstream software runs 10 times faster on Linux than on Unix (but I though Linux was Unix). Indeed the one growth part of the geophysics business is the 19” rack full of Intel (or AMD)—based PC’s. Geophysical processing is truly amenable to clustering and some mind-boggling CPU counts—up to 10,000 at one location—were mentioned. But I digress.
At the SEG’s afternoon ‘The Leading Edge’ Forum on Future computing, industry experts pontificated on where geophysical IT was heading in the next few years. Again, we’ll bring you our report next month—but for the purposes of this editorial, I must confess that after a couple of hours of listening to stuff about Moore’s ‘law,’ Linux clusters and Java I was moved to intervene. “What about the algorithms?” I blurted! This was met with more entreaties to use Java and C++. “What about Fortran?” I spluttered! This seemed to catch the experts on the hop. I concluded that the “F” word is generally considered inadmissible in good company. ‘Object-oriented, Java’ are the things to talk about—but generally, you are on safer ground if you stick with the whiz-bang of TeraFlops and MegaPixels rather than programming paradigms. This I find puzzling since it has been reported elsewhere that progress in computing, for instance in sorting algorithms or code cracking, comes almost equally from speeding-up the hardware and improving the algorithm.
When I got back home I decided to do some more research on Fortran in action. I quizzed the Usenet newsgroup comp.lang.fortran (about as active a community as sci.geo.petroleum incidentally) and would like to hereby express my greatest thanks for their cooperation. I first learned that contrary to the received view, Fortran is not a ‘legacy’ language—but a performant, widely-used tool for writing numerical applications—such as reservoir simulation, seismic processing and meteorological, engineering, oceanography and nuclear reactor safety. Fortran remains popular because of its built-in support for multi-dimensional arrays and its portability (could this be why so many apps have popped up on Linux recently?). But most of all, Fortran’s success is due to its accessibility. Engineers and scientists have enough intellectual challenges in their core business to want to wrestle with such elementary features as arrays.
A closing thought—just how popular is Fortran? My guess/contention is that Fortran code probably represents a majority of world-wide CPU clock cycles. At least those that are, as it were, computed in anger—we might have to exclude the zillions of near-idle PC’s, waiting for the odd mouse click. In terms of real number crunching, climate forecasting code that uses over a million CPU hours in one run takes some beating. So for that matter does a 10,000 CPU seismic processing cluster running round the clock.
© Oil IT Journal - all rights reserved.