A Flanders and Swann-inspired reflection on novelty

Editor Neil Mcnaughton remembers the 1960s singing duo Flanders and Swann’s satiric commentary on the new technology of the day, the shiny 33rpm vinyl record. In a ‘back to the future’ mood he rolls-in a recent exchange in the Financial Times on the merits or otherwise of 'antiquated’ Cobol and revisits a hobbyhorse viz: Is there anything new in the new ‘data science’ movement?

Current talk of computing ‘in the cloud,’ at the edge,’ (i.e. near the sensors) or in the ‘fog’ (i.e. somewhere in between) reminded me of a 1960’s song from the singing duo Flanders and Swann. Their records were popular with both parents and children which was quite unusual at the time. One of Flanders and Swann’s hits was a satirical comment on the new technology of ‘high fidelity.’ Their ‘Song of reproduction’ was written at the time of technology upheaval as the old, unamplified 78rpm records were being placed by stereophonic 33rpm micro-grooved LPs.

I had a little gramophone,
I’d wind it round and round.
And with a sharpish needle,
It made a cheerful sound.

And then they amplified it,
It was much louder then.
So they sharpened finer needles,
To make it soft again.

The other day, I noticed my wife’s oldish MacBook air playing some music while she was out of the house so I switched it off. I was surprised in that this took just a few seconds. My much more powerful PC chunters on for yonks when I shut it down. I am certainly not the first to have observed that for many straightforward tasks, increasing compute power is more than offset by lumbering software. Or, as F&S might have had it…

I had a little PC,
On which I wrote my stuff.
It only had a meg of Ram
But that was quite enough.

Then I got a workstation
And gigs more memory
But bags of blooming bloatware
Brought the blighter to its knee(s).

When the move to the cloud started (and I am talking about Office365 as well as enterprise IT here) it was obvious that there would be a problem of bandwidth. Even a modest desktop PC can easily provide almost a hundred MB/s of read/write bandwidth, more if you are prepared to pay for it. The cloud, for most of us, is likely to be a factor of 10x below this and also brings problems of quality of service and latency. Microsoft has addressed this by providing double copies of its software, ‘lightweight’ i.e. crap versions that clunk along in the browser, and lumbering bloatware to run on the desktop.

The internet of things is likely to be another such train wreck even though the ‘unforeseen’ (i.e. blindingly obvious) consequences of bandwidth and latency are being addressed by adding computing at the ‘edge’ or in the ‘fog.’ This sounds rather familiar. Before the cloud we had real time systems in the factory/plant. In the field, on site intelligence like a pump off controller would do what it had to do locally. Only a subset of data would make it into the network or scada pipeline. This situation was deprecated by the nouvelle vague of IT, as data that did not make it into the network was deemed to have disappeared down the data drain instead of being gathered as it should have been into the data lake.

Cloud, drain, fog, lake … so many metaphors! So many moving parts! So much confusion! What does it all mean? I think it means that just about any new or not so new technology can now be shoehorned into one or other of these new and nebulous paradigms.

~

An interesting exchange took place recently in the Financial Times where Lisa Pollack reported on a study by the US Government’s accountability office that ‘highlighted the continued use of Cobol in public agencies.’ This was deemed by the Office (and by Pollack) to be a ‘bad thing.’ Cobol is seemingly an ‘antiquated’ language that exposes users to ‘code fragility’ and that it is heading into a ‘digital dustbin’ presumably along with its coders. Cobol needs to be urgently replaced by the kind of ‘micro services-based architecture’ that is favored by the proponents of the cloud.

This reminded me of my October 2002 editorial ‘Don’t mention the ‘F’ word in marketing!’ Where I opined that a programming language ought to be tuned to its end-users’ needs. This is true for both science and or business. I also demonstrated that in 2002 at least, Fortran was alive and kicking and likely providing the world with more real compute cycles than most other languages. While I was mentally drafting a letter the FT along these lines. I was pipped to the post by someone far better qualified, one Stephen Castell of Castell Consulting. He argued cogently that Cobol-based code has stood the test of time and that it is in fact particularly robust in the face of code fragility. He encouraged developers to ‘get back to the future’ with ‘unfashionable’ Cobol.

~

I recently stumbled across a 2015 paper by David Donoho, statistics professor at Stanford University, titled ‘50 years of data science.’ Well that sounds like back to the future already. I was even more intrigued in that Donoho proposed to ‘review [...] the current data science moment and [...to investigate...] whether data science really different from statistics.’

Donoho’s paper was based on an address he gave at the centennial celebration of the birth of John Tukey who ‘called for a reformation of academic statistics [...and...] who pointed to the existence of an as-yet unrecognized science, whose subject of interest was learning from data, i.e. data analysis.’ And that was over 50 years ago! Tukey was a part time geophysicist, famous for his fast Fourier transform. As everyone knows, geophysicists do (data) till it Hertz!

@neilmcn

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.