A conspiracy theory

Why computers are not going to take your job.

It is now commonplace at conferences and exhibitions to see hackathons with (usually) younger folks beavering away in Python deriving insights from big sets of data. Should we all be learning to code? Or more specifically, what relative importance should be placed on learning to code as against learning say, mathematics or physics? If your data contains all you need to know then why bother with the bottom up approach of learning geology, geophysics and what have you. The data driven brigade have it that the new paradigm of machine learning will sweep away many traditional jobs – not least seismic interpreters (see elsewhere in this issue).

As an ex seismic interpreter myself, I confess to being more than a little skeptical about this – but forget that, and assume that we are moving to a world where ML is going to replace knowledge work. Or, as a recent report from that great source of nonsense and editorial inspiration, the World Economic Forum has it, that tomorrows workforce will comprise ‘data scientists, developers, social media specialists and marketing’. Or, to rub salt in the wound, you as a geophysicist will be replaced by a data scientist massaging big data. So, your employer is going to replace you, the geophysicist (at $100,000/year) with a hot shot data scientist (at $350,000/year)? There is something wrong here.

I have a crazy conspiratorial theory as to what is wrong with this curious development and it starts in the early days of computing. For a few decades computing did a straightforward job of a) speeding up calculations and b) doing more with a smaller headcount*. To pursue the geophysical example, it was rumored in the West that the Chinese did seismic migration (a compute-intense task) with huge teams of abacus-wielding engineers. The model here is for one person and a computer doing the work of thousands. The remark attributed to IBM founder Tom Watson, that in 1943 there was only a market for about five computers is seemingly apocryphal, but it summed up the idea of multiplying the efforts of a small number of programmers and computers**.

The early promise of the computer was one of a multiplication of effort. A program ought to perform a calculation or a task repetitively. A a good program is one that wakes up as a Unix cron task at regular intervals does its stuff, spiders the file system, QC’s some new data and pumps it into the database. The ideal programming workflow is write-once … and then run for ever, doing many people out of work including the programmer! In the plant and process industries there are a lot of these programs around controlling valves and motors in what is known as operations technology. OT is generally looked down on by IT.

This is not quite how things panned-out. Over time, the computer industry contrived to produce machines that require constant attention in the way of program maintenance and upgrades. Programming languages were tweaked and re-invented and battles raged between supporters of one programming paradigm and another. This was in part due to genuine technological progress, but not entirely. As the industry grew, Madison Avenue stepped-to help grow the business with its marketing. Nothing wrong with that. Back in the day, buyers recognized that real and not so real technological progress would be matched with a marketing effort designed to foster ‘fear, uncertainty and doubt’ (FUD) in those that failed to catch the next ‘big thing’.

The arrival of the personal computer threw a bigger spanner into the ‘multiplication of effort’ paradigm. Instead of a lonely programmer sitting in front of a console attached to a big machine doing a lot, everyone got to play with their own computer. Instead of a market of “maybe five” there was suddenly a market of “maybe five billion!” For the computer industry, the birth of the PC was manna from heaven. Instead of one programmer doing the work of many, the picture shifted to everyone doing stuff on their own (forcing the marketing department to convince us, despite the evidence, that we were all now ‘collaborating’ with ‘productive’ software).

To understand the computer industry, you need to ‘follow the money’. Imagine what would happen to the industry if somehow there was a shift back, even slightly, from one computer per person. A massive revenue stream would be lost! So Microsoft and others do everything in their power to work to counter the original nature computing, i.e. to do more stuff with less human effort. In the PC world this is achieved by trivial ‘upgrades’ to computer operating systems, by inefficient ‘bloatware’, and by eye candy-oriented interfaces. In the larger field of enterprise computing, the objective of most development seems to be a ‘dashboard’, again with the implication that a human looking at a screen is the essential endpoint.

In a presentation at the 2018 ECIM we heard from a proof of concept by Arundo Analytics (report from ECIM in our next issue) which was, on the face of it, an AI success. Arundo’s deep learning-based predictor of compressor failure correctly warned of failure three weeks before it happened. The only trouble was, ‘nobody was looking at the screen!’ The unasked question is, ‘why would anyone want to look at a screen for months on end waiting for such a warning?’ It seems like IT’s task is done once a dashboard is up and running with graphical widgets and cute eye candy to display a few KPIs.

But I have digressed from my original question as to the role of data scientists and subject matter experts (SME) like geophysicists. Let’s assume that AI will replace many SMEs including geophysicists and results in machines doing the bulk of knowledge work. This leads to the interesting question as to how computer industry will make its money faced with a greatly reduced head count and no hardware sales (now that all is done in the cloud).

A glance at the GAFA’s (and Microsoft’s) revenues makes it clear that this is absolutely not how things are developing. No way is the AI revolution (or any other next big thing) going to result in a ‘rational’ use of compute resources that reintroduces a computerized multiplication of effort. Marketing will step in to assure that business as usual continues, that more and more powerful computers will have to be deployed, that compute nirvana awaits those with a mastery of ‘R’, CUDA or what have you. I say ‘will’ but the marketing madness is all around us right now. Wild claims abound for AI as “controlling safety-critical infrastructure across a growing number of industries…” (no it does not!). A possible best-in-class wild claim just popped into my inbox from Accenture where we are invited to “find the holy grail, the driverless supply chain, with quantum computing in oil and gas”. Really? Back in the day a mild amount of FUD was accepted as part of doing business. But things are getting out of hand! As I put in in a short letter that the Financial Times has the good grace of publishing recenly...

IBM and others have been so successful in entangling members of the Forth Estate*** (FT Big Read Technology September 4) that it is hard to see how near they are to making a quantum computer that actually works. Perhaps the quantum computer is, like the qubits that drive it, in a state of superposition, being both real and not real at the same time. The question is, can the marketing folks keep the promise of quantum computing alive before it, like the qubit, decays and is replaced with the next ‘big thing.’

Best regards Neil McNaughton

* Although the job picture here was actually more nuanced, see the US Bureal of Labor Statistics Occupational Changes in the 20th Century.

** It is not even as silly as all that if you allow that today, there are just a handful of GAFAs running ‘real’ computers in behemoths of datacenters.

** Fancy talk for the media!

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.