Computing with light!

An announcement from MIT researchers on breakthrough computing using ‘photonics’ highlights the potential for analog devices in artificial intelligence. Editor Neil McNaughton recalls earlier work using light to ‘image’ seismics. Unfortunately no longer a politically correct use case for MIT!

Early on in my career I was a young geophysicist in the head offices of a major EU oil company. One day there was a commotion in room near mine and soon the ‘next big thing’ was unveiled, an optical bench for analyzing seismic data. This remarkable tool shone a laser beam through a 35mm slide (remember them?) of a seismic line. The light then passed through a lens which focused the beam to a point. No surprises there. But what was mind blowing (to me at least) was the fact that the information, the pattern of light at the focal point, represented a Fourier transform of the seismic image. For those of you who have not come across Joseph Fourier’s chef d’oeuvre, a Fourier transform splits information into its frequency components. A spectrum analyzer if you like. This laser optical bank split seismic into its spatial (rather than temporal) frequencies.

This seemed to me rather fanciful until the machine’s champion began sliding optical gratings and wedges into the device at the focal point, demonstrating just how powerful a filtering device this was. You could remove any directional component in the slide and bring out features ‘hidden in the data.’ It was so powerful that it may have brought out some features that were not there at all.

I later realized that this was not exactly the ‘next big thing’ but the last, having been developed a decade earlier by United Geophysical and sold as the LaserScan. In the 1960s, (before my time!) this device was of interest to seismic processors, even though digital processing was already well established. Digital geophysics was invented a decade earlier (yes, in the 1950s) at MIT’s Geophysical applications group, MIT-GAG. But the laser/analog device was capable of instant processing at a higher resolution than would have been practical with the digital technology of the time. Some examples of LaserScan are given in Ernie Cook’s 1965 paper on geophysical operations in the North Sea and another by John Fitton and Milton Dobrin in the October 1967 Geophysics.

My next encounter with non digital, analog devices, does not have anything to do with this editorial, but it was so clever and I doubt that I’ll ever have a better opportunity to talk about it so here goes. In the mid 1970s GPS did exist but it was not very good. In fact although it was widely adopted, the first sonar-doppler aided marine GPS systems were a step back from radio navigation. Of which there were many competing systems. One of these (unfortunately I can’t remember what it was called and can’t find any references), used an analog delay line and a radar type chirp that was broadcast over the air. The signal was also sent, as an acoustic surface wave across a solid-state device. The distance travelled across the device (at the speed of sound) was selected so that it took about the same time as the radio waves travelling to shore-based beacons (at the speed of light). By cross-correlating the returning radio signal with the output of the delay line, the time of travel (and hence the distance from) the shore-side beacon could be measured very accurately. At least that was the idea. If my memory serves me well the navigation service was not in operation for very long as the GPS brigade got their act together soon after the system was introduced and the rest is history.

You may be wondering what the point of all this is in today’s age of digital ‘big’ data. Well, a recent paper in Nature Photonics, ‘Deep learning with coherent nanophotonic circuits’ by Yichen Shen et al. from MIT describes the use of an optical, analog computer to perform artificial neural network (ANN) ‘deep learning.’ Seemingly, today’s computing hardware, despite ‘significant efforts’ is ‘inefficient at implementing neural networks.’ Just as the digital computers of the 1960s weren’t up to some geophysical processing tasks. And the solution may again be computing with light.

As an aside, this kind of photonics is not to be confused with quantum computing which is also touted as solution for ANN. Quantum computing is not as far as I know yet feasible. MIT’s ‘photonics’ optical computer just uses regular light, no quanta, not even digital pulses.

MIT’s optical ANN promises an enhancement in computational speed and power efficiency over state-of-the-art electronics. The researchers have demonstrated the concept with a programmable nanophotonic processor featuring a cascaded array of 56 programmable Mach–Zehnder interferometers. The system was trialed on a ‘typical’ ANN-style problem, speech recognition where it performed reasonably well, scoring a 77% accuracy.

Commenting the breakthrough Shen said that the architecture could, perform ANN calculations much faster and using less than one-thousandth as much energy per operation as conventional electronic chips. Energy consumption by the way is a big issue in high performance computing. ‘Light has a natural advantage in doing matrix multiplication, dense matrix operations are the most power hungry and time consuming part of AI algorithms.’

And, one might say, of geophysical imaging. In fact the MIT team expects other applications in signal processing. If it wasn’t so politically incorrect these days, they might have added ‘and in seismic prospecting for oil.’ But MIT-GAG is of a long-forgotten past. MIT’s current Energy Initiative, MITEI, is an altogether greener thing, even if though it is funded by oil and gas companies.

By the way, after the knock-off device across the corridor from my office was installed, I would occasionally sneak across the corridor and peek into the laser room. I don’t recall it being used much. In fact I think if was one of those things that you buy, use and couple of times to amaze your friends and then forget about. A bit like my Panono!

@neilmcn

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.