On seismic risk, outsourcing and AI

Oil IT Journal editor Neil McNaughton traces the risk/reward equation of the seismic business. Majors outsourced their seismic acquisition risk many years ago to contractors. It did not go well for the latter. The advent of AI may be about to tip the balance. The datasets amassed by the multi-client specialists now represent a ‘competitive advantage’ in training new seismic large language models.

A common misconception about the oil and gas business is that it is high risk. Well, it is high risk if you are talking about the E&P minnows, but not for the majors that have adopted a strategy that derisks large chunks of their business. Take for example the seismic industry. Many years ago some majors actually ran their own seismic crews. The boom and bust nature of exploration meant that in a downturn, they had crews on their hands doing nothing. The risk of such an eventuality was countered by outsourcing the seismic business to contractors. These enterprising folk were happy to build more and more complex acquisition devices, from high performance computers in the field, to multiple massive marine streamers towed behind purpose designed vessels. In 2014, the last really big downturn, much of this fancy gear was surplus to requirements. The oils retrenched. Many contractors went bust, sold to the Chinese or went ‘asset light’ i.e. they got out of the acquisition business.

The majors meanwhile, being in the driving seat, retained the decision making and management of their seismic surveys. Company geophysicists faced with novel exploration challenges could ask the remaining contractors for pretty much anything they wanted. Sparse recording? More streamers? Circular shooting? When asked to jump, the contractors said ‘how high?’ This enabled the oils to adopt a kind of ‘fail fast’ approach with the risk of failure devolved to their contractors. And failures there have been. Shooting in circles turned out to be a rather futile exercise. We hear less today about sparse/interferometry despite its intellectual attractivity. In fact the whole streamer paradigm and those expensive vessels is being replaced by ocean bottom nodes which are just ‘much better’!

On the geophysical processing side the majors took a different approach. The large processing centers were not ‘outsourced’ but kept in-house. Majors often consider that their own processing software gives them a competitive advantage. The massive number crunching that this entails means that the knowledge required to process seismics extends from physics, through scientific software and into the more esoteric specificities of computer hardware. The other advantages to an in-house high performance computing set up are the bragging rights that a high ranking in the Top500.org bring and the fact that you can walk visitors round the installation.

The situation may be changing with the arrival of artificial intelligence (see this month’s lead and our report from the Ken Kennedy Institute HPC in Energy event in this issue). First, AI runs on a different architecture (Nvidia GPUs) from conventional HPC and second, it uses a different programming language (CUDA). Adding to this is the competition between the geoscientists and the computer scientists. Standford Professor Biondo Biondi spoke of being ‘disenfranchised’ by the data science brigade, although he saw some hope in the likelihood that AI will be doing the specialist code writing, leaving the geos to do the science.

AI is bringing an even more important development to the seismic business with the advent of seismic foundation models and deep learning. Almost everybody involved in the industry is doing this today. The foundation models that are proving so successful rely on data, lots and lots of it! While the majors each have pretty good data sets to play with, they may not be big enough to build the best models. They are also likely to be patchy, reflecting each player’s historical and geographical areas of interest. And, if I may make so bold, their data sets may not be terribly well managed, making harder for the new data tools to access.

So who does have really big data today? And who does manage it well? To my thinking the answer is the few remaining contractors who assumed the oils’ risk back in the day and who also, thanks to their activity in seismic processing and multi-client work, do data management rather well too. Some are also streaks ahead in the smart use of the cloud. Some, notably TGS, as we report elsewhere in this issue, are going all-in on AI-based processing and interpretation. Perhaps the risk-reward pendulum is swinging the other way.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.