As we boldly announced last month, PDM was on a mission to the SPE. We planned to track down and report on new applications of 4D, time-lapse seismics to reservoir management. We were disappointed to find that at the SPE, the home of reservoir management, virtually nobody was even talking about the new technology. Well, there were some signs of activity, Schlumberger’s SimOpt, although still largely under wraps, will soon integrate 4D seismic information and Scandpower is currently developing such capability. But why is this great new technology being adopted so timidly? I reckon it’s because the geoscience to petroleum engineering boundary delimits the biggest silo wall in our industry.
Actually, the SPE program contains all sorts of enticing topics, ‘integration’ and ‘multi-disciplinary’ teams are the order of the day. But there is more talk than substance here, offering little more that camouflage for the old silo boundary. I attended the session on ‘seismic applications’ and listened to two papers with ‘seismic’ in the title - but nowhere else, one paper on ‘4D seismics’ which wasn’t, and another pure ‘seismic’ paper, with no engineering. We seem to be light years away from melding the disciplines.
In the past there has been a kind of natural demarcation line between the silos, between the static world of the geoscientist and the dynamics of the reservoir engineer. But with time-lapse, seismic suddenly appears on the wrong side of the fence, offering the reservoir engineer ‘in-your-face’ observations of what is really going on. Not modeling, reality! A paradigm shift of this magnitude is revolutionary stuff to the silo worker - and to the software developer.
The magnitude of the developer’s problem is evident from the embryonic 4D offerings that currently involve taking some minimal subset of information (such as a cube of water saturation values) and passing it over the silo wall in the manner of a ‘hot potato.’ The potato can then be peeled, mashed or whatever, and another subset of information (such as a map of flow boundaries) passed forward. This preserves the silos, and ensures that cooperation between engineers and geoscientists will be minimal, maintaining the natural order.
Irony apart, I know there is cross-discipline cooperation today, but the trouble is that it mostly works through visualization. If you can see two ‘hot potato’-type data sets, from different parts of the data chain, at the same time, then you are ‘integrated.’ But there’s more to data integration than just seeing.
To exploit 4D to the full, we need to compute across the silo boundary. Ultimately, to fully exploit any seismic dataset, you need access to pre-stack data for attribute analysis. Downstream you need to be able to go around the simulation and history matching loop, conditioning the results with input from multiple 4D datasets. Ideally you need visibility and computability of data irrespective of its place in the chain. Managing the reservoir becomes managing, seeing and computing data, from pre-stack seismics to well tests, simulations and real time.
Beyond the pale
Using all the data in a non-trivial manner is beyond today’s silo-focused applications. This partly explains the popularity of non-silo oriented environments like Technoguide’s Petrel and T-Surf’s GoCad. Both cut across the silo boundaries, although neither address 4D seismics right now, and both are rather ‘visualization’ focused.
In this context, the rising profile that Schlumberger is giving to Open Spirit may be significant. Now Schlumberger products ‘integrate’ Open Spirit rather than GeoFrame. Reading the tea leaves left by the marketing department, it is clear that Open Spirit is no longer just the approved way of integrating non-GeoQuest applications. It is has become, in the space of six months, the preferred route to application integration.
But what kind of integration? If we are aiming for visualization-focused integration, then exposing application data through Open Spirit is OK. But one aspect of Open Spirit that has been somewhat overlooked is the way software can be componentized, and used to build ‘ad-hoc’ applications. Looking maybe a few years into the future, one can see how such technology could be used to build a bespoke ‘seismic processing through reservoir modeling’ application tailored to our 4D problem. Whether this will work depends on what is meant by Open Spirit ‘enabling’ an application. If it means exposing data to other applications, or being able to ‘see’ foreign data stores, then we won’t have moved very far down the road of workflow integration. But componentized applications that share resources, that can be center-stage, or waiting in the wings, ready to compute or display as and when required, that would be something.
New Orleans is steeped in both polystyrene cups and magic. My own personal gri gri is a Motorola Timeport cell phone. Three-band technology allows it to operate in Europe and the US. In the New Orleans central business district where the Ernest Morial Conference Center is located, signal strength is pretty well on max. But cross over Canal into the French quarter and there’s nothing. Now that’s what I call voodoo!
© Oil IT Journal - all rights reserved.