Google Earth, hosted data and the way of the world

Oil IT Journal Editor Neil McNaughton tries to distill a ‘boilerplate’ technology talk from the many C-level addresses he’s heard recently and then writes last month’s editorial on recent some developments in geographical data management. The deal between Talisman and Valtus announced at the ESRI PUG looks like a ground breaker—but can it be applied to other E&P data types?

It’s probably not the first time that I have observed in these columns that the talks given by the great and good of this industry have a boilerplate similarity about them. One year everyone will be banging on about the need for cost cutting and headcount reduction. A few years later folks will be warning of imminent doom as underinvestment in infrastructure and an aging workforce threaten our survival.


While the swing of the invest/cost-cut pendulum is just the way of the world, the aging workforce may reflect a tendency on the part of the oldies in charge of the show to ‘let go’ their younger colleagues. I suppose that’s the way of the world too. All of which is a way of introducing some of the issues that constitute the current upstream technology boilerplate and some maturing technologies that may actually be able to address them.


My boilerplate of the month includes: the problems of a dwindling and aging workforce; of entering data again and again, of fixing bad data again and again, of the evaporating skill sets needed to handle tough problems like formats from the SEG, ESRI, LIS, LAS, DLIS, issues with geodetics and exploding data volumes. I could go on. In fact I will—how hard is it to build a project of intersecting 3D seismic surveys of different vintages, a few horizontal wells, some with bad deviation surveys, and cultural data with a different projection? All of which is layered on top of the hardware and infrastructure needed to run the show—file servers, virtualization, workstations, graphics subsystems. Not to mention different operating systems, Oracle versions and so on. All in a day’s work what?


Convolve this daily reality with a hiatus in educational input following the last downturn and you have the picture painted by the ‘C-level’ folks who get to hold forth at the industry conventions. Their answer to such problems is in general, more training (i.e. more people), more investment and more technology. More of everything in fact. As I remarked at this month’s SPE Digital Energy show, not so long ago we were flagellating ourselves as technology laggards. Now we have to sell our industry on its high tech leadership to compete for the young grads. But I digress.


I should say that this is the editorial that I was going to write last month after the ESRI PUG. But it got displaced by my rant over Microsoft’s HPC effrontery. So going back to the ESRI show, I think that geographical data is a rather interesting proxy for E&P data in general. It is complex (geodetics rivals seismics in the mathematical and domain-specific skills required). Data volumes are huge (especially for raster) leading to layered complexity and serious scalability issues. But the good thing about GIS is that, as a generic, horizontal data type, it is more likely to benefit from mass market developments than say, a cement bond log.

Google Earth

And the mass market is of course what has happened to GIS. It seems churlish to make a meal of this in the context of the ESRI PUG, but coincidentally, there were no less that four Google Earth-based announcements in last month’s edition of Oil IT Journal—and no, we were not trying to make a point! Commodity has come to GIS. Instead of pushing the complexities out to users, keep all that stuff on the remote server and just give them what they need. We’ve been here before, with hosted data offerings (Schlumberger et. al. circa 1998) and with application service provision (various circa dot com boom time). In the GIS domain itself, we reported on Microsoft’s TerrServer back in 1998. But these earlier services never really caught on. With Google Earth, the paradigm has landed.

Thin client

You don’t really need me to explain how the hosted data and thin client approach addresses pretty well all of the issues in our C-level boilerplate above. One time data entry? You got it! Hide the complexity? Yup—all the ‘clever’ stuff is managed off site on one single authoritative data set. OK, I know, GE does have a few issues with the alignment of roads and satellite imagery—but someone is working on it (I hope). Even the demographics and the aging workforce issues are largely addressed by the hosting model. Scalability is good too—both upwards and in the face of future downturns (also a way of the world I’m afraid).


The GE effect has already galvanized ESRI into producing an ‘industrial strength’ version in the form of ArcGIS Online. But what impressed me most at the PUG was Talisman’s account of its spatial data outsourcing with Valtus. Somehow this really did sound like a breakthrough. Certainly, the idea of outsourcing your data management to a company that already specializes in that data type makes sense. The rest is just a matter of bandwidth.


Will outsourcing break through to more generic E&P data types? IHS appears to be moving this way with Enerdeq and both the service majors have significant offerings in this space. All the same, having just got back from the SPE Digital Energy show, I can assure you that this topic is not yet in the hearts and minds of the upstream. This can be explained by a certain reticence in the face of the new technology. On the part of the vendors—how many license sales will we lose? And on the part of the oils—what will ‘application support’ and ‘data management’ do? We’re back to the old ‘people problem’ again—perhaps we need a generation to retire without replacement before this one is going to run.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.