Oil Information Technology Journal

Editor Neil McNaughton tells of plans for a re-launch of Petroleum Data Manager, reviews this month’s coverage of upstream IT, and reflects on what makes an E&P standard.

This is the last edition of Petroleum Data Manager. But fear not, we are not abandoning you! Petroleum Data Manager is to re-launch as Oil Information Technology Journal. Why? The change is simply to better reflect the true scope of our coverage. Petroleum Data Manager has never just been about data management. We have always tried to give you the big picture.

Identity crisis

But in this day and age, it is not enough to supply content, packaging is equally important. We recently carried out a study of PDM readers, both casual and confirmed. This was achieved by in-depth anecdotal collection, chance observations and a couple of phone calls. We concluded that we have a problem of identity. Casual observers have been seen dismissing our publication as ‘just advertising’ - which it is most definitely not. Others have taken a far too literal interpretation of PDM’s name. Thinking for example that as a ‘knowledge worker,’ data management does not concern them.


This situation has frustrated us for some time, and has only been bearable thanks to a sustained effort of... denial. But something has come up that has rendered this strategy untenable. Our very modest success over the last couple of years has led the French fiscal authorities make increasingly ludicrous demands on us. So we have to act. I see three choices - exile (far too pro-active for our denial-prone corporate strategists), throwing in the towel (but what would your megalomaniac editor do without an audience?) and getting a larger readership.


The move to Oil IT Journal is intended to more accurately reflect our expanded coverage, and to bring in lots of new readers for whom the ‘Data Management’ tag may be a turnoff. But for those of you who are data managers - and I count myself amongst you, fear not. Our coverage of data management and IT infrastructure will continue unabated. In fact my humble opinion is that our coverage of these matters is actually getting better, and to demonstrate this, I’d like to walk you through some significant developments we cover in this issue of PDM.


From the bottom up, as it were, we see how CGG is moving from tape-based systems to high capacity disks and gigabit Ethernet to support its high performance PC clusters. Similar technology is being deployed by SeiScan to offer highly granular access to seismic data. And Conoco is building a ‘format neutral’ archive for project data. We also report on significant developments on the Statoil Slegge project which are leading Schlumberger to re-tool its data management infrastructure.

Standards Review

Also new in this issue is an extensive review of upstream data standards. This is one of the most researched pieces we have ever done, and for us one of the most illuminating. One tends to think of standards bodies as august organizations that grind slow and fine. But the reality is that Standards come and go, much like dot coms in a boom and bust. They compete for pieces of the action, and they are not always aware of what each other is up to.


I would like to be able to provide a pithy summarization of what standards bodies are really about. But sometimes, the more closely you observe something, the more complex it seems to get. I’ll just offer up a few random observations and let you draw your own conclusions. Politics, in many guises, plays a significant role in standards. Traditionally, at least in the upstream, most standards have come from either Schlumberger, or the not for profits. But a couple of years ago a new breed of standards body was formed by two groups of commercial companies. One is definitely deceased, the other is moribund and neither have made any usable standards public.


My next observation is that the remaining standards orgs have a very different attitude towards life. Take the SEG standards committee for instance, and the new SEG-Y Rev. 1. This was ‘nearly final’ in August 1998 and is still not official. This delay is not an accident. A lot rides on a new SEG standard. As Alan Faichney, SEG standards committee chairman told PDM, the SEG aims “not to be first, but to be certain.”


At a different point in the standards spectrum we have most of the current XML initiatives. These seem to generate instant ‘standards’ which represent, not agreement between different groups of users, but just one possible way of using a new tool. A ‘disposable’ standard if you like. These short-fused standards initiatives have the advantage that they leverage the latest technology, unlike ‘conservative’ standards like the SEG-Y Rev 1.

If it ain’t bust?

My last rambling observation on standards came from a couple of workers involved in the migration of legacy, ASCII or binary standards to XML. Both questioned the dash for XML solutions to problems which have already been solved. Far from being luddites, these observers had noted that the new XML protocols start out looking like panaceas and paradigm shifts. As they move to become industrial strength solutions, and incorporate the hard stuff like scalability and security, they lose some of their shine. The process of moving from a ‘disposable’ to deployment may take so long that technology will likely have moved on by take-up time. Is this the true enigma of the standards development process? That the development and approval cycle time means that you can only ever have old tech standards? Happy New Year.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.