Who’s to integrate the portal?

Recycling a recent paper on integrating vertical geotechnical applications with horizontal corporate-wide IT, PDM Editor, Neil McNaughton decides that ‘keeping it simple’ is the key to success. But who is best placed to offer such ‘simple’ solutions? Multiple software developers and service providers? Or the new one-stop-shop behemoth?

I must apologize this month because this editorial may sound more like a sermon than usual, but there is a paper that I have presented twice this year* just hanging around on my hard disk, and I just have to inflict it upon you. Sorry. This month’s lesson then, is about the interplay between ‘horizontal,’ enterprise-wide software, like Microsoft Word, Lotus Notes and Web Portal engines, and our beloved ‘vertical’ E&P interpretation applications. Actually, there is an extra dimension that I would like to introduce to the debate - and that is, who is best placed to implement such enterprise-wide integration.


For several years now the E&P data management business has focused on data sharing and integration between
geotechnical applications. Different information technology (IT) paradigms have been applied to the problem, from standard data formats, middleware, and common data models, through business objects, to the vendor application programming interface (API). While all of these tools have moved the industry forward, none can be said to have acquired ‘silver bullet’ status.


One aspect of almost all applications is that they have an innate tendency to grow. An individual application’s data requirement may extend way beyond the domain-specific data ‘owned’ by the application itself. Most E&P applications have their own built-in mapping package. Vendor suites usually boast a few, and many companies have a bespoke ArcView package.


This leads to many data management issues, with problems of maintaining synchronicity, of edits and updates, and of duplicating datasets across the department. Such issues are unlikely to go away. New applications are likely to require even more ‘exotic’ data and data types. The exploding data volumes coming in from the seismic contractor and from permanent sensors on production facilities are compounding the problem.


Today, web-driven corporate IT is making new demands on domain-specific applications. Corporate-wide document management systems and web portals require a much larger vision of what IT is trying to achieve. Everyone wants a look at the latest maps, interpretations or production data. Suddenly, the dirty linen of the departmental data manager is visible to all!


Interestingly, the problems facing the E&P IT specialist are encountered at many other organizational levels. At the smaller scale, within a ‘suite’ of applications from a single vendor, integration is only a matter of degree. At the broader scale, other IT domains such as Enterprise Resource Allocation (ERA) and production, experience identical issues.


Because the problem has a very broad extent, it is rather well studied. Current thinking, at the broadest level, is that techniques such as API’s and middleware (COM and CORBA) have their place at departmental-level and domain-specific computing.


But as IT scope expands, these tightly coupled techniques necessitate an unrealizable IT schema (the ‘uber-schema’) of the whole enterprise. Such an approach, would mandate ‘clean-room’ IT infrastructure, with careful control over operating systems, middleware and software versions. For an even moderately sized enterprise, such control is illusory. Current thinking on integration centers around three concepts – limiting application coupling, sharing metadata and XML-based messaging.


These concepts are really applications of the KISS** principle, and recognition of the fact that application coupling is a road to hell. Too much shared data quickly becomes unmanageable. Most importantly, restraining the scope of an application to its own ‘core business’ and recognizing that function creep and overlap is to be avoided, should bring more performant applications, and an infrastructure that brings us nearer to the holy grail of interoperability.

Who Do?

Now for the thorny issue of who should implement the integration. Decoupling applications potentially opens up the field to multiple service providers. The whole Internet architecture, from TCP/IP stack, through the intranet, portal and XML-based applications, is especially amenable to such an approach. Each provider works on their particular area of expertise.


On the other hand, as Schlumberger Chairman Euan Baird claims, (see page 10), some clients may want a single-source supplier. In buying horizontal specialist Sema Group, Schlumberger is now equipped to provide the whole caboodle, from communications infrastructure through smart cards to the vertical application. I guess this will be a offering some will not be able to refuse. But one can’t help wondering why Schlumberger doesn’t push the logic a bit further; if they bought a utility or two, they could supply you with electricity as well!

* SMi Data Management 2001 and PPDM 2001 Spring Member Meeting

** Keep it simple, stupid!

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.