Editorial - Business Processes, Objects and Benefits (August 1996)

PDM's editor Neil McNaughton tries to get to grips with Business Objects, and concludes that what they lack is .. a business model!

A long while ago, when I was managing director of a small E&P company (or was it a dream?) we decided to computerize our accounting. During our search for the perfect accounts package I remember one piece of advice in particular. Someone who had considerable experience of installing such packages said that it was important to have a viable manual system that was working well before attempting to computerize. It seemed to me at the time, and still does that this was very good advice for accountants, geophysicists, data modellers, in fact everyone who is contemplating using a computer to perform a business task. First understand your business.

Bemused

Subsequently I have been bemused frequently by what seem to me to be industry wide attempts to flaunt this principle. A database company, an IT department or a hardened hacker will nowadays, after some time spent on implementing what they thought would be a good solution to what they thought was the problem will come up against some minor obstacle such as a deadline, or a spent budget, or an unhappy end-user and a debate along the following lines will ensue; User: 'This is no good, it has cost too much, its not finished and it doesn't do what I wanted anyhow' IT Person: - the usual excuses but then 'of course what you really need is a better understanding of how your business actually works (note not 'how to run your business'). What you really need is a good dose of Business Process Reengineering, this will make your business run more like a computer and make it easier for me to computerize. During this process we will design new 'Business Objects' which will offer huge Business Benefits.'

f-word

The problem facing the E&P data modeling community today is that if a data model is specified in a very complete manner, to allow for all possible manifestations of a particular data type, then the chances of an application which asks the model for a given instance of the data actually finding the required data are inversely proportional to the completeness of the model. These problems are often referred to as impedance mismatches or different data footprints. As Bill Quinlivan of Geoquest has put it, flexibility is the f-word to the data modeler. As a possible solution to this, the Petroleum Open Software Corporation (POSC) has been contemplating Business Objects of late at a workshop on Business Objects and Interoperability in POSC's Houston offices In February, with a follow up meeting in July in the POSC Europe offices. David Archer, POSC's COO stated that while POSC has delivered a data model (Epicentre), it has de-focused on the concomitant goal of interoperability.

OMG

A presentation by Oliver Sims of the Object Management Group1 was of particular interest. OMG has developed a standard (CORBA) which is of use to programmers working at the level of the network (middleware), but has failed to provide high-level tools which application programmers working in Cobol (or Fortran) can utilize. As a result, the holy grail of widespread Client/Server computing and interoperability is still just beyond our grasp. What we now need is an object model at a much higher level. To the OMG this means Business Objects.

SLO - BO?

What is a Business Object? Frankly, that is a pretty tough question. First, what is an object? I know an answer to that one. It is a piece of computer code which can be run in either a separate piece of memory (out-of-process), or even on a different computer from the code which is calling it. This is a basic part of client server computing, and for interoperability, 'true' objects should be able to be called from different operating systems across the network. As you can imagine, the nuts and bolts of calling a subroutine on an IBM mainframe from a Unix box are pretty low level stuff, the stuff of CORBA in fact. So what is the interest for us high-level chaps? Well perhaps the process can be scaled up so that a Geoquest app can call a Seismic Line Object (SLO) from a Landmark app, do what ever processing may be required and then pass it back. What is required is a specification of a SLO that both the apps can understand, and that can act as an intelligent buffer between the two apps, performing services such as re-sampling of data so as to eliminate the impedance/footprint problem mentioned above.

Who pays?

Needless to say, specifying such a high level object requires more than just IT skills. It needs domain competent practitioners (First understand your business!) to know what methods and data to put into an object. And it needs a combination of IT, domain competence and black magic to decide on what is termed the granularity of the objects. Too small and they look like regular Corba objects, too big and they look like an application unto themselves, or even a data model! So who is going to specify these objects? Who is going to pay for them, and who will own them? Watch this space....

OMG??

The OMG is a consortium of software vendors working to promote a common object model (CORBA) which will allow for software interoperability. A program should be able to call another routine running on a different machine and perhaps on a different OS. While CORBA holds sway in distributed computing in the Mainframe / Unix domain, it is a competing object model, COM from Microsoft, that rules on the desktop.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_1.0_199608_3 as the subject.

© Oil IT Journal - all rights reserved.