August 1996

Shell affiliate becomes IBM PetroBank reseller in US (August 1996)

IBM has scored a significant US hit for its PetroBank solution for seismic data management needs with the announcement of an agreement with the Shell Oil subsidiary Shell Services Company (SSC).

SSC, with current revenues of $300 million and a workforce of 800 IT specialists, was floated off last year from Shell Oil to make its own way as an independent profit center delivering a range of business and technology services to the energy industry. It has now chosen PetroBank to be the key databank component of its next generation E&P data management solutions and services. This has to be good news for IBM. SSC has a natural constituency for its services within Shell affiliate companies in North America and indeed the first data management services for SSC using PetroBank is with Shell Offshore.

2D data set

SSC will be loading Shell Offshore's 2D data set in the first instance, a task expected to take around a year, and the expectation is that it will then move on to the 3D data set. Shell's legacy seismic will be stored on NTP Magstar tape in a combination of on-shelf and robotic storage. The part of the Epicentre data model used will cover header information on post stack seismic. It is planned to move quickly to a more comprehensive system including the management of pre-stack seismic.

No logs

For the moment SSC has not taken up wireline log PetroBank option. Remastering of the legacy data to NTP will be performed internally by Shell and not outsourced. Shell believes that its data management standards have been consistently high in the past and that migration should therefore be relatively painless. The current version of PetroBank being employed is V2a with migration to V2b expected in the 4th quarter of this year.

Data management services

In some ways the best part of the deal for IBM is that SSC will become one of its PetroBank resellers in the US. In the first instance SSC will be offering its data management services to companies focussing on the Gulf of Mexico, continental US and some selected international locations. The official line from Scott Reeves, manager of Shell's subsurface IT services, commenting on the IBM deal was 'Our relationship with IBM's worldwide Petroleum Industry Solutions unit will help us provide cost effective data services and products to both our existing Shell partners and our new energy industry customers'.

Extremely selective

IBM now has two companies SSC and Petroleum Geo Services (PGS), both based in Houston, marketing data management services and solutions based on PetroBank technology. IBM's John Ughetta, manager responsible for E&P revenue in North America, is un-embarrassed. He says that IBM was extremely selective in its choice of strategic partners and was unlikely to add to the list of resellers in the near future.

Calgary next?

IBM itself is also active in the marketplace and is hoping for a successful outcome in Calgary, Canada soon. It is also in talks with a number of majors about their future data management needs.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_1 as the subject.

SeisWorks 3.0 makes its market debut (August 1996)

The new release of its integrated geophysical interpretation suite is said to allow interpreters to perform seismic derived reservoir characterization.

Landmark started the month of August banging the big drum for the beginning of shipping of its SeisWorks 3.0 upgrade. The new release of its integrated geophysical interpretation suite is said to enable interpreters to move beyond structural interpretation to volume characterization and definition of reservoirs. The company also took the opportunity to announce an expansion of its geophysical consulting services, reinforcing its drive towards the holy grail of 'integrated solutions.'


Landmark says the SeisWorks 3.0 family has moved forward the art of integrating data, processes and applications. In a burst of purple prose, the company likens the new release to the difference between an MRI scan which creates a holistic view of a human body white an X-ray only shows the skeletal structure. In geoscience speak, interpreters can incorporate more critical seismic and well data for much greater accuracy in determining the size, shape, nature and location of a reservoir during exploration, development and production. 'Rather than only interpreting at or around boundaries, geoscientists can now analyze data between the boundaries' is how the company explains the power of SeisWorks.


According to Landmark's John Gibson, executive vice president for the Integrated Products Group, 'most companies use only a fraction of the data they have available to them to make multi-million dollar decisions about their drilling and production programs. Landmark is fundamentally improving their ability to make faster, more accurate decisions by better defining reservoirs through access to more seismic and multidisciplinary information. This integrated approach is difficult or impossible to achieve using other methods.

Client input

Expanded fault handling capabilities such as active fault correlation using intersection symbols, multi-z values and enhanced time slice fault plane interpretation are some of the features in the SeisWorks 3.0 release which was based on extensive consultation with Landmark clients. Additional enhancements include full merge functionality for interpreting multiple 2D and 3D data volumes in either time or depth, new seismic data management capabilities and significant usability and user interface updates to accelerate training and productivity.


Landmark points out that the SeisWorks family is dynamically integrated through OpenWorks, the company's project data management environment based on industry standards. It proudly notes that OpenWorks integrates more applications than any other commercially available E&P environment, including Landmark applications, other vendors and customers' proprietary systems. With regard to the expansion of its consulting services for geophysical interpretation, Landmark's vice president for professional services Keith Johnston said the company could integrate environments that comprehend all the applications and data needed to meet client's goals as well as train cross-functional teams.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_2 as the subject.

Editorial - Business Processes, Objects and Benefits (August 1996)

PDM's editor Neil McNaughton tries to get to grips with Business Objects, and concludes that what they lack is .. a business model!

A long while ago, when I was managing director of a small E&P company (or was it a dream?) we decided to computerize our accounting. During our search for the perfect accounts package I remember one piece of advice in particular. Someone who had considerable experience of installing such packages said that it was important to have a viable manual system that was working well before attempting to computerize. It seemed to me at the time, and still does that this was very good advice for accountants, geophysicists, data modellers, in fact everyone who is contemplating using a computer to perform a business task. First understand your business.


Subsequently I have been bemused frequently by what seem to me to be industry wide attempts to flaunt this principle. A database company, an IT department or a hardened hacker will nowadays, after some time spent on implementing what they thought would be a good solution to what they thought was the problem will come up against some minor obstacle such as a deadline, or a spent budget, or an unhappy end-user and a debate along the following lines will ensue; User: 'This is no good, it has cost too much, its not finished and it doesn't do what I wanted anyhow' IT Person: - the usual excuses but then 'of course what you really need is a better understanding of how your business actually works (note not 'how to run your business'). What you really need is a good dose of Business Process Reengineering, this will make your business run more like a computer and make it easier for me to computerize. During this process we will design new 'Business Objects' which will offer huge Business Benefits.'


The problem facing the E&P data modeling community today is that if a data model is specified in a very complete manner, to allow for all possible manifestations of a particular data type, then the chances of an application which asks the model for a given instance of the data actually finding the required data are inversely proportional to the completeness of the model. These problems are often referred to as impedance mismatches or different data footprints. As Bill Quinlivan of Geoquest has put it, flexibility is the f-word to the data modeler. As a possible solution to this, the Petroleum Open Software Corporation (POSC) has been contemplating Business Objects of late at a workshop on Business Objects and Interoperability in POSC's Houston offices In February, with a follow up meeting in July in the POSC Europe offices. David Archer, POSC's COO stated that while POSC has delivered a data model (Epicentre), it has de-focused on the concomitant goal of interoperability.


A presentation by Oliver Sims of the Object Management Group1 was of particular interest. OMG has developed a standard (CORBA) which is of use to programmers working at the level of the network (middleware), but has failed to provide high-level tools which application programmers working in Cobol (or Fortran) can utilize. As a result, the holy grail of widespread Client/Server computing and interoperability is still just beyond our grasp. What we now need is an object model at a much higher level. To the OMG this means Business Objects.


What is a Business Object? Frankly, that is a pretty tough question. First, what is an object? I know an answer to that one. It is a piece of computer code which can be run in either a separate piece of memory (out-of-process), or even on a different computer from the code which is calling it. This is a basic part of client server computing, and for interoperability, 'true' objects should be able to be called from different operating systems across the network. As you can imagine, the nuts and bolts of calling a subroutine on an IBM mainframe from a Unix box are pretty low level stuff, the stuff of CORBA in fact. So what is the interest for us high-level chaps? Well perhaps the process can be scaled up so that a Geoquest app can call a Seismic Line Object (SLO) from a Landmark app, do what ever processing may be required and then pass it back. What is required is a specification of a SLO that both the apps can understand, and that can act as an intelligent buffer between the two apps, performing services such as re-sampling of data so as to eliminate the impedance/footprint problem mentioned above.

Who pays?

Needless to say, specifying such a high level object requires more than just IT skills. It needs domain competent practitioners (First understand your business!) to know what methods and data to put into an object. And it needs a combination of IT, domain competence and black magic to decide on what is termed the granularity of the objects. Too small and they look like regular Corba objects, too big and they look like an application unto themselves, or even a data model! So who is going to specify these objects? Who is going to pay for them, and who will own them? Watch this space....


The OMG is a consortium of software vendors working to promote a common object model (CORBA) which will allow for software interoperability. A program should be able to call another routine running on a different machine and perhaps on a different OS. While CORBA holds sway in distributed computing in the Mainframe / Unix domain, it is a competing object model, COM from Microsoft, that rules on the desktop.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_3 as the subject.

South-East Asia geology on CD-ROM (August 1996)

New digital atlas from Cambrian for exploration effort in South-East Asia

Cambrian recently announced the availability of a digital atlas compiled on behalf of CCOP member countries (an Intergovernmental Organization for promotion and coordination of geo-scientific programs in offshore and coastal areas in East Asia). This resource for companies exploring in East or South East Asia combines cultural information with a detailed play atlas (geologic, stratigraphic and hydrocarbon occurrence information for specific basins) and 15 sections plus more than 20 maps describing three main sedimentary units broadly based on regional disconformities. The units are described with respect to thickness (isopach), lithology and environment of deposition (paleo-environment).

GIS package

Cambrian has packaged these data with a GIS viewing and plotting tool and the maps are also supplied in number of commonly used formats (AutoCAD, DXF, Microstation, Arc Info) structured and attributed for integration with other proprietary data and maps sets. Cambrian has kindly supplied PDM with a review copy of this product.

We will be reporting on this in next month's edition. For more information please contact Phil Carpenter at tel: (44) 1291 673022, fax: (44) 1291 673023, via e7mail at or through the World Wide Web at

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_4 as the subject.

GeoScene changes (August 1996)

Management team changes at UK software house

PDM has been catching up with UK company Oilfield Systems based near Southampton which is gradually winning acceptance for its geological workstation GeoScene. It now claims some 60 seats installed in Europe, North America and Asia. GeoScene 3 is being launched this month with new features and enhancements including a new optional 3D tool which allows visualization of interpretations prepared with GeoScene.

Davidson COO

There has been a change at the top with American Glen Kendall stepping down from his position as managing director to be replaced by Ross Davidson who has been with the company since 1986 and assumes the role of Chief Operating Officer. Kendall remains on the Oilfield Systems Board. In the US Tom Robinson was recently installed as president of GeoScene Systems, the company's US affiliate, with the task of heading up an expansion. The company is hoping to increase its North American business three fold in the next 12 months. Oilfield Systems' confidence is clearly riding high because it has announced its intention of hiring staff in the UK and the US as well as a technical representative in Calgary.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_5 as the subject.

Amerada hires quality Guru (August 1996)

Conscious of the problems surrounding data quality, Amerada Hess has hired Paul Duller, a quality specialist, to take-on the problem. Duller tells PDM how he is going to approach the problems.

Amerada Hess (UK) are taking quality in data management seriously. They consider that their exploration databases form an important strategic resource, contain millions of records compiled from numerous sources, over many years. Current attention to this asset has led to the conclusion that errors may exist in practically all of these datasets and to consider the limitations that such defects impose rather than ignoring them. Earlier last year Amerada Hess recruited Dr Paul Duller to co-ordinate their approach to data quality and lead a series of initiatives designed to improve the quality of data held their exploration databases. Commenting on this, Dr Duller said "A key element of our exploration success to date is access to accurate and reliable information and a clear understanding of the nature, origin and quality of data in use. Geological data by its very nature poses particular problems in terms of data quality, however quality assurance procedures can be applied to ensure the accuracy of the data and safeguard any associated exploration activity. A number of initiatives focusing upon data management and the quality of our seismic and well data are already underway."


As an example Duller cites a major clean up effort underway to standardize seismic line naming conventions. This problem arises from a historical laissez-faire approach to line naming conventions and poor data capture standards within the industry which has resulted in serious discrepancies between physical seismic data and their corresponding navigation database records. Although the industry is moving towards the increasing integration of these data types, the volume of data involved within a single companies archives (over 100,000 navigation lines and 200,000+ sections) places major logistic constraints upon how this problem can be resolved.


Amerada Hess have developed a structured methodology to reconcile this problem by matching navigation line names with their corresponding physical records, despite their apparent differences. This involves the generation of a line alias (a standardised and expanded form of the line name) and progressive, stepwise comparisons of records from both the physical and navigation datasets. Multiple match parameters (including the line alias) are used to provide additional levels of confidence in the match results, while user-defined transformations and filters serve to enhance the relative success rates achieved by this approach. Using this approach Amerada has been able to achieve in under six months, what would have taken years of painstaking manual cross-referencing using any other approach.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_6 as the subject.

Landmark's record 4th quarter earnings (August 1996)

Nice timing from Landmark as earnings of $54 million are reported.

Robert P Peebler, president and chief executive officer of Landmark, says he is quite pleased with the record fourth quarter results of the company and the end result of Landmark's 1996 fiscal year. He believes that the company has now got some momentum for the future. 'We look forward to creating even more value for our customers as we expand the scope and range of Landmark's integrated information systems and professional services through our upcoming merger with Halliburton and alliance with EDS.'


What Peebler didn't say is that he's probably relieved at not being continually in the front line reporting to shareholders on earnings per share now that Landmark has the less exposed role of wholly owned subsidiary of its new master Halliburton Company. Landmark reported record revenue of $54 million for its fourth quarter of fiscal year 1996 with net earnings of $4.6 million or $0.26 per share. In 1995 revenue was $49.2 million for the fourth quarter with net earnings of $5.5 million or $0.3 million. Software revenue for the quarter was up from $24.2 million to $25.7 million while service rose 26% from $16.8 million to $21.2 million.

1996 record

For the year end Landmark achieved record annual revenue of $187 million compared with $171.2 million in 1995. This was made up of software revenue of $83.6 million ($83.7 million in 1995) and a 34% upswing in services revenue from $55.6 million to $74.8 million. Excluding non-recurring charges, net income in fiscal year 1996 was $16.6 million or $0.93 per share compared to 1995 net income of $20.1 million or $1.15 per share. In the acquisition agreement announced in July, Halliburton Company will issue 0.574 of a share of it common stock for each outstanding share of Landmark common stock.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_7 as the subject.

Keep on Geo-Trekkin (August 1996)

Following publication of the list in last month's issue, PDM reports on another important data repository - the US National Geoscience Data Repository System (NGDRS) initiated in 1993.

The goals of the NGDRS are twofold: to preserve the large amount of legacy geoscience data that may be destroyed as being surplus to requirements and to provide a means for users to locate existing data through a Geographical Information System (GIS). Under the auspices of the American Geological Institute (AGI) the NGDRS team, which includes POSC, Texas Bureau of Economic Geology, and The Information Store has put together a coalition of private companies to plan and implement a system of public and private data stores. This represents a network of independent, dispersed digital data and material storage facilities that are joined electronically.

real rocks!

Phase I of the NGDRS project was completed in 1994 and demonstrated that companies were willing to donate millions of seismic miles, well logs, boxes of cores and cuttings, and other assorted geoscience data. Phase II was initiated in February 1995 with the goal of developing an operating plan for the NGDRS. A metadata repository that contains twenty two data servers controlled by GeoTrek has proved the viability of using a metadata repository to find data in distributed data servers. GeoTrek allows a GIS based access to the different metadata stores and the user can determine what data is available in a given area. It is important to note that only metadata (data about data such as well name, well location, formation name, etc.) is accessible through the NGDRS. The user must contact the owner of the hard data for permission to view or retrieve. Data accessible through the system includes cultural, seismic locations, well data, cuttings, maps, images, and text files.


In 1997, Phase III will establish a clearinghouse for data transfer from the private sector to the public domain and provide a metadata repository and related software that will enable the users of geoscience data to determine what data exists, who owns it, where it is, and how to access it. The next step is to put the University of Texas Bureau of Economic Geology's (BEG) core and cuttings data store live on the Internet during the first quarter of 1997 Contacts Tim Haynes ( and Glenn Breed (

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_8 as the subject.

Dutch company counting on quantitative geoscientific interpretation (August 1996)

Newly formed Dutch company de Groot-Bril Earth Sciences offers general-purpose software portfolio for quantitative geoscientific interpretations.

Launched at the European Association of Geoscientists and Engineers (EAGE) meeting in Amsterdam, the new company "attracted a great deal of interest". We have to take their word for that, but this is what they offer. The DGB-GDI (Geology-Driven Integration) system, according to its makers, comprises of a set of modules for integrating, analyzing and quantifying geoscientific data and knowledge. The subject to be defined is completely user-defined. Geological objects with attached properties (non-numeric) and quantifies (numeric) are defined at "natural scale" levels. Any hierarchical ordering system used in sedimentary geology can be projected into the system where is it used to identify, manipulate, analyze and quantify the information. There are no restriction with respect to the scale levels, nor to the quantities and properties to be studied.


The idea is that DGB-GDI can be used for applied research and special studies in various disciplines such as reservoir geophysics, seismic lateral predictions, seismic pattern analysis, petrophysics, rock-physics, basin analysis, etc. Unique features in the portfolio are a module for simulating pseudo-wells and dGB's concept of 'translator objects.' Via this technology dGB claims to have succeed completely in separating the application from the data representation. Data can be streamed into or out of the application in real-time from any file, device or command that is accessible from the user's site. In other words, data duplication and unwanted data loading procedures can be avoided. DGB-GDI is selling the software in logical groups of modules. We'll be keeping an eye on further developments which are being furthered through a newly initiated GDI consortium which so far includes Shell, Saudi Aramco and Geo-Logic with other companies and universities participating in the decision-making.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_9 as the subject.

Hot dates for data management (August 1996)

Well, we knew data management was hot, but six conferences in the same month (nearly). Sounds like panic stations.

Here's the list so far: -

The E&P Data Model Seminar, Stephenson & Associates - Houston August 27th - 29th. Contact S&A on Tel: +33 7576 0511, Fax: +33 7576 0512.

Geoscience Data Management Seminar, PESGB 19th-20th September 1996, London. Contact PESGB, 2nd Floor, Dover Street, London WlX 3PB. Fax: +44 171 495 7808.

Knowledge Working in the Oil and Gas Industry, PSTI /IBC Group 19th20th September 1996 in Aberdeen. Tel: +44 171 453 2712. Fax: 0171 631 3214.

Exploration & Production Data Management Conference, Gulf Publishing Company, 10-llth September, Houston. Tel: +1 713520 4430. Fax: +1 713 520 4433.

Stephenson & Associates are also holding E&P Data Management '96 on 23rd-24th September and E&P Data Model Seminar on 25th-27th September, both in London.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_10 as the subject.

Paris venue for course on POSC basics (August 1996)

A series of courses on aspects of Petrotechnical Open Software Corporation (POSC) is being delivered this autumn at headquarters of the Ecole National Superieure du Petrole et des Moteurs (ENSPM), Rueil-Malmaison on the outskirts of Paris.

Lectures will be in English will be provided by Geomath International and Paras. One of the four day courses POSC: Epicentre Overview - Advanced Epicentre being held on 9-12 September and 5-8 November covers the concept of the Epicentre data model and presents an overview of the methodology, structure, dictionary and diagram knowledge. It includes navigation through Epicentre and shows how it facilitates data sharing and integration of disciplines in the E&P domain. The second part of the course builds on the core model which ties all the discipline models together. lt describes at length one topic, either geology or geophysics or production or reservoir information or well operations or E&P management/ organization (defined according to client requests).


POSC: Data Access and Exchange (4 days) being held on 15-18 October and 16-19 December offers fundamentals and key principles of developing applications and POSC software integration platform (sips) based on POSC specifications. Using examples and exorcises, the course covers the application program interface, architecture, Epicentre meta-model, EXPRESS language, data model, level access and exchange file access. Further information available from: ENSPM - Formation Industrie, Centre de Rueil, 4 avenue de Bois-Preau, B.P 311, 92506 Rueil-Malmaison Cedex, France. Tel: +33 1 47 52 67 10, Fax: +33 1 47 52 70 41.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_11 as the subject.

Landmark users' world meeting (August 1996)

Landmark is holding its annual Worldwide Technology Forum in Houston on 12-14 February next year, so if you're a Landmark user this is where you should be. It'Il cost you $400 for what the company promises will be three focussed days in which users will be 'exposed to in-depth presentations, future development plans and product updates.' This will certainly be a good time to see how Landmark is faring under its new owner Halliburton Company. Venue for the get-together is the Adams Mark Hotel, Houston.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_12 as the subject.

The Vendor's tale (August 1996)

We get inside the head of an imaginary software developer to see how standards, interoperability and objetcs look from the commercial side of the fence.

Just imagine that you are a software vendor working in the E&P sector. A Schlumberger or a Landmark - or should I say Halliburton. Looking down the product line, you see a variety of tools. Some developed in house, perhaps rather a long time ago, with proven algorithms developed in Fortran on proprietary data structures. Others may be more recent, perhaps acquired in a take over of a younger company, with brave new software, incorporating all the latest politically correct objects and techniques. As you scan these products, you will know that some, probably the older ones, are the company's real breadwinners. They may be written in Fortran, but they have undergone years of improvement and adaptation as a result of feedback from clients. They may not integrate with other products too well, they may not be pc, but by golly, as a result of years of work, they work damn well!

Pain in the butt!

Now what do you thing of an outside "standards" organization which comes along and starts out by saying that the data model, the heart of your program, is unacceptable, and needs changing to be XYZ compliant. You would regard them as a serious pain in the butt n'est-ce pas? Unfortunately for you, this standards organization is funded, and presumably supported by a substantial number of your clients (you thought they were happy!). Other support for the standards comes from an even more important sector - your potential new clients, who are ready to sign that check just as soon as you make your software XYZ compliant. How do you feel now? Well now you probably feel that a cleft stick has caught up with a sensitive part of your anatomy. Just as your clients finally decided to give up developing software in-house and focus on their core business, they come back to tell you how to write software via a standards committee.

Damage limitation

Of course you grin and bear it, join the standards committee yourself, and start the long drawn out process of damage limitation. A little bit of agreement here, a modicum of dissent there, just enough to dampen their ardour. But you wake up in the middle of the night every now and then and think "No they can't be serious, they want us to tear the heart out of our application, put it back 10 years, and then moan because it isn't performant anymore, and there are a zillion new bugs, what are these guys trying to do to us?"

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_13 as the subject.

Advice on rationing software licenses (August 1996)

Garry Perrat (Phillips), offers the following advice to those who are trying to optimize software licenses and use.

'I have been asked to find out whether people are monitoring usage of licenses and/or monitoring processes for idle sessions and, if so, how they are doing it. I have a script which looks for idle sessions (basically looks every half hour and checks whether or not the process has used any cpu since the previous check) but doesn't automatically terminate them - it just logs them so people who are looking for a license can identify users hogging them. (Of course, this wouldn't be necessary if FlexLM's TIMEOUT feature was implemented by the applications, but then I guess this sort of thing isn't in a vendor's interest.) Monitoring license utilization is quite easy to program and Panther's AppTrack does quite a nice job with some pretty graphs. It can help to determine how many licenses you actually need. (It is quite wasteful to buy enough licenses so that no one is ever unable to get one - the trick is determining the frequency of 'lock out' which is an acceptable compromise between users' ability to get their jobs done and keeping maintenance costs down.)

Application Tracking Code


# seismon - monitor SeisWorks processes for

# inactivity since SeisWorks doesn't support

# FlexLMs TIMEOUT facility

# run by cron every half hour

# Put a blank line and the date into the log file

echo "" >> $HOME/lmk/seismon.out

date >> $HOME/lmk/seismon.out


# Define temporary work area - not /tmp since this is cleared

# on reboot and we need to hold on to the files..

TMPDIR=$HOME/lmk/seismon.tmpdir;export TMPDIR


# Specify the list of servers to monitor

serverlist="lnst0aa lnst1a lnst2a lnst3a lnst4a lnst5a lnlm06a lnlm08"


# Loop through all servers

for server in $serverlist; do


# check that the server is alive

# If it is not then continue with the next one

/usr/etc/ping $server >/dev/null

[ $? -ne 0 ] && continue

# Th server is alive so find all relevant processes running

# on it and print out a sorted list of pid, uid, cpu_time

# and process_name

rsh $server ps -axuww|nawk '$0~/Seis.d$/ \

{print substr($0,10,5), substr($0,1,9), \

substr($0,50,6),$NF}' | \

sort > $TEMPDIR/seismon.$


# Join the old and new lists and print out uid and \

# process_name if there is no change in cpu time

join $TMPDIR/seismon.$server.old $TMPDIR/seismon.$ \

| nawk '$3==$6 {print "'$server'",$2,$4}'>> \



# Rename new output to old

mv $TMPDIR/seismon.$ $TMPDIR.seismon.$server.old



Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_14 as the subject.

GeoForum group dissects Inter/Intranet use by E&P companies (August 1996)

A recent meeting of theGeoforum discussion group explored the application of Inter and Intranet solutions in UKbased E&P companies.

There was great interest expressed by the have-nots, proselytising bythe haves and some sitting on the fence from the luddites. While there was near universalrecognition of the potential of the Intranet, there were misgivings as to the widespreadprovision of Internet access to the masses. Intranet, especially within Phillips was citedas a powerful tool for the dissemination of information on many topics such as

  • phone book
  • project reports
  • bimonthly geoscience newsletter (used to be hardcopy)
  • personnel contact lists.

The Intranet in Phillips is definitely being exported from anenthusiastic US, where it has widespread acceptance to a more reserved UK. This isreflected in other UK companies who doubt the efficiency of generalized net access interms of personnel output. The obvious concern in these days of cost hyperconsciousness iswith the time wasting element of widespread web surfing. For myself, I always preferredwalking down the aisles between the books in the library rather than doing a key-wordsearch, so I was a surfer before my time I suppose. This sometimes resulted in lateralleaps of thought, sometimes not. It never crossed my mind that I might be wasting companytime. Nowadays I suppose you just have to keep pickin' 'em horizons, and that anysearching for off-screen enlightenment is time mis-spent by definition.

Inter vs Intranet

TheInternet, as everyone knows by now, is a sort of friendly, anarchistic information mart,accessible to most via the World Wide Web. This kind of furry freak brothers mentalitydoes not co-habit easily with your average Business - oriented Oil Co. of today. However,the usefulness of cheap, simple to implement and configure, point and click software whichit takes anyone about 2 minutes to master has not escaped managers and IT professionals inall industries. The compromise of the moment is therefore the Intranet. In other words,the application of the same technology of Internet servers and World Wide Web browsers,but here generaly limited to within a single company. This allows documents (text,spreadsheets, graphics) to be deployed and made accessible through powerful andeasy-to-use search engines, and offers an instant solution to the democratisation ofinformation within a company. While the Intranet offers a secure and easily implementedsolution today, user demand for access to external sources (Internet) coupled with morestraightforward solutions to security issues are likely to make the ineter/intra netdistinction blur. In the near future, the Internet will become more secure, with evenrestricted, paying access for certain services, this already exists in the E&P sectorfor some providers such as POSC and the Oil and Gas Journal. At the same time someIntranet services - such as information of interest to shareholders - will be becomepublicly available. All this will be made possible through Firewalls, secure transactionsand encryption, all of which are currently available technologies waiting forstandardisation and or market muscle.

    Click here to comment on this article

    If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_15 as the subject.

    Getting Started on the Internet (August 1996)

    Garry Perrat, thedriving force behind Geoforum has allowed us to print some of his tips on using Internet,here's Garry..


    Usenet provides discussion groups (somewhat like bulletin boards) on every conceivable subject (and quite a few inconceivable ones, it has to be said!). Unless your site has a news feed access can be tricky. I currently peruse sci.geo.petroleum via Birmingham University's gopher (address below) which provides access to all news groups.


    These are some useful starting points for those new to the internet:

    • (Petroleum Industry Events)
    • (The Virtual Earth)
    • (Quality Earth Science Resources)
    • (Another Node on the Net)
    • (Yahoo search engine)
    • gopher:// (Birmingham University gopher for Usenet)

    The sci.geo.petroleum resource list is not available online as far as I can tell. It is posted regularly to sci.geo.petroleum itself or is available from Scott Guthery ( It includes a LOT of information!

    To visit a company's web site the best first attempt (apart from asking someone or looking at a brochure) is to try "". For a non-profit organisation replace the "com" with "org". If you still can't get through give them a call! Note that although most vendors (hardware and software) and service companies have Internet web sites many oil companies don't. Some companies which follow this website naming convention include IBM, SGI, Sun and Halliburton. Some which require a little lateral thinking include Landmark ( and Schlumberger ( A search on Yahoo (or similar) can usually find a company;

      Click here to comment on this article

      If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_16 as the subject.

      The Pecten Data Delivery Project (August 1996)

      Shell subsidiary Pectendescribes a pragmatic approach to data management.

      At the recent Stephenson and Associates conference on Data Management Inda Immega from Pecten gave a revealing talk on data management as implemented by this Shell subsidiary. While others pontificate on data models and play with objects of one sort and another, Pecten have adopted a more a pragmatic approach. A survey of Shell users showed islands of satisfaction separated by data exchange bottlenecks. With 6 layoffs in 7 years, projects had become people-bound. A need was identified for a shared data repository and tools for moving data in and out. This was to be an efficiency based project (there are inefficiency based projects? - there probably are too!) The IT group was down from 12 to 2 and a decision was made to support no more than one tool to do one job (some smaller databases were abandoned). Archiving was to be built in. Note that 5 years ago the idea was to have all digital data accessible via a GIS. This has proved un-realistic. Today the main aims are

      • go for all-digital data
      • do not lose data.

      They are not (currently at least) concerned with

      • data model
      • hardware
      • Business Process ReEngineering

      So typical activities are

      • scanning and indexing of logs
      • maps geodeticised and entered into database
      • digital data cleanup (this is a major bottleneck and is mainly outsourced)

      Pecten use the Tigress data model, with replication of data to remote sites (PCs). Immega says Shell are converging towards industry solutions in the same direction as POSC & PPDM without following the same steps. Personally I found the Pecten approach, refreshingly practical issues. While we were was miles away from Data Models and BPR, such realism is more likely to impact positively on Business than most any of the other processes.

        Click here to comment on this article

        If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_17 as the subject.

        Software Review - UKOOA/BGS Lithostratigraphic Nomenclature of the North Sea on CD-ROM (August 1996)

        PDM reviews a newCD-ROM-based resource from the British Geological Survey, providing a definitivenomenclature of the North Sea's lithostratigraphy.

        This review is intended as a review of the technology rather than of the stratigraphic interest of the product. In respect of the latter, a quote from the introduction well describes the intent. "Emphasis has been placed on developing a scheme that, while satisfying the requirements of lithostratigraphic procedure, is of practical value to the diverse group of professionals needing to use it (e.g. exploration/development geologists, drillers, mud loggers, petroleum engineers, and members of the academic community). To this end, the aim has been to ensure that all lithostratigraphic units included within the scheme will be readily identifiable with the minimum of information, i.e. through the routine study of cuttings and wireline logs."

        Harry's member

        The report is supplied as a Windows help file and is as such fully integrated into the Windows environment. I had no trouble cutting and pasting between the NSL demo and other windows applications such as Word and Visio. The quality of the graphics, intended for quick reference, are not quite up to presentation standard, but good enough for reports. I had no trouble in knocking up the following basic introduction to the Harris member in a few seconds. "The Harris Member is composed almost entirely of reddish brown, red, brownish grey, and waxy green to greenish grey mudstones and minor siltstones, but there are occasional laminae of white and purple mudstone, and stringers and thin beds of sandstone and argillaceous limestone. The mudstones are micromicaceous, and locally contain traces of pyrite. They are typically slightly calcareous, and include sporadic units of highly calcareous mudstone, grading to marl. Most of the sandstone beds are less than 1m thick; they are white, pale grey, pale brown or greenish grey, moderately or poorly sorted, and very fine, fine or medium grained. Some sandstone beds are highly micaceous; others are reported to contain traces of glauconite. The sporadic, thin limestone beds are white, pale grey or pale brown, and have microcrystalline or cryptocrystalline textures".

        The use of a Windows help file has several advantages

        • You already know how to use it
        • Full text search (i.e. you do not need to search for a keyword, you can search for any word you like and all occurrences are automatically found in milliseconds)
        • Cut and paste between applications
        • Integrate with other windows products

        The basic material has already been published in paper form by the British Geological Survey1 and consequently should be an authoritative reference in this field. The BGS also has one of the most comprehensive geological libraries in Western Europe (and a reference library for the general public) and is custodian of the National Geoscience Information Service which includes a substantial inventory of maps, borehole cores, samples and other data.


        Some minor niggles, Windows help files do not highlight text found in a search and the basic help file format is easy to get lost in. Although bookmarks and history help a lot, the newer InfoViewer format uses a split-screen display with a hierarchical reminder of the current location of one side, and the hypertext help-type file on the right. Other ideas for future editions might be hotlinks from maps to wells and full text of papers (references) or perhaps some URLs.

        Cambrian are now seeking pre-commitments for purchase of the full product on its completion later this summer. For more information please contact Phil Carpenter at (44) 1291 673022, fax (44) 1291 673023, via e-mail at or through the World Wide Web at

        Click here to comment on this article

        If your browser does not work with the MailTo button, send mail to with PDM_V_1.0_199608_18 as the subject.