You could be excused for being a little confused by the message coming out of Houston from E&P software developer Cogniseis. At the beginning of October the company announced a restructure, read job losses. But later in the month it was swinging back with its latest release of VoxelGeo, announcing it as the true 3D true volume interpretation system that allows geoscientists to interact directly with their data. The company has also been included in an advertising campaign to promote the GeoScience Corporation, the name used by US company Tech-Sym for its geoscience related group of companies Cogniseis, Syntron (seismic data acquisition manufacturer) and Symtronix (workstation specialist) led by Richard Miles, previously boss of Syntron. So what are we to make of it all'? Best answer seems to be that a new management team put in place earlier this year following the company's acquisition by Tech-Sym, has made its move to reposition the company, focussing on what it considers to be its core business.
Hiring now!
"Actually we are hiring staff at the moment", Richard Cooper,
general manager of Cogniseis told PDM. "Obviously we regret having to implement staff
reductions as part of this initiative. However, Cogniseis is operating in a dynamic and
changing market. It is essential for us to maintain the status of our 'best of class'
product offerings in seismic processing, volume interpretation and geological modeling
while positioning these products to meet the future needs of the E&P industry."
The four main components of the company's product drive in the future will be based on its
well known Focus processing and imaging software, the SeisX surface-based interpretation
system, the GeosSec geological modelling package and VoxelGeo (these last two the result
of joint development with E&P industry consortia. In due course the whole product line
will come under the label of TerraCube. CogniSeis has therefore put the spotlight on
VoxelGeo 2.1.5 which it claims can render data faster and more accurately than any other
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_1 as the subject.
Calgary has a lot going for it in terms of E&P data management and
use. It is major center for the E&P business and, unlike Houston, nearly all of the
industry is located in the downtown area. The mining law and regulations on public domain
data mean that there is a lot of data available for trade and sale, which has led to a
considerable growth in the service sector. This growth has not yet been caught up with by
take-overs and merger-mania, so there really is an active marketplace for data and added
value services, with many players of different sizes. Fiber everywhere
Calgary - E&P Data Capital of the World? (November 1996)
PDM visit Calgary and conclude that the rest of the world has a lot to learn from this city's IT infrastructure.
What makes Calgary a good candidate for the title of E&P Data
Capital of the world is all of the above plus the communications infrastructure that has
been installed to cater for all this activity. Some of you may have heard of Asynchronous
Transfer Mode (ATM), which is the latest offering from Europe's public or ex-public
service operators. This, in Europe, is to provide bandwidth of around 155 Megabits per
second (MBps). Well in Calgary, the Metronet fibre links originally installed by Fujitsu
for a now defunct supercomputing centre, doubled up by a competing system from Telus,
offer data transfer rates up to 650 Megabyte/s. i.e. 5GBps. These numbers are important
because they influence what data can be practically exchanged over the Wide Area Net.
Electronic trading and purchasing of pre-stack 3d seismic datasets becomes a possibility
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_2 as the subject.
As the first signs of winter arrived in Calgary, a rather important
meeting was taking place in the granite-clad high-rise of Petrocanada's offices. The AGM
of the Public Petroleum Data Model Association had to deliberate on the continuing
development of the PPDM data model (see insert) but also was the scene of the first public
reporting on the preliminary out come of project Discovery. This, for those of you who
either missed previous editions of PDM, or who were not paying attention, is an attempt to
reconcile the disparate approaches of the Petrotechnical Open Software Corporation (POSC)
and PPDM. This project 's core team is made up of representatives from Landmark, Chevron
and Schlumberger with representatives from POSC and PPDM helping out. In other words, this
is a business driven (in the true sense of that much misused phrase) initiative. The
Discovery sponsors either want to encourage a rapprochement between PPDM and POSC, or at
least send a clear message to the outside world that they are making a very serious
attempt to do so, and that if nothing comes of it, then it ain't their fault! forked-tongue
Editorial - Discovery - the future of E&P Data Modelling? (November 1996)
PDM reports on the state of play in the Discovery Project. The second attempt (and probably the last) to merge data models from the PPDM association and the POSC.
What is at issue in this co-operative effort between competitors is not so much a drive towards the goal of interoperability (we saw in the October PDM that in the case of one major vendor, a POSC compliant implementation was in no way intended to facilitate this), the objective of the Discovery sponsors is simply that two data models out there is one too many. Furthermore because their client universe is split roughly between those who use PPDM in one guise or another, and those who have stated that only compliance with POSC will do for them, this situation is having a very real impact on their business, and on their development and maintenance effort. A single data model will avoid the necessity for a "forked-tongue" marketing strategy, and will roughly halve the data model development for these major players.
subset?
So what is the problem here? What is project Discovery trying to do? Reporting at the PPDM AGM, Mary Kay Manson of Chevron Canada gave an update into Discovery progress to date. While the long term strategic objective is to meld the two models, the current Discovery project is a preparation for the next major release of the PPDM data model - version 4.0, which, if all goes well will become a "subset" of POSC's Epicentre model. It should be possible to obtain, via an automated projection methodology, in other words a set of rules for generating one model from another, the new model from Epicentre. Thus will PPDM benefit from the rigor of POSC's purist data model, and POSC will have a new "subset" of happy clients.
prosaic
So much for the overview, the reality of Project Discovery is more prosaic. While project Discovery set out initially with the objective of providing a common model, it was soon realized that this was a harder task than initially thought. Fundamental differences between the two models led to the scope of the project being reined in to test the feasibility of mapping from an Epicentre logical data model to a PPDM-like relational implementation. The area of seismic acquisition was selected as a test-bed. The methodology used is to use a projection tool to manufacture a physical model (Oracle tables and relations) from the Epicentre logical model. This has been found to be a very unstable process - with small changes in the projection method making for radically different models - a reflection of the complexity of the system.
top-down
Other problems arise from the different scopes of the two models. Some differences between PPDM and POSC come from the simple fact that different people, with different focus, have worked on the models at different times. While PPDM is strong in the domains which have been central to its member's activity - such as the dissemination of well data - POSC's top-down design and all-encompassing intended scope means inevitably that it is, at least locally, a bit thin on the ground. Another major difference stems from the inevitable compromise that is implicit in all data models using the relational data base. They can either be normalized (see side box) - for precise data modeling - or non-normalized for performance. Generally speaking, Epicentre is normalized and PPDM is not.
where's the data?
A further fundamental difference between the two models, and one of particular importance to project Discovery is where to store data. You may think that a database is a good place to start with, but things are not that simple. A historical difference between the PPDM and POSC specifications is the actual location of bulk data (i.e. well logs and seismic trace data). In all implementations of both models, the actual data is generally speaking stored outside of the database, on the file system in either native formats such as SEG-Y or LIS or in proprietary binaries. Project Discovery however appears to be following the pure POSCian line with data to be stored within the database. This - as we discussed last month, can have deleterious effects on query performance, and may be a stumbling block for Discovery take up if it is too hard-wired into the model.
x-man
Project Discovery's schedule has been broken down into four phases quaintly named "Strawman", "Woodman", "Ironman" and "Steelman". This is a handy way of delimiting the progress of a project as it allows for infinite variation of content and progress. Discovery is today in the "Ironman" phase - although what this means is not yet clear. The strong pro-Discovery lobby within PPDM is pushing to extend project Discovery's scope to substantially the whole of the PPDM data model scope for release as PPDM version 4.0. The zealots are even forecasting a release date towards the end of 1997 for this. This sound like tempting fate in view of prior similar announcements, but with the support for Discovery coming from the sponsors, and the newly elected PPDM board, may be more achievable than at the time of the last POSC/PPDM merger back in 1993.
physical
Two mindsets will have to evolve before Discovery brings home the
bacon. The rank and file of the PPDM membership will have to loosen up and accept that the
next generation of the PPDM model is going to be very different from the 2.x and 3.x
versions. This will be hard, because PPDM is a "physical" model, in other words
it is implemented and used. Data and applications are plugged in and running off it, so
that any change (even between prior mainstream versions of PPDM) has always been painful
(see the PPDM article in this issue) Secondly POSC will have to enhance the marketing
profile they ascribe to Discovery. Discovery will be still-born if POSC continues to see
it as a "subset project". POSC will have to consider the Discovery data model as
THE relational projection of Epicentre. If this happens, then even the recalcitrant
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_3 as the subject.
The founding fathers of the relational data base, by using a
combination of mathematics and black magic, came up with a set of rules defining the make
up of a pure relational database. Nearly all of these rules have been bent or ignored by
relational data base vendors and implementers, and for a good reason, a compromise between
purity and performance is essential in the real world. A good example of such compromise
is in the degree of normalization that is implemented in a data base. Normalization is
measured on a scale of 1 to 5 "Normal Forms" - a kind of Richter Scale of
database purity. The first normal form states that one value in a table should be just
that - i.e. it is acceptable to have a well name column, with one value per line, but not
to have a well names column , with a group of well names in a single field. The second
normal form states that a row in a column must be uniquely identifiable while the third
eliminates redundancy of non-key information. While these three requirements combine to
make for robust database design and simple maintenance, they can have a negative effect on
performance. Because of this they are frequently only partially applied. Click here to comment
on this article
In July 1995 GMG Europe entered into a marketing agreement with
Panther Software to act as the authorized regional distributor for its seismic data
loading and application tracking software products throughout Europe, Africa and the
Middle East. The success of Panther led to the establishment of a regional office, Panther
Europe, with GMG Europe retained to provide a variety of management services. Based at GMG
Europe's Ascot offices, Panther Europe offers full technical support for the company's
software products. GMG Europe also acts as a distributor throughout Europe, Africa and the
Middle East for Green Mountain Geophysics' survey design and prestack processing products
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_5 as the subject.
According to the publicity "there will be industry visionaries and
experts on hand to comment on future trends and developments. The events will showcase
technological breakthroughs and teach the skills needed to succeed in today's, and
tomorrow's, marketplace. You'll see hands-on integrated workflow demonstrations, hear real
success stories and witness first hand the results and business advantages that companies
have achieved by working with Landmark. Sounds too good to miss! This month's four dates
are 20 (London), 22 (Aberdeen), 25 (Stavanger) and 29 (Milan). For more information tel:
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_6 as the subject.
The Public Petroleum Data Model Asociation was founded in Canada in
1988 (two years before POSC!) with the intent of defining a data model suited to the needs
of the Canadian E&P sector. This meant that the initial versions of the PPDM data
model had a strong focus on well data which is both widely available and very actively
traded in Alberta (a propos, Queen Victoria wanted to rename India "Alberta"
after the death of her consort, her advisors redirected this accolade to the Canadian
province). The PPDM model and membership have grown a lot since the early days. Today
there are 86 members, and the current version 3.3 of the PPDM data model contains around
300 tables in the four subject areas of wells, seismic, land/lease and lithology. The PPDM
data model is very widely used in the industry, it is at the heart of Geoquest's Finder
product - with more than 500 world-wide licences. In fact Finder Graphics Corporation was
one of the four founding members of PPDM. Landmark will also implement the PPDM data model
as the basis of their Open Works product. While the majority of the members of PPDM come
from the vendor community, providing data and other services utilising the model, some oil
co. members implement the PPDM model themselves as a corporate data store. Pragmatic PPDM's approach has always been a pragmatic one, the objective is to
supply an implementable data model using today's technology. Thus the PPDM deliverable
comes in the form of a suite of Oracle Data Description Language (DDL) scripts. These can
be run on any Oracle installation to initiate the model. Populating it is up to the user.
It is clear that this approach is less than politically correct in these days of open
systems. It might seem a shame that a "standard" is so closely coupled with a
commercial product (although there are rumoured to be some Sybase implementations of PPDM
around), but this is a reflection of Oracle's dominance in the E&P sector. In this
context, POSC is in practice as coupled to Oracle as PPDM, with their relational
implementations delivered too in Oracle DDLs. Prime Those of you who would like to have an overview of the PPDM data model
together with a reasonably dispassionate presentation of the business case for PPDM would
be well advised to sign up for the one day seminar held regularly by Prime Geomatics .
This outlines the history of PPDM, runs through the subject areas and perhaps most
importantly, distinguishes the areas of the model in active use from those under
construction. Such considerations are important for implementers since implementing a part
of the model which is already in widespread use will be very different from doing so with
a frontier area. Plug and tweak
Normalization (November 1996)
PDM offers a backgrounder in database basics
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_4 as the subject.
Panther in Europe (November 1996)
GMG Europe is anxious to make clear that it is the management company acting for Panther Europe, subsidiary of the Calgary-based software developer.
Landmark's world tour (November 1996)
Landmark tell us that it is holding a series of educational seminars in November for geophysicists, geologists and E&P managers under the label of 'Geophysical Solutions for a Changing World'.
PPDM's past the key to POSC's present? (November 1996)
PDM analyses competing data models from PPDM and POSC and concludes that despite their differences, they are likely going to share a lot of common usage patters, and a few 'gotchas'.
PPDMology is an important science if only for the reason that the PPDM data model is more mature than POSC, has been more widely implemented and can therefore tell us a thing or two about the likely future evolution of POSC itself. Firstly, PPDM implementations do not allow for plug and play. Finder "runs on" PPDM as will Open Works. But they both have their own internal implementations which are to all intents and purposes proprietary. That is to say that the frequently presented diagram showing a central data repository, with applications running off it as spokes, is simply not how things actually work. Real world implementations of PPDM are (in the end-user oil co. community) as essentially stand alone data repositories. Data may be pumped out from them to applications, which may themselves have a PPDM data model at their core, but it will be a separate, different implementation of the model. Implementing a PPDM model - again as an end-user - is not for the faint hearted as is shown by the type of company which has adopted this approach. These are generally either majors, large subs or state companies with significant IT resources.
"based-on"
The essence of the problem here is that, contrary to the marketing spiel, PPDM (and incidentally POSC) are not "buy not build" solutions. They nearer to being "build not buy" solutions (given the rather low price for POSC's SIP V2.1 mentioned elsewhere in this edition, you will forgive me this poetic license). You acquire a data model for a very small price, then you build it into a solution whether this be a vendor's data delivery tool or an Oil Co.'s full-blown corporate data repository. During the building process the model is customized for local data flavors, data entry tools are developed, formatters and re-formatters are plugged in to the model for export to a particular combination of applications and to satisfy a particular type of workflow. Queries, indexes, triggers and stored procedures are designed and implemented and after a while and a not inconsiderable investment, you have a working solution. This can then be said to have been "based on" PPDM or POSC, but will also have been based on a particular version of one of these.
antiquated
Which leads on to another complication which is clear from our elementary study of PPDMology. Without wanting to embarrass anyone, all commercial implementations of both PPDM (and POSC) are by their nature, based on more or less antiquated versions of their respective standards. Porting them to the next version is a costly and painful exercise which is infrequently performed. So commercial implementations of PPDM today are variously based on versions 1.0, 2.3, 3.2 and 3.3. In the POSC camp, the first version of Epicentre was published in September 1993, version 2.0 followed in October 1995, and 2.1 has just been issued. So real-world implementations of Epicentre will, in all probability, come in different flavors, with the same issues of compatibility and cost of change.
Ad-infinitum
Another field where PPDMology is instructive is in model scope. Get a group of data modellers together and what do they do? Model! It is tempting for all IT professionals to go on developing ad-infinitum rather than deploying. This situation is actually encouraged by the structure of POSC and PPDM. Rather than making sure that a sector of the model is well specified, debugged and in service, it is much more interesting to move on to greener pastures. This is a difficult issue, but broadening the scope of a data model is a double edged sword. It may make the understanding of the business easier for the E&P data modeller, but when you are beginning to invade ground that is already well-trodden by accountants with big iron products like SAP, then you had better watch out!
Buy or build?
In conclusion I would like to open up the debate with a statement and a
question. The statement: neither POSC nor PPDM are either "buy not build"
solutions nor do they promote interoperability - in the sense of "plug and
play". The question: what are they really for? What honest statements about their
real usefulness can be made. This is a leading question, but I can think of a few answers
that are not entirely negative. What do you all think?? Answers please to PDM, we will
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_7 as the subject.
The technology, consisting of patented Lamont 4D software, will be
available from Western Geophysical as part of a service package or as a licensed software
product. The agreement grants the company the exclusive worldwide license to this
technology for the analysis of time-lapse seismic data to monitor fluid changes in
reservoirs. By monitoring how and where these fluids are bypassed or blocked, oil
companies will be able to design drilling strategies to extract oil and gas that are
usually left behind. Scientists at Columbia University's Lamont-Doherty Earth Observatory
in Palisades, New York developed the Lamont 4D software in conjunction with a consortium
of seven oil companies. It has been tested in 15 oilfields in the North Sea and the Gulf
of Mexico and is currently being used in almost half the active 4D projects worldwide.
Denby Auble, Western Geophysical's senior vice president of technology, is quoted as
saying Western Geophysical expects "4D seismic surveys and associated interpretation
software to be essential ingredients for ensuring the quality and efficiency of E&P
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_8 as the subject.
For use on PCs, the suite is designed for oil and gas producers to
create and optimize in-house design projects and to update or modify designs supplied by
Dowell with each job proposal. Two of the six CADE Office products, already used daily by
Dowell engineers, are now being made commercially available. These are the StimCADE matrix
acidizing software and the FracCADE hydraulic fracturing software. The four other products
to be released over the next year include the CoilCADE, CemCADE, PacCADE and MudCADE
software for coiled tubing, cementing, sand control and drilling fluids applications,
respectively. Claude Vercaemer, vice president, Dowell marketing and product development
said "We believe that placing our technology in the hands of clients will improve our
ability to work together towards the optimisation of each well. It is a natural
progression of our business as a leading technolce provider in well construction, well
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_9 as the subject.
The revolution in IT has helped to keep the E&P industry in business despite persistently low oil prices. This extravagantly priced volume ($2500 for the first copy, $100 for further copies for internal use, obtainable from Cambridge Energy Research Associates, LP, 20 University Road, Cambridge, Massachusetts 02138, USA) does have the virtue of spelling out the importance and the challenge of IT to convince doubting management. PDM welcomes its publication and reproduces extracts (some meaty, some less so) from the Executive
Summary The CERA report lists 15 findings as follows: The challenges for management IMPACT IT has made an extremely powerful impact on costs and efficiency in the
E&P business. Prospects can be found and reservoirs produced that would otherwise be
uneconomic or invisible. Fields can be developed more cheaply and much faster and teamwork
has been reinforced. At the same time, IT also allows an E&P company to function with
far fewer people - whether its activities are in-house or outsourced. CHANGE AND ITS PRICE Most companies continue to maintain a range of different vintages of
hardware and software. Many have not yet completely abandoned their mainframes or their
smaller successors, especially for non-technical applications. They are increasingly
skeptical about the arguments for undertaking further investments in IT the price of
change seems more visible than the benefits from making it. ROLE OF SENIOR MANAGEMENT
Enhanced oil recovery software licensed from Columbia University (November 1996)
Western Geophysical and Columbia University are working together to market software technology to locate untapped oil and gas reserves, allowing more production from existing wells.
Dowell and GeoQuest on well optimization (November 1996)
A CADE Office suite (Computer Aided Design, Execution and Evaluation) is being launched by Schlumberger Dowell, marking a first venture into commercial software in association with GeoQuest.
IT's official! Report says IT plays key role in E&P! (November 1996)
A new report from Cambridge Energy Research Associates (CERA) 'Quiet Revolution: Information Technology and the Reshaping of the Oil and Gas Business' tells us quite a lot about what we newalready.
Whether a company writes its own software, adapts it or lives with what comes off the shelf, senior management involvement is seen as a key ingredient of a successful IT project. This is not just upward delegation in case things go wrong. Designing and managing an IT project requires judgements about how the business is going to be conducted. Unfortunately, IT projects are hard to relate to business objectives and senior management is often not close to the IT issues.
THE GENERATION GAP
Although its existence is much debated, we do believe, based on what we have learned, that there is a generation gap - it would be more surprising were there not. New entrants to the workforce tend to be more adept and more flexible in their use of IT. They are less prone to suffer from 'digital anxiety'. Yet, there is more to managing IT than familiarity and dexterity. Hence, we do not expect that this IT 'problem' will be solved only by the passage of time, as the next generation of managers enters the executive suite.
FAST FOLLOWER?
The dominant IT strategy in the E&P industry is fast follower. Since internal IT budgets have already been passed to operating companies as the IT customer', innovations are being concentrated on immediate problem solving rather than on more fundamental developments. IT spending continues to rise in the geosciences, where it tends to be regarded as providing the tools of the trade; it is under more pressure in oil field operations where the potential may now be greater to reduce total costs by spending more IT.
HOW LONG AND HOW MUCH?
IT projects are famous for running over time and over budget. The current remedy is to keep them short and small. If a project must be large, many companies would start with a pilot. It is particularly difficult for people whose experience lies in one computing environment to implement a project in another, many companies have had the unhappy experience of trying to do so. They now prefer not to write their own software since it is also difficult to keep innovations exclusive for long enough to get a compensating proprietary advantage. However, companies usually cannot buy suitable off the shelf products without some degree of troublesome custom tailoring. This is a major source of friction with IT suppliers.
THE PACE
The second major source of friction with IT suppliers concerns the pace of change. Some E&P companies believe that they are too often pressed into unnecessary upgrades. IT suppliers would rather continue to innovate than to lose traction by supporting an ever growing list of their customers' legacy systems. In addition, the customers often send their suppliers mixed signals - specialist users tend to have most contact with suppliers and are most eager for new refinements ("like children in the candy shop", as they were unkindly described by one executive).
Quote "The residual value of IT after three years is zero. We need
to think of IT as a consumable, like drilling mud. That is the paradigm shift."
Click here to comment on this article
THE WORLD WIDE WEB
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_10 as the subject.
CERA's Five key technologies (November 1996)
The CERA study highlights five key technologies most likely to make an impact on the industry's future
Developed by particle physicists in Switzerland to make the Internet easier to use, the Web combines text, graphics, sound and video on a single screen. It is growing fast, more users and more applications every day, and stimulating big changes in the ways people communicate and organize their activities.
PHOTONICS
Also called optical electronics, it means using light (photons instead of electrons) to store, process, manipulate and transmit information. Televisions will act like computers and computers will act like televisions while both of them act like telephones. Using superconductors and fiber-optic cables, photonics will allow more data across the Web.
DATA WAREHOUSES
They will store all that digital data and make it available quickly and reliably for analysis by whoever has rights to use it. Data will become the base commodity of the 21st century, as cereals were in the 19th and oil in the 20th centuries. Competitive advantage will pass to those with the best access to data and the best tools for its analysis.
OBJECT TECHNOLOGY
Without new tools to handle it, warehouses would simply result in an avalanche of data. Artificial intelligence (AI) - which requires search engines that use fuzzy logic and neural nets that try to imitate an animate brain - is needed for quick and accurate summaries of the information patterns contained in all that data. Success with AI will be aided by object technology to allow computers to store and access data in the right way.
VIRTUAL REALITY
Through the use of robotics, virtual reality could provide subsurface
sight and sense to technicians above ground. If combined with extreme miniaturization
(called nano-technolce), robots can become microscopic; AI would allow them to become
decision makers as well. Taken to the limit, and combining all three technologies, robots
could patrol the pores of an oil or gas reservoir, monitor how the hydrocarbons are
flowing, decide how to maximize recovery and dictate to other robots in the wells which
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_11 as the subject.
Welltest 200 is intended to allow analysis of well tests in complex
geological situations, well geometries and, with multi-phase fluid flow, effects such as
water coning. The application allows users to validate their raw well test data, conduct
conventional analytical analysis and to interactively prepare a numerical model. The
complexity of real-life reservoirs is addressed by the use of perpendicular bisection
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_12 as the subject.
An oil company operating offshore has to balance a number of
production, commercial and financial factors in developing an oil and gas field. They can
be summarized as producing at a cost commensurate with earnings at the required level of
return on the investment at the market price for gas and oil, but without compromising
safety in any way. In achieving this, IT has a vital role to play by making a viable
contribution to this business objective. For BHP Petroleum, developers of the Liverpool
Bay Field, their IT strategy has been driven by the need to maximize the use of IT across
the whole range of operations in the Field. Everything including production,
instrumentation, maintenance, finance, E-mail, condition monitoring, even flight planning,
is computerized. The result is up to 20 MIS software applications ranging from Logica's
PRODIS production reporting system and Salem's personnel tracking system, to the Brown
& Root OMNIDOC electronic document system and Microsoft's Office suite. There is
nothing especially innovative in the use of these packages by themselves. What is
different is the way the field's computing resources have been deployed. Wherever possible
they are located offshore where they are needed, with their supervision based onshore. Reducing risks and costs This meets one of BHP's prime aims which is to keep the number of
people working offshore to a minimum level as a way of reducing safety risks and costs.
There are less than 100 people actually working in the field, none of whom are IT staff,
with the result that three of its four platforms are unmanned. The key elements in
achieving this are a 70 mile undersea FDDI Backbone Local Area Network which, in
conjunction with microwave line of sight links, connects all the platforms with BHP's
Point-of-Air gas terminal in North Wales and an oil storage installation moored in the
Irish Sea; and the use of remote management and monitoring techniques to ensure the
network, the computers attached to it, and of course the end users keep working. Microwave links
GeoQuest's new software and quality compliance (November 1996)
A new analytical product, Welltest 200, based on the Eclipse reservoir simulator acquired from ECL, has been announced by GeoQuest.
IT goes offshore - BHP's Liverpool Bay Field (November 1996)
BHP Petroleum explains how it has marshalled its IT resources to facilitate the operating of the Liverpool Bay field off the north west coast of England
The LAN operates at a full 100 Mbps and consists of mono-mode fiber cable bound with the electricity cables and laid underwater between the platforms. Microwave links provide the connection between the manned Douglas platform and the oil barge and the Point-of-Air terminal. Through a military specified optical connector, an offshore support vessel can plus into the LAN at any of the platforms at which it docks and, in the event of having to pull away, break the connection without damaging the optical fibers. Access One hubs from UB networks are installed on every platform and at the gas terminal with SUN and Hewlett-Packard UNIX servers on the Douglas platform and at Point-of-Air. Through Ethernet switching 10 Mbps is delivered via Category 5 cabling to the end user Compaq PCs. Up to 80 of these are installed, 30 of them offshore including one on the support vessel. On the Douglas platform and at the Point-of-Air they are connected to Novell LANs running of Compaq PC servers.
Infrastructure
The networking infrastructure ensure that each of the management information systems can be accessed by all relevant personnel, irrespective of where they are stationed whether offshore or onshore. It means, for example, that the support vessel, once it is plugged into the LAN, can call up the maintenance documents held on the OMNIDOC system if it encounters a faulty piece of equipment on one of the unmanned platforms. Indeed, the need to store our maintenance and engineering documentation electronically and access them instantly were major factors behind the use of high speed communications. When the network and IT facilities were designed two years' ago, there was no precedent for them. They still are state of the art and only now are similar installations being built for the North Sea.
Remote management
Given its importance in the field's operations, it is vital that the network and computing resources are continually monitored and managed to ensure they are always available to users. From the outset, BHP decided this should be outsourced. To BHP this is an overhead requiring dedicated equipment and a wide range of expansive technical skills, their expertise lying in oil and gas discovery and production. They believe that IT services can be more cost effectively provided by an outside specialist organization. In a contract worth £1.3m over three years, and believed to be the first of its kind in the UK, BHP has appointed IT Services Group, Wakebourne, to undertake the management of its IT infrastructure. "BHP demanded a total solution for managing the network, the devices attached to it and the applications running over it" explains Wakebourne's marketing manager, Colin Williams, "and given BHP's policy of limiting offshore manning levels it had to be undertaken remotely. Our ultimate aim is to keep end users working by identifying and remedying faults and solving any problems they encounter on their computers."
Impossible?
"At first it was not clear whether this was possible" he adds, "but in a series of trials on a pre-commissioned network we demonstrated we could provide such a service with offshore visits few and restricted to the replacement of faulty parts." The service is all embracing, involving network management and control; PC applications and operating system support; PC network and UNIX server administration and support; SUN/Oracle and NT/Oracle database application support, third party hardware maintenance and the provision of a dedicated Help Desk. The ultimate aim is to keep end users working by isolating and remedying faults and solving any problems they encounter on their computers. The service is run from Wakebourne's network management center in Hanworth and systems management center in Coventry, both of which are connected to the BHP network of Point-of-Air by Kilostream links. The former is responsible for the network, the PCs and their applications, while Coventry is concerned with the UNIX environment consisting of the servers, applications and databases.
heterogenous
Given the diverse nature of the IT elements involved, it is not surprising that a diverse range of management tools are employed by Wakebourne. Providing the base SNMP functionality for the network components (hubs and routers), and its attached devices such as servers, is HP OpenView with the detailed monitoring, e.g. front views of the hubs and the state of LEDs on each module, being undertaken by UN Network's NetDirector management platform. The Novell environment is managed by their ManageWise solution augmented by Intel's LANDesk Manager. Part of Wakebourne's brief is to monitor the network performance and usage to determine whether changes should be made to increase capacity or reconfigure it. To obtain the more detailed statistics needed to meet this requirement, AXON RMON probes are' deployed. In real time these look at traffic and user activity such as packet rates, the top ten users, traffic bottlenecks and error rates, correlating them to individual users rather than just the network in general. Such information reveals what is causing a problem, e.g. is it collisions, excess traffic, particular users or applications, and decisions can be taken on whether to add new network segments for example.
User-related problems
All faults are reported to Wakebourne's Coventry Help Desk which determines which of the two management centers has responsibility for the fault. The fault is logged into the Remedy Action Request system which presents the relevant Help Desk personnel with details of the location of the user's system. The majority of problems are not, however, associated with the hardware or the network operation. Most are user related, typically concerned with not being able to log on or with difficulties with an application or accessing a database. The most important aspects of the management scenario are, therefore, the Help Desk, the ability to take over users' terminals and the ability to drill down into the UNIX databases to identify problems with them. By remotely capturing through Intel's LANDesk Manager the users' PC screens and keyboards from their management centers, Wakebourne can identify the problem, correct it or, through the Help Desk, advise users as to what they are doing wrong. The large databases inherent in BHP's UNIX programs require extensive administration and housekeeping including such factors as storage issues, usage information, database tuning and capacity planning.
Continual refinement
Wakebourne's solution is to use Compuware's EcoTOOLS to monitor the
databases and diagnose database problems. Agents sent into the database monitor data for
events, changing values, patterns and trends against pre-defined criteria. From this they
can direct problems such as which database areas are filling up or which applications'
processes are heavy processor users and then alert the Help Desk. It is an operation of
continual refinement in which the personality of the database is built up and fine tuned,
allowing the administrator to become closer and closer to the nature of the particular
application and its performance. A general view is also kept to trap unexpected, i.e. not
previously experienced, performance problems. EcoTOOLS provide very specific information
about what is causing a problem, e.g. if the end user reports that the database is running
slow the system will discover the reason whether it is a user or the application. If
necessary, EcoTOOLS will also undertake the appropriate corrective action. The combination
of BHP's IT strategy and Wakebourne's remote management capabilities and tools has
produced a viable IT solution which provides definable business and operational benefits.
Without it the development and exploitation of the Liverpool Bay Field may not have proved
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_13 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_14 as the subject.
This product allows for analysis of well tests in complex geological
situations, well geometries and with multi-phase fluid flow effects such as water coning.
The new application allows users to validate their raw well test data, conduct
conventional analytical analysis and to interactively prepare a numerical model. The
complexity of real-life reservoirs is addresses by the use of perpendicular bisection
(PEBI) gridding techniques (whatever they are!). Welltest 200 runs on PCs and Unix. quality
SIP V2.1 Available (November 1996)
POSC Announce availability of the Software Integration Platform Version 2.1. This is supplied to interested parties at the bargain price of $25.00 US on a two CD-ROM set. The POSC SIP comprises the Epicentre Data Model, the Data Access and Exchange Format and the E&P user Interface Style Guide. More information from the Petrotechnical Open Software Corporation,
Geoquest - announces new software and quality compliance (November 1996)
Geoquest has just announced a new product Welltest 200, based on the Eclipse reservoir simulator acquired from ECL.
Another announcement from Geoquest concerns their recent qualification
for ISO 9001 Registration for their Software Support, Software Commercialization and
Training Departments. The Hardware Commercialization and Staging Departments were
qualified back in 1995. The ISO 9001 Quality System Standard encompasses the design,
development, production, installation and servicing of products and services, including
computer hardware and software. Stan Wedel, GeoQuest Vice President of software product
commercialization and support claims that their customers will "appreciate the
changes made to advance the testing and delivery of new software and the increasing speed
and consistency of our support". Of course the trouble with "Quality" is
that once you have broken down your business into a set of procedures that can be followed
by a monkey, the next step is to go out and hire some monkeys. We'd like to hear from any
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_15 as the subject.
The September-October issue of Geophysics, the journal of the Society
of Exploration Geophysics (SEG) includes the specifications for the new Record Oriented
Data Encapsulation (RODE) format. This is intended to allow for the copying - remastering
of old (legacy) data recorded in obsolete formats onto the new High Density Media (HDM),
such as D2, D3 and NTR RODE will provide "a standard way of migrating old data into
the new environment and allows new data to be recorded efficiently on new media using
existing formats". The essential problem is that many of the older formats were
reliant on now obsolete recording techniques, sometimes hardware dependent, such as very
long tape blocks. While it is possible to reformat all these older formats to a more
modern format such as SEG-Y, such a reformat adds a considerable overhead to the
remastering process. Furthermore, there is always the chance that there will be data loss
or corruption in a reformatting operation. so that conventional wisdom is to encapsulate
the old data format Well logging ancestry
RODE format endorsed by SEG (November 1996)
A new method of storing any data type, but typically seismics, on any media has been officially given the stamp of approval by the SEG.
The encapsulation process is designed simply to allow a future reader of the HDM to recover the legacy data in its native format onto a disk file. This also means that such data can be processed with legacy software, providing of course that such software has been maintained between times! RODE has its ancestry, not in seismic acquisition, but in well logging. Schlumberger's LIS and DLIS formats form the basis for the American Petroleum Institute's (API) Recommended Practice 66 (RP66) format (2) for the digital exchange of well data. RP 66 was intended to be a highly generalized format offering "a major opportunity to have just one standard exchange format for all oilfield data" . However, the first version of RP66 was deemed to be too well log specific and it is the version 2 of RP 66 that has been used as the basis of RODE.
Hard to read!
Eric Booth of RODE Consultants and co-author of the Geophysics paper presented the new standard at the PESGB Data Management Seminar. Booth described RODE as having a generic grammar and data dictionary allowing for self defining data, easily handling of arrays and with simple rules for the incorporation of textual information. According to Booth, RODE is easy to write, but difficult to read because of its flexibility. RODE is important as it is already in use in major transcription projects. It is also being taken up by POSC and Geoshare as a data exchange format. As RODE use increases it may eventually be used directly in data recording. The overhead of encapsulation may be outweighed by the facility it offers of for example encapsulating raw data with textual or even scanned images pertaining to the project and bundling them all into one file which should be future-proofed with respect to media and file system changes.
(1) Geophysics September-October 1996, Vol. 61, No 5 page 1546
(2) Recommended Practices for Exploration and Production Digital Data
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_16 as the subject.
The new approach will allow the interpreter to move freely within both
the seismic data cube and the evolving 3D structure model, enabling rapid identification
and resolution of problematic areas concurrently with interpretation, according to Midland
Valley. The project includes three key components: work-flow methodologies, software
integration and a structural interpretation which will provide a rest bed and technology
transfer vehicle. Data is being provided by Chevron, as operators of the Cabinda
Association, with partners Sonangol, Agip and Elf. The data covers an area of complex
extension and salt related deformation and will serve to fully test the technical and
commercial impact of the project deliverables. At a software level, Midland Valley and
Landmark are developing a very tight integration of 3DMove within Landmark's SeisWorks and
EarthCube environment. This builds on the strategic partnership announced with Landmark
last May. It is hoped that the new software will provide dynamic data links and crucial
co-cursor tracking to make interpretation, modeling, restoration and validation a seamless
process for mainstream interpreters.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199611_17 as the subject.
Landmark and Chevron join Midland Valley in modeling project (November 1996)
Midland Valley, the specialist in 3D structural modeling and validation, is working on a project with Landmark and Chevron to develop new work-flow methodologies for structural interpretation and validation of 3D seismic data, integrated onto a common workstation.
© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.