It was a takeover waiting to happen, but who would have guessed that Halliburton Company would end up buying Landmark Graphics Corporation in a stock transition valued at $557 million. The announcement made on 1 July also outlined plans for Halliburton, Landmark and EDS, the massive information services company founded by US presidential pretender Ross Perrot, to join forces in an alliance to develop ‘a worldwide distributed data management capability that integrates all information associated with the oilfield lifecycle.’ The financial terms of this arrangement were not disclosed.
Obvious irony
The obvious irony of the deal is that Halliburton is actually buying back some of the business it sold two years ago. At that time the struggling giant in the geophysical industry, Halliburton Geophysical Services, was sold for a song to Western Atlas. In a recent rationalization, Western agreed the sale of its software products division to Landmark including some products originating from the HGS stable. Which is why some observers are intrigued by the logic of the Halliburton move, particularly as the company seemed to have backed off involvement in the exploration side of the business. One PDM contact suggested that in order to stay in the competition with its chief rival Schlumberger, Halliburton needed the Landmark capability in the geo-sciences. An upbeat statement from Dick Cheney, Halliburton’s chairman, president and chief executive officer, said ‘The global petroleum industry is increasingly seeking service partners who not only deliver solutions for today but also have the insight and vision to anticipate future needs.
'The acquisition of Landmark is strategic for Halliburton and will enable our combined businesses to deliver an increasing array of solutions to address the needs of customers while providing added value to our shareholders.'
Cheney said that Halliburton will operate Landmark as a separate subsidiary ‘to ensure that it continues to provide innovative software and services to all segments of the industry as well as forming alliances with other companies. He saw the EDS alliance providing the potential for ‘an unprecedented linkage of information between oil field locations and the offices or our customers.’ The official line on the alliance i’s that it will ‘create an information environment that will automate and integrate petroleum exploration and production from energy company offices throughout their oilfields.
Transition strategy
This scalable environment will have the potential to encompass applications, workflows, processes and data from Halliburton, Landmark and EDS. lt will be based on industry standards and open to any software supplier, service company or energy company for widespread adoption.'
The acquisition of Landmark comes at a time when the company had been pursuing a strategy of transition from software products seller to provider of ‘integrated solutions’ to include services such as customer support, training, maintenance and consulting.
This concept had been made possible partly through the rapid expansion of its product range, thanks to key acquisitions such as Western Atlas Software, GeoGraphix, Stratamodel, Munro Garrett, Advance Geophysical and Zycor. However Landmark had some way to go with its integrated solutions aimed especially at the new asset team approach being adopted by oil companies. In an analysis of Landmark in June, equity research company Morgan Keegan predicted a possibly bumpy road ahead in the short term, emphasizing management as the critical factor in a successful transition.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_1 as the subject.
Panther has already proved itself in the area with its own SDMS seismic
data management system which enables the accessing, moving and exporting of various
formats of seismic data at the desktop. In the IBM application the aim is to streamline
the access of seismic data requested from a PetroBank archive to the client’s desktop
applications such as SeisWorks, IESX, Charisma and SeisX.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_2 as the subject.
The field of data management has sprung from nowhere to become the most
talked about topic in the E&P computing field. Two phenomena have thrust this topic
into the limelight. On the one hand, accumulating legacy data has reached the state where
something just has to be done: old tapes need remastering, interpretations are rarely
correctly archived and paper data, which may still contain mission critical information is
often effectively lost to posterity. On the other hand, new data is arriving at an ever
increasing rate and personnel levels are down following industry wide restructuring. Standards The resulting squeeze on geoscientists and IT professionals has led to
a profusion of offerings from the standards organizations and the vendor community. This
is particularly true in the "dash for standards", which are often announced way
before they have been effectively established, and may be utilized by vendors as a
marketing tool rather than a real attempt for collaborative data exchange. It is quite legitimate to doubt the extent to which some vendors are
really trying to improve inter-operability in that cross platform or cross vendor
data exchange may have unpredictable side effects on market share elsewhere on the product
line. An example of this is the difficulty in swapping seismic data between
the two market leading interpretation platforms, and this despite some five years of
effort from standards organizations set up to solve exactly this sort of problem. The PDM will cover the fields of geology, geophysics, reservoir
engineering and IT in general and standards organizations both within the E&P
community and outside. We take a very broad view of what "data" means; from bits
and bytes through the whole gamut of definitions to include knowledge and even, as some
would have it, "wisdom". New technologies are arriving at an alarming rate and must be analyzed
from more than just the technical aspect. Currently the Internet is touted as a major
revolution in IT and, while this is undoubtedly so, its take-up by E&P companies has
been slow, not least because of human resources implications. In our information and security conscious business, many are reticent
to provide enterprise-wide worldwide web access to their staff even when the basic
anti-intrusion firewall technology has been put into place. Inevitably some new
technologies will look like old technologies rather quickly, with sometimes unfortunate
consequences. Scope The seismic business is currently suffering from this phenomenon as it
attempts to come to terms with decades of legacy data recorded on different physical
media, with a plethora of formats ("standard" and otherwise) and with a highly
variable degree of consistency of reporting and quality control. We will look outside of the Petroleum business to see what is happening
in other industries, to analyze the importance of such developments and to try to help our
readers "bet" on winners, or at least to hedge their bets intelligently. Currently most of the ‘high value" information in an oil company
is not in the corporate data base. The decision maker is more likely to need to retrieve a
Word for Windows document than a seismic trace. Technology for managing this type of
information is developing outside of the E&P community. Products such as Microsoft
Office, Lotus Notes and a variety of document management tools are available now, and we
will be analyzing their suitability for use in our industry. Interpretation systems We will therefore be following developments in workstation applications
particularly as interpretation systems evolve to cater for more and more esoteric
functionality. Much of which may be at odds with mundane considerations of the management
of data. Finally, technical choices cannot be divorced from business
considerations both of vendors and companies. In this high-tech field, commercial
stability is as important as technical excellence. We will therefore be looking behind the scenes, at vendors’ commercial
successes and failures, and at the people working in this dynamic and complex field. As
oil companies adapt to a changing world and increasingly integrate expenditure on
intangibles such as data management into their business plans, we will be discussing
current thinking in business process re-engineering and the attempts to analyze the true
costs and benefits of this activity.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_3 as the subject.
The idea, which Larry Ellison president of Oracle Corp has dreamed up
to wage battle with Bill Gates, is that instead of having "expansive" software
on a "complicated" hard disk, the Network Computer becomes an Internet terminal
downloading applets (such as a slimmed down word processor or even an upgraded operating
system). Oracle data bases This machine is to be sold for something like $500, and will make
intensive use of ... Oracle databases over the net. The machine’s specs as outlined in the
NC Reference Profile - 1 are the following. "The hardware guidelines cover a minimum
screen resolution of 640 x 480 (VGA) or equivalent, a pointing device (mouse or track
ball), text input capabilities and audio output. The agreed upon Internet protocol are Transmission Control Protocol
TCP), File Transfer Protocol (FTP), optional support of NFS to enable low-cost, media-less
devices while allowing for persistent storage in the network and SMTP, a protocol enabling
the distributed management of devices. The profile further adheres to World Wide Web
standards HTML, HTRP and the Java Application Environment, as well as to mainstream mail
protocol (SMTP, IMAP4, POP3) and common data formats such as JPEG, GIF, WAV and AU.
Optional security features are supported through emerging security APIS; security
standards are ISO 7816 SmartCards and the EMV (Europay/MasterCard/Visa) specification. Business world A Gartner Group forecast reported recently that Internet Terminals in
2001 could makeup 50% of the home market for networked devices but would be unlikely to go
above 20% in the business world. Now there are some aspects of this idea that don't stand
up to real close examination. Today’s PC software is unbelievably cheap and using a modern PC is
becoming simpler with innovations such as Plug and Play hardware, and software Wizard
technology. Too powerful The NC pundits have claimed that the modern PC is "too
powerful", and while this sentiment may well be echoed by some IT managers, we have
yet to hear an end-user bemoan his excess of computing power. Also, because the NC
alliance does not actually own the Internet, despite what they might like to think, other
players will likely be able to emulate successful parts of the whole in NC windows for
PCs. The position of Sun and Apple is intriguing in this respect. Will the Internet
applets run on Sparcs and Macs? Although this is unlikely, it could make for some radical
rethinking of software pricing on these platforms.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_4 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_5 as the subject.
POSC started out by spurning the relational mainstream of database
technology and was a leading force in the field of Object Oriented databases. Two data
modeling technologies were adopted by POSC in 1993, OpenSql from HP and UniSql from the
company of that name. Neither of these products have exactly set the world alight and POSC
were faced with a dilemma, whether to backtrack from the 00 avant-guard or to forge ahead
regardless of the cost. Weil no-one can afford to disregard the cost of anything these
days and POSC were forced to look for some concrete examples of the value that it had been
talking about for so long. Savior Temporary savior came in the shape of two major data banking projects
whose clients were sold on the idea of POSC compliance, and who wanted it now. This led to
the emergence of two "relational projections" of the Epicentre data model (one
from IBM, the other from PECC/ PetroSystems). These offerings, although technically rather
limited in both their overall scope and in the extent of their actual compliance with
Epicentre, are both available and can be implemented on today’s technology. They can be
said to have saved POSC’s bacon, for now ... A spin-off of the emergence of the relational projection" of
Epicentre as a de-facto standard means that most commercial applications of the POSC data
model are implemented on the Oracle database. Since this is also the main platform for
PPDM implementations it seems natural that there should be some attempt to merge the two
products. This should provide benefits to PPDM users, in that their user community should
widen and the model be enhanced, and to the POSC sponsors who to date have not seen a
great return on their investment. Now this is not the first time that such a merger has been attempted.
Late in 1993 a similar project was launched with much trumpeting and it was announced that
a merger of the two models should be achieved by mid 1994. Ail this turned sour rather
quickly as the PPDM user community refused the changes that were required for even minimal
POSC compliance. What has changed today is the acceptance by POSC management of what are
essentially non-compliant products as compliant. This can best be qualified as a
"wide goalpost" option. So long as a desire to be in the POSC camp is expressed
by a vendor, then the product is welcomed into the fold with terms such as "partial
compliance" or compliance with "the relational projection" of Epicentre. Core data model But this is not the real driving force between the subset project.
Landmark, which has adopted PPDM as the core data model for its product line, and is
therefore in bad odor with its POSCian clients, is pushing POSC towards PPDM and vice
versa. Meanwhile, the POSCians within the POSC sponsor community have given up telling
their management that true POSCness is just one more committee away, and are having to
explain how POSC "compliant" products exist in the marketplace, but not in their
own shops! PPDM has made its position on 00 technology clear, it sees the oil
industry as a follower and not a leader. It makes a rather powerful case for this by
pointing out that with less than 5% of the worldwide market, the petroleum industry is
hardly in a position to set standards that will stick outside its own patch. Why abandon? The problem with the PPDM Epicentre subset project remains, why should
the PPDM user community, who have a successful, pragmatic product, abandon the "if it
ain't bust, don't fix it principle" in order to help out the vendor community who are
in a bind with their POSCian clients. The sort of squaring of the circle which is being
attempted within the subset project is a reflection o t e main issue confronting the IT
business today. This is the interplay between the established relational data model, the
new technology of client server computing and the object paradigm. These influences are
sometimes complementary, and sometimes divergent. They are also the sort of IT political
footballs that are kicked around with boasts of "standard" adherence,
"openness" and so on. In reality, competing businesses are the driving force behind these
different technologies. We may be moving towards a brave new world of interoperability and
openness, but if the future were to be ail Sun and all Oracle, well that would be just
fine for Larry Ellison and Scott McNealy too. In the next few issues of PDM we will be
providing a step by step explanation of these issues, both in terms of the IT world at
large and of course in terms of our own backyard of E&P
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_6 as the subject.
First of all, contrary to what one might be led to believe by the
current talk of data modeling coming from both vendors and standards organizations, data
management is not synonymous with data modeling. Nonetheless, data modeling is an
important building brick in data management, and current developments, particularly in the
relationship between the Petrotechnical Open Software Corporation (POSC) and the Public
Petroleum Data Model (PPDM) make this an extremely interesting starting point. The reason
for all the attention given to data models today is that they are a way out of the
interoperability impasse described in the previous section. Most applications today run on
pretty much the same sort of hardware. They run on the same Unix operating system and even
frequently use the same database management system. The next step is therefore to use the
same underlying data model. Simplifying If all applications know that, say, well deviation data is contained in
a database table called WELL-DEV-DATA whose structure is known beforehand then there only
need be one copy of this table available to all applications, greatly simplifying data
entry, access and maintenance. Now the foregoing is a simplified account of the situation
prevailing about seven or eight years ago when a number of different organizations began
to address the problem. Other players In Canada, PPDM started out with a very straightforward mission, to
build a data model running on the mainstream technology of relational data bases such as
DB2, ORACLE and SYBASE, and utilizing a standard programming language developed in the
1970s by IBM called Structured Query Language (SQL). More or less simultaneously, a group
of major oil companies in the US and Europe clubbed together to establish POSC which
although sharing the same goals as PPDM, had a more ambitions view of the problem, and who
were prepared to entertain more forward looking solutions. Others also entered the data
model fray, Schlumberger who put their Geoshare technology into the public domain, and
Petroconsultants came out with IRIS21, a data model closely coupled with Petroconsultants
"raw material", land lease and well information. Meanwhile, EXXON and IBM with
the Mercury product, had been modeling data for some time before POSC and PPDM came on the
scene. Enthusiasm Now you will already have spotted the problem with this excess of
enthusiasm for data modeling. Too many data models! In fact it can be broadly stated that
the data modeling effort rather than reducing the workload of reformatting data from
different applications has actually added to this by introducing a new dimension to the
solution, that of "mapping" data from one data model to another. There, very
briefly, is where we are now. We are still pumping data from one format to another via for
instance SEGY. We are also playing a new game, that of finding common ground between
different data models. The fruit is still on the tree, we haven't found the ladder yet.
The Discovery project (see separate article) describes a recent attempt to find common
ground between PPDM and POSC.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_7 as the subject.
The race for data banking continues with the recent announcement of a
new site in Australia. PECC, HP and Storage Technology Corp (STK). are in the process of
implementing a data bank for the State of Queensland. The facility called the Pacific
Resource Information Center (PRINCE) is being installed at the Queensland Supercomputing
Laboratories (QSL). Some 70,000 nine-track tapes, written reports, exploration surveys and
rolls of maps will be transcribed onto STK’s high density media. Initial data volumes are estimated at 28TB. "Our goal with PRINCE
is to protect the state’s data asset and to make that asset available and easily
accessible to explorers," said Peter Harvey, business development manager at QSL.
"This is a comprehensive project in that we first have to collect all the data,
archive it and then make it accessible. We view combining HP’s high performance scalable
servers with automated tape libraries from Storage Technology and the sophisticated
map-based browser software from PECC as giving us the best opportunity to quickly bring
the system on line." HP served as the system integrator on the PRINCE project.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_8 as the subject.
Since this is a first edition it is a good place to outline some of the
basic issues involved in modern data management and to set the scene for later, more
technical discussions. At the highest level, data management revolves around where and how
to ,store corporate data. For a large organization this may involve a multi-tiered data
store with robotic access to Terabytes of data feeding project data stores on servers
accessed by workstation clients. For a small organization this may mean organizing boxes
of Exabyte tapes and Excel spreadsheets. For organizations of all sizes, data management
is about managing heterogeneous data in a multiplicity of digital forms and including vast
amounts of information on paper and even in people’s minds. Incompatible As anyone who has worked with E&P data will know, a major time
wasting problem is that of incompatible data formats. Data arrives in the E&P
department on tapes from the logging or seismic contractor in a bewildering number of
"standard" formats, on a variety of physical media and often in a form which
even if it is a recognizable format, may need a considerable effort in editing and
cleaning up of the data before the real work of interpretation. Once the raw data has been
loaded into the workstation, things are by and large all right so long as the interpreter
is working alone, with a single application. Now this is emphatically not the way to do
things in the modern world of asset based interpretation teams. This new way of working
involves experts from different disciplines (and hence using different applications)
working together and sharing data. Since few applications use the same data formats this
involves a new bout of data reformatting gymnastics each time data is traded between
specialists. Nitty-gritty Things get even more complicated when the actual nitty-gritty issues of
networks and disks are taken into consideration. Every time data is swapped it is in
effect duplicated. When you consider that modern seismic data sets already push storage
technology and network bandwidth to the limits, the last thing the IT manager needs are
multiple copies of the same data. Unfortunately, that is what we have to deal with, a
gigabyte in quickly becomes five or 10 gigs as data is traded between applications and
multiple backups of interpretations are made. On the subject of backups we touch on
another critical issue for data managers. No IT professional needs telling about the
necessity to protect a company’s investment in processed data and interpreters’ time. You
would think that if one has gone to the trouble of making copies of the data at regular
intervals then you would be able at some future date to come back to a particular stage in
the interpretation process. Current technology and practices may in fact mean that much of
the effort spent in backup and archive is in reality time wasted. The modern slimmed down
organization does not encourage the interpreter to spend time on housekeeping and there is
a tendency at the end of a job to rush on to the next project. This means that even if an
archive has been made at the end of a project, if it is restored after some time has
elapsed, perhaps by a different interpreter, it may be in fact useless, containing many
trial interpretations only comprehensible to the original interpreter. Worse still Worse still, the backup, if made in a workstation specific format, may
actually be unusable with the latest version of the software it was made with - which has
since been upgraded to a more recent version. Having stated the problem, we are sure that
we have left out a lot. We've painted a bleak picture of the way things are, we're sure
we've not exaggerated anything. In fact things are likely to get worse as the volumes of
data involved in E&P go through the roof. A modern seismic boat acquires data at
around 100 megabytes per minute. Tapes are shipped off by helicopter to avoid sinking the
boat, and they are all coming your way ...
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_9 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_10 as the subject.
Well, client-server is actually a rather abstract notion; it’s a model
of computing. There are several models of computing: the old mainframe style, standalone
PC, workgroup (file-serving), and, to that list, we've most recently added client-server.
In the mainframe model, as you'll recall, there’s one, and only one, computer. PCs can be
used to interface with the mainframe, but they're reduced to "dumb" (we should
perhaps say "computationally challenged" instead - this is the
politically-correct 90s after all) terminal in the process. The mainframe does everything. Abstract part In the client-server model, the workload is split, amongst two or more
systems (not necessarily computers). This is where the abstract part comes in. The
client-server computing model is defined by two or more systems in which one of the
systems requests a "service" that is provided by another system. Since the model
is largely implemented in software, the "systems" could quite possibly reside on
the same physical computing device. Nevertheless, usually, there is a physical split: a
desktop client requests services from a central computer (a "server") residing
down the hall, or something like that. What kind of services are we talking about? The
server might provide fax services to a department or even an entire organization. If it
were an especially capable machine, the server might be used as a compute server:
computationally-intensive tasks could be farmed out to this powerful device, leaving the
client to take on less demanding jobs. Corporate data In the popular usage of the phrase, however, client-server usually
means client-data-server. In this instance, corporate or departmental data would be
located, processed and "served up" to clients on demand. There’s more to it than
this, though. There’s an implicit requirement in the definition of client-server that the
client add value to the data (if we're talking about data-serving) before presenting it to
the end user. If it weren't for this proviso, file serving would qualify as client-server.
"Added value" most often means presenting the data in a manner more easily
digested by the user. Using a graphical user interface (GUI), like Microsoft’s Windows,
IBM’s OS/2 Presentation Manager or, to a lesser extent (from a market share perspective),
the OSF’s Motif, allows complex data relationships to be quickly and easily understood by
end-users. But quite apart from the technical definition, there’s also a philosophical
aspect to client-server. In the late 1980s, we found ourselves with networked PCs, armed
to the teeth with office productivity tools. Glass house Just one thing was missing: access to the corporate data. That was
often (usually) safely locked up on the mainframe. Without a painless way to get at it, it
might as well have been on Mars. Under a banner of "information r t e people,"
client-server proponents now seek to tear down the barriers to the "glass
house." Rather than just break through the walls of the glass house, there’s some
that think we should do away with it altogether. They're the down-sizers. They reason,
"now that we have client-server, why do we need mainframes?" It’s a dangerous
view, however. Client-server and the mainframe style of computing we've known in the past
aren't usually pitted squarely against each other; each has their strengths, each has
their weaknesses, each has their place.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_11 as the subject.
CogniSeis Development, part of the newly formed Tech-Sym Geoscience
group (also includes Syntron and Photon) and Cray Research are claiming a quantum leap in
3D seismic imaging technology following the companies’ agreement of sell the CogniSeis
Focus software on the CRAY T3E system, the supercomputer capable of more than a trillion
calculations per second. Richard Cooper, new general manager of Cogniseis, said the plan
was to turn 3D pre-stack technique into mainstream technology for worldwide oil
exploration projects.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_12 as the subject.
Well the World Wide Web has yet to prove itself in terms of doing
anything really serious, but it has to be taken seriously because a) everyone else takes
it so seriously and b) so many claims are made about it that some of them are bound to
come true, even if this is just on a statistical basis. Weaving between the hot air and
pious wishes of the WWW superstars(incidentally, what was Tim Berners-Lee actually being
paid to do when he invented the WWW? - answers on a postcard please). Here are what we
consider to be some key points and forecasts 1. A staggeringly simple and powerful demonstration of the Web’s power
in an application which many of you will have used already : FedEx’s parcel tracking
service. 2. Intranet which according to the WWW consortium is set to slaughter
the client-server computing paradigm. 3. The need for more powerful programming tools and extensions to the
HTML language to allow for richer information to circulate the Web. 4. The network will make the local disk and even the concept of an
operating system dissolve as super systems and data can be downloaded from the Internet. Now we've included the last item because it is an important issue
today, relating obviously to the Network PC described elsewhere in this issue. However our
view is that this is where the WWW, or at least Berners-Lee starts displaying excess of
testosterone, and that it is going to take a much better idea than this one to knock Bill
Gates off of his pedestal.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_13 as the subject.
Whatever you do, remember the originals When you record on high density media (HDM), do you throw away the
originals? (Abdulla N. Hamoodi, on sci.geo.petroleum) and a good answer from Gary Cameron
of Union Pacific who pointed out that tapes are often moved to off-site storage after
transcription - rather than being thrown away, and that with high density media, it is
easy to make and store many versions of the backup. Anyone who has worked with computers
knows that the main reason for losing data is the operator rather than the machine. We
have an interesting analogue here, in that the remastering and compaction operation can
bring both data preservation and, to an extent, data destruction. The latter may be due to
a problem involving real data loss through poor reformatting (which should be picked up
fairly quickly be good QC), or a much more serious problem of data loss when an
information-rich header is migrated into another environment. It would seem prudent to
hang on to the legacy data for a while, until experience has demonstrated the reliability
of the information of the new HDMs. New use for hairspray! Rex Knepp offered some useful advice on how to use hairspray to clean
sticky tapes. We've not tried it so try this at your own risk. Here’s Rex "If you're
asking how to use hairpray see below. If you are unfamiliar with the word, hairspray is a
compound (in aerosol form) that people use to impart extra stiffness to their hair, mainly
to force it to hold a desired shape. Until recent years, the most common users were women,
however many men now use the stuff (Texas is, BTW, the world’s largest importer of
hairspray!). I believe that the main ingredient was at one time a form of lacquer, though
I rather suspect that the juice of the lac beetle has been replaced by synthetics these
days. Whatever it is, either the substance or the aerosol base is flammable. I suspect
that the use of hairspray would coat a tape with a film of the lacquer (or lacquer-like
substance) stabilizing the tape long enough for a single read. To do that, though one
might be forced to stand there and spray the tape as it spooled out. Sounds
tedious.." Yes and we suggest you don't try smoking while you do this.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_14 as the subject.
Petroconsultants & PPDM recently announced their intention to work
together to ensure that data can be easily transferred between the Iris2l However, our
understanding is that Petroconsultants have had second thoughts about this and that the
development may not actually go ahead in quite such a collaborative manner.
Petroconsultants were to work with PPDM "to extend the PPDM Model to cover new areas
such as acreage and field data in a way that it compatible with Iris2l." This would
have offered an interesting venture, benefiting PPDM users in that the model would have
been improved in what was previously a weak domain. It would also have allowed
Petroconsultants to sell their Foreign Scouting Services as a ‘plug in’ to PPDM members.
It seems likely that the fly in the ointment here is what the implications would have been
for Iris21. If the permit management capabilities of Iris2l were included, even in a
limited way, into the PPDM data model, which can be purchased for and PPDM models. $400,
what future for Iris2l?
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_15 as the subject.
The company says its X-windows and Motif package incorporates the
science of rock physics in a graphical application that can work on well log data sets or
at a single point with the unique and convenient "scratchpads". Users doing AVO
analysis, 4D seismic, development geophysics or seismic modeling will find these tools
useful, according to PetroSoft. Unlike standard petrophysics software, PetroTools
apparently lets the user create "what if?" scenarios. For example, what if the
porosity was lower, the clay content was higher or there was hydrocarbon in the zone? The
resulting Vp, Vs and density curves from PetroTools can be exported to most popular AVO
and synthetic seismic packages. PetroTools 2.4 also provides the best available estimates
of shear wave velocity in complex lithologies, with or without hydrocarbons. supports
metric and English (US) units Version 2.4 has ail new graphics with simple drag and drop
plotting, graphical data editing and industry standard CGM or Postscript hard copy output.
PetroSoft claims over.120 copies of PetroTools are now in use by over40 international
E&P companies.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_16 as the subject.
CGG recently started the process of acquiring Petroleum Exploration
Computer Consultants (PECC), a joint venture partner involved in data management, and is
promoting the PetroVision Database Management product. PetroVision is currently being
installed under a $1.5 billion World The application Bank Project in Algeria for the state
oil company Sonatrach. Petrovision uses Emass D2 Data Tower with 5.6 Terabytes capacity
for storage. The project involves transcription of over 100,000 tapes, and scanning of
paper data together with vectorising of seismic and well logs. An existing well database
is to be ported to the POSC Epicentre V2.0 data model.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_17 as the subject.
The corporate merry-go-round has seen two significant combinations
being formed over the past couple of months. US geophysical service company Digicon has
teamed up with Canadian company Veritas Energy Services to form Veritas DGC, a move mainly
driven by the need to add resources muscle in order to compete with the other global
geoscience industry companies. Meanwhile Sercel, the CGG subsidiary specializing in cable
telemetry seismic acquisition systems, is taking over Opseis, manufacturer of the market
leading Opseis Eagle radio telemetry acquisition, in what is seen by the industry as a
good fit.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_18 as the subject.
The group, which includes faculty and graduate students, at the Centre
for Wave Phenomena (CWP), uses off the shelf hardware and free software to develop
geophysical algorithms. The algorithms can run on local area networks or be scaled up to
run on multi-million dollar supercomputers. The group’s current parallel workstation
cluster consists of 15 network-connected Pentiums running the GNU/Linux operating system.
Project software can be downloaded at no cost from CWP’s worldwide web site. (See URL
http://www.cwp.mines.edu). The site additionally provides free access to Seismic Unix
(SU), a complete seismic data processing system for Unix workstations that is installed at
approximately 700 sites around the world. Also available is a C+ + library of optimization
and numerical analysis software. In addition the site provides links to the Samizdat
press, an Internet archive of free books and lecture notes.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_19 as the subject.
Using GeoQuest software, CNOOC is focusing on optimizing production of
existing fields. The software products will be installed at four CNOOC offices. CNOOC
geoscientists will use the GeoQuest integrated software for all stages of reservoir
analysis - from multi-survey 2D/3D seismic interpretation, through geologic interpretation
and petrophysical analysis to reservoir modelling, simulation and mapping, enabling CNOOC
to efficiently interpret all data for each field. CNOOC is one of China’s largest national
oil companies. In 1995 the company produced more than 8.4 million tons of oil and
approximately 375 million cubic meters of natural gas.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_20 as the subject.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_21 as the subject.
Name Location Hardware Software Sponsor Date CANOGGIS Calgary Sun QCData N/A 1995 CDA London Sun QCDATA/
Geoquest DTI 1996 DISKOS Stavanger IBM IBM/PGS NPD 1995 GeoBank UK London IBM ISM/PGS PGS 1996 GeoBank USA Houston IBM IBM/PGS PGS 1996 MMS Houston Sun QcData Dept of
interior 1995 NPIC Peru Sun Geoquest Petroperu 1996 PDVSA Venezuela Sun Geoquest PDVSA 1996 PRINCE Queensland HP PECC Queensland 1996 SONATRACH Algeria HP/STK CGG/PECC SONATRACH 1996
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_22 as the subject.
A keynote address by Lucio Deluchi (AGIP) entitled "Information
technology and data management as factors for E&P competitiveness" set the scene
and was followed by a special lunchtime workshop dedicated to the Discovery project. A
morning’s session was also devoted to papers on E&P data management and all of these
events were extremely well attended, underlining the strong interest in this developing
field today. A meeting of the Geoshare Users’ Group was held on the first day of the
conference and we will be describing this interesting technology in a future edition of
PDM. Deluchi’s thesis is that only a proper understanding of the opportunities offered by
modern day information and communications technology (ICT) will allow managers to
"sustain a competitive advantage for their company in the new information
society." He further stressed the impact of business process reengineering.
"Communication solutions and workflow re-engineering are increasingly based on new
and more efficient information systems and extended standardization. "At the same time, lines dividing business drivers, work
processes, organization structures and information systems are blurring and managers must
be capable of utilizing the new technology to build a new model of ‘networked’ companies.
In particular, they should strive to dismantle the trap of corporate functional silos of
technology, data and information, moving instead towards tighter integration and networked
structures supporting new business processes throughout the lifecycle of E&P assets.
" All in all this was a thought provoking talk, illustrative of the serious way in
which many large organizations are approaching radical change.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to
pdm@the-data-room.com with PDM_V_2.0_199607_23 as the subject.
Panther to serve PetroBank datastore (July 1996)
Canadian company Panther Software has teamed up with IBM Petroleum Industry Solutions to build a network-based petroleum data management server for IBM’s PetroBank datastore.
Editorial - Why data management is the key to industry future (July 1996)
PDM’s editor Neil McNaughton analyses the state-of -the art in E&P computing and data management and explains why a newsletter is a necessity.
Network Computer challenge to PC market domination (July 1996)
A Sun-Apple-IBM-Oracle-Netscape alliance is in the offing to manufacture and market the new PC-killing diskless Internet workstation, dubbed the Network Computer (NC).
Ellisons phone give-away fallacy (July 1996)
Ellison’s idea that these things could be given away by phone companies is reminiscent of France Telecom’s strategy with the Minitel, a forerunner of the NC. The Minitel is an alphanumeric terminal and extremely slow (1200/70 baud) modem generates $1.5 billion in revenues per year. This is through astronomically high costs per byte transmitted (typically 100 x Internet rates) and an extreme laissez-faire policy which means that up to 90% of Minitel traffic is dating and porn.
Discovery - what did they find? (July 1996)
Project Discovery is an attempt to find common ground between the two industry standard data models from POSC and PPDM. PDM reports on progress.
The modeling malaise, in E&P data management (July 1996)
PDM offers a brief retrospective of the data models competing for market share in E&P computing.
Consortium provides PRINCE databank for Queensland (July 1996)
New databank for State of Queensland (Australia).
Data, data everywhere and not a number to crunch’. (July 1996)
PDM outlines some of the basic issues in data management today
Power at Cornell (July 1996)
The IBM RS/6000 scalable POWER parallel system at Cornell Theory Center based in Ithaca, is to be used by Schlumberger Oilfield Services Company Geco-Prakla for a variety of seismic processing projects in a quest to reduce the time it takes to process seismic data, a continuing focus for the geoscience industry
Getting to grips with client-server computing (July 1996)
Few Issues in computing cause more Confusion than client-server. It is literally impossible to open up a computing publication these days, and not find some mention of it. But just what is it? And how is it different from the styles of computing we known in the past? Here’s an abstract from ‘Client/server & Open Systems by Rand Dixon, John Wiley & Sons, New York, Jan. 1996. This abstract reprinted with permission.
Focus on Cray (July 1996)
Cogniseis and Cray research announce seismic processing breakthrough
World Wide Web Consortium Roadshow (July 1996)
PDM reports on the World Wide Web Consortium Roadshow
Tips from the ‘net
(July 1996)
PDM logs on and learns about handling legacy tapes, and a novel use for hair-spray on sticky tapes.
Petroconsultants backs off PPDM liaison (July 1996)
Petroconsultants to abandon merge with PPDM data model
Seismic rock properties software for AVO analyses (July 1996)
PetroSoft of San Jose, California has just released PetroTools 2.4, the latest version of its seismic rock properties software, now for Solaris and SUN OS.
CGG objective (July 1996)
CGG stated its intention ‘to become a world leader in [E&P] database management’ as part of a major refocusing of its business strategy announced prior to a new rights issue.
Companies combine (July 1996)
Digicon merges with Veritas DGC and Sercel takes over Opseis.
CSM’ develops distributed seismic software for workstations (July 1996)
The Colorado School of Mines (CSM) says its researchers are developing seismic data processing software that allows users to distribute large tasks across computer networks without having to become experts on parallel computing.
GeoQuest seals China software contract (July 1996)
China National Offshore Oil Corporation (CNOOC) recently awarded GeoQuest a $2.2 million contract for reservoir characterization and data management software.
Pressure data supply (July 1996)
Petroleum Information (ERICO), UK arm of the US company PI, is providing pressure data for nearly 1200 wells from its UK and Norwegian continental shelf databases, to GeoPOP, a multi-disciplinary research project on the origin of overpressure in sedimentary basins. Three British universities, Durham, Newcastle and Heriot-Watt are involved.
Data banks - Where are they now?. (July 1996)
This is the first of what we hope will be a growing list of data banks which we will keep updated regularly. Incidentally, we are defining a Data Bank here as a facility which is accessible publicly. In fact some oil companies have data banks which are larger than many of these. We will be describing some of these initiatives in future issues of PDM.
EAGE - Data Management highlighted (July 1996)
Data Management featured prominently at the annual conference and exhibition of the European Association of Geoscientists and Engineers (EAGE) held in Amsterdam in June. E&P IT issues were addressed in the keynote speech, a Geoshare User Group meeting was held and morning session on data management.