November 1997


Stirrings from the beast – Microsoft wakes up to importance of E&P computing.(November 1997)

Microsoft is moving in on the E&P ITaction with a push for the hearts and minds of the energy business. The strategicobjective is knock UNIX off its perch and establish Windows NT as the system of choice foroil industry computing.

To the casual observer at the SEG or any other E&P convention, the compute engines that matter all appear to be UNIX boxes. Sun, Silicon Graphics, HP/Convex and IBM all have a substantial presence and are actively promoting their networking capabilities and, in the case of Sun, have an army of tee-shirted individuals offering candy to anyone who knows that Java is not just coffee. On the software front much of the high profile talk is of Java, CORBA, the OMG and other offerings from the UNIX camp. Elsewhere on the exposition floor – as in the rest of the world - the most predominant operating systems hail from Microsoft, but this fact, along with wide deployment of programming tools such as Visual Basic do not get much of a marketing look in. They are not "mainstream" to E&P, and have had little support from a Microsoft with many other fish to fry up till now.

new commitment

All this is about to change, as part of an increased commitment to the energy and utilities industries, MS is launching a new effort designed to "establish Windows NT as the superior development platform for energy application". This initiative involves the formation of a new vertical industry practice focused on the energy and utilities industry. The company also announced that Hillman Mitchell, a 15-year veteran of the energy industry, has been appointed as global energy industry marketing manager heading the new effort.

Mitchell was formerly manager of the Houston Microsoft consulting services practice and before he joined Microsoft, was senior analyst for architecture and infrastructure for advanced process control systems at Conoco Inc. where he also worked as a computer scientist for exploration research in the upstream market.

influence

Mitchell stated that the new unit has been formed "in response to growing customer demand for innovative applications using the Microsoft Windows NT operating system and the Microsoft BackOffice family" with the intent of "globally expanding its evangelism and technical support for independent software vendors and key influencers in the business applications-buying process". One such "influencer" is Dr. J. Patrick Kennedy, president of OSI Software Inc. (formerly Oil Systems, Inc.), something of a Microsoft evangelizer himself who described the move to Windows NT as "..one of the best we have made - we ship approximately 300 new systems per year to companies such as Amoco, Chevron, Mobil, Georgia Pacific, Entergy, Commonwealth Edison and Duke Energy. Windows NT has grown as the choice of our customers from 35 percent of shipments in 1996 to more than 60 percent this year." PDM's editorial this month takes a sideways look at the recent Microsoft initiative and questions whether the UNIX/NT war should really be center-stage in E&P computing, and suggest that Microsoft has more to offer both the user and developer communities from the desktop than from the server.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_1 as the subject.


Mobil to spend $63 million on Landmark/GeoQuest solution (November 1997)

Mobil outline a daring IT strategy involving the marriage of GeoQuest’s Finder with Landmark’s OpenWorks. This $ 63 million program includes deployment at 14 of Mobil’s locations world-wide and training of some 800 staff. Vendor-side presentations from Landmark and GeoQuest indicated a degree of confusion as to how interoperability was to be achieved – with Landmark’s DAE and Schlumberger’s Geoshare both being suggested.

Larry Bellamy described Mobil's approach to deploying a world-wide E&P computing solution using products from Landmark and GeoQuest at the inaugural meeting of the POSC South-Western group held during the SEG conference in Houston. Following extensive analysis of their requirements and the various vendor offerings, Mobil chose to base their solution on GeoQuest's Finder for the Master Data Store (MDS) together with the full suite of applications from Landmark including the Open Works Project Data Store (PDS). This contract will be worth some $63 million over three years, and mentions POSC compliance, although Bellamy states that he, like most of the rest of us, doesn't know exactly what that is. The solution is described in a three tiered structure, with the Finder MDS at the base, the Open Works PDS in the middle below the rest of the Landmark applications. The exception to this is the deployment of GeoQuest's Oil Field Manager (formerly Production Analyst) production reporting tool which runs directly from Finder. Recognizing that the application side of the solution may not necessarily represent the "best in class" - Mobil introduce the concept of "adequacy", with the further comment that "diversity is not necessarily good in application purchases". The project incorporates a large dose of business process re-engineering with workflow improvement consultancy being provided by Landmark as a part of their new service offering.

Interoperability

The two vendors then had their chance to describe the project from their standpoint, and in particular, to explain how interoperability between their different products lines was to be achieved through "POSC compliance whatever that might mean". John Sherman of Landmark spoke first and described how the project involved implementation at 14 locations world-wide, and involved the training of some 800 staff. Sherman explained that Open Works is migrating to "full POSC compliance" but with the qualifier that "it would be a good idea if this was defined". Currently every POSC implementation is a relational projection, but Sherman points out that the commercial projections are not "open" i.e. published.

via the DAE….

Describing the data flow from the MDS to the PDS, Sherman stated that this would be performed using the LightSIP Data Access and Exchange (see last month's PDM for extensive coverage of this technology). Data transfer is to be achieved with a "one-button" interface.

..or Geoshare?

The view from the other - GeoQuest - side of the fence was outlined by Howard Neal. No talk of LightSIP or the DAEX here, data transfer between the two environments is to be via our old friend Geoshare. Questioned on the differing presentations of the linkage, Neal re-affirmed that Geoshare was the data transfer mechanism adopted for the project and would remain so for the foreseeable future. Geoshare is also the vehicle for data transfer between Finder and GeoQuest's own PDS, GeoFrame. Both vendors drummed home their undying commitment to POSC standards - without, as before being able to clearly define what is at stake. David Archer, POSC's COO mused publicly as to the merits of compliance profiles - whereby a vendor might publish the Epicentre footprint of their data model, or publish metadata, or issue a compliance statement. This is all very good in so far as it goes, but it is a telling reflection on the development of POSC that, nearly 10 years down the road, compliance issues should be discussed in such detached terms. It is also a telling reflection on POSC's contribution to interoperability that the data transfer mechanism between these two "compliant" environments is Geoshare.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_2 as the subject.


Editorial Microsoft off-target in E&P computing push? (November 1997)

PDM’s editor Neil McNaughton outlines his personal IT background as one-time UNIX aficionado turned Visual Basic enthusiast. As a reformed zealot, McNaughton wonders whether Microsoft’s current push for all-NT is really what the industry needs.

As an ex-UNIX control freak turned low cholesterol Visual Basic programmer I must admit that Microsoft’s sudden interest in E&P computing is welcome news. Maybe I’ll get to be a VB control freak before too long! You probably have experienced this sort of mindset at some time or another. My control freaking began when we acquired a UNIX box and I misspent part of my youth and employer's time in learning arcane commands and tools for doing all sorts of things some of them quite useful. No harm in that you might say, the harm begins when you start to protect your investment in your newly acquired skills by evangelizing about the merits and power of your chosen operating system. Especially when you begin to sell it in areas where it may not be appropriate.

evangelism

Such evangelizing can turn into religious warfare rather quickly and lead to the sort of futile operating system wars that have taken place - between DEC and IBM, IBM and UNIX and today between UNIX - or rather mainly SUN and Microsoft. OS wars are for the vendors, if you find yourself proselytizing about one OS at the expense of another you are acting as an unpaid publicist for a computer company. They are particularly pernicious when they extend to selling an OS to an inappropriate market.

inappropriate marketing

Microsoft's new energy focused group appears to be operating in line with Microsoft’s general corporate strategy which could be summarized as "we already have the desktop, now lets kill the Sun servers". But this is inappropriate marketing at least to the scientific computing E&P community - where the stability, memory model and IEEE conformity of UNIX hold sway. In a similar manner, the trendy image that Sun is trying to give to Java is inappropriate in that it instills fear uncertainty and doubt into the naive members of the desktop community, while offering thinly spread jam tomorrow in return.

Microsoft's marketing pitch is thus off-target in the very field where they have the most to offer. In the high tech world of E&P, the most under-utilized tool around is probably the PC on your desk. People are the main cost in an information system; and what really counts (to an oil company) is the end user’s tools for accessing data, performance, ease of use, consistency and sophistication. In this issue of PDM we see how Windows based tools have served to build quick and not so dirty solutions to problems such as how to archive and distribute a very large unstructured dataset in the context of a bidding round. New products utilizing GIS access from Windows to large corporate datastores are appearing almost weekly. Even the flagship company quoted by MS in the press release, OSI Software, is focused on the desktop, they supply an application which provides a bi-directional link between processes such as oil refining and SAP’s R/3 applications using tools such as ODBC and Visual Basic. But while such software vendors are quietly aware of the potential of these tools, the same is not generally true of the G&G end user.

data miners

Unlike the financial services industry where workflow is pre-determined, E&P-ers are by nature data miners, and hopefully, push-oriented publishers of their findings. In my wanderings through E&P shops world-wide I consider the full exploitation of an Office/Web paradigm to be one of the least understood elements of E&P computing. Access to the various data stores in a corporation requires interoperability. But this is here now, with the kind of tools that Microsoft supplies to its developers, augmented by a vast array of third party offerings from Borland, Oracle, DevSoft and the like. Much of the effort centered on "true" platform independent interoperability - through Java or CORBA stems from an understandable, but rather unrealistic desire to free the IT world of Microsoft's hegemony. In a practical sense, if the world does take this path, then the overall cost to industry at large in terms of lost time, functionality and re-training will be huge. It would be good to see Microsoft new energy unit offering more support in E&P for its information work offerings, for the pervasive utility of the Office product line, and for the functionality of its development tools. This would put the claims of the Java camp into context and would be a more telling sales pitch than the price of a MIPS.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_3 as the subject.


G&G personnel shortage forecast (November 1997)

Despite Industry wide downsizing, and efficiency gains, booming business in the E&P sector is creating headaches for HR departments particularly in the revitalized service sector.

The boom and bust cyclical nature of the oil business is yet again leading towards another mismatch between forecast industry requirements in the next few years and current levels of graduation and student intake in earth science departments world-wide. A shortfall of up to 4 times demand over supply has been postulated by industry observers. Some innovative thinkers have are linking this shortfall with the new asset team based approach to conclude that G&G asset teams may act like mercenaries, or football stars in the near future, with even a new profession, that of a G&G professional and asset team agent negotiating the best deal for the new superstars. We will not pretend that for individual G&G’s this isn’t some of the best news to hit the upstream for a decade. On the corporate level however two necessary actions are clearly requires, first hike the payroll budget appropriately to cater for the agent’s fees and new major league salaries, second, ensure that service providers really have the resources they claim – otherwise dealing with E&P services contractor may end up like trying to get the builder to finish off the construction work on your new house.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_4 as the subject.


Peebler advocates a 'revolution in the white-space' (November 1997)

Bob Peebler, CEO of Landmark Graphics Corp. speaking at the SEG conference in Dallas this month fleshed out his plan for a 'revolution in the white-space' in E&P computing and business processes.

The white-space here is both the white-space between functions in the EP organizational chart and between functionally focused E&P applications. The revolution is in allowing E&P organizations to operate seamlessly across the white-space and the main weapon in the revolutionary’s arsenal is – you’ve got it - data management. Peebler is probably preaching to the converted – at least for readers of PDM, and we have in the past (PDM Vol 1 N 4) been somewhat skeptical of Landmark’s deliverables in this context so it is interesting to see just how Landmark is setting out to revolutionize the business.

under fire

The first bastions of traditionalism to come under fire from Peebler’s revolutionaries are the parallel worlds of time and depth in seismic and geological interpretation. Current practices generally perform the "warp" from time to depth once the geophysicist is through interpreting and when the geologist is ready to start modeling. This abrupt shift in domain effectively precludes any iterative interpretation crossing this junction between the two disciplines. LGC’s solution is to move the time to depth warp further up the food-chain by incorporating it into seismic processing. Of course depth imaging is nothing new, there are many companies with highly credible offerings in this domain, but their integration, particularly from a data management standpoint is problematical, these third party offerings can be said to represent an increasing functional focus. LGC’s "revolutionary" contribution in this domain comes in the form of a new release to ProMax (described elsewhere in this issue), LGC’s seismic processing suite which now offers data exchange and visualization of processing parameters from within SeisWorks.

high impact

Peebler describes the marriage of processing and interpretation as one of several "high impact intersections" others are the use of seismic data in reservoir characterization where Peebler estimates that up to 90% of the information on a reservoir is "locked up" in seismic data which is usually "dumbed down" – particularly in the stacking process – because "that’s all we can handle". Another intersection is further down the chain in the realm of interpretative simulation. Here again dumbing down is the watchword, this time with the abandoning of perhaps another 90% or so of the information as the fine grain geological model is reduced to the coarse cellular representation of the reservoir engineer. Again LGC is working on this intersection with enhancements to its ParallelVIP reservoir simulator destined to allow more of the geological model to be incorporated in the simulator without compute times going through the roof.

sleeper

Another intersection of note is interactive well planning – which Peebler describes as "a real sleeper" - involves improving communication between drilling engineers and G&G’s which is currently deemed very detrimental to cycle times. Yet another is in interactive drilling where Peebler sees video arcade technology applied to real time control of the drill bit in a concept that is reminiscent of the URGENT project (discussed in PDM Vol.2 N 9).

The final intersection which may well prove to have most impact on the way we all do work is a proposed linkage between finance and the technical side of the business. This Peebler describes as "resource allocation" and revolves around improved information movement from the well head, through the accounts department and into the upstream so that financial and human resources and even data can be allocated in a timely and cost effective manner.

PDM comment

The software enhancements to the Landmark suite are hardly a revolution, that will come when a reservoir simulator can see right through the data chain to pre-stack seismics, to allow say for reservoir fluid mapping to be guided by offset-dependent attributes. And when all intermediate stops on the data path have similar vision up and down the data stream. Aware of these issues, and perhaps of the time that it will take before such interoperability -even within a single vendor’s product line - becomes a reality, Landmark’s other "revolutionary" activity is in a booming service offering. This activity involves helping client sites out with workflow consulting and services showing them what can be done, and how to achieve it. This activity incorporates the development of workflow templates which have been developed to solve specific problems in a given area. We will be covering Landmark’s service offering in depth in a future PDM.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_5 as the subject.


Expro 97 conference focus on E&P IT. (November 1997)

The Expro conference held in London last month is unusual in that its focus is E&P IT. It therefore lacks the domain specificity of your AAPG/SEG and the like, and brings together geologist, geophysicist petroleum engineers, plant designers and of course data managers.

Expro 97 is designed to be a multidisciplinary event and the result of this is to give everyone some exposure to problems encountered elsewhere in the industry with the hope that such a gathering will allow ideas to cross fertilize from one domain to another. Of particular note was Stuart Robinson's description of the UK Department of Trade and Industry's initiative aimed at getting the oil industry to submit data digitally. This immediately posed the problem of data formats and the DTI elected to base their recommendations on the standards coming from POSC. However, to ensure that their industry clients understand what is meant by these standards, the DTI has taken the initiative of re-drafting the POSC specifications that they use in ASCII - and is publishing them on the DTI website at http://www.gov.uk/og. Also available on the DTI site is a complete list of the official DTI well number so that all those companies who have hitherto adopted a go it alone policy can mend their ways and use the DTI standard. The ASCII initiative has been achieved with help from POSC gurus from Cap Gemini. The DAEX product from Oilfield Systems was also used.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_6 as the subject.


PPDM membership snubs major vendors (November 1997)

The Public Petroleum Data Model (PPDM) Association AGM was held in Calgary last month. Membership is rising and deployment widespread. Candidates for the PPDM board from both Landmark and GeoQuest were rejected by PPDM members.

PPDM business is booming, with some 120 members (a 33% rise over 1996) and to judge from the deployment case histories, the model is being widely deployed, at least in North and South America. The PPDM board elections demonstrated that PPDM is still a rather insular organization in the way that the membership snubbed candidates from Landmark and GeoQuest. In the case of John Sherman this was particularly maladroit in view of the work and resources that he and Landmark have put into the Discovery project.

POSC - no merger

It looks rather as though some PPDM members are glad to see the attempted rapprochement with POSC fail yet again. Participants in the Discovery project however relate that POSC was highly involved, and were impressed with the assistance they got from POSC and in the quality of the projection tools, a great improvement over the previous merger attempt when projecting from Epicentre meant several days of work. Both GeoQuest and Landmark were also closely involved until quite recently when both vendor's attention seems to have been diverted from the project - perhaps explaining the snub at the elections. PPDM still have the Discovery project on their books, but it is unlikely now to represent the hoped-for merger with Epicentre. Rather PPDM now see the way forward as through "new technology" version of PPDM, in other words Oracle 8.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_7 as the subject.


Amoco's PPDM Access (November 1997)

Amoco show how a simple system based on web distribution of scanned paper documentation helped them sort through the massive quantity of documentation released for the recent Venezuelan bidding round.

The problem is relatively commonplace, a few months before a bid deadline, your company acquires a package of data from a government, and virtually none of it conforms to corporate databasing standards. Additionally, you may well have a few filing cabinets, or rooms full of legacy data in the form of reports, paper sections and so on. To clean up and populate a conventional database with all this stuff would take months, and your G&G people need access in a matter of weeks, in fact yesterday would be just fine for them. Charles Fried recounted how Amoco were faced with this kind of a problem in the Venezuelan bid round earlier this year. The data supplied by the Venezuelan government makes for some impressive numbers, some 200 8 mm tapes with 200,000 files totaling 25 Gigabytes – excluding seismic trace data. Amoco also possessed some mission critical information themselves gleaned from previous work in the country and wanted to integrate this with the data in the public package to gain a competitive advantage over users of the standard package. To make matters more interesting the data came in a multitude of different formats, Excel, Word, UKOOA navigation data, Tiff imagery and so on. Getting all this right became of the utmost importance to Amoco and all the other bidders in this round which ultimately netted $2 billion in bonuses at bidding time.

Team web

Rather than adopting the traditional approach of letting everyone climb over the data for themselves, Amoco decided to set up a hybrid WEB based GIS front end from which spatially indexed data could be located and viewed on screen. But this was not just new technology, but a new work paradigm. Populating the web based data base was to be a collaborative effort, - the "team web" approach offering self service input and viewing of data. A Microsoft Access database – described as "PPDM like" - was built to contain header and index information while reports and scanned images were placed on a fileserver. The timely capture and organization of this data was only possible because the Venezuelan data itself was scanned and organized. A Java applet from ESRI was used to provide point and click access to the different datasets from maps. The server side solution comprised an NT Server, a UNIX box running an NT Server emulator, MS Access, ESRI’s ArcView Internet Server, MS Active Server and Index Server. On the client side, windows95 clients ran Tiff imaging software from CPCView together with Netscape and Internet Explorer. The whole system was up and running in "a few months" and the bid data loaded in two weeks. 30 users were online with around 500 accesses and 4,000 pages viewed per month. Intriguingly one of the main lessons learned was the difficulty of changing the way people worked. Both the data sharing paradigm, with its self service data loading, and the on-screen data delivery stretched the traditionalists.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_8 as the subject.


Argus to AnGIS (November 1997)

Canadian start-up AnGIS is working on a plug-in replacement to Argus, a PC-based GIS which is scheduled to be switched off as Landmark moves to Open Explorer.

Robin Winsor told how his client, Alberta Energy (AEC) Co. a $3 billion utility had 11 legacy mapping systems, some of which produced conflicting maps. These have now been rationalized to 2 systems, GeoScout, a proprietary system and Landmark's Argus, an open system which AEC use as the hub for data from Dyad, IPL, QC Data and their internal database. The Argus development allowed users to see data from all these "best of breed" sources in the same composite map. Unfortunately for AEC Landmark are pulling the plug on Argus and offering users an upgrade path to Open Explorer. This may not fit with everyone’s IT strategy, since Argus was a compact PC based product, and Open Explorer, (reviewed in PDM Vol 2 N 6) is a UNIX based system requiring significantly more resources. Step in AnGIS, a plug in replacement for Argus using ESRI’s map tools and a PPDM data model. Winsor claims that the use of ESRI’s mapping tools has greatly improved some of the more sluggish performance of Argus. Now the AnGIS team are canvassing other Argus clients to see if they want an alternate path to OP, looks like a great business opportunity – more information from Robin Winsor, rwinsor@angis.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_9 as the subject.


Shell Canada targets remedial work (November 1997)

Blair Wheadon described the objectives of this production and accounting focused development as reducing cycle time between data capture in the field, financial processing in the accounts department and feedback to engineering.

The results of this effort allow just in time remedial action to be planned in such a way as to optimize resource allocation. The data fed to the engineers is not just SCADA generated information as to what wells may require intervention, but also a real time feed of the dollar value the lost production or "deferred oil". This allows remedial resources to target those problems where the reward will be greatest. In fact it is not hard to see how, once you have all the relevant data at your fingertips in real time, quite sophisticated optimization and timing of workers and the like can be realized. The essence of the problem is one of data management.

Shell Canada have used PPDM V2.3 to combine public and internally generated datasets with field generated data from SCADA Merak’s FDSC and OSI Plant Info. Data from this system is fed to the accounts department – where the dollar values are attached. Analysis is performed using a variety of tools, Production Analyst, Excel, Business Objects and MS Access. Main successes claimed are enhanced data integrity checks, particularly by comparing public and proprietary datasets and data latencies have been minimized.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_10 as the subject.


GeoFrame POSC AND PPDM compliant! (November 1997)

Paul Haynes (GeoQuest) made a valiant attempt at convincing a skeptical audience at the PPDM AGM that, despite all the POSC – centric marketing that has been put out concerning GeoFrame, GeoQuest’s flagship data model and integration platform, it is really PPDM compliant too!

To understand how this circle has been squared Haynes encouraged the PPDM membership to consider that "reality is truth, data models are abstractions". A data model, being a highly normalized structure can be viewed in a number of different ways. The data model plus the observer becomes the View. Such views are part and parcel of database management systems and can allow a programmer to access data in a more domain oriented and comprehensible way. They are nearly Business Objects, except that their scope is limited to the RDBMS in use. GeoFrame currently uses the Epicentre data model in the realm of production – where it is said to be "fully compliant" with Epicentre. This implementation of Epicentre has been derived from POSC’s logical model using Express, but not the projection tool.

All done by Views!

Projection has been hand coded to ensure a one to one mapping from Epicentre to reconstruct the views of the data that GeoQuest’s applications are used to seeing. Even with this, integrating these applications, more used to working off of the PPDM derived Finder, has caused some challenges to the programmers. These have been circumvented by defining PPDM Views of the GeoFrame data model – for instance a PPDM View of a POSC well completion header. This view is created by a join on 5 different GeoFrame tables and produces a PPDM like data structure. Questions were asked as to the performance hit involved in this additional layer between application and database. John Gillespie of GeoQuest replied that using views was slower than direct table access, but that performance was still adequate. PPDMers pointed out that if GeoQuest’s applications required a simpler view of the data than Epicentre could provide, then they might as well use PPDM as the physical model. This presumably would not be politically correct.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_11 as the subject.


Development Tools announced for PPDM clients (November 1997)

Benson and Associates have just announced the availability (to PPDM members only) of the production version of PPDM Version 3.4 pre-loaded into the Oracle designer/2000 CASE tool. This is intended to allow users to 'quickly generate PPDM-Compliant applications using Oracle forms, Visual Basic, Oracle Reports and Oracle Web Pages'. More info from DAB&A, (713) 461-1620,

bensonda@aol.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_12 as the subject.


PPDM Compliance (November 1997)

Doug Benson – Oracle Corp and Benson & Associates was given the thankless task of trying to sell compliance to the PPDM membership.

You would think that compliance was an unequivocally good thing that everyone would support. Life is not so easy. Terms of reference of PPDM compliance are not intended to be compulsory, nor to guarantee interoperability. Four stages are proposed in the path to compliance, a self check, a tweaking, submission to PPDM and validation and publishing on the PPDM website. Compliance is not a yes/no affair and is measured as a ratio of compliant vs. non-compliant elements. The target for compliance is pre-determined (and will be published) as the footprint of the full PPDM model to be covered by the applicant. Two levels of compliance are envisaged Gold level requires table level read/write compatibility with published PPDM tables, Silver level allows for view defined read only compliance.

Middleware

An interesting issue that arose in discussions was the situation for middleware mapping software such as SAS GEO, here the software assures interoperability, but the internal model does not need to be compliant with any particular model. The issue of minimal compliance as suggested here versus full interoperability was raised, with a reference data set supplied, and a compliance judged by performance of a set of SQL test queries. This leads to the issue of improved or enhanced versions of the model which may work better, but fail compliance testing. These issues are crucial to the whole role of standard data models, they were set up to allow for interoperability, but focus, for PPDM or POSC has gradually drifted such that both organizations play a strange role supporting commercial software developers who use the fruits of the labor of the standardization organizations to their own ends, and now may be dragging their feet over interoperability with any software except their own.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_13 as the subject.


QC Data new projects Alaska Data Bank and Bell South (November 1997)

QC data is working with Alaskan majors to investigate a future ‘National’ Data Repository (NDR) for Alaska, and has also diversified into data management for Bell South. The commonality between these projects is the increasing use of spatial data warehousing.

The idea of a "National" data bank for Alaska has been around for some time but lack of progress has caused three of the major Alaskan player to go it alone. BP, Arco and Exxon are to set up their own study group to move things along and have commissioned QC Data (Calgary) to work on the base case and specifications for a joint well data solution. The data bank is intended to be scalable to other data types and perhaps even to grow into a true national data repository.

Telco outsources data management

Bell South have awarded a $50 million contract to QC Data to manage 6 centers in the SW United States. All technical and spatial data management is outsourced. Typically this includes the vast amount of plant and infrastructure – drawings hard copy and hundreds of thousands of maps. These are now being digitized into a fully structured database. Other issues handled in the contract are change management and data sharing with contractors. What has this to do with the oil business? Just replace the infrastructure with producing oilfields, the phonelines with pipelines and we are in the heart of POSC/CAESAR country. This activity, spatial datawarehousing is a major growth area for QC Data.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_14 as the subject.


Panther Ovation and Sony team in PetaStar terabyte storage offering (November 1997)

A new offering in the field of seismic databanking has been developed by Panther Software, Sony and Ovation Data Systems.

Building on Ovation’s "Seismic Shift" hardware modification to Sony’s DTF tape subsystem (see below) Panther adds its Seismic Data Management System to offer companies seismic tape access and data management from the workstation. Sony’s PetaStar tape library is managed though a hierarchical storage manager (HSM) from Minneapolis – based Large Storage Configuration (LSC). This effectively removes all the complex issues involved in writing seismic data to tape with block sizes and file marks by making the whole tape library look like huge disk capacity to the workstation. Ovation supply their PC based image management software – GEOASIS - allowing for scanned seismics, TIFF files maps word documents and so on to be stored in the PetaStar archive in native format. Hardware costs start at $40,000 for a 9 tape robot with 400 GB capacity and scale to a 5 TB site at around $250,000. A software bundle including the LSC software, Panther’s SDMS and Ovation’s image management will up these prices to around $120,000 for the entry level system and $500,000 for the 5TB offering.

SEG-Y

All data on the system is re-mastered to SEG-Y. RODE and Geoshare are considered by Panther as too unstable at the current time. An addition to SDMS functionality is the ability to carve up a 3D dataset into an arbitrary polygon for seismic trade or sale. The scanned data from the Ovation package can be associated with a line in SDMS for visualization. Panther are also working on what will probably be called the "Production Seismic Data Loader" - a batch oriented version of their SDML which will target transcription/remastering projects where once the parameters required for data cleanup and load have been established, the process can be run unattended. This could for instance allow a three shift transcription operation to be optimized with the "clever" guys working a single day shift and the tape monkeys working round the clock.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_15 as the subject.


Sony and Ovation seismic shift (November 1997)

The thorny problem of writing oddly blocked seismic data to modern High Density Media (HDM) has been addressed by the Sony/Ovation initiative dubbed the Seismic Shift.

Hitherto the small blocks of seismic field tapes have sometimes created havoc when read by UNIX workstations with a reported 9 hours read time for one HDM (brand "X") containing SEG-Y data. Seismic shift is a hardware modification to the standard Sony DTF drive which overcomes these problems and gives transfer rates of up to 11.25MB/sec on SEG-Y and SEG-D data without modification to the tape data format. Having helped Sony with the hardware modifications, Ovation are now ready to provide systems and services for migrating data from any other media and format to the DTF for processing, near line storage and archiving which can include their integrated "PetaStar" solution based on Sony’s PetaSite mass storage library holding from 5.4 TB to 2.3 Petabytes (1 petabyte = 1 billion megabytes).

More info from gservos@ovationdata.com and laura_whitaker@mail.sel.sony.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_16 as the subject.


IESX and Charisma now part of GeoFrame (November 1997)

The latest release of GeoQuest’s software integration platform GeoFrame, version 3.0 now integrates GeoQuest’s twin seismic interpretation packages IESX and Charisma.

Applications in the GeoFrame environment share a common Oracle database that is described by GeoQuest as "fully compliant" with standards from POSC. Which of POSC’s labyrinthine standards GeoFrame is "compliant" with is not specified though. In addition to the two seismic interpretation packages GeoFrame now integrates visualization with GeoViz and Voxels, InDepth velocity modeling and depth conversion, CPS-3 mapping and Framework 3D for characterizing and interpreting complex structures, StratLog for geological interpretation, WellPix for correlation ElanPlus and PetroViewPlus for petrophysical analysis and other. Other tools for reservoir analysis include Impact, Zodiac and Polaris.

Data management has been enhanced with new tools for installing the software and data loading, spreadsheet style data management tools and editing and saving of ASCII data. WellEdit works with well log and core data. The migration of IESX and Charisma to GeoFrame is said to have taken "three years of dedicated development and testing", what is perhaps surprising is why GeoQuest did not take this opportunity to merge the two products. After all, you can’t have two "best in class".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_17 as the subject.


Veristream to remarket PECC/PetroSystems’ PetroVision (November 1997)

Shell Services Company (a.k.a. Veristream) and CGG/PECC have signed a letter of intent for the delivery of world-wide E&P data management solutions using PECC’s PetroVision.

PetroVision (and not PetroView as we wrongly called it in last month’s PDM – apologies to all) is CGG’s answer to PetroBank so Veristream’s announced adoption of PetroVision may surprise PDM readers since the announcement last year that it was to become a Petrobank "reseller" (see PDM vol 1 N 2). Veristream state that their support for two competing data banking solutions bears witness to their vendor neutrality – citing their service level support for both Geoquest and Landmark’s product lines. Veristream was set up as a service outsource/spin-off from Shell and still works almost exclusively with the Shell group. The US based Veristream organization – whose headcount has already grown from 1800 to 2200 is set to expand further in 1998 when a linkup with the European Royal Dutch Shell service arm is set to make Shell Services a world-wide operation.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_18 as the subject.


Landmark to export Legends (November 1997)

Landmark Graphics Corporation is doing something practical about the personnel shortage problem through its Legends training programme.

Through this, approved G&G professionals attend an intensive four month training program given by Landmark to obtain skills and certification in at least one specialty track application. At the same time, a monthly stipend and help with accommodation may be available. This has to be a win-win situation, and may allow some traditionally trained earth scientists to overcome their digital anxiety and re-enter the marketplace as either Landmark employees, or users and perhaps proselytizers. Hitherto limited to the US, the Legends program is to be exported next year, to Moscow, London and probably South America. With the US program, some 85-90% of graduates from the Legends program find employment with Landmark. For more information, contact your Landmark local rep.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_19 as the subject.


Databank competition hots up in Houston (November 1997)

A $ 1.6 million data storage facility has been inaugurated by CGG’s data management subsidiary PECC in Houston, while competitor GeoQuest announces the PowerHouse data repository.

The 12 Terabyte capacity center uses PetroVision and an Ampex 812 tape robot. This houses up to 256 50GB DD-2 19mm helical scan tapes. Four drives allow for a theoretical aggregate maximum transfer rate of 60MB/sc. The facility is intended as a showcase for PECC’s software and services which include turnkey data banking, on site remastering and archiving and offsite data storage with client site access via a PetroVision client.

Competition

Competitor GeoQuest have also announced the rollout of their PowerHouse data bank facility in Houston which will offer a High Density Media seismic library service for oil companies in the Houston area. The robotic system, developed in collaboration with Geco as part of BP Aberdeen’s outsourcing effort, is a StorageTek Powderhorn with 6 units of 6000 D2 and/or 3590 tapes. In this configuration, SeisDB runs atop of a Hierarchical Storage Manager (HSM) SAM FS from Large Storage Configuration. The service is being marketed along similar lines to PGS's GeoBank - whereby company's outsource the management of their tape library and the service company rationalizes multiple copies of the same survey, and manages access entitlements for participating companies.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_20 as the subject.


Ampex unveils new tape system. (November 1997)

Using the same DD2 technology as in PECC’s Houston facility, Ampex’s latest robot, the DST 712 will initially hold up to 5.8 terabytes (TB).

Up to 116 50GB DST cartridges can be stored in the unit. Future upgrades will allow this to be increased in a modular fashion with each module adding from 3.2 to 6TB. Ultimate capacity using the modules is said to be unlimited. The new DST has a list price of $150,000 for the single drive configuration and $240,000 for the dual-drive configuration. More infor from sales@ampex.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_21 as the subject.


Hays acquires TTN unit and teams with PGS. (November 1997)

UK-based Hays Information Management (HIM) has teamed up with PGS Data Management to form what they claim will be the world’s leading E&P data management organisation.

Hays’ expansion into E&P data management was initiated with their acquisition of Rockall a major E&P inventory storage outfit which had developed RSO, for online management of inventory. Close association with tape management and storage has clearly opened Hays’ eyes to the growth business of seismic tape remastering and has led to the global alliance with PGS. The alliance partners will provide a suite of services from hard copy storage with integrated data remastering to on-line, automated data banking. Part of the deal involved HIM acquiring the PGS unit Tape Technology Norge – Geodata Services. GeoData Services are involved in major remastering work on behalf of the Diskos consortium. PGS Data Management is part of the PGS ASA group with 1996 revenues of $451 million.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_22 as the subject.


Landmark Releases ProMAX 7.0 (November 1997)

Landmark’s new seismic processing software fills a missing link in current interpretational workflows by beginning to bride the gap between the seismic processor and the seismic interpreter.

Aimed at the high volume end of the processing business, ProMAX 7.0 offers improved coupling with interpretation software, and introduces a database link. Integration with ProMAX and SeisWorks enables seismic data, including horizons to be shared between the two applications so that interpreters can create SeisWorks projects directly from ProMAX. Data navigation and access is enhanced and a new component is introduces DBTools which improves access to and visualization of a wide range of attributes. This is said to allow rapid visual detection of problems such as geometry and statics. Landmark’s CEO Robert Peebler has described conventional seismic processing as "dumbing down" the data and advocates a "revolution" in the white-space between processing and interpretation. While ProMAX 7.0 may not be the revolution, there should at least be some arm wrestling between interpreters and processors to decide who is going to sit in front of the ProMAX workstation.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_23 as the subject.


A2D Technologies Launches LOG-LINE Version 4.0 (November 1997)

The latest version 4.0 of A2D’s digital well log management software now has a GIS front end. Log-Line was originally derived from Texaco’s Loglib, an in-house development, as part of the outsourcing of their well log data management in 1996.

The outsourcing project involved the management of Texaco’s 1.5 million digital well log curves (20 billion curve feet) of digital data, with data feeds to Petcom, Workbench and other application tools. Described as "the only instant-access Internet delivery system for digital well log data", the new release incorporates a GIS search engine, based on the ubiquitous ESRI’s MapObjects, for Gulf of Mexico wells. LOG-LINE debuted in January 1997, and already has more than 80 subscribers, from majors to small independents. Users can either make spatial queries with the new map search engine or browse LOG-LINE's text index of wells, place their orders, and instantly download industry-standard LAS files to their workstations. "The GIS search capability in LOG-LINE has made it much easier for explorationists to find and use critical digital data," says Dave Kotowych, president of A2D Technologies. "Our goal is to continue to reduce the time required to find critical data, thereby increasing users’ productivity." More information about A2D LOG-LINE can be had from (888) LOG-LINE, or contact Joe Carroll at (281) 319-4944, email joe@a2d.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199711_24 as the subject.