April 2006

Landmark gets Diskos back

Operations of Norway’s national data will return to Halliburton in 2009. The Consortium is pushing for more open access to Diskos data and a PetroBank API is likely, along with new data types.

The Diskos group, comprising the Norwegian Petroleum Directorate (NPD) and Norwegian operators, has awarded the third phase of Diskos operations to Landmark Graphics AS, a Halliburton unit. Phase three will run from 2009 to 2014. Diskos data is stored in PetroBank—the data management system originally developed by IBM for Diskos, acquired by Landmark in 2000 for $179 million.


Incumbent Diskos operator Schlumberger, via its Norwegian SINAS unit, along with startup Lerya were also in the running. Diskos project manager, the NPD’s Eric Toogood told Oil IT Journal, ‘The three bids were comparable from a technology standpoint, but Landmark offered the best financial terms.’ The NPD release referred to ‘the excellent work being carried out by the current operator, SINAS.’

80 Terabytes

Diskos houses some 80 terabytes of seismic, well and production data from the Norwegian continental shelf, made available to members through secure broadband access. The new operatorship will involve data migration from SINAS to Landmark. But, as Toogood remarked, this has been done before and, ‘Technology is always improving, it should be easier next time.’ SINAS stores the 80 Terabyte dataset on NetApp disks. An additional 1.8 Petabyte of nearline tape is available for prestack data.

PetroBank API

The Diskos re-tender placed particular emphasis on ‘openness’ and data access. This has resulted in what might be a significant development for National Data Bank users as Landmark is likely to offer API access to PetroBank data. As Toogood explained, ‘Large corporate users want to be able to integrate data in Diskos better with their own workflows.’


Diskos plans to expand membership and data scope. The thirty new companies that have started working the Norwegian continental shelf will be offered attractive membership terms. Toogood also plans to expand data coverage, ‘We are looking into the cost-benefits of storing pre-stack seismics in near-line storage and, later on, field data.’


PetroBank well log master data is stored in an embedded version of Petris’ Recall package—leading to the interesting possibility of the PetroBank API extending into Recall. According to Toogood, ‘Companies are asking for API access to well, seismic and production data, although the API is outside the scope of the main Diskos program and will be negotiated between Landmark and individual Diskos users.’

Déjà vu?

It’s interesting to recall Schlumberger’s prior under-taking to ‘provide open access to the Diskos repository [and] improved workflows by seamlessly integrating Diskos data into company internal solutions.’ That was back in 2004 (Oil ITJ Vol. 9 N° 12).

Paradigm takes OpenSpirit stake

Paradigm and other OpenSpirit investors are to underwrite a share capital increase. A ‘stamp of approval’ for the upstream middleware.

Paradigm has signed a Memorandum of Understanding with the shareholders of OpenSpirit Corp. to subscribe to a primary equity offering in the company. Established in 2000 by Chevron, Shell and Schlumberger, OpenSpirit provides a CORBA-based interoperability framework for upstream software.


Paradigm CEO John Gibson, who will be joining OpenSpirit’s board of directors said, ‘Explorationists and producers need cooperation of the leading geoscience application and data providers. Paradigm is making a very definite choice to ‘play well with the others’ by sharing OpenSpirit’s vision and objectives.’


OpenSpirit president Dan Piette told Oil IT Journal, ‘This deal Is a stamp of approval for what we do. Paradigm will be using OpenSpirit in its own products. In addition, some of our existing investors will be putting new money into the company.’


Subsequent to the deal, the position of Paradigm’s own CORBA-based integration framework, Epos, is unclear. Epos currently provides data sharing and application interoperability of Paradigm’s software with third-party applications.

Cavalry coming to digital oilfield’s rescue?

Oil IT Journal editor Neil McNaughton opines on this month’s news from Landmark, Schlumberger and Paradigm before enumerating some shortcomings of current digital oilfield ‘real-time’ offerings. To which he believes he may have stumbled across a possible remedy from the battlefield.

Landmark’s marketing machine has come back to life this month with the Diskos deal, a new ‘real time decision center’ and a deal with Pavilion that brings upstream software a tad closer to the process control ‘coal face’ of the digital oilfield of the future (DOFF).


Also of note is Paradigm’s investment in OpenSpirit. In our interview last April, Paradigm CEO John Gibson spoke of a possible evolution of Paradigm’s own Epos middleware to ‘open standards’. The buy-in suggests that Paradigm’s route to openness is OpenSpirit. This contrasts with ‘openness’ via Ocean that was the theme of Schlumberger’s Open Technology Symosium this month. But skeptics were no doubt reassured as SIS president Olivier Le Peuch and John Gibson sealed their alliance with an on-stage hug!


One of the main jobs of the editor is to peruse vast quantities of text in the search for enlightenment and new ideas. Since doing this job I have developed a kind of info-bulimia. So I try and skim through publications aimed at the explorationist, engineer and technologist. From AAPG Explorer to JPT, passing by various trade mags, company newsletters, Business Week and the Financial Times. To try and keep up with the IT side of things, I also subscribe to Dr. Dobbs Journal and most recently Computer, from the IEEE.

Info glut

All of which means that when I get home from a longish trip I am confronted by a pile of journals. An info-glut of my own making, which I deal with by a rather peremptory scanning process, perhaps watching the TV at the same time. As all this happens while recovering from my jet lag, the scanning process may be punctuated by deep sleeping and a kind of info-induced reverie which I suppose somehow ‘informs’ my editorials. The result this month is a reverie along the lines of real time, simulation and optimization, the underpinning of the DOFF.

Me too?

Reflecting on the Halliburton/Pavilion deal above, which is a bit ‘me-too-ish’ to Schlumberger’s AspenTech alliance last year made me wonder if the real time optimization of the digital oilfield is going the same way as upstream interpretation software in the geoscience space. That is to say with vendor-based environments à la OpenWorks or GeoFrame that offer a degree of interoperability so long as you stay within the framework. But which become a source of hassle and inconvenience when you try to deploy ‘best of breed’ solutions that operate across the framework boundaries.


The twin upstream/downstream deals (SIS/AspenTech and Landmark/Pavilion) illustrate two facets of the DOFF. SIS and AspenTech’s first product, Avocet combines various simulators to optimize facility design (OITJ Vol. 10 N° 4). This kind of optimization falls in the ‘front end engineering design’ (FEED) category. Such optimization may take hours or days of computer time which could be considered as ‘real time’ if you are planning next year’s project, but not if you are closing the valve on a BOP.


Landmark/Pavillion illustrates another component of the DOFF—model based controllers (MBC). These use computer models of process to identify critical situations. Certain combinations of values have been shown in offline modeling to be associated with events that require remedial action. While MBCs may run in real time, the simulations that compute each case may require hours of machine time, data mining and computing historical datasets.


While both FEED and MBCs are no doubt great tools, they are a far cry from an overreaching ‘field to facility’ real time model. The problem is that to simulate across the silos of reservoir, well bore, pipe network, facility and beyond, you need to fire up a multiplicity of simulators. Each of these may have their own notion of what ‘real time’ means. Some simulators go faster than real time. Some can just keep up. Others may go much slower than real time. Bringing them all together requires a well though out strategy and ideally, some standardized way of synchronizing the outputs and inputs.


The January 2006 issue of the IEEE’s Computer magazine had an article* on how the defense community addresses similar issues. When the army, navy and air force show up to play war games, each brings along their own toys, in the form of battle simulators. A first pass at making the ‘plug and fight’ was the 1993 Distributed Interactive Simulation (DIS) which created ‘synthetic, virtual environments by connecting separate subcomponents of simulations residing at multiple locations.’ This has subsequently evolved into the ‘High Level Architecture’ (HLA) for distributed simulations which allows slow and fast ‘subscribers’ to plug into the real time infrastructure and receive timely, appropriate information.


The only oil and gas related application of the HLA I’ve Googled is the Nork Hydro-sponsored Real-Time Simulator for Complex Offshore Marine Operations developed by the Norwegian SINTEF research organization. This integrates multiple simulators to train operators about to, for instance, install a heavy module on the deck of a floating production unit. This is not a million miles removed from the DOFF. Could the HLA, or something like it, be a way around the putative, vendor-specific DOFF frameworks? If the DOFF is driven by a future POSC/SPE HLA—remember, you read it here first!

* Enabling Simulator Interoperability. Katherine Mores (SAIC) et al., Computer, January 2006

OITJ Interview—Gilmore, de Vries, Invensys

Following our report on Invensys’ terminal management system for the Cheniere Sabine Pass LNG terminal in last month’s Oil IT Journal, John Gilmore and Stan de Vries offer an insight into this complex system and how the technology will impact the digital oilfield of the future.

Gilmore—Cheniere has been working on LNG in the US since the 1970s, though the gas market was in the doldrums for decades. Today, increasing energy prices have revitalized the business. Cheniere saw this coming and develop a new terminal design and a new business from scratch—a top down design involving a business process team, engineering and finally IT design and implementation.

It seems like this kind of technology is fairly a propos regarding the digital oilfield of the future?

de Vries—Customers in all industries recognize the value of transferring information from the shop floor to the business and also in other direction. This is being recognized in power generation and other large plants. The secret is to make offline simulation tools that are reliable enough to be used online—a completely different situation from today. The idea is to provide engineers with shadow costs and KPIs in real time. Moving information out of the plant is getting to be a best practice. But there is a lack of business info flowing TO the floor. This is the problem we have addressed at Invensys by building an IT infrastructure layer adding in multiple simulators. This gets around the problem of instrument tag-based systems.

Can you explain?

de Vries The problem with tag-based systems is that within a plant, there may be a lot of equipment with identical naming systems for different machines—also the structure for naming can change across a plant. Our solution normalizes equipment nomenclature and centralizes operations. Our largest example of such a deployment is in Oman where we have deployed a system across 17 oil and gas fields—along with a whole phone book of equipment suppliers.

But that’s quite a high level of granularity.

de Vries Yes—we’ve not yet got all information to where it can be used to best effect. You need simulators to study what’s going on in wells, in gathering systems, for real time production allocation. During our different deployments we have accumulated a lot of information from customers’ sites. The next step is to ask, ‘what do we do about the business process?’ For Shell we have developed techniques to look for emerging patterns in data indicative of particular events like sand breakout or pressure drop. These can be addressed with appropriate remedial actions. Today it can be hard for upstream customers to take knowledge and associate it with a particular event because IT still lives in a ‘tag time value’ world.

So this is a data mining issue?

de Vries—Yes, but traditional data mining tools are not very good at time series analysis. We try to capture discrete events, when they start, when they end. The system is constantly looking for patterns in the data; when things go from good to bad, when production drops off, gas lift parameters change, flow from LNG terminal changes. We organize data so that business software can understand it.

Is this productized?

de Vries—It is the ArchestrA Collaborative Environment.

Does Cheniere use SAP on Sabine Pass?

Gilmore—Not SAP. But the ERP system is always key. Upstream businesses want to make enterprise finance visible to plant people and help operations make financially correct decisions. We offer real time accounting with our Dynamic Performance Measurement offering—this has been a big transformation.

de Vries—We are learning from customers. The old way of producing a report every 30 days just produces ‘ancient history,’ not actionable results. On the other hands, ‘targets’ can be counter productive. The answer is to provide real time accounting and make people aware how they are doing. Our tag line for the shop floor is ‘what did I make today?’.

In process control I get the impression that the data historian is both a huge focus and a barrier to information flow?

de Vries—Yes—the historian is both those things. It stores tag, time, value data. We are in the process of transforming information—not just capturing event history but providing trends and forecasts. Invensys is one of the three main providers of process historians. I say, let the historian do what it does. But use another data store to run your plant. Here you can associate measurement with derived values. But we have to recognize the strong culture within our clients that depends on the historian and Excel spreadsheets.

Gilmore—People used to complain that they didn’t have enough data about what was happening. Then came the Historian and folks suddenly has vast amounts of data and no time to look at it! So we have to help operations and business people filter and pre-process data—solving local potential problems. It’s a kind of ‘triage.’ Like a doctor asking, ‘Is this a cold or pneumonia?’

de Vries—In the past, technology workers knew individual tag names. People got familiar with their environment. Now we have fewer, more mobile people so you can no longer assume such familiarity—there is no longer the ability to noodle around with data in the historian to understand what’s happening. Today, ‘events’ need to be turned into actions.

What data store do you use?

de Vries—Any RDBMS will do—we often use Microsoft SQL Server unless customers want something else.

Gilmore—And we have our own Industrial SQL product based on SQL Server extended into real time data.

Is the RDBMS the right tool for time variant data?

de Vries—The RDBMS is horrible for time variant stuff! But other object datastores are not ready for prime time. Customers do ask, but at the end of the day, we need to put our data into a commercial database.

GilmoreThis is an evolving thing—a changing world.

de Vries—Indeed and speaking of change, these projects present huge change management issues involving IT, technologists and the business. This can create rivalries. So it is easier to start with white sheet of paper like Chiniere than to retrofit.

Speaking of the barrier of the Historian, do you encounter the ‘silo mentality’ when dealing with different parts of your clients’ businesses?

Gilmore—Sure. There are even completely different personalities. It takes a cowboy’s personality to bring an asset on stream and a dairyman’s to operate it. Some have tried to optimize handover by giving a project team operational responsibility for the first year of production. But this was not really effective. As Stan said, it’s a big change management issue.

Software, hardware, sales, new releases ...

Short takes this month from Intervera, Systems Evolution, ESRI, Fugro-Jason, Geomodeling, Geosoft, INT, Knowledge Systems, MetaCarta, Energy Solutions, TGS NOPEC, PGS, Schlumberger, BT, M2M Data, BP, Telenor, RECOPE, Techint, Geovariances and Tobin International.

Intervera has just announced ‘DataVera Clean’ (DVC) a data cleansing package. DVC automates the data cleansing process, monitors compliance with corporate standards and quality assures E&P workflows in real time.

Houston-based Systems Evolution (SE) has been subcontracted by Microsoft Corp. to assist an unnamed ‘large oil and gas services company’ with an enterprise wide Microsoft Project deployment. SE founded its Enterprise Project Management Practice earlier this year to provide consultancy services on large scale projects. SE also operates an executive search division, ‘Next Hire’ targeting the oil and gas sector.

National Fuel Gas Company has selected ESRI to provide a core enterprise geographic information system (GIS) technology platform to support the company’s transmission and utility operations.

Fugro-Jason has announced RPM, a new module adding rock physics elastic modeling to its PowerLog log analysis package.

Geomodeling Technology has released SBED 2006 with new functionality for modeling small-scale geological heterogeneities that impact reservoir performance.

Geosoft’s Target for ArcGIS introduces a 3D subsurface viewer for borehole data in ESRI’s ArcView.

INT has announced GeoToolkit .NET 2.1, a class library for E&P software developers working in C#. New features include localization support, true-scale hardcopy and better performance with SEG-Y data.

Knowledge Systems has released Drillworks 11.0 and Pressworks 2.0 with enhanced capabilities for pore pressure and geomechanical analysis The new Pressworks release introduces ‘Scout’, a browser and interface for non specialists.

MetaCarta’s new geOdrive release offers automated search and notification, region search, 2-D analysis and visualization, and better integration with ESRI.

Energy Solutions’ PipelineStudio 2.8 release offers Excel-based reporting of simulation results and customizable report and pipeline templates. PipelineStudio now integrates with Energy Solutions’real-time pipeline monitoring system, PipelineManager.

TGS NOPEC has announced PRIMA 8.0, a new release of its seismic data prestack interpretation and analysis package. Release highlights include a module for 2D interpretation, enhanced memory management and 64-bit support.

PGS has released its reverse time migration (RTM) solution for wave equation prestack depth migration. Hitherto, RTM has proved ‘too computationally demanding for commercial use’.

Schlumberger has awarded BT a five-year, $47 million contract for global network services. BT is to take over the running of Schlumberger’s SINet MPLS backbone which spans 18 countries. SINet will transition to BT’s global MPLS infrastructure.

Internet SCADA specialist M2M Data Corp. has released iPM, a component of its iSCADA service. iPM enables computer-based maintenance of multiple assets.

BP has signed a three year contract with Telenor Satellite Services for global broadband services over satellite. The deal will upgrade communications systems on BP’s vessels and production facilities with Telenor’s Sealink VSAT technology. BP’s fleet includes 80 deep sea vessels, offshore platforms and land-based production facilities.

RECOPE, the Costa Rican Petroleum Company, is to deploy Energy Solutions’ PipelineManager package to assure pipeline safety and environmental protection throughout its 226 km network. The project is managed by Techint S.A.

Geovariances has just released version 6.0 of its ISATIS geostatistical package which now offers a 3D Viewer, new import and export modules for CMG, Eclipse and ZMap. The Volumetrics application has been improved to give more flexibility in matching simulation outcomes and the Plurigaussian simulations have been enhanced to take petrophysical properties into account.

Tobin is offering onshore directional survey data for Texas RRC Districts 2, 3 and 4, South Louisiana Parishes and Texas Barnett shale counties. Data is available in ESRI, Oracle PPDM or ASCII formats.

Well log data management as a service

Petris has ported Recall to its PetrisWinds Now infrastructure with storage at secure data center.

Houston-based Petris Technology has just announced the port of the Recall well log data management package it acquired from Baker Hughes last year to its PetrisWinds Now (PWN) Software as a Service (SaaS) offering. Recall is now available on a monthly fee basis, offering experienced petrophysicists web-browser access to Recall’s functionality without upfront licensing fees.


A Recall database is also available for online storage of well data such as curves, logs, directional surveys, zones, images, cores and tops. Data may be stored on the user’s desktop or in a username and password protected private storage space on the Petris hosted server along with daily backups.

SAS 70

The server and the Recall applications are hosted at a Class A Houston-located data center location. The center recently received SAS 70 certification in recognition of its ‘best practice’ data center environment. SAS 70 Type II audits are considered proof of compliance with Sarbanes-Oxley’s Section 404 requirements for outsourced services.

OpenSpirit gateway to geoLOGIC dataset

Data vendor leverages middleware to serve PPDM-based data to end users.

Calgary-based geoLOGIC Systems and OpenSpirit Corp. are to develop an OpenSpirit data server for geoLOGIC’s data sets. Users of geoLOGIC’s Data Center (gDC) will be able to access data from the gDC using a variety of OpenSpirit and geoLOGIC tools.


geoLOGIC president David Hood said, ‘We created the gDC as an open system that uses the latest technology, including the newest version of PPDM, to give customers instant access to the most current petroleum data using the software of their choice. This collaboration with OpenSpirit means our clients will be able to access our regular and spatially enabled data sets from the gDC through the OpenSpirit Scan Utility, and access and use data through any OpenSpirit client applications’.


OpenSpirit president Dan Piette added, ‘OpenSpirit provides end users with comprehensive and flexible access to their upstream data. This project integrates geoLOGIC’s data solution with OpenSpirit-enabled data stores including the recently released OpenSpirit PPDM data store connector.’

Consortium rolls out Wheeler volume

dGB announces sequence stratigraphic interpretation system—a commercial plug-in to Open dTect.

de Groot-Bril’s (dGB) seismic sequence stratigraphic interpretation system (SSIS) was unveiled at the AAPG this month. The methodology, developed under a joint industry project (JIP) involving Statoil, BG Group, TNO and Shell, involves a representation of seismic data in as a ‘Wheeler’ diagram—with a vertical scale of geologic time. This used to delineate sedimentological bodies in situ and to reveal depositional hiatus.


SSIS workflows begin with a stratigraphic interpretation. Stratigraphic time of deposition is then assigned to picked seismic horizons. The whole seismic volume is then transformed into a geologic time-flattened cube—the Wheeler display. OpendTect’s tracking algorithms are used to delineate seismic horizons, unconformities, and other sequence boundaries.

Sequence stratigraphy

Seismically derived depositional units are further categorized with model driven assignment of internal bed geometries. SSIS allows for switching from time to Wheeler domain for further insight. Time synchronous seismic horizons can be extracted from the flattened Wheeler cube for attribute analysis.

Stark Research

The SSIS plug-in is a commercial add-on to the free OpendTect infrastructure. A similar OpendTect plug-in from Stark Research was announced last month (OITJ Vol. 11 N° 3). Stark Research holds a US patent covering the Wheeler volume.

IFP’s sandbox videos now in C&C DAKS

CT-scan movies of sand and putty tectonic experiments now available in digital analog database.

While you can model anything on computer, some situations such as the complex transition from extension to compression tectonics in the deepwater Gulf of Mexico are easier to re-create with analog models. These ‘sandbox’ experiments involve considerable artistry, and are the subject of the ongoing FACET research program at the French Petroleum Institute. Sand and putty models are time-lapse analyzed during deformation in a medical CT scanning X-Ray device to create a film of the tectonic history.


Although it may be some time before the IFP’s tectonic movies appear at your local cinema, they are available now from C&C Reservoir’s Structural Analogs Module, part of C&C’s Digital Analogs Knowledge System (DAKS). The IFP/FACET consortium is funded by Shell, Petrobras and Conoco.

Divestco bags Geo-X processing house

Company acquires seismic processing boutique for C$ 11.5 million.

Calgary-based Divestco is to acquire Geo-X Systems’ seismic processing division for C$11.5 million—mostly in cash. Geo-X was founded in 1971 by Don Chamberlain. Chamberlain is to retain Geo-X’s ARAM seismic recording systems division. Geo-X specializes in high resolution onshore processing—particularly with R&D-oriented seismic projects. Divestco CEO Stephen Popadynetz said, ‘This is a perfect fit with our seismic acquisition business—expanding our existing products and services.’


Divestco is offering employment to all 101 employees of the processing division and has already retained top management.’ The unit will be led by Oliver Kuhn who has been with GEO-X since 1984. Divestco expects the new unit to contribute around $15 million in revenue and $3 million in EBITDA. Divestco is in the process of integrating its software, services, and datasets.

SPE Digital Energy 2006

Around 600 turned out for the 2006 Society of Petroleum Engineers Gulf Coast Section’s ‘Digital Energy’ conference. The show marks an ‘epiphany’ for the Petroleum Engineering community with the recognition of data management as a serious problem. A related topic is the degree to which technologies from outside oil and gas can be brought in to ‘solve’ such issues. This question is often associated with the allegation that oil and gas is a technology laggard. Here, the CIOs are skeptical. But they are having a hard time fending off advances from the likes of Microsoft and Google. In this report we summarize talks from Marathon, Chevron, Shell, Anadarko, BHP Billiton and others.

According to Marathon’s Steve Hinchman, current demand forecasts and declining production will mean a 125 million bbl/day shortfall in oil supply by 2020. The problem is not insurmountable, reserves are there but most are in the hands of National Oil Companies, (NOCs). Historically, NOCs required investment from the International Oil Companies (IOCs), but now have they have both capital and technology. IOCs need to adapt, to ‘differentiate and partner with NOCs’. Value-add relationships are critical and need to be ‘more open and transparent’.


Today’s work practices mandate a ‘single source of the truth’ and there is new focus on IT-enabled knowledge sharing. Despite this, petrotechnical and IT professionals are ‘like ships passing in the night.’ To date IT has ‘over promised and under delivered’. But potentially, the competitive advantage goes beyond the digital oilfield of the future (DOFF) to leverage combined talents, workflows and industry standards. Connectivity is the tissue that pulls it all together. In the Q&A, Hinchman admitted that for Marathon, ‘standards’ are more likely to mean ‘internal standardization on vendor applications’ than POSC or PPDM.

IT investment

Hinchman doubted that the data standards problem can be solved across the industry. Another questioner asked if the new base oil price was affecting IT investment. Hinchman opined that, ‘Investment levels target sustainable growth and are independent of the oil price. Our focus is on our workflows. IT invest will come later.’ Another questioner asked what Marathon had learned from talking to other industries about managing the ‘data deluge’. ‘So far we have talked to an army of consultants and heard a lot of consultantese. We’ve not yet seen the benchmarks we need.’


Michael Lock (Google) believes that adding metadata tags to unstructured information is a waste of time. Google’s experience with one global energy customer was that content was distributed across geoscience, HR, seismic etc. Archival involved ‘cumbersome’ manual tagging. There was no provision for cross silo search and on retrieval, relevance was ‘spotty’. Following implementation of Google’s Enterprise Search, ‘searches are up and complaints down.’

Baker Hughes

Marc Sofia described Baker Hughes’ use of commercial off the shelf software in a prototype enterprise data integration solution that federates well data across multiple legacy data stores. A ‘search broker’ captures metadata and allows for ‘requirements-driven data synchronization’.


Jim Crompton, Chevron, suggests that we have created ‘yet another silo boundary’ between the modeler builders and operations that ‘deal with reality’. This is reflected in the tools the different communities use which range from the sophisticated (for the modeler) to the spreadsheets of the operator. Engineering is dominated by Microsoft Office technology with project documentation in Power Point, collaboration via Outlook and ‘faxes still work, as a proxy for lack of connectivity’. All of which leads to unstructured data issues as email grows, data hides on ‘O:\ drives’ leading to multiple versions of documents. It’s not that ‘IT didn’t do it right,’ rather, ‘IT didn’t do it at all!’ Many digital technologies, such as SCADA, don’t even belong to IT. Another bleak truth is that ‘data management is worse than you think, it’s amazing we do business at all.’

Data data data

The data issue cropped up again in Don Paul’s (Chevron) presentation. ‘People are maxed out on data and we’ve only just started!’ and for Devon Energy CIO Jerome Baudoin, data management is ‘more and more of a problem in our environment, a complex, ill defined activity.’ Despite the best efforts of PPDM, PIDX and POSC, ‘we go over and over again spinning our wheels!’ As an independent oil company, ‘we want to spend energy on what is critical to our organization’. A balance needs to be struck between data overload and data ‘righteous’ (sic). Don Paul noted the increasing sophistication of the NOCs as impacting the traditional role of majors as bringing technology to the game.’ For Alan Huffman, (Fusion ), the 21st century ‘will be the century of the NOC. These companies are aggressively hiring US engineers and IT people today.’ Katya Casey (BHP Billiton) offered an impassioned plea for data management, ‘Companies don’t put enough value on data. Nobody adds metadata. The discipline grew out of secretarial and drafting departments, there has been no education, training or support for data management.’

BHP Billiton

Casey presented BHP Billiton’s Technical Information Architecture, a ‘common information platform to solve business problems’. Core application selection is to be ‘workflow-driven’ and delivered on a single technical hardware platform. Components include OpenWorks, Foster Findlay & Associates, Paradigm and Petrel. Seabed and OpenSpirit also ran. BHPB’s Portal, the Petroleum Professional Web Workspace is under construction. BHPB is looking at ProSource with SeaBed as a taxonomy-driven metadata repository. A major workflow analysis is underway prior to consolidation into ‘true’ 3D environment circa 2008. BHPB is ‘coming to terms with a multi-dimensional integration process.’


Tom Evans detailed Marathon’s in-house seismic processing effort on the Cactus prospect. Kirchoff and Wave Equation Migration were both used, running on Marathon’s own 1,000 CPU cluster. Re-running processing tests even one year later show significant image enhancements due to code improvements. ‘As geophysicists, we shouldn’t be worrying about the hardware but the reality is that this is a highly compute-intensive, interactive process.’ Stochastic simulation is also used with around 100 runs per evaluation, performed on a 256 node SGI box. The engineers liked it so much that ‘they kicked the geoscientists off it and made them buy another one’.


John Nieto (Anadarko) contrasted the ‘linear’ approach to problem solving with shared earth’ modeling (SEM). This integrates static, in place reserves with dynamic, fluid flow modeling. Log-derived facies populate the geological model. If the history match doesn’t work, ‘there are many things you can alter’. The SEM is driven by software like GoCad, Roxar and Petrel. These tools let interpreters see seismic, logs, facies maps all together in one environment.


Linda Dodge (Shell) stated that poor data management was at the root of the Piper Alpha disaster and also caused the loss of one of Shell’s oilfields, which was killed by water injection. Dodge traced the move from the well-maintained, central datastores of the 1980s, with their ‘costly bureaucracy’. In 1995, Shell decentralized and allowed for local customization. But this caused ‘data loss and confusion’. Shell now has a data management community with virtual teams, data quality metrics, global standards and a ‘higher profile for data management’. Shell is involved with POSC and sees standards like the Global UWI project and PRODML as non-competitive.


Oil and gas is a poor performer compared with other verticals according to Microsoft’s Mike Brulé. Disparate systems lead to ‘cumbersome’ navigation in the data environment. Brulé suggests that the DAMA International organization offers ideas as to how enterprise business intelligence (BI) could help. Our complex, ‘highly engineered’ industry lags others in data management and enterprise BI analytics. Some propose services-oriented architecture (SOA) as the new silver bullet. But for Brulé, SOA ‘is orthogonal, it says nothing about data management, quality and system performance.’ Some argue that our data is different from other industries. Brulé disagrees, ‘Data use is axiomatic across all industries’. But Brulé did acknowledge that oil and gas has to cope with an application ‘tug of war,’ between Maximo, DIMS, SAP etc. ‘It’s real problematic’.

This article has been taken from a 10 page illustrated report produced as part of The Data Room’s Technology Watch Research Service. For more information on this service please email tw@oilit.com.

Volant, Intervera team on data quality

EnerConnect Transfer promises web services data integration and data QC on-the-fly.

Houston-based Volant Solutions has just announced EnerConnect Transfer (ECT), a data transfer solution for geoscience data managers and application users. ECT provides access to geoscience data in internal, proprietary and external commercial data providers. ECT allows filtered data to be imported into project databases such as OpenWorks or Petra. Transfers can be managed with the native ECT GUI or from supported third-party commercial GIS systems.


Volant CEO Scott Schneider said, ‘ECT finally delivers on what has been promised by integration vendors for a long time: an open, scalable and reliable software platform that seamlessly integrates your geoscience data environment. Our Amazon-style interface provides a familiar user experience: online shopping. Instead of searching for books and DVDs, users search for Well and Log data, place this data in their shopping cart, and verify pricing for purchased vendor data’

Web services

EnerConnect employs a Service-Oriented Architecture (SOA) to deliver a modular software platform composed of a business workflow engine, application adapters and data transformation services. The component architecture lets users address specific business requirements without modifying existing workflows.


Volant’s ECT now also embeds real time data quality capabilities from Intervera Data Solutions of Calgary. Volant will leverage the DataVera data quality management suite to provide on-the-fly of data transiting through EnerConnect. DataVera’s repository contains thousands of industry data quality checks and solutions which identify and clean up suspect data.

Folks, facts and orgs…

This month’s news from Oil IT Journal, PNEC, CygNet, Energy Solutions, ExxonMobil, Schlumberger, ESRI, Fugro-Jason, Geotrace, Sun Microsystems, Techsia, Roxar and Petrotech.

Oil IT Journal’s editor, Neil McNaughton, received one of the first awards from the 10th PNEC Data and Information Integration Conference in Houston this month. The other went to Ellen Hoveland, Anadarko.

CygNet Software has appointed Steve Slezak as Senior Account Executive in its Houston office. Slezak was previously with Schlumberger and CASE Services.

Former U.S. Secretary of Energy, Spencer Abraham, is to advise Energy Solutions International, to ‘increase awareness of the environmental impact of pipeline leak detection and prevention.’

ExxonMobil plans to start up some twenty new projects over the next three years. In 2005 the company produced a blistering 31% return on capital employed.

Schlumberger is to acquire Baker Hughes’ 30% minority interest in WesternGeco for $2.4 billion cash.

ESRI has appointed Brian Boulmay as Petroleum Industry Solutions manager. Boulmay was previously Geo-information and GIS team leader with Shell’s Houston unit and has served on the ESRI Petroleum User Group (PUG) steering committee.

Brad Woods is to head Fugro-Jason’s new Reservoir Products Division. The new unit will develop Fugro-Jason’s PC-based products into ‘an integrated suite of applications spanning petrophysics through to reservoir characterization and simulation.

Gary Yu has been promoted to General Manager of Reservoir Integration with Geotrace.

Sun Microsystems has appointed Greg Hess as Director, Communications and Sales Operations of its Global Energy Division. Hess was previously with Kelman and Quorum Business Solutions.

Karine Schepers is to head-up French software house Techsia’s new Houston office.

Paul Khuri has been appointed as Roxar’s new Americas Regional Manager for its flow measurement division.

Marian Nymark Melle is the new MD of the Norwegian Prototech R&D organization.

Invensys equips Habshan gas plant

Invensys is to supply SCADA control systems on the 180 well Abu Dhabi GASCO gas expansion.

Eastern Bechtel, prime contractor on Abu Dhabi Gas Industries’ (GASCO) Habshan gas plant expansion project, OGD-III, has awarded a major SCADA contract to Invensys Process Systems. The system is built around a Wonderware kernel configured for oil and gas applications. Foxboro Remote Terminal Units will be located at some 180 existing and new gas production wells, remote manifold stations and injectors. Approximately 10,000 I/O points throughout Habshan will be controlled or monitored by the new system, most on wells and remote manifolds.


The control room employs redundant master SCADA servers, a real time database and an InSQL historian. The servers are connected by redundant, high-speed Ethernet. The control room is linked to monitoring and control sites, including the nearby Bab plant.


The changeover to the new SCADA system will be implemented in parallel with the existing SCADA system during ongoing production operations at the Habshan processing facilities. Residue gas from processing is re-injected into the reservoir for pressure maintenance.

Seismic interpretation—most get it wrong!

ODIN Project investigates ‘concept uncertainty,’ finds interpreters are major source of risk.

Researcher Clair Bond (University of Glasgow) in association with Midland Valley and GX Technology has been inviting tradeshow attendees to take part in a test of their seismic interpretation acumen.


ODIN picks up on the current vogue for uncertainty and risk management by trying to quantify the risk of an interpreter getting it wrong! As Bond notes, ‘If you put geoscientists in front of the same data, you get as many interpretations as interpreters! These derive from different assumptions, bias and experience. The question is, does this ‘concept uncertainty’ have a greater impact on prediction than models built with a Monte Carlo-type approach?’

Guinea pigs

The ODIN methodology started with the creation of a 2D geological model built with Midland Valley’s 2D Move package. A synthetic seismic section was then created by GX-Technology. The ODIN Guinea pigs—seismic interpreters who were bushwhacked at various recent tradeshows—were invited to interpret the ODIN ‘data’ and fill out a questionnaire.

200 geoscientists

The results from some 200 interpretations showed that interpreters were frequently biased by their ‘prior knowledge’. Interpreters tended to shoehorn their interpretations into the tectonic regime with which they were most familiar. Those working in extensional settings saw extension. Thrust belt workers saw thrusting. In fact the section was an inversion problem—a fact only spotted by 43% of the sample.


One encouraging result from the survey was that experienced interpreters fared better than newbies. Especially if their experience came from an extensional setting! To remove such bias, the ODIN project is to expand into other tectonic settings. Future results will be evaluated for statistical bias and will target specific groups of interpreters.

Landmark, EMC—E&P lifecycle data management

Document management from EMC/Documentum is now available from DecisionSpace IM Solution.

Landmark is expanding its DecisionSpace information management solution with the addition of document management and high-end networked storage from EMC Corp. The companies have signed a global agreement to jointly develop information lifecycle management (ILM) solutions tuned to upstream oil and gas companies. The new solutions address information asset management creation, storage, indexing, cataloging, data quality measurement and workflow/audit capture.


Landmark will integrate EMC’s Documentum content management package with its DecisionSpace information management and infrastructure solutions. Landmark will also resell EMC’s portfolio of networked storage systems, the Centera content addressed storage system and the ControlCenter family of storage management software.


Halliburton VP Jonathan Lewis said, ‘This alliance will provide knowledge workers with an IM solution that spans structured and unstructured data management across the enterprise.’


EMC senior VP Mitch Breen said, ‘Oil and gas is a very information-intensive industry with Petabyte data volumes. We expect these volumes to increase as companies use more sophisticated digital technologies. By leveraging Landmark’s industry expertise we are able to increase the value of EMC solutions to E&P companies.’

EOS opens B2B exchange in Qatar

New business to business exchange set to enable Qatari companies to expand internationally.

Following the announcement of ‘Energy City’ (see last month’s Oil IT Journal), the state of Qatar is now the site of an extension of EOS’ Middle East business to business (B2B) exchange. EOS Qatar is a joint venture between UK-based EOS Technologies and a group of Qatari interests. EOS Qatar aims to foster the growth of business in Qatar and help local organizations evolve internationally.

Al Thani

EOS Qatar investor Sheikh Saoud Bin Abdulla Jabor Al Thani said, ‘Qatar is one of the region’s fastest growing economies, and is forecast to become one of the world’s largest natural gas producers. This growth needs to be managed carefully and if we are to move into the international business arena, local corporations must identify ways to manage their internal and external processes in a timely and cost effective manner.’


EOS already supported corporate transitions to e-procurement in the Middle East. The EOS Exchange offers a transparent pricing structure with no sign-ups fees. Companies pay for what they use and can transition to full e-commerce at their own pace.


EOS CEO Alan Livingston added, ‘With 8,000 members in the Middle East alone, and projecting over 20,000 by the end of 2006, Qatar is a key country for us. EOS Qatar can now offer Qatari companies an easy to use e-procurement system that will open doors to international trade. Our exchange will help Qatar’s public and private sectors make Qatar one of the most e-enabled countries in the region.’


In 2004, EOS announced Enerdox, an European e-business document exchange. EOS also runs the Document Delivery Exchange (DDE), an oil and gas e-commerce facility connecting some 2,500 organizations. The EOS solution that powers Enerdox is used by suppliers including Halliburton and Schlumberger. EOS’ oil and gas anchor client is Petroleum Development Oman (34% owned by Shell).

UK call for production optimization

R&D organization seeks projects that enhance brown field recovery and improve data use.

The UK-based Industry Technology Facilitator (ITF), owned by 13 major oil companies, acts as a facilitator for R&D related to new technologies to maximize recovery from mature basins.


The current ITF call for proposals is seeking innovative technologies with ‘a demonstrable path to commercialization’. One RFI of interest to the upstream IT community covers real-time field management (RTFM) and involves integrated RTFM systems with predictive capabilities for events such as loss-of-well and topside failures. The RFI also covers numerical tools for well bore and process facility fluid flow modeling. The RFI notes the use of geoscience software to make predictions from massive data sets to ask if this approach can be transferred to real-time field management. Another concern is how better use can be made of the large amount of data currently collected. Proposals should offer low cost, flexible solutions that can be retrofitted to existing (brown) fields.

£23.5 million

Since 1999, ITF has raised over £23.5 million in direct joint industry project support, with a further £20 million of equity investment capital being directly linked to ITF projects and over £20 million in trials funding. According to the RFI, ‘this is an excellent opportunity to gain a wide audience in seeking support for your technology.’ There are no restrictions on proposers’ country of origin. The RFI closes on the 24th of May.

Badleys’ successful Windows SFU port

Straightforward port of multi-million line code base leverages Microsoft Services for Unix.

Badley Geoscience (Badleys) reports a painless port of its structural modeling package from Unix to Windows using Microsoft’s Services For Unix (SFU). Badleys code, written in C/C++ with a Motif/OpenGL-based GUI comprises millions of lines of code and around 40 shell scripts.


Several porting options were considered, but most involved costly run-time licensing or reduced functionality. In contrast, SFU offered a cost-effective approach and a compliant Windows GUI—including third-party OpenGL libraries, a GCC, NFS client, and shell scripting.


Badleys’ Andy Foster said, ‘Our investment on the SFU port paid off in under six months. Our engineering team migrated millions of lines of source code in just one week using this built-in Windows package.’

Shell’s FieldWare—now you see it, now you don’t!

A ‘people shortage’ is compromising commercialization of Shell’s well surveillance package.

Shell’s FieldWare Production Universe (FW) real time well surveillance was a key enabler of Shell’s flagship e-field development, Champion West, in Brunei. Oil IT Journal spoke to Shell Global Solutions’ Ron Cramer and Jennifer Duhon about FieldWare which was promoted by Shell last year as a commercial offering.


Ron Cramer believes that the oil industry is a conservative business that has been slow to adopt digital oilfield and ‘smart’ field techniques, although Shell has been interested in the concept for 30 years or so. FW is a suite of applications, much of which are commissioned or bought in and configured from commercial ‘commodity packages. ‘Shell is an oil company, not a software developer.’ FW’s scope is huge—the product’s strap line is, ‘from well bore to point of sale.’

Well test

One critical FW application is in emulating costly multi phase flow meters. At a cost of around $200k installed, these are not really an option for a 1,000 well field. Moreover such equipment requires considerable tender loving care (TLC), something the oil industry is not good at! The normal industry work around is the monthly well test. But this assumes that things stay the same in between tests. As Cramer points out, ‘We know this is not true. Things happen when we aren’t looking at the well. But instead we plug rubbish into the hydrocarbon accounting system because we can’t monitor the system continuously’. For Shell, this was identified as a serious barrier to e-field enablement—as well as a serious reporting problem. ‘You can’t manage what you can’t measure.’ FW works by monitoring available data such as well head pressure and performing real time mathematical modeling to figure out what the well is doing. FW’s ‘data driven’ models are calibrated with a special type of well test. Well behavior is then obtained by continuous measurement from SCADA systems. Cramer concluded that 2006 would be ‘prime time’ for FW. But there is a twist to the story.


Duhon explained that Shell Global Solutions (SGS), the technology/R&D engineering branch of Shell, FW as a commercial opportunity that could be sold outside of Shell. Hence the putative commercialization of FieldWare last year. Subsequently, in a booming oil services market, SGS is suffering the same ‘people shortage’ as other service companies, leading to a re-evaluation of Shell’s position regarding Field Ware’s commercialization.

mySQL powers data logger

Datalog’s ANAX 500 logging system runs QNX real time OS.

Calgary-based Datalog has just released the ANAX 500, a new data collection system for mud logging units. The ANAX 500 supports offshore operations with high speed acquisition and data management for drilling optimization and hydraulic analysis. The unit handles TVD, multiple wells and geoscience logging. Security is provided by dual recording to twin hard drives and a redundant CPU.


The secure unit leverages open source mySQL databases to store logging, BHA, drill string, pipe tally and directional survey data. These databases allow for data mining of unusual events and comparison with other wells for optimization.


ANAX 500 adds enhanced data compilation functionality to Datalog’s WellWizard remote collaboration package provides offsite personnel secure access to the same data as seen at the wellsite. Datalog develops wellsite server software on QNX Software’s Neutrino real time operating system.

Enterprise data visibility and control

Invensys’ InFusion promises ‘unified, real-time control, information, and application environment.’

Invensys Process Systems has just announced ‘InFusion,’ a new ‘unified, real-time control, information, and application environment for the enterprise’. InFusion bundles Invensys process control solutions with enterprise information and integration technologies from Microsoft and SAP.


Invensys president Mike Caliel said, ‘By combining capabilities from across Invensys into a unified architecture, we have realized a step change in the use of open technologies and standards. InFusion breaks down stubborn technical and organizational barriers to automation, while preserving clients’ existing investment.’


InFusion marks the unification of Invesys process control solutions and its Wonderware HMI solutions into the Microsoft .NET-based ArchestrA solution. This delivers a standardized plant object model unifying plant-wide automation systems with business-centric information systems—regardless of who made them or when they were made.

Real time

InFusion promises unified, real-time visibility of both plant and business to help align overall plant performance with business objectives. For more on Invensys’ vision for plant to business integration and its implications for the digital oilfield see our interview with Stan de Vries and John Golmore of page 3 of this issue.


The InFusion Collaboration Wall, a new concept in human interface, can also be used to provide plant operators, maintenance technicians, engineers, and managers with a shared view of process control, maintenance, performance, and business application displays to encourage and facilitate creative collaboration.


InFusion uses key technologies and standards such as Microsoft .NET and BizTalk Server 2004, SAP NetWeaver, ISA S95 (for manufacturing-to-enterprise integration), MIMOSA (for maintenance-to-enterprise integration), and OPC (for real-time connectivity).

Open O&M

InFusion also represents the first major implementation of Open O&M (Operations & Maintenance), the industry-standard convergence of OPC, ISA S95, and MIMOSA. This approach eliminates the need to use conventional point-to-point solutions that are costly to implement, costly to maintain, and inherently inflexible in nature, and helps to ensure that the right information is delivered to the right people, at the right time, and in a meaningful context.

AssetBank evaluates undeveloped assets

New package from IHS Energy bundles field database, fiscal terms and economics.

AssetBank, a new component of IHS’ Forecaster provides valuations of undeveloped upstream assets. AssetBank is an interactive database of development plans and valuations for some 750 undeveloped assets worldwide. AssetBank embeds IHS’ QUE$TOR cost estimation engine. AssetBank includes location, reserves, reservoir, joint venture partners and contract information. The package also offers ‘conceptual’ field development plans and economic evaluations with editable assumptions and optional query tools.


Part of the IHS Forecaster family, AssetBank delivers a bottom-up assessment based on an asset’s production, fiscal and cost profiles. The package includes 270 fiscal regimes, optimized portfolio scheduling to meet user defined targets such as production, costs or income and roll-up at field, country or company level.

Absoft takes SAP savvy downstream

Aberdeen software house to leverage upstream experience with refiners and terminal operators.

Absoft, the Aberdeen-based SAP consultancy, is taking its upstream SAP implementation know-how downstream, targeting oil and gas logistics with integrated business processes spanning refineries, terminals, wholesalers and distributors. Absoft supports SAP Oil and Gas Secondary Distribution (OGSD) and Open Terminal Administration System (OpenTAS), where Absoft has teamed with German software house Implico.

Data capture

OGSD allows companies to integrate back office, supply chain and customer service operations and to optimize inventory management and customer service while reducing transportation expenses and operating costs. OpenTAS incorporates stock movement, reconciliation and accounting processes. OpenTSA integrates with both SAP and other ERP systems as well as data capture systems.


Absoft MD Ian Mechie said, ‘Absoft has seen significant growth in demand for its combined SAP skills and oil and gas business understanding. The downstream market is a continually evolving business operating with tight margins and therefore demands optimised logistical processes and effective cost control. Extending our portfolio of SAP solutions and partnering with Implico to address these demands in the UK is a natural fit for Absoft.’


Both SAP packages are already used in continental Europe by a growing number of organisations. The move into the downstream market will require substantial investment from Absoft. Mechie expects the rewards to be very worthwhile over a 4-5 year period during which time he sees potential for the downstream business to double Absoft’s turnover and staff numbers.

Landmark, Pavilion team on DOFF

The digital oilfield of the future sees integration of DecisionSpace with model-based controllers.

Halliburton has announced an agreement with Austin-based Pavilion Technologies, embedding Pavilion8 with Landmark’s DecisionSpace Production (DSP) framework. Pavilion’s iField Perfecter technology was unveiled at last year’s IQPC’s Oilfield Automation Conference in Houston (Oil ITJ Vol 10 N° 3).


DSP offers a real time, integrated model of the reservoir, wells, gathering network and production facilities. Pavilion’s I-field Perfecter uses the ‘objective function’ to maximize field value without violating operating constraints.


Halliburton VP Jonathan Lewis said, ‘This technology resolves one of the fundamental problems faced by the industry in defining the digital oilfield of the future and overcomes the limitations of traditional modeling with sub-surface to surface interfaces. By overcoming the constraints and interdependencies of traditional full physics simulators, we enable dynamic interpretation of multiple scenarios and actual conditions in real time making the vision of a real time integrated asset model a reality.’


Pavilion uses a hybrid modeling approach using neural networks, process data and physics in model development. Pavilion8 leverages Service-Oriented Architecture (SOA), enabling seamless integration with ERP applications, data warehouses, historians, and distributed control systems from a variety of suppliers. A J2EE-compliant web service interface provides portability across Microsoft and UNIX operating systems. Pavilion’s downstream clients include BP and Total.

Model-based controller gets cash injection

KBC Private Equity takes 60% of Shell’s Fieldware solution partner, IPCOS

Following the award of a $5.6 million contract for the development of Shell Global Solutions’ (SGS) FieldWare product suite, Integrated Production Control Systems (IPCOS) has a new investor, KBC Private Equity (KBC). KBC has taken a 60% stake in the Leuven, Belgium-based developer of automated process control solutions. IPCOS’ specialty is model predictive control technology as described in our FieldWare article in this issue.

Leuven University

The other 40% of IPCOS equity is held by management, staff and the Catholic University of Leuven from which the company was spun-out in 1995. The partnership with KBC Private Equity reinforces IPCOS’ financial position and will enable the company to expand into new market segments.


IPCOS MD Peter Van Overschee, said, ‘We recognize KBC as a pro-active financial partner—a perfect fit with our corporate culture. The inclusion of KBC in our management structures will provide IPCOS with the drive to continue growing our high-tech solutions around the world.’ IPCOS partners with various R&D organizations and commercial partners including TNO and Protomation (Netherlands), Uhde (Germany), Naizak (Saudi Arabia), Siemens (Germany) and PSE (England). Speaking of the SGS deal, Overschee added, ‘SGS’ vision of data driven monitoring and optimization solutions for the oil and gas business, combined with IPCOS’ expertise in algorithm and software development, has led to extremely powerful solutions.’

Halliburton’s Real Time Decision Center

New demonstrator consolidates real time centers and visualization rooms across the enterprise.

Halliburton has announced a ‘Real Time Decision Center’ (RTDC) located at its Houston Digital and Consulting Solutions (DCS) corporate offices. The RTDC brings together the functionality of previous real time centers and visualization rooms located at operator’s units around the world. The RTDC is a collaborative environment for exploration, drilling, production, asset management and finance—and soon reservoir engineering.


DCS president Peter Bernard said, ‘We’ve designed the RTDC to be the most fully-used, multi-functional center at a customer’s site—a work space that will increase productivity and allow them to operate more profitably.’ The RTDC incorporates real-time data feeds, volumetric applications, video streams from rig-based cameras and global videoconferencing. The Houston unit is designed as a technology demonstrator where clients can ‘test drive’ the RTDC.


The demonstrator is built around an SGI Prism capable of handling 3TB datasets, a Sony SXRD digital projector with 8.3 megapixel resolution. A ‘collaboration room’ features a CyViz Visualization System and HP workstations. Infrastructure is a NetApp unified storage system. The whole enchilada of Halliburton DCS applications can be accessed from the RTDC including geotechnical, reservoir management, drilling and completions and production optimization. Information management and interpretation packages include GeoProbe, Well Seismic Fusion, and DecisionSpace Nexus. If that ain’t enough, the GeoGraphix suite of applications and Sperry’s drilling and completions packages are also available.


Halliburton is offering the RTDC as a customizable, turnkey solution that can be tuned to client workflows. All users can access the same data sources offering the possibility of ‘connecting expert resources with each other and overseeing an entire asset portfolio from a central location.’

© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.