May 2004


SIS bags Decision-Team

Schlumberger Information Solutions has acquired production optimization specialists Decision Team. Decision Team’s Decide! product applies artificial intelligence to oilfield data mining.

Schlumberger Information Solutions (SIS) has acquired the assets of Decision Team, an oil and gas software and consulting services firm based in Baden, Austria.

Decide!

Decision Team’s flagship ‘Decide!’ software provides ‘intelligent’ reservoir surveillance and production optimization. Decide captures, analyzes, conditions and transforms historical and real-time production data into ‘actionable operational decisions’.

Stundner

Decision Team MD Michael Stundner said, ‘Production engineers can leverage this immense volume of data while focusing on well and field-level problems. We look forward to integrating Decide with Schlumberger’s suite of production software to enable production optimization workflows such as simulation history matching.’

Goode

SIS president Peter Goode said, ‘The combination of SIS and Decision Team will provide a comprehensive set of petroleum engineering workflows for real-time production optimization and proactive reservoir management. Decide will be a catalyst for enhancing production and augments our real-time capabilities.’ Schlumberger told Oil IT Journal that Decision Team personnel will continue work on Decide within SIS.

Data volumes

Huge volumes of operational data present a challenge for today’s reservoir and production engineers. Decide transforms raw data into pertinent information and offers notification systems and ranking lists of underperforming wells. Automated event detection replaces routine field surveillance, resulting in ‘significant time saving’.

AI

Decide applies artificial intelligence (AI) decision analytics to reservoir and production engineering, generating useable information from such large volumes of data. Data mining techniques available include self organizing maps, multiple linear regression and neural nets. This analytical data mining support diagnostics and predictive modeling for activities such as optimizing field injection-production ratio, artificial lift performance and smart well control.

Read the book!

For more on the Decision Team approach see our review of the book ‘Oil and Gas Data Mining’ in Oil ITJ Vol. 9 N° 2.


Weatherford buys EPS

Weatherford, N° 4 in the service sector, has bought into the software business by acquiring asset modeling specialists Edinburgh Petroleum Services.

Weatherford International has acquired Edinburgh Petroleum Services (EPS), a move that further enhances Weatherford’s leading position in Production Automation and Optimization. EPS’ asset management software optimizes design of networks of wells and production facilities. EPS also offers well test analysis, material balance and well performance modeling tools and a strong reputation for consultancy and training.

Mehta

Dharmesh Mehta, head of optimization with Weatherford said, ‘We have over 40,000 wells around the world using automatic control. EPS allows us to extend this dominant position in well optimization into complete asset optimization.

Oremrod

EPS MD Laurence Ormerod added, ‘Weatherford’s strength in all aspects of artificial lift systems, including real-time data acquisition and control, is an excellent match for our optimization skills.’

5 point plan

The EPS acquisition completes Weatherford’s ‘five point plan’ for production optimization—from completion hardware and sensing, artificial lift sensing and software, to field optimization.


Report from the data management frontline

Oil IT Journal Editor Neil McNaughton reports from the 8th PNEC Data Integration Conference. A quick glance at the program might lead one to think that little has changed since the PNEC started in 1996. But things are changing as the supermajors put serious money into big data cleanup projects.

There are two kinds of data managers. Those that are ‘just doing it’ and those that are still waiting for a silver bullet to ‘do it’ for them. This really is my take-home from the excellent PNEC Petroleum Data Integration* conference held in Houston this month. Our complete report on the 8th PNEC will appear in next month’s Oil IT Journal—and of course as part of The Data Room’s extended Technology Watch Report service. But I thought that you might like a preview in the form of some thoughts on where data management is today.

De Gaulle

I believe it was the good old Général De Gaulle who said ‘plus ça change, plus c’est la même chose’. Indeed it is easy for regular attendees at the PNEC, observing a certain sameness in the debates, to conclude that nothing has changed, that we are confronted with the same old problems of expanding data volumes, poorly applied rules and procedures for naming and capturing data and lack of funding. A couple of years back, a variety of ‘solutions’ were suggested—usually combining outsourcing with re-engineering the workflow. Such solutions tended towards a ‘production line’ approach: ‘Taylorism’ applied to managing the upstream workflow.

Taylorism

Frederick Taylor—the original management guru—wrote his ‘Principles of Scientific Management’ in 1911. Taylor advocated** developing a ‘science’ for every job, including ‘rules, motion, standardized work implements, and proper working conditions’. With great prescience, Taylor also advised ‘selecting workers with the right abilities for the job, training them and offering proper incentives and support’.

Re-engineering

Such notions were central to industry for the best part of the last century—from Henry Ford’s production lines to W. Edwards Deming’s quality management and maybe even to our upstream workflow re-engineers. But the ‘production line’ approach implies a considerable degree of stability in work processes. There is no point in retooling and training everyone unless you are going to be manufacturing some product for a considerable time. Likewise, there is no point establishing a set of data management procedures if your data sources are going to change—or if new technology is going to come along and change the way you work.

Evolving workflow

This is the problem of applying Taylorism to a moving target. And upstream ‘targets’ have shifted considerably in the last few years—with much more post stack data online, horizontal wells with ghastly data management issues, multi-z and image logs—and with time lapse and four component data on the horizon. All of which is set in a context of exponential growth in data volumes.

Red herring

An illuminating discussion followed Yogi Schultz’s talk at the PNEC—when Ian Morison of the Information Store questioned the notion that data volumes are the problem. Morison argues that if it were just a matter of increasing volumes, then our IT solutions would be more than capable of keeping up (thanks to Moore’s Law and growing disk capacity). Morison put his finger on what is undoubtedly the real issue in data management: the increasing complexity of upstream data and workflows.

Domain knowledge

Data complexity defies the Taylorism approach. If you are trying to collate GIS data from multiple coordinate reference systems then you really need a good understanding of geodesy. You are also unlikely to apply exactly the same skill sets two days running. Modern logging tools defy the simple depth—value pair paradigm and require serious domain knowledge for their management. The management of multiple pre and post stack seismics likewise requires a goodly degree of geophysical knowledge.

Mergers

But it can be done! What makes for good data management is a well-funded project and here, demonstrable progress is being made. Both ExxonMobil and Chevron-Texaco presented major data clean-up projects at the PNEC. These centered on the merger of well data from ‘heritage’ companies and are great examples of what can be achieved when adequate resources are applied to such problems. The mergers have had the effect of a shot in the arm for data management. They appear to be succeeding where years of pontificating and theorizing have failed.

The point?

The cleanup of the majors’ heritage data sets are arguably the big drivers in data management today. They are spinning-off a new breed of software tools and contractor know-how as a new micro-industry is born. Above all, I think the major’s approach shows that spending fairly substantial amounts of money on data clean-up is really part of the cost of doing business.

Fabric of management

As we map the processes developed for well header data across to the more complex parts of the workflow, the move away from Taylorism will be even more pronounced. We are no longer looking at a ‘sausage machine’ approach to data management—but to the incorporation of domain knowledge into the fabric of data management.

Just do it

It is the combined requirement domain knowledge and grunt work that makes it hard to get traction for data management—but the majors are showing the way. So my advice to you all is—just do it!

* Petroleum Network Education Conferences—Philip C. Crouse & Associates.

** Source www.cornell.edu. Google ‘Taylorism’.


Oil ITJ Interview—Bill Befeld, Baker Atlas

Oil IT Journal was invited to attend the launch of recall V 5.0 including a new port to Windows, more of which in next month’s edition. But while at the Recall user group meeting, we caught up with Baker Atlas director Bill Befeld who explained Baker’s strategy in the field of software engineering.

Oil ITJ—Where does software fit into your organization?

Befeld—Baker Hughes International has six divisions. Baker Atlas handles wireline and formation evaluation. Atlas itself has three further subdivisions including Technology with Shraga Wolf as VP and myself as Director. We are located in the Houston Technology Center and the UK Recall unit.

Oil ITJ—Halliburton has Landmark, Schlumberger has GeoQuest—where is Baker’s software brand?

BefeldSoftware development is dispersed throughout Baker. All divisions have products that are licensed to oil company client users of Baker tools. Baker does not see a separate software business outside of its core divisions. Software is a creative process and shouldn’t get too big. Witness the size of our Recall unit with 25 people.

Oil ITJ—What software is developed at the Houston Technology Center?

Befeld—Mostly software for logging tools, surface instruments and Inteq-related products. Such software is generally all tied-in with our tools—and not ‘packaged’.

Oil ITJ - But they may relate to Recall..

Befeld—Sure. Part of my job is to help clients customize their own environments. If Agip likes our acoustic log processing software we’ll fix a license for their in-house use. Likewise for the vertical seismic processing toolkit, SeisLink which was developed through from our joint venture with CGG—VSFusion.

Oil ITJSo what’s the product line-up?

Befeld—The heart of our system is the database. This was designed from the ground up (by Chris Hanley—who also heads our Recall unit in London) and includes industry-specific data structures. This represents some sophisticated programming as much well data is recorded against both depth and time. Recall is really best in class for ‘multi-z’ and image data. The software started out as ‘Incline’ for measuring dipping beds—the imaging tools grew out of this. Petros—the petrophysical product is now being pushed into LogScape and now offers linked views—points, histograms, image data and flags showing z locations of selected analysis points.

Oil ITJ Did you use any third party tools for the Windows port of recall 5.0? Where are you in the Open GL vs. Direct-X debate?

Befeld—We didn’t use any third part tools for the Windows port. All our development is graphics neutral—but we are following the technology in this space.

Oil ITJ—WellLink communications are a likely game changer in this space.

Befeld—Indeed, Atlas Online built a satellite link to BH Direct. Now a customer with a laptop can see what’s happening at the rig in near real time—maybe a 3 second delay—via the Recall system. The possibilities are amazing—a user could receive a quick-look analysis on a personal digital assistant (PDA). This has major implications for decision makers! These folks used to be in the logging unit making big money decisions on the hoof—and sometimes on their own!

Oil ITJ—Does your software interface with other petrophysical analysis tools?

Befeld—We write our software for our logging tools—so users tend to use these. But of course Recall is the exception—and will work with all service companies’ logging tools. Our core business is logging—so surface systems are the key development targets.

Oil ITJ—Well site processing seems a bit anachronistic. Why don’t you stream all your data to Recall and process it there?

Befeld—Such ideas are always being kicked around. Maybe smarter logging tools will be able to push data up to the surface and on to a database. But downhole is a very harsh environment and these tools are very complex and must work in 400°F and 20,000 psi. I guess the answer is part history—part the organization of our core business of surface acquisition.

Oil ITJ—Recall 5.0 on Windows is a big change for your market.

Befeld—Windows is very important to us. Recall, which is in every major oil company, has always run on big systems. Windows opens up a larger end user market—which we will be targeting aggressively.


Schlumberger rebuilds consulting business

Antoine Rostand is to head up Schlumberger Information Systems’ new Business Consulting unit.

Following its aborted sortie into IT consultancy via the acquisition and subsequent sale of Sema Group, Schlumberger’s upstream consulting business has lacked visibility. This is about to change with the creation of a new Schlumberger Information Solutions (SIS) unit – the Schlumberger Business Consulting group (SBC). SBC is focused on what is described as the ‘growing demand’ for optimization of upstream operations. The new unit has been formed by the SIS’ software technology, information management, network and infrastructure services.

Rostand

Heading up the new unit is Antoine Rostand who was president of EDS France before joining Schlumberger, where he was previously VP consulting and systems integration, Europe, for Schlumberger-Sema.

Goode

SIS president Peter Goode said, ‘Our business is evolving, with changing operational issues and accelerating workforce dynamics. The next step change in value creation will see IT, new technologies and process redesign integrated with core E&P processes. Our domain experience and the capability to provide an end-to-end solution uniquely positions us to assist our customers to meet these challenges.’

Strategy

SBC will work with E&P companies on strategy and organization design and implementation to enhance core operational processes and to realize productivity gains through IT-enabled workflow design.


US MMS kicks-off eWell reporting

The Minerals Management Service is going live with its online permitting and reporting system.

As revealed in OITJ earlier this year, (Vol. 9 N° 2) the United States Minerals Management Service (MMS) is to implement a new, electronic ‘eWell’ reporting system. eWell lets operators exchange well data with the MMS Gulf of Mexico Region’s district offices.

Paper forms

Current MMS regulations require companies to submit specific paper forms for completed and planned well activities. As part of a larger electronic government reengineering effort, MMS has restructured six well permit and report forms and made them accessible electronically. The new system replaces paper versions of permits to drill, permits to modify wells, well activity reports, end of operations reports, and rig move notifications.’

Internet

eWell now puts these forms on-line where information can be submitted via a secure internet site in lieu of paper submission. The system pre-populates forms with previously submitted data stored in the MMS database. Automated help screens will speed form completion and improve accuracy.

Burton

According to MMS director Johnnie Burton, ‘MMS analyses show our new eWell permitting and reporting system will reduce processing time for the 20,000 applications each year by 50 %, thus reducing costly rig waiting time.’ eWell is scheduled for rollout in June. Earlier this year, Burton received the prestigious Women in Energy leadership award.


Statoil splashes out on upstream software

Statoil signs five-year upstream software deals with Paradigm and Landmark Graphics.

Statoil has awarded Paradigm Geo and Landmark Graphics five year deals for the provision of upstream software. Paradigm is to supply its Explorer and Geolog applications – running on its epos 3.0 integration platform. Geolog is recognized as Statoil’s ‘mainstream’ well petrophysical tool while Explorer fills a similar role in time-depth conversion workflows.

Gundersen

Statoil’s Erik Gundersen commented ‘Explorer is a powerful time-depth conversion solution. The software integrates well and seismic information with mapping tools, geostatistics and uncertainty—providing geoscientists with the means to generate velocity models and convert the original time interpretation.’

OpenWorks

In what was described as a ‘multi-million dollar’ deal, Statoil is extending its software contract with Landmark in the prospect generation, field development planning, drilling and completion area. The current agreement represents a significant extension of Statoil’s existing OpenWorks implementation described as the foundation of Statoil’s integrated information management strategy. The original deal with Landmark was signed five years ago.


INT’s log viewer for Winds Enterprise

INT’s web-based log viewer is now a component of Petris’ ASP-based data management offering.

Houston-based INT and Petris Technology will be offering web-based well log viewers as part of the PetrisWINDS Enterprise (PWE) system. PWE, the vendor-neutral, web-based data and application environment now offers users multiple well displays, cross plots etc..

Schatz

INT VP Paul Schatz said, ‘By making interactive data display available via the internet, PWE users will benefit from state-of-the-art geoscience delivered efficiently and economically to the desktop.’

Pferd

Petris VP Jeff Pferd added, ‘INT offers high quality visualization and state-of-the-art technical foundations. We are pleased that INT’s technology is an integral part of our data management application.’


Blue Marble offers image/GIS integration

The latest edition of Blue Marble’s geographic transform library offers raster image integration.

Blue Marble Geographics has released a new version of its GeoTransform library for GIS developers. GeoTransform 5.0 supports Visual Basic, C++, Delphi, PowerBuilder, C++ Builder and lets developers embed sophisticated image re-projection and tiling in their applications.

Cunningham

Blue Marble president Pat Cunningham said, ‘GIS developers need to work directly with raster imagery in their applications. But don’t need to reinvent the wheel! GeoTransform provides affordable technology and reduces time-to-market.’

120,000 users

Blue Marble claims over 120,000 customers in 100 countries. More from www.bluemarblegeo.com.


Majors clean-up with QCLogix

Data clean-up tools from Innerlogix stole the show at the PNEC data integration conference.

At the PNEC Data Integration conference—of which more in next month’s Oil IT Journal—Innerlogix pretty well stole the show, or rather neatly arranged for ExxonMobil and ChevronTexaco to steal the show on their behalf. Both supermajors are enthusiastic users of Innerlogix’ DataLogix data cleanup tool and have used this extensively in the merger of their ‘heritage’ data sets.

ExxonMobil

Work done for ExxonMobil in particular has led to the development of a new, batch-oriented data cleanup tool which is just about ready for commercial release. Innerlogix’ president Dan Heggelund told Oil IT Journal, ‘QCLogix represents a new direction in data management, data confidence. QCLogix monitors upstream data and displays the results on a ‘quality dashboard’. Understanding quality is the cornerstone for building confidence in the data. QCLogix is based on a proven methodology for defining, measuring, analyzing, improving, and controlling the quality of E&P data.’

ChevronTexaco

At the PNEC, ChevronTexaco’s Mike Underwood said, ‘QCLogix will enable us to better organize our data cleanup—monitoring, validating, and checking our databases for us. Batch mode processes should enable us to cut the percentage of time it takes to perform these activities in half and will enable us to increase the frequency of our runs to over one a month.

Heggelund

For more on how Innerlogix uses ‘statistics, geo-statistics, business logic, and fuzzy logic’ for data QC read Heggeleund’s May 2002 article in Oil IT Journal (Vol. 7 N° 5).


MetaCarta’s geOcarta upstream GIS/DMS

MetaCarta has re-branded its oil industry-specific text/GIS search engine.

MetaCarta has productized its oil-industry specific text-and-GIS search tool as geOdrive. geOdrive helps geoscientists locate text documents stored in a shared drive, the company Intranet or corporate portal from geographic locations, keywords and time parameters. Documents are then organized on a map according to their geographic references.

Odell

Mike Odell, head of MetaCarta’s energy unit said, ‘Oil and gas companies depend on knowing as much as possible about geographic locations. Workers need to know where to drill an exploratory well, where to place an offshore platform and where to locate a retail outlet. Geography is central to the operations.’

Gazetteer

geOdrive uses a geoparsing engine to determine the spatial location of documents on maps. MetaCarta claims a significant investment in its oil and gas sector gazetteer which holds millions of place names and industry-specific locations such as blocks, quadrants and leases within the Gulf of Mexico and the North Sea.


OpenSpirit direct connect to SilverWire

Data from A2D’s SilverWire commercial log delivery service is now available through OpenSpirit.

OpenSpirit Corp. is releasing a well log curve loader with a direct connection into A2D’s ‘SilverWire’ web-based log data delivery service. The system will allow users to ‘seamlessly compare and transfer’ log data between industry leading exploration and production systems.

Middleware

OpenSpirit’s middleware integrates industry-standard datastores including Geoframe, Finder, and OpenWorks. SilverWire lets users query and download data from A2D’s Log-Line Plus well log database. Digital curve, raster image, and ‘SmartRaster’ depth-calibrated images can be accessed from the workstation.

Harter

OpenSpirit CTO Clay Harter said, ‘SilverWire connectivity from A2D greatly enhances access to log data. End users within other OpenSpirit-enabled applications and datastores will be able to connect and query against A2D’s catalog of digital log curves, compare to existing logs, and download live without any interfile transfer or other format changes.’


Fugro-Jason rolls-out PowerLogSE

A new release of Petcom’s PowerLog now includes depth registered raster log capability.

Fugro-Jason has released the second edition of its PowerLog petrophysical software. PowerLog SE updates Petcom’s Windows-based well log analysis tool originally released in the early 90s.

PowerLogSE introduces a new user interface and adds a cross-section montage capability in to the Collage Tool.

Depth registered

Users can now incorporate and correlate depth-registered graphical images such as scanned logs, maps and core images. PowerLogSE will be introduced to the industry at the upcoming SPWLA annual conference in Noordwijk (Netherlands) and the EAGE annual conference in Paris next month.


AAPG 2004 Convention, Dallas

Consultancies including Wood Mackenzie, Robertsons and IHS Energy believe we may be seeing a move to durably higher oil prices. Industry has failed to replace consumption for 20 years and once-profitable basins like the North Sea are now ‘value destroyers.’ The deep offshore remains a viable target. Paradigm and CGG were noteworthy by their absence from this year’s AAPG but many smaller houses were showing new software. Earth Decision Sciences is turning GoCad into a fully-featured interpretation and modeling environment and SMT is likewise expanding its Windows-based Kingdom Suite into a ‘seismic to simulation’ solution. Landmark made heavy weather of selling its consultancy and outsourcing services while Schlumberger’s cluster-based visualization lacked pizzazz. North American universities continue to develop laser mapping of outcrop geology while new, well-site focused tools capture ‘awkward’ geo-data types like cores and cuttings. Some companies are working on new graphics software based on Direct-X rather than the ubiquitous OpenGL. Interesting research from the Kansas Geological Survey shows use of Semantic Web-based taxonomies of geological terms to stitch together geological maps from different areas.

Wood Mackenzie director David Black gave an update on the 2003 study on Upstream Value Creation. WoodMac categorizes Repsol and ConocoPhillips as ‘black holes’—companies which fail to replace reserves and whose exploration ‘erodes value’. Over the 1997-2003 period, the 25 majors studied provided an average 11% return on investment (ROI) from exploration. Some geographical areas are out of favor with WoodMac—notably the UK North Sea, with $2 billion of value destruction (on an $11 billion investment). Worldwide, onshore and shelf environments ‘destroy value’. Only the deepwater and Agip’s ‘Kashagan’ discovery have created value. Acquisitions have provided a 12% ROI—mostly because these deals were done during a low oil price. While there is ‘plenty of life’ left in deepwater, new reserves quality is an issue. The value of the discovered barrel is going down with a move to higher tax regimes, stranded gas, longer lead times and fewer giant fields. Another problem is that investment is constrained by dwindling opportunities which will lead to increased competition.

ChevronTexaco

Rob Ryan described ChevronTexaco’s (CT) portfolio management, conducted by centralized ‘exploration review teams’ (ERT). Since the mid 90s, CT’s wildcat success rates have been constant at around 30%. The average discovery is 50 million bbl. Ryan stated that for CT, ‘The problem is not a lack of investment dollars, we have the money’. Access to opportunities presents ‘some challenges’ but there are significant opportunities. The real issue is efficiency.

Process efficiencies

Ryan believes that the industry should ‘focus on selection and prediction process efficiencies, from technical assessment through risk evaluation review and planning.’ The ERTs were created to ensure consistency in exploration review through multi-discipline risk analysis. A 2002 study compared ERT and asset teams’ evaluations. The asset teams were ‘wildly optimistic’ compared with the ERT evaluations. Ryan observed that, ‘There is no better way to destroy value than the high risk prospect’. Exploration workflows should focus on the basics—amplitude risking standards, seal standards, reservoir quantification standards and hydrocarbon charge standards. ‘It’s Nintendo exploration—plus!’ In 2002 Chevron-Texaco was ‘best in class’ for exploration success according to Deutsche Bank.

Apache

Mike Bahorich explained how Apache Corp. gives local units the decision making power—but ‘measures’ centrally and rewards success through the its ‘rule 43’ incentive system. This means that when Apache stock goes through $43 for over 10 days, staff get a 100% salary bonus! Apache is a ‘smart shopper’—buying common off the shelf (COTS) technology at the right price for horizontal wells, 3D seismic etc. Technology Watch is important for Apache—looking out for emerging and especially ‘disruptive’ technologies. Apache is benefiting from the falling cost of storage and now keeps pre-stack 3D seismic on disk.

IHS Energy

‘Get back to exploration’ was the entreaty from IHS Energy’s Pete Stark. There has been a ‘precipitous’ drop in gas discoveries in the last three years. A dramatic change in operator mix has also occurred with a move from the western majors to the NOCs and former NOCs. The exploration slump is a cause for concern. The world has failed to replace production for the last 20 years.

Rig site

Pason Systems will shortly be releasing AutoDriller—control software which maintains constant parameters, especially for horizontal wells. The feedback system is tied to the electronic drilling recorder and adapts to its environment. AutoDrillerruns atop of Pason’s Electronic Drilling Recorder (EDR), a hardware, software and database combo. Epoch Well Services now offers real time data feeds to users of its myWells.com well data gathering and distribution system. Users can follow drilling activity from anywhere over a secure internet connection. GeologixWellExplorer lets companies set up a departmental-level intranet for dissemination of well summaries, logs, and reports. WellExplorer uses Microsoft’s Internet Information Server (IIS) with a SQL server back-end. The software uses Geologix’ ‘GEO’ dynamic document structure, and leverages the emerging WITSML standard.

Data Management

Core Laboratories has expanded its digital data management offering with the Rapid core database and RIB, an HTML-based data archiver. Rapid stores well information, core imagery, thin sections, SEM data, poro-perm, cross plots and photomicrographs. ‘The focus is on rocks.’ Data can be exchanged through LAS, CSV files and Oracle. RIB provides web-based archiving and reporting of data in Rapid. Production Geoscience’s Oilfield Data Manager now sports an ‘integration canvas’ for display of cultural and regional data alongside the geology. ODM offers data management, correlation displays and wells and surfaces in 3D. ODM is used by Shell to QC data before entry into the corporate database and by Saudi Aramco for stratigraphic correlation. The 2004 release of Fugro-Robertson’s Tellus database adds 8 million data points of geochemical source rock and seep data. On-the-fly mapping from the database includes petroleum systems and chromatography plots. The OilTracers web sites offers free searching of an oil sample library—a database of over 33,000 oil, gas and rock samples from all over the world. A second product, OilRef, holds over 11,000 citations from 300,000 pages of geochemical literature. Zebra GeosciencesEZDataRoom is an electronic data room. Well log and SEG-Y data can be viewed from a web browser. Data remains in the data room and does not needs to be passed to a user—enhancing security. If required, configurable user-based security allows for printing and download of information. Both Seismic Micro Technology and TerraSciences are developing OpenSpirit data servers for their interpretation software. These will simplify integration with other vendor interpretation packages.

Interpretation

Geoff Dorn, from the BP Center for Visualization at the University of Colorado, has developed patented technology (originally from Arco) for automated fault extraction (AFE) from 3D seismic data volumes. The software will likely be commercialized as a plug-in to Paradigm’s VoxelGeo. LithoTect Interpreter from Geo-Logic Systems is a low cost version of Geo-Logic’s geological map, well, seismic, cross section, and 3D interpretation tool. Interpreter includes depth conversion and well picking, monitoring, and projection capabilities. Interpreter is a ‘pure’ Java application that runs on laptops and workstations. GMI Imager from GeoMechanics International now has ‘write back’ capabilities with Landmark’s OpenWorks. Imager analyses can be written back to the database for correlations and other applications.

New technology

The Kansas Geological Survey was showing elements of the new North American Cyberinfrastructure—an electronic grouping of various geological resources across the USA. One component—the GEON Grid seismic infra-structure is a federation of ArcIMS servers supporting the National Carbon Sequestration database. CHRONOS is a national stratigraphic portal providing access to distributed databases across the country. Parts of the Cyberinfrastructure leverage emerging standards for ontologies using semantic web standards like OWL. Isatis V5.0 from Geovariances adds multi-Gaussian simulation—a ‘pixel based’ technique. Isatis can be used in stand-alone mode, or coupled to Petrel, Gocad, RMS, RML, and PowerModel. According to Geovariances, ‘vendors are now offering Isatis links within their own packages’. Schlumberger was showing its ‘Chaos’ Cube—a Coherence Cube look-alike that highlights high density faulting and fractures. The Chaos attribute is thought to react to the presence of fluid and ‘may provide a 3D picture of gas migration’.

Visualization

Sclumberger was showing Gigaviz on a 16-node Linux cluster which was down. GigaViz offers a lot of sophistication for image processing and ad-hoc, rule based voxbody extraction. But the demos fail to capture attention in the way that Magic Earth does. 4D Vista from Midland Valley is described as an ‘Adobe Acrobat for 3D’ and now links to 2DMove and 3DMove toolsets to provide ‘an integrated structure analysis workbench’.

Hardware

A new adaptor for Neuralog’s Neura-Scanner lets you scan transparent (film) logs with transmitted light. SeeReal Technologies was showing novel ‘glasses-free’ true 3D visualization. An eye-tracking device on the display adjusts the 3D display to the user’s head movements.

Papers

Dave Abbot, consultant, warns all you deal makers out there that deal promotions are subject to anti-fraud provisions of state and federal law. ‘Transparency’ is the key to dealing with investors. ‘Violation could result in you losing your home’! Evendi et al., (ChevronTexaco) described an XML schema for kinetic data in GoCad. Gary et al., Tramontane Inc., described Unocal’s web portal for biostratigraphy and log-derived sand count. Hodge et al., (Midland Valley) combine digital elevation model (DEM) with satellite imagery and structural modeling. Larue, (ChevronTexaco) claims field maturity does not necessarily equate to reduced volumetric uncertainty. Larue showed examples of uncertainties of over 50% in mature fields. Databases and reservoir modeling studies ‘downplay the importance of depositional environment on recovery’. Loseth et al., Norsk Hydro have collected 3D digital outcrop data with GPS and laser scanners. A fluid flow model integrated surface core data and other ‘behind the outcrop’ data including shallow seismic and ground penetrating radar. Comparisons with less constrained models showed the importance of capturing true reservoir heterogeneity. Preston et al., HRH Ltd. use Bezier curves to categorize lithological units. These are stored along with geological data such as grain size—and can be scaled and integrated with other digital data such as wireline and MWD.

This report is abstracted from a 25 page illustrated report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this subscription-based service please email tw@oilit.com.


Folks, facts, orgs, et cetera

News from LSI, CDA, CMG, Fugro, Geotrace, KSI, GMI, MetaCarta, Quorum, Roxar, Baker and DGI.

LSI Logic has renamed its Storage Systems unit ‘Engenio Information Technologies’ in preparation for an IPO.

~

Common Data Access is seeking an experienced E&P data management professional to manage its key projects.

~

Calgary-based CleanDB is no more; Brian Marshall, who developed a mySQL version of the PPDM database, is now operating as an independent contractor.

~

Simulation specialist Computer Modeling Group (CMG) has announced a multi-year $300,000 annual software license agreement with ‘a major multinational company’.

~

Paul van Riel is to replace Kobi Ruegg as CEO of the Fugro’s development and production business. van Riel was a founder of Jason Geosystems, acquired by Fugro in 2001. Ruegg now heads up Fugro’s offshore survey unit.

~

Jogn Ebbern is MD of Geotrace’s new Aberdeen office. Ebbern was previously with Western Geco.

~

James Webster has been named COO of geopressure specialists Knowledge Systems Inc. (KSI).

~

Potential field specialist Fugro-LCT has ‘consolidated’ its London facility, re-locating it to Houston.

~

Patrick Keenan is now president of GeoMechanics International(GMI). Keenan was previously VP business development with Core Lab.

~

Ron Matros has joined MetaCarta, Inc., as CEO. Matros was previously CEO of iConverse.

~

Donn Wilson has joined Quorum Business Solutions as land and ownership advisor. Wilson was previously with ChevronTexaco.

~

Roxar CEO Sandy Esslemont has relocated to Houston to ‘spearhead’ Roxar’s targeting of the North American market.

~

Mike Wiley is to retire as chairman and CEO of Baker Hughes when his term expires in 2005.

~

Divestco is to sell Dynamic Graphics’ 3D visualization and modeling technology in Canada. The company is also developing the Divestco DataStore—which will allow customers to manage and access their seismic data.


First quarter service sector results mixed

Underwhelming service sector results contrast oils’ financial health, mirroring a long-term trend.

Halliburton’s Energy Services Group posted first quarter 2004 revenues of $1.8 billion, a $205 million increase over the first quarter 2003. Operating income of $214 million was also up $34 million. Landmark and Other Energy Services first quarter 2004 operating income was $29 million on revenues of $129 million (up 5%). The first quarter record was due primarily to increased software sales.

Core

Core Laboratories had an ‘excellent’ quarter with all-time revenue highs. Notwithstanding this, Core reports a $6.5 million loss for the period due to a write-down on the sale of its Resrvoir Technology division to Paradigm. In its 2003 annual report, Core reveals that it paid just over $10 million cash for its 2002 acquisition of Advanced Data Solutions.

Baker Hughes

BHI reported income for qtr 1 2004 at $95.3 million, up 90% over the same period last year. Mike Wiley, Baker Hughes’ chairman and CEO expects ‘strong activity .. to continue’ and ‘margins to improve throughout the year.’

Divestco

Divestco announce earnings per share of 3.9 cents for quarter, compares to a loss of 0.7 cents for the same period in 2003 - ‘an increase of 657%’ according to Divesco’s financiers! Software generated revenue of $5.1 million for 2003 up 60% over 2002.

I/O

Input/Output’s first quarter net income was $591 thousand on revenues of $36.3 million. These compare with a net loss of $5.3 million, on revenues of $41.2 million for the same period a year ago. Bob Peebler, I/O’s President and CEO, said, ‘The first quarter contained many positive events for I/O.’

Kelman

Kelman Technologies Inc announced a net loss of $136 thousand for the first quarter—an improvement of $800 thousand over the period last year. Kelman reports ‘modest improvement’ from its data management division—particularly in the US. While the Houston market remained low, the disappearance of two major competitors has ‘thinned-out’ competition.

TGS-Nopec

TGS-NOPEC consolidated net revenues were $28.7 million, down 6% on Q1 2003. Operating profit of $6.6 million was down 39%.

Seitel

Seitel reports first quarter revenues up 36% to $41 million with income from operations up 168% to $8,5 million. Seitel chairman Fred Zeidman, said, ‘We’ve made significant progress in our turnaround and we continue to work hard to emerge from bankruptcy as soon as possible.’

AspenTech

Aspen Technology’s revenues for its third fiscal quarter totaled $80.7 million of which $35.9 came from software. Dave McQuillin, President and CEO says ‘our customer base is actively looking to invest in IT solutions.’

Peters & Co.

Underwhelming results categorize the service sector and contrast with excellent performance from virtually all oil and gas producers. From May 2001 to May 2004 the Peters & Co. PE Integrated Oils index rose from around 3,000 to 5,000. Oilfield service sector index over the same three year period has had a switchback ride down from 10,800 to a low of 6,600 year end 2001—and back up to a current level of 9,000. More from www.petersco.com.


Qatar Petroleum selects P2 ES ERP

Petroleum Place Energy Solutions now claims to be number 2 in oil and gas financial software.

Qatar Petroleum (QP) is implementing an upstream enterprise resource planning (ERP) tool from P2 Energy Solutions (P2ES). QP will be using P2ES’ Enterprise Upstream (EU) suite of applications to support its oil and gas production management and reporting in Qatar. P2ES was awarded the contract following a public tender. QP’s EU implementation replaces four internal legacy systems.

Anani

P2ES president international ops, Tarig Anani said, ‘Our software integrates with ERP systems including SAP, Oracle, and PeopleSoft. Other P2ES clients in the middle east include. QP’s licensing of four EU modules is a strong testimony to the international appeal of the software.’ P2ES is represented in the region by local e-business solution provider iHorizons.

Vickers

P2ES president Gary Vickers told the Denver Post earlier this year that his company was now the second largest player in the oil and gas ERP market behind SAP. Other EU customers include Conoco Phillips, Unocal, BHP Billiton, KOC and ADCO.


Trade Ranger’s ‘Universal Environment’

Trade Ranger is rebranding its e-commerce portal, adding business intelligence functionality.

As revealed in Oil IT Journal (Vol. 9 N° 2), Trade-Ranger, the buy-side e-marketplace for the petroleum industry is to launch the Trade Ranger Universal Environment (TRUE)—a ‘unified collaboration framework’ for all its applications. TRUE offers single-source access to existing Trade-Ranger applications and extends its current offering with Business Intelligence and Event Management modules.

Wilson

CEO John Wilson said, ‘TRUE creates an electronic marketplace where our customers can enter easily, navigate quickly, do business, and thrive. TRUE is built on a solid foundation that allows collaboration between trading partners, and enhances smart data and process level integration.’

Business intelligence

TRUE Business Intelligence offers buyers and suppliers summary reports of their organization’s activity with trading partners while Event Management tracks document status and transaction history. Members can access the new functionality on go2true.com. Trade-Ranger shareholders include ConocoPhillips, Shell, Statoil, Total, Unocal, Occidental and BP.


Impress integrates ESRI GIS and SAP ERP

New software adds a geographical component to financial reporting.

Impress Software (Hannover, Germany) and ESRI have teamed on a ‘pre-packaged’ integration application (Geo I.APP) which links ESRI’s ArcGIS product line to SAP’s mySAP product lifecycle management (PLM) solution. The development was driven by the demands of Impress customers who include BP, ChevronTexaco, Total, ConocoPhilips and Halliburton Energy Services.

Benner

ESRI director Steve Benner said, ‘Using spatial information stored in ArcGIS in daily operations supported by mySAP PLM requires both synchronous and asynchronous integration of the two applications. In IMPRESS, we found a partner with proven experience in developing standard integration solutions between SAP solutions and other third-party applications. IMPRESS Geo I.App will enable clients to deploy integrated applications quicker, cheaper, and in a supported environment.’

Dotan

Impress president Omri Dotan added ‘Geo I.App can be implemented in a few weeks, generating immediate payback and substantial ROI for customers’.


PPDM ‘Lite’ draft spec released

PPDM’s spatial database specification takes shape—along with a scaled-down data model.

PPDM has release a draft specification of the ‘Lite’ version of its data model. PPDM Lite is also known as PPDM Spatial IV, or again as ‘PSDM’, the petroleum spatial data model.

Daunting

Full implementation of the PPDM can be a daunting task. Some companies require a smaller, spatially-enabled subset of the model that supports typical spatial queries.

Geodatabase

Spatial IV uses PPDM and ‘other existing Geodatabases’ to create a simplified, denormalized data model focusing on spatial data and on the presentation of PPDM summary data end users.

Formats

Support for various spatial formats is provided including Oracle, ESRI SDE and PostGIS. DDLs for Oracle 9.x and PostgreSQL 7.4 are provided. The draft spec for the ‘Lite’ data model is available for comment on www.ppdm.org/development/projects/spatial/documents.html.


Smart wells for Aramco and BHP Billiton

Production engineers are increasingly deploying WellDynamics’ ‘intelligent’ completions.

WellDynamics has installed its first SmartWell completion in Saudi Arabia on a trilateral well in Saudi Aramco’s Shaybah field. Shaybah field development has evolved from single horizontal wells to long reach multilateral completions that maximize contact with the reservoir. Intelligent completion technology provides inflow control from each lateral branch, resulting in more efficient clean up, stimulation and production control. The Shaybah 119 completion is a 3-lateral completion, employing hydraulically actuated control valves to allow selective variable choking of each of the laterals.

e-Field

Intelligent completions are a key component of the much touted ‘e-Field’ concept – a mix of hardware, modeling and optimization software which promises to revolutionize production systems.

Saleri

Nansen Saleri, head of Reservoir Management at Saudi Aramco, said, ‘Advanced well completion is a technology focus area for us. SmartWell technology will contribute to our goals of improved reservoir management, maximum recovery and reduced well count.’

Koot

WellDynamics regional manager, Leo Koot said, ‘This installation demonstrates that our products and services are mature and ready to be installed on a routine basis.’ WellDynamics is a joint venture between Halliburton Energy Services and Shell Technology Ventures. Formed in 2001, the company has installed 115 SmartWell systems for 24 customers in 13 countries throughout the world.


Kappa addresses massive field data influx

Kappa has solved the data historian problem by applying wavelet filters to production data.

Later this year, Kappa Engineering is to release a software tool that addresses the problem of the massive data volumes generated by the ‘e-field’. Diamant is described as a ‘software cross-over’ - a data management and reservoir surveillance hybrid. Diamant accesses and processes data from production data historians and permanent gauges.

Distortion

Current downhole and surface acquisition creates vast amounts of raw data. Straightforward filtering can distort significant data ‘signatures’ while recording all data will saturate the CPU and the memory of any application.

Wavelet

Diamant filtration reduces data volumes by two orders of magnitude without missing significant events such as choke changes and shut-ins. Diamant uses wavelet filtering to process billions of permanent gauge data points, extracting only useful information. Low frequency producing pressures are de-noised and filtered for production analysis. History matching and high frequency events, such as build-ups, are detected and loaded. New data is loaded with individual gauge filter settings and appended to existing data. Users can return to any part of the data and locally re-populate sequences of interest. Diamant also loads and updates rate data from a production database via ODBC.

Analysis

Data can be drag and dropped (or passed via the clipboard) from Diamant to Kappa’s Saphir for pressure transient analysis or to Topaze, for production analysis. Permanent gauge data contains useful information allowing accidental or planned shut-ins to be used as transient tests. The Diamant browser positions files in a logical hierarchy that includes fields, tanks, well groups and wells, irrespective of the actual location of the files. Data management functionality further enables files to be ‘gathered’ into a rational data structure.


Total and Schlumberger opt for SGI

Altix proves popular for seismics and reservoir engineering with sales in the US, France and the UK.

Total is expanding its processing capability with the acquisition of several new systems from Silicon Graphics (SGI). Total’s E&P unit located in Pau, France, has bought a second 256-processor Altix 3700 supercluster and a 40-processor Altix system. The new machines will be used by Total’s geophysicists for seismic analysis and reservoir engineers for fluid flow modeling. The Altix supercluster includes 256 Intel Itanium 2 processors clocked at 1.3 GHz, with 2 terabytes of memory. The new systems are said to add some 1.5 teraflops of processing power to Total’s installation. The system integrates Total’s existing SGI infrastructure, in particular an InfiniteStorage CXFS 1 shared filesystem.

Total USA

The 64-processor Altix 350 is for delivery to Total E&P USA where Total’s geophysical research group will develop and industrialize new seismic processing algorithms. A four-processor SGI Altix 350 system will be used in addition to the 40-processor Altix to run reservoir simulation applications.

Schlumberger

Schlumberger Cambridge Research is also an enthusiastic SGI customer and has just purchased a 32-processor Altix 3000 system with 64GM memory. The system will be used to study geomechanics, physical chemistry, fluid mechanics, and seismics.


eLynx announces Enterprise web SCADA

The Houston Exploration Company is using eLynx Enterprise on its Arkoma basin properties.

Tulsa-based eLynx Technologies has just announced its new ‘Enterprise’ web-based SCADA solution. Enterprise gives customers remote management of production assets and associated data from an Enterprise level. New features include administration of individual user logins and security profiles, as well as tree-based groupings of assets.

Powers

eLynx VP John Powers said, ‘Enterprise lets clients manage hundreds of users and thousands of wells and to share real-time data across the organization. User management means that current data can be delivered in-context to different users, such as pumpers, field supervisors, engineers and accountants.’

Framework

Enterprise solution lets companies create hierarchical groups of device layouts with unlimited custom attributes, an interactive trending interface and tiered, custom alarm schemes.

Simpson

Lead developer Geoff Simpson added, ‘Enterprise provide maximum flexibility for any size organization. In addition to the robust application design, the eLynx Enterprise is designed to scale from a single device to thousands of devices without the exponential cost increase required by traditional SCADA systems.’ eLynx now offers custom data integrations with third party production, accounting and engineering systems.

Houston Exploration

The Houston Exploration Company (HEC) is one of eLynx’s 100 producer clients and is an early adopter of Enterprise. Houston has just extended its contract with eLynx to include its Arkoma Basin assets. Elynx now monitors 850 producing wells for HEC, some 95% of the company’s onshore production.

Hresko

HEC ops manager Joanne Hresko said, ‘eLynx provides an economic, efficient system allowing us to operate more wells with fewer people.’ eLynx Technologies was formed in August 2000 by its parent company American Central Gas Technologies.


Aspen, SIS roll-out Sim-Opt solution

.NET-based technology from AspenTech and Schlumberger offers subsurface-to-facility optimization.

AspenTech has announced availability of its integrated software solution for the oil and gas industry, the Aspen Oil and Gas Solution (AOGS). AOGS builds on Aspen’s two software components, the AssetBuilder asset modeling environment and HYSYS Upstream simulation technology. The combined solution enables companies to model, plan and optimize the production of an entire field, including multiple wells, pipeline networks and process facilities, using an integrated simulation model.

Schlumberger

The solution also integrates ‘industry standard’ third-party applications from Schlumberger Information Solutions (SIS) to support accurate and timely operational decisions. An integrated asset model is generated using Aspen AssetBuilder. Companies can improve field performance by optimizing the whole system from subsurface to producing facility.

Le Peuch

SIS VP product development Olivier Le Peuch said, ‘AspenTech’s simulation and optimization technologies complement our upstream workflow and modeling applications. The integrated asset model provides oil and gas companies with additional capabilities that support production optimization and field planning initiatives, and help identify opportunities to improve facilities utilization.’

.NET

The multiple applications and data sources that support the integrated asset model within Aspen AssetBuilder are connected in a common environment based on Microsoft .NET. .NET’s support of Web services is said to help customers ‘harness an expanded array of existing tools’, while ‘protecting their options to incorporate new features in the future’. SIS also announced a major .NET development last year.

Mikulis

Marise Mikulis, Microsoft’s oil industry manager, said, ‘By developing its solutions based on .NET, AspenTech expands the scope of integrated modeling to improve operations and increase value for oil and gas companies.’ AspenTech claims over 1,500 clients for its software including BP, ChevronTexaco, ExxonMobil, Shell, and Total.


Aveva uses Leica laser scan

3D laser scan technology aids ‘as built’ facility modeling.

At the SPAR conference in Houston this month, Aveva Solutions presented new laser scanning technology that allows retroactive capture of digital plant design data. 3D laser scanning hardware from Leica Geosystems is used to capture plant geometry ‘as built’. The ‘point cloud’ data is then fed into the plant data management system. Laser scanning hardware can resolve point accuracies of ± 6mm over 50 meters—important for accurate 3D tie-in and clash detection. Aveva claims significant savings on rework costs with the new technology.


Remote cementing ops for BP’s Valhall

BP is leveraging its DrillView software and high bandwidth fiber to move workers onshore.

Halliburton and BP Norway recently completed what is claimed to be the first remote offshore cementing job. Operations were controlled from BP’s operations center in Stavanger, 340 km from the Valhall field in the Norwegian North Sea.

DrillView

Onboard equipment and software for remotely-controlled cementing operations was installed on the field late last year. The operator’s workstation uses BP’s DrillView software and a twin screen display allowing for controls of cement mixing and pumping. Data and video from offshore cameras are fed over a twin optical fiber link to the shore.

Bjordal

Audun Bjordal, Halliburton Fluids Division country manager in Scandinavia said, ‘We now able to control and monitor many operations and processes from the beach.’ Prior cement jobs were trialed from a control room located on the platform, one deck above the cement equipment.

Paradox

Paradoxically, moving operations from the harsh offshore environment is not always appreciated by the personnel involved. In the past, generous pay and conditions meant that offshore workers could spend their leave in Spain, or remote rural locations in Norway. Technology is turning their jobs into Stavanger-based, 9-5 routines.


Input/Output acquires GX Technology

Bob Peebler’s plans for ‘full wave’ digital seismic services reach fruition with I/O’s latest acquisition.

Input/Output (I/O) has bolstered its ‘full wave’ seismic offering with the acquisition of seismic processing house GX Technology Corp. (GXT) in a $150 million cash and paper deal.

Peebler

I/O president Bob Peebler said, ‘GXT is a ‘crown jewel’ in the world of seismic technology and is recognized as a leading imaging solutions provider. GXT will play a critical role in our strategy to lead the industry into the digital ‘full-wave’ era. GXT will re-orient our company from equipment manufacturing to offering a full range of seismic imaging solutions. GXT complements our VectorSeis sensor technology, but full-wave imaging is about more than just the sensor – it’s also about planning, field execution and advanced processing.’

PSDM

GXT’s processing offerings include time processing, velocity modeling and pre-stack time and depth migration. GXT also provides value-added services including survey design, project management and quality control. GXT pioneered pre-stack depth migration techniques, originally applied in the Gulf of Mexico.

Lambert

Mick Lambert, CEO of GXT, added, ‘Our companies share a vision of a high tech seismic company focused on solving our clients’ imaging problems. The combined company will be able to tackle some of the most challenging seismic opportunities in geophysics; full-wave, multi-component processing and 4-D imaging.’

Debt

The $150 million purchase price includes the assumption of $4.5 million in debt and the delivery of I/O stock options with a value of approximately $15.5 million. I/O plans to issue $100 million of common stock to finance the deal along with a new loan. In calendar 2003, GXT reported revenues of $49 million. For more on I/O’s ‘full wave’ plans read our interview with Peebler in OITJ Vol. 8 N° 12.


Major SAP roll-out for India’s ONGC

SAP is now live in ONGC—serving 5,200 users while Statoil reports JV accounting success.

SAP reports ‘continued success’ for its upstream industry-specific application, SAP for Oil and Gas. In one of the largest ever SAP go-lives in Asia, Oil and Natural Gas Corporation Limited (ONGC) has rolled out the software at over 100 locations across India.

5,200 users

SAP now serves over 5,200 ONGC users and provides ‘a single companywide platform to integrate and optimize all business processes’. Key components deployed by ONGC include Remote Logistics Management and Production Sharing Accounting.

Kaviraj

ONGC executive director Amitava Kaviraj said, ‘We chose SAP to cover our end-to-end business process needs. Now, all information on company activities is available online, 24/7 to support tactical and strategic decision-making. Operations are tightly coupled, increasing productivity and reducing costs. For customers, this means better service; for ONGC, a fundamental competitive advantage.’

Statoil

Kjell Petter Gilje, described Norwegian Statoil’s experience as an early adopter of SAP’s Joint Venture Accounting solution saying, ‘We have integrated and streamlined cash call and joint interest billing processes which contribute directly to easier handling, faster processing and collection of funds from our venture partners. Standard month-end joint venture processing enables Statoil to achieve faster month-end closing, leading to faster financial reporting at less cost.’

mySAP

SAP for Oil & Gas is a customization of the mySAP Business Suite and leverages SAP’s NetWeaver web services infrastructure. SAP claims over 500 customers in the oil and gas industry worldwide, comprising more than 750,000 users.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.