Schlumberger Information Solutions (SIS) has acquired the assets of Decision Team, an oil and gas software and consulting services firm based in Baden, Austria.
Decision Teams flagship Decide! software provides intelligent reservoir surveillance and production optimization. Decide captures, analyzes, conditions and transforms historical and real-time production data into actionable operational decisions.
Decision Team MD Michael Stundner said, Production engineers can leverage this immense volume of data while focusing on well and field-level problems. We look forward to integrating Decide with Schlumbergers suite of production software to enable production optimization workflows such as simulation history matching.
SIS president Peter Goode said, The combination of SIS and Decision Team will provide a comprehensive set of petroleum engineering workflows for real-time production optimization and proactive reservoir management. Decide will be a catalyst for enhancing production and augments our real-time capabilities. Schlumberger told Oil IT Journal that Decision Team personnel will continue work on Decide within SIS.
Huge volumes of operational data present a challenge for todays reservoir and production engineers. Decide transforms raw data into pertinent information and offers notification systems and ranking lists of underperforming wells. Automated event detection replaces routine field surveillance, resulting in significant time saving.
Decide applies artificial intelligence (AI) decision analytics to reservoir and production engineering, generating useable information from such large volumes of data. Data mining techniques available include self organizing maps, multiple linear regression and neural nets. This analytical data mining support diagnostics and predictive modeling for activities such as optimizing field injection-production ratio, artificial lift performance and smart well control.
Read the book!
For more on the Decision Team approach see our review of the book Oil and Gas Data Mining in Oil ITJ Vol. 9 N° 2.
Weatherford International has acquired Edinburgh Petroleum Services (EPS), a move that further enhances Weatherford's leading position in Production Automation and Optimization. EPS asset management software optimizes design of networks of wells and production facilities. EPS also offers well test analysis, material balance and well performance modeling tools and a strong reputation for consultancy and training.
Dharmesh Mehta, head of optimization with Weatherford said, We have over 40,000 wells around the world using automatic control. EPS allows us to extend this dominant position in well optimization into complete asset optimization.
EPS MD Laurence Ormerod added, Weatherfords strength in all aspects of artificial lift systems, including real-time data acquisition and control, is an excellent match for our optimization skills.
5 point plan
The EPS acquisition completes Weatherfords five point plan for production optimizationfrom completion hardware and sensing, artificial lift sensing and software, to field optimization.
There are two kinds of data managers. Those that are just doing it and those that are still waiting for a silver bullet to do it for them. This really is my take-home from the excellent PNEC Petroleum Data Integration* conference held in Houston this month. Our complete report on the 8th PNEC will appear in next months Oil IT Journaland of course as part of The Data Rooms extended Technology Watch Report service. But I thought that you might like a preview in the form of some thoughts on where data management is today.
I believe it was the good old Général De Gaulle who said plus ça change, plus cest la même chose. Indeed it is easy for regular attendees at the PNEC, observing a certain sameness in the debates, to conclude that nothing has changed, that we are confronted with the same old problems of expanding data volumes, poorly applied rules and procedures for naming and capturing data and lack of funding. A couple of years back, a variety of solutions were suggestedusually combining outsourcing with re-engineering the workflow. Such solutions tended towards a production line approach: Taylorism applied to managing the upstream workflow.
Frederick Taylorthe original management guruwrote his Principles of Scientific Management in 1911. Taylor advocated** developing a science for every job, including rules, motion, standardized work implements, and proper working conditions. With great prescience, Taylor also advised selecting workers with the right abilities for the job, training them and offering proper incentives and support.
Such notions were central to industry for the best part of the last centuryfrom Henry Fords production lines to W. Edwards Demings quality management and maybe even to our upstream workflow re-engineers. But the production line approach implies a considerable degree of stability in work processes. There is no point in retooling and training everyone unless you are going to be manufacturing some product for a considerable time. Likewise, there is no point establishing a set of data management procedures if your data sources are going to changeor if new technology is going to come along and change the way you work.
This is the problem of applying Taylorism to a moving target. And upstream targets have shifted considerably in the last few yearswith much more post stack data online, horizontal wells with ghastly data management issues, multi-z and image logsand with time lapse and four component data on the horizon. All of which is set in a context of exponential growth in data volumes.
An illuminating discussion followed Yogi Schultzs talk at the PNECwhen Ian Morison of the Information Store questioned the notion that data volumes are the problem. Morison argues that if it were just a matter of increasing volumes, then our IT solutions would be more than capable of keeping up (thanks to Moores Law and growing disk capacity). Morison put his finger on what is undoubtedly the real issue in data management: the increasing complexity of upstream data and workflows.
Data complexity defies the Taylorism approach. If you are trying to collate GIS data from multiple coordinate reference systems then you really need a good understanding of geodesy. You are also unlikely to apply exactly the same skill sets two days running. Modern logging tools defy the simple depthvalue pair paradigm and require serious domain knowledge for their management. The management of multiple pre and post stack seismics likewise requires a goodly degree of geophysical knowledge.
But it can be done! What makes for good data management is a well-funded project and here, demonstrable progress is being made. Both ExxonMobil and Chevron-Texaco presented major data clean-up projects at the PNEC. These centered on the merger of well data from heritage companies and are great examples of what can be achieved when adequate resources are applied to such problems. The mergers have had the effect of a shot in the arm for data management. They appear to be succeeding where years of pontificating and theorizing have failed.
The cleanup of the majors heritage data sets are arguably the big drivers in data management today. They are spinning-off a new breed of software tools and contractor know-how as a new micro-industry is born. Above all, I think the majors approach shows that spending fairly substantial amounts of money on data clean-up is really part of the cost of doing business.
Fabric of management
As we map the processes developed for well header data across to the more complex parts of the workflow, the move away from Taylorism will be even more pronounced. We are no longer looking at a sausage machine approach to data managementbut to the incorporation of domain knowledge into the fabric of data management.
Just do it
It is the combined requirement domain knowledge and grunt work that makes it hard to get traction for data managementbut the majors are showing the way. So my advice to you all isjust do it!
* Petroleum Network Education ConferencesPhilip C. Crouse & Associates.
** Source www.cornell.edu. Google Taylorism.
Oil ITJWhere does software fit into your organization?
BefeldBaker Hughes International has six divisions. Baker Atlas handles wireline and formation evaluation. Atlas itself has three further subdivisions including Technology with Shraga Wolf as VP and myself as Director. We are located in the Houston Technology Center and the UK Recall unit.
Oil ITJHalliburton has Landmark, Schlumberger has GeoQuestwhere is Bakers software brand?
BefeldSoftware development is dispersed throughout Baker. All divisions have products that are licensed to oil company client users of Baker tools. Baker does not see a separate software business outside of its core divisions. Software is a creative process and shouldnt get too big. Witness the size of our Recall unit with 25 people.
Oil ITJWhat software is developed at the Houston Technology Center?
BefeldMostly software for logging tools, surface instruments and Inteq-related products. Such software is generally all tied-in with our toolsand not packaged.
Oil ITJ - But they may relate to Recall..
BefeldSure. Part of my job is to help clients customize their own environments. If Agip likes our acoustic log processing software well fix a license for their in-house use. Likewise for the vertical seismic processing toolkit, SeisLink which was developed through from our joint venture with CGGVSFusion.
Oil ITJSo whats the product line-up?
BefeldThe heart of our system is the database. This was designed from the ground up (by Chris Hanleywho also heads our Recall unit in London) and includes industry-specific data structures. This represents some sophisticated programming as much well data is recorded against both depth and time. Recall is really best in class for multi-z and image data. The software started out as Incline for measuring dipping bedsthe imaging tools grew out of this. Petrosthe petrophysical product is now being pushed into LogScape and now offers linked viewspoints, histograms, image data and flags showing z locations of selected analysis points.
Oil ITJ Did you use any third party tools for the Windows port of recall 5.0? Where are you in the Open GL vs. Direct-X debate?
BefeldWe didnt use any third part tools for the Windows port. All our development is graphics neutralbut we are following the technology in this space.
Oil ITJWellLink communications are a likely game changer in this space.
BefeldIndeed, Atlas Online built a satellite link to BH Direct. Now a customer with a laptop can see whats happening at the rig in near real timemaybe a 3 second delayvia the Recall system. The possibilities are amazinga user could receive a quick-look analysis on a personal digital assistant (PDA). This has major implications for decision makers! These folks used to be in the logging unit making big money decisions on the hoofand sometimes on their own!
Oil ITJDoes your software interface with other petrophysical analysis tools?
BefeldWe write our software for our logging toolsso users tend to use these. But of course Recall is the exceptionand will work with all service companies logging tools. Our core business is loggingso surface systems are the key development targets.
Oil ITJWell site processing seems a bit anachronistic. Why dont you stream all your data to Recall and process it there?
BefeldSuch ideas are always being kicked around. Maybe smarter logging tools will be able to push data up to the surface and on to a database. But downhole is a very harsh environment and these tools are very complex and must work in 400°F and 20,000 psi. I guess the answer is part historypart the organization of our core business of surface acquisition.
Oil ITJRecall 5.0 on Windows is a big change for your market.
BefeldWindows is very important to us. Recall, which is in every major oil company, has always run on big systems. Windows opens up a larger end user marketwhich we will be targeting aggressively.
Following its aborted sortie into IT consultancy via the acquisition and subsequent sale of Sema Group, Schlumbergers upstream consulting business has lacked visibility. This is about to change with the creation of a new Schlumberger Information Solutions (SIS) unit the Schlumberger Business Consulting group (SBC). SBC is focused on what is described as the growing demand for optimization of upstream operations. The new unit has been formed by the SIS software technology, information management, network and infrastructure services.
Heading up the new unit is Antoine Rostand who was president of EDS France before joining Schlumberger, where he was previously VP consulting and systems integration, Europe, for Schlumberger-Sema.
SIS president Peter Goode said, Our business is evolving, with changing operational issues and accelerating workforce dynamics. The next step change in value creation will see IT, new technologies and process redesign integrated with core E&P processes. Our domain experience and the capability to provide an end-to-end solution uniquely positions us to assist our customers to meet these challenges.
SBC will work with E&P companies on strategy and organization design and implementation to enhance core operational processes and to realize productivity gains through IT-enabled workflow design.
As revealed in OITJ earlier this year, (Vol. 9 N° 2) the United States Minerals Management Service (MMS) is to implement a new, electronic eWell reporting system. eWell lets operators exchange well data with the MMS Gulf of Mexico Regions district offices.
Current MMS regulations require companies to submit specific paper forms for completed and planned well activities. As part of a larger electronic government reengineering effort, MMS has restructured six well permit and report forms and made them accessible electronically. The new system replaces paper versions of permits to drill, permits to modify wells, well activity reports, end of operations reports, and rig move notifications.
eWell now puts these forms on-line where information can be submitted via a secure internet site in lieu of paper submission. The system pre-populates forms with previously submitted data stored in the MMS database. Automated help screens will speed form completion and improve accuracy.
According to MMS director Johnnie Burton, MMS analyses show our new eWell permitting and reporting system will reduce processing time for the 20,000 applications each year by 50 %, thus reducing costly rig waiting time. eWell is scheduled for rollout in June. Earlier this year, Burton received the prestigious Women in Energy leadership award.
Statoil has awarded Paradigm Geo and Landmark Graphics five year deals for the provision of upstream software. Paradigm is to supply its Explorer and Geolog applications running on its epos 3.0 integration platform. Geolog is recognized as Statoils mainstream well petrophysical tool while Explorer fills a similar role in time-depth conversion workflows.
Statoils Erik Gundersen commented Explorer is a powerful time-depth conversion solution. The software integrates well and seismic information with mapping tools, geostatistics and uncertaintyproviding geoscientists with the means to generate velocity models and convert the original time interpretation.
In what was described as a multi-million dollar deal, Statoil is extending its software contract with Landmark in the prospect generation, field development planning, drilling and completion area. The current agreement represents a significant extension of Statoils existing OpenWorks implementation described as the foundation of Statoils integrated information management strategy. The original deal with Landmark was signed five years ago.
Houston-based INT and Petris Technology will be offering web-based well log viewers as part of the PetrisWINDS Enterprise (PWE) system. PWE, the vendor-neutral, web-based data and application environment now offers users multiple well displays, cross plots etc..
INT VP Paul Schatz said, By making interactive data display available via the internet, PWE users will benefit from state-of-the-art geoscience delivered efficiently and economically to the desktop.
Petris VP Jeff Pferd added, INT offers high quality visualization and state-of-the-art technical foundations. We are pleased that INTs technology is an integral part of our data management application.
Blue Marble Geographics has released a new version of its GeoTransform library for GIS developers. GeoTransform 5.0 supports Visual Basic, C++, Delphi, PowerBuilder, C++ Builder and lets developers embed sophisticated image re-projection and tiling in their applications.
Blue Marble president Pat Cunningham said, GIS developers need to work directly with raster imagery in their applications. But dont need to reinvent the wheel! GeoTransform provides affordable technology and reduces time-to-market.
Blue Marble claims over 120,000 customers in 100 countries. More from www.bluemarblegeo.com.
At the PNEC Data Integration conferenceof which more in next months Oil IT JournalInnerlogix pretty well stole the show, or rather neatly arranged for ExxonMobil and ChevronTexaco to steal the show on their behalf. Both supermajors are enthusiastic users of Innerlogix DataLogix data cleanup tool and have used this extensively in the merger of their heritage data sets.
Work done for ExxonMobil in particular has led to the development of a new, batch-oriented data cleanup tool which is just about ready for commercial release. Innerlogix president Dan Heggelund told Oil IT Journal, QCLogix represents a new direction in data management, data confidence. QCLogix monitors upstream data and displays the results on a quality dashboard. Understanding quality is the cornerstone for building confidence in the data. QCLogix is based on a proven methodology for defining, measuring, analyzing, improving, and controlling the quality of E&P data.
At the PNEC, ChevronTexacos Mike Underwood said, QCLogix will enable us to better organize our data cleanupmonitoring, validating, and checking our databases for us. Batch mode processes should enable us to cut the percentage of time it takes to perform these activities in half and will enable us to increase the frequency of our runs to over one a month.
For more on how Innerlogix uses statistics, geo-statistics, business logic, and fuzzy logic for data QC read Heggeleunds May 2002 article in Oil IT Journal (Vol. 7 N° 5).
MetaCarta has productized its oil-industry specific text-and-GIS search tool as geOdrive. geOdrive helps geoscientists locate text documents stored in a shared drive, the company Intranet or corporate portal from geographic locations, keywords and time parameters. Documents are then organized on a map according to their geographic references.
Mike Odell, head of MetaCartas energy unit said, Oil and gas companies depend on knowing as much as possible about geographic locations. Workers need to know where to drill an exploratory well, where to place an offshore platform and where to locate a retail outlet. Geography is central to the operations.
geOdrive uses a geoparsing engine to determine the spatial location of documents on maps. MetaCarta claims a significant investment in its oil and gas sector gazetteer which holds millions of place names and industry-specific locations such as blocks, quadrants and leases within the Gulf of Mexico and the North Sea.
OpenSpirit Corp. is releasing a well log curve loader with a direct connection into A2Ds SilverWire web-based log data delivery service. The system will allow users to seamlessly compare and transfer log data between industry leading exploration and production systems.
OpenSpirits middleware integrates industry-standard datastores including Geoframe, Finder, and OpenWorks. SilverWire lets users query and download data from A2Ds Log-Line Plus well log database. Digital curve, raster image, and SmartRaster depth-calibrated images can be accessed from the workstation.
OpenSpirit CTO Clay Harter said, SilverWire connectivity from A2D greatly enhances access to log data. End users within other OpenSpirit-enabled applications and datastores will be able to connect and query against A2Ds catalog of digital log curves, compare to existing logs, and download live without any interfile transfer or other format changes.
Fugro-Jason has released the second edition of its PowerLog petrophysical software. PowerLog SE updates Petcoms Windows-based well log analysis tool originally released in the early 90s.
PowerLogSE introduces a new user interface and adds a cross-section montage capability in to the Collage Tool.
Users can now incorporate and correlate depth-registered graphical images such as scanned logs, maps and core images. PowerLogSE will be introduced to the industry at the upcoming SPWLA annual conference in Noordwijk (Netherlands) and the EAGE annual conference in Paris next month.
Wood Mackenzie director David Black gave an update on the 2003 study on Upstream Value Creation. WoodMac categorizes Repsol and ConocoPhillips as black holescompanies which fail to replace reserves and whose exploration erodes value. Over the 1997-2003 period, the 25 majors studied provided an average 11% return on investment (ROI) from exploration. Some geographical areas are out of favor with WoodMacnotably the UK North Sea, with $2 billion of value destruction (on an $11 billion investment). Worldwide, onshore and shelf environments destroy value. Only the deepwater and Agips Kashagan discovery have created value. Acquisitions have provided a 12% ROImostly because these deals were done during a low oil price. While there is plenty of life left in deepwater, new reserves quality is an issue. The value of the discovered barrel is going down with a move to higher tax regimes, stranded gas, longer lead times and fewer giant fields. Another problem is that investment is constrained by dwindling opportunities which will lead to increased competition.
Rob Ryan described ChevronTexacos (CT) portfolio management, conducted by centralized exploration review teams (ERT). Since the mid 90s, CTs wildcat success rates have been constant at around 30%. The average discovery is 50 million bbl. Ryan stated that for CT, The problem is not a lack of investment dollars, we have the money. Access to opportunities presents some challenges but there are significant opportunities. The real issue is efficiency.
Ryan believes that the industry should focus on selection and prediction process efficiencies, from technical assessment through risk evaluation review and planning. The ERTs were created to ensure consistency in exploration review through multi-discipline risk analysis. A 2002 study compared ERT and asset teams evaluations. The asset teams were wildly optimistic compared with the ERT evaluations. Ryan observed that, There is no better way to destroy value than the high risk prospect. Exploration workflows should focus on the basicsamplitude risking standards, seal standards, reservoir quantification standards and hydrocarbon charge standards. Its Nintendo explorationplus! In 2002 Chevron-Texaco was best in class for exploration success according to Deutsche Bank.
Mike Bahorich explained how Apache Corp. gives local units the decision making powerbut measures centrally and rewards success through the its rule 43 incentive system. This means that when Apache stock goes through $43 for over 10 days, staff get a 100% salary bonus! Apache is a smart shopperbuying common off the shelf (COTS) technology at the right price for horizontal wells, 3D seismic etc. Technology Watch is important for Apachelooking out for emerging and especially disruptive technologies. Apache is benefiting from the falling cost of storage and now keeps pre-stack 3D seismic on disk.
Get back to exploration was the entreaty from IHS Energys Pete Stark. There has been a precipitous drop in gas discoveries in the last three years. A dramatic change in operator mix has also occurred with a move from the western majors to the NOCs and former NOCs. The exploration slump is a cause for concern. The world has failed to replace production for the last 20 years.
Pason Systems will shortly be releasing AutoDrillercontrol software which maintains constant parameters, especially for horizontal wells. The feedback system is tied to the electronic drilling recorder and adapts to its environment. AutoDrillerruns atop of Pasons Electronic Drilling Recorder (EDR), a hardware, software and database combo. Epoch Well Services now offers real time data feeds to users of its myWells.com well data gathering and distribution system. Users can follow drilling activity from anywhere over a secure internet connection. Geologix WellExplorer lets companies set up a departmental-level intranet for dissemination of well summaries, logs, and reports. WellExplorer uses Microsofts Internet Information Server (IIS) with a SQL server back-end. The software uses Geologix GEO dynamic document structure, and leverages the emerging WITSML standard.
Core Laboratories has expanded its digital data management offering with the Rapid core database and RIB, an HTML-based data archiver. Rapid stores well information, core imagery, thin sections, SEM data, poro-perm, cross plots and photomicrographs. The focus is on rocks. Data can be exchanged through LAS, CSV files and Oracle. RIB provides web-based archiving and reporting of data in Rapid. Production Geosciences Oilfield Data Manager now sports an integration canvas for display of cultural and regional data alongside the geology. ODM offers data management, correlation displays and wells and surfaces in 3D. ODM is used by Shell to QC data before entry into the corporate database and by Saudi Aramco for stratigraphic correlation. The 2004 release of Fugro-Robertsons Tellus database adds 8 million data points of geochemical source rock and seep data. On-the-fly mapping from the database includes petroleum systems and chromatography plots. The OilTracers web sites offers free searching of an oil sample librarya database of over 33,000 oil, gas and rock samples from all over the world. A second product, OilRef, holds over 11,000 citations from 300,000 pages of geochemical literature. Zebra Geosciences EZDataRoom is an electronic data room. Well log and SEG-Y data can be viewed from a web browser. Data remains in the data room and does not needs to be passed to a userenhancing security. If required, configurable user-based security allows for printing and download of information. Both Seismic Micro Technology and TerraSciences are developing OpenSpirit data servers for their interpretation software. These will simplify integration with other vendor interpretation packages.
Geoff Dorn, from the BP Center for Visualization at the University of Colorado, has developed patented technology (originally from Arco) for automated fault extraction (AFE) from 3D seismic data volumes. The software will likely be commercialized as a plug-in to Paradigms VoxelGeo. LithoTect Interpreter from Geo-Logic Systems is a low cost version of Geo-Logics geological map, well, seismic, cross section, and 3D interpretation tool. Interpreter includes depth conversion and well picking, monitoring, and projection capabilities. Interpreter is a pure Java application that runs on laptops and workstations. GMI Imager from GeoMechanics International now has write back capabilities with Landmarks OpenWorks. Imager analyses can be written back to the database for correlations and other applications.
The Kansas Geological Survey was showing elements of the new North American Cyberinfrastructurean electronic grouping of various geological resources across the USA. One componentthe GEON Grid seismic infra-structure is a federation of ArcIMS servers supporting the National Carbon Sequestration database. CHRONOS is a national stratigraphic portal providing access to distributed databases across the country. Parts of the Cyberinfrastructure leverage emerging standards for ontologies using semantic web standards like OWL. Isatis V5.0 from Geovariances adds multi-Gaussian simulationa pixel based technique. Isatis can be used in stand-alone mode, or coupled to Petrel, Gocad, RMS, RML, and PowerModel. According to Geovariances, vendors are now offering Isatis links within their own packages. Schlumberger was showing its Chaos Cubea Coherence Cube look-alike that highlights high density faulting and fractures. The Chaos attribute is thought to react to the presence of fluid and may provide a 3D picture of gas migration.
Sclumberger was showing Gigaviz on a 16-node Linux cluster which was down. GigaViz offers a lot of sophistication for image processing and ad-hoc, rule based voxbody extraction. But the demos fail to capture attention in the way that Magic Earth does. 4D Vista from Midland Valley is described as an Adobe Acrobat for 3D and now links to 2DMove and 3DMove toolsets to provide an integrated structure analysis workbench.
A new adaptor for Neuralogs Neura-Scanner lets you scan transparent (film) logs with transmitted light. SeeReal Technologies was showing novel glasses-free true 3D visualization. An eye-tracking device on the display adjusts the 3D display to the users head movements.
Dave Abbot, consultant, warns all you deal makers out there that deal promotions are subject to anti-fraud provisions of state and federal law. Transparency is the key to dealing with investors. Violation could result in you losing your home! Evendi et al., (ChevronTexaco) described an XML schema for kinetic data in GoCad. Gary et al., Tramontane Inc., described Unocals web portal for biostratigraphy and log-derived sand count. Hodge et al., (Midland Valley) combine digital elevation model (DEM) with satellite imagery and structural modeling. Larue, (ChevronTexaco) claims field maturity does not necessarily equate to reduced volumetric uncertainty. Larue showed examples of uncertainties of over 50% in mature fields. Databases and reservoir modeling studies downplay the importance of depositional environment on recovery. Loseth et al., Norsk Hydro have collected 3D digital outcrop data with GPS and laser scanners. A fluid flow model integrated surface core data and other behind the outcrop data including shallow seismic and ground penetrating radar. Comparisons with less constrained models showed the importance of capturing true reservoir heterogeneity. Preston et al., HRH Ltd. use Bezier curves to categorize lithological units. These are stored along with geological data such as grain sizeand can be scaled and integrated with other digital data such as wireline and MWD.
This report is abstracted from a 25 page illustrated report produced as part of The Data Rooms Technology Watch Reporting Service. For more information on this subscription-based service please email email@example.com.
LSI Logic has renamed its Storage Systems unit Engenio Information Technologies in preparation for an IPO.
Common Data Access is seeking an experienced E&P data management professional to manage its key projects.
Calgary-based CleanDB is no more; Brian Marshall, who developed a mySQL version of the PPDM database, is now operating as an independent contractor.
Simulation specialist Computer Modeling Group (CMG) has announced a multi-year $300,000 annual software license agreement with a major multinational company.
Paul van Riel is to replace Kobi Ruegg as CEO of the Fugros development and production business. van Riel was a founder of Jason Geosystems, acquired by Fugro in 2001. Ruegg now heads up Fugros offshore survey unit.
Jogn Ebbern is MD of Geotraces new Aberdeen office. Ebbern was previously with Western Geco.
James Webster has been named COO of geopressure specialists Knowledge Systems Inc. (KSI).
Potential field specialist Fugro-LCT has consolidated its London facility, re-locating it to Houston.
Patrick Keenan is now president of GeoMechanics International(GMI). Keenan was previously VP business development with Core Lab.
Ron Matros has joined MetaCarta, Inc., as CEO. Matros was previously CEO of iConverse.
Donn Wilson has joined Quorum Business Solutions as land and ownership advisor. Wilson was previously with ChevronTexaco.
Roxar CEO Sandy Esslemont has relocated to Houston to spearhead Roxars targeting of the North American market.
Mike Wiley is to retire as chairman and CEO of Baker Hughes when his term expires in 2005.
Divestco is to sell Dynamic Graphics 3D visualization and modeling technology in Canada. The company is also developing the Divestco DataStorewhich will allow customers to manage and access their seismic data.
Halliburtons Energy Services Group posted first quarter 2004 revenues of $1.8 billion, a $205 million increase over the first quarter 2003. Operating income of $214 million was also up $34 million. Landmark and Other Energy Services first quarter 2004 operating income was $29 million on revenues of $129 million (up 5%). The first quarter record was due primarily to increased software sales.
Core Laboratories had an excellent quarter with all-time revenue highs. Notwithstanding this, Core reports a $6.5 million loss for the period due to a write-down on the sale of its Resrvoir Technology division to Paradigm. In its 2003 annual report, Core reveals that it paid just over $10 million cash for its 2002 acquisition of Advanced Data Solutions.
BHI reported income for qtr 1 2004 at $95.3 million, up 90% over the same period last year. Mike Wiley, Baker Hughes chairman and CEO expects strong activity .. to continue and margins to improve throughout the year.
Divestco announce earnings per share of 3.9 cents for quarter, compares to a loss of 0.7 cents for the same period in 2003 - an increase of 657% according to Divescos financiers! Software generated revenue of $5.1 million for 2003 up 60% over 2002.
Input/Outputs first quarter net income was $591 thousand on revenues of $36.3 million. These compare with a net loss of $5.3 million, on revenues of $41.2 million for the same period a year ago. Bob Peebler, I/Os President and CEO, said, The first quarter contained many positive events for I/O.
Kelman Technologies Inc announced a net loss of $136 thousand for the first quarteran improvement of $800 thousand over the period last year. Kelman reports modest improvement from its data management divisionparticularly in the US. While the Houston market remained low, the disappearance of two major competitors has thinned-out competition.
TGS-NOPEC consolidated net revenues were $28.7 million, down 6% on Q1 2003. Operating profit of $6.6 million was down 39%.
Seitel reports first quarter revenues up 36% to $41 million with income from operations up 168% to $8,5 million. Seitel chairman Fred Zeidman, said, Weve made significant progress in our turnaround and we continue to work hard to emerge from bankruptcy as soon as possible.
Aspen Technologys revenues for its third fiscal quarter totaled $80.7 million of which $35.9 came from software. Dave McQuillin, President and CEO says our customer base is actively looking to invest in IT solutions.
Peters & Co.
Underwhelming results categorize the service sector and contrast with excellent performance from virtually all oil and gas producers. From May 2001 to May 2004 the Peters & Co. PE Integrated Oils index rose from around 3,000 to 5,000. Oilfield service sector index over the same three year period has had a switchback ride down from 10,800 to a low of 6,600 year end 2001and back up to a current level of 9,000. More from www.petersco.com.
Qatar Petroleum (QP) is implementing an upstream enterprise resource planning (ERP) tool from P2 Energy Solutions (P2ES). QP will be using P2ES Enterprise Upstream (EU) suite of applications to support its oil and gas production management and reporting in Qatar. P2ES was awarded the contract following a public tender. QPs EU implementation replaces four internal legacy systems.
P2ES president international ops, Tarig Anani said, Our software integrates with ERP systems including SAP, Oracle, and PeopleSoft. Other P2ES clients in the middle east include. QPs licensing of four EU modules is a strong testimony to the international appeal of the software. P2ES is represented in the region by local e-business solution provider iHorizons.
P2ES president Gary Vickers told the Denver Post earlier this year that his company was now the second largest player in the oil and gas ERP market behind SAP. Other EU customers include Conoco Phillips, Unocal, BHP Billiton, KOC and ADCO.
As revealed in Oil IT Journal (Vol. 9 N° 2), Trade-Ranger, the buy-side e-marketplace for the petroleum industry is to launch the Trade Ranger Universal Environment (TRUE)a unified collaboration framework for all its applications. TRUE offers single-source access to existing Trade-Ranger applications and extends its current offering with Business Intelligence and Event Management modules.
CEO John Wilson said, TRUE creates an electronic marketplace where our customers can enter easily, navigate quickly, do business, and thrive. TRUE is built on a solid foundation that allows collaboration between trading partners, and enhances smart data and process level integration.
TRUE Business Intelligence offers buyers and suppliers summary reports of their organizations activity with trading partners while Event Management tracks document status and transaction history. Members can access the new functionality on go2true.com. Trade-Ranger shareholders include ConocoPhillips, Shell, Statoil, Total, Unocal, Occidental and BP.
Impress Software (Hannover, Germany) and ESRI have teamed on a pre-packaged integration application (Geo I.APP) which links ESRIs ArcGIS product line to SAPs mySAP product lifecycle management (PLM) solution. The development was driven by the demands of Impress customers who include BP, ChevronTexaco, Total, ConocoPhilips and Halliburton Energy Services.
ESRI director Steve Benner said, Using spatial information stored in ArcGIS in daily operations supported by mySAP PLM requires both synchronous and asynchronous integration of the two applications. In IMPRESS, we found a partner with proven experience in developing standard integration solutions between SAP solutions and other third-party applications. IMPRESS Geo I.App will enable clients to deploy integrated applications quicker, cheaper, and in a supported environment.
Impress president Omri Dotan added Geo I.App can be implemented in a few weeks, generating immediate payback and substantial ROI for customers.
PPDM has release a draft specification of the Lite version of its data model. PPDM Lite is also known as PPDM Spatial IV, or again as PSDM, the petroleum spatial data model.
Full implementation of the PPDM can be a daunting task. Some companies require a smaller, spatially-enabled subset of the model that supports typical spatial queries.
Spatial IV uses PPDM and other existing Geodatabases to create a simplified, denormalized data model focusing on spatial data and on the presentation of PPDM summary data end users.
Support for various spatial formats is provided including Oracle, ESRI SDE and PostGIS. DDLs for Oracle 9.x and PostgreSQL 7.4 are provided. The draft spec for the Lite data model is available for comment on www.ppdm.org/development/projects/spatial/documents.html.
WellDynamics has installed its first SmartWell completion in Saudi Arabia on a trilateral well in Saudi Aramcos Shaybah field. Shaybah field development has evolved from single horizontal wells to long reach multilateral completions that maximize contact with the reservoir. Intelligent completion technology provides inflow control from each lateral branch, resulting in more efficient clean up, stimulation and production control. The Shaybah 119 completion is a 3-lateral completion, employing hydraulically actuated control valves to allow selective variable choking of each of the laterals.
Intelligent completions are a key component of the much touted e-Field concept a mix of hardware, modeling and optimization software which promises to revolutionize production systems.
Nansen Saleri, head of Reservoir Management at Saudi Aramco, said, Advanced well completion is a technology focus area for us. SmartWell technology will contribute to our goals of improved reservoir management, maximum recovery and reduced well count.
WellDynamics regional manager, Leo Koot said, This installation demonstrates that our products and services are mature and ready to be installed on a routine basis. WellDynamics is a joint venture between Halliburton Energy Services and Shell Technology Ventures. Formed in 2001, the company has installed 115 SmartWell systems for 24 customers in 13 countries throughout the world.
Later this year, Kappa Engineering is to release a software tool that addresses the problem of the massive data volumes generated by the e-field. Diamant is described as a software cross-over - a data management and reservoir surveillance hybrid. Diamant accesses and processes data from production data historians and permanent gauges.
Current downhole and surface acquisition creates vast amounts of raw data. Straightforward filtering can distort significant data signatures while recording all data will saturate the CPU and the memory of any application.
Diamant filtration reduces data volumes by two orders of magnitude without missing significant events such as choke changes and shut-ins. Diamant uses wavelet filtering to process billions of permanent gauge data points, extracting only useful information. Low frequency producing pressures are de-noised and filtered for production analysis. History matching and high frequency events, such as build-ups, are detected and loaded. New data is loaded with individual gauge filter settings and appended to existing data. Users can return to any part of the data and locally re-populate sequences of interest. Diamant also loads and updates rate data from a production database via ODBC.
Data can be drag and dropped (or passed via the clipboard) from Diamant to Kappas Saphir for pressure transient analysis or to Topaze, for production analysis. Permanent gauge data contains useful information allowing accidental or planned shut-ins to be used as transient tests. The Diamant browser positions files in a logical hierarchy that includes fields, tanks, well groups and wells, irrespective of the actual location of the files. Data management functionality further enables files to be gathered into a rational data structure.
Total is expanding its processing capability with the acquisition of several new systems from Silicon Graphics (SGI). Totals E&P unit located in Pau, France, has bought a second 256-processor Altix 3700 supercluster and a 40-processor Altix system. The new machines will be used by Totals geophysicists for seismic analysis and reservoir engineers for fluid flow modeling. The Altix supercluster includes 256 Intel Itanium 2 processors clocked at 1.3 GHz, with 2 terabytes of memory. The new systems are said to add some 1.5 teraflops of processing power to Totals installation. The system integrates Totals existing SGI infrastructure, in particular an InfiniteStorage CXFS 1 shared filesystem.
The 64-processor Altix 350 is for delivery to Total E&P USA where Totals geophysical research group will develop and industrialize new seismic processing algorithms. A four-processor SGI Altix 350 system will be used in addition to the 40-processor Altix to run reservoir simulation applications.
Schlumberger Cambridge Research is also an enthusiastic SGI customer and has just purchased a 32-processor Altix 3000 system with 64GM memory. The system will be used to study geomechanics, physical chemistry, fluid mechanics, and seismics.
Tulsa-based eLynx Technologies has just announced its new Enterprise web-based SCADA solution. Enterprise gives customers remote management of production assets and associated data from an Enterprise level. New features include administration of individual user logins and security profiles, as well as tree-based groupings of assets.
eLynx VP John Powers said, Enterprise lets clients manage hundreds of users and thousands of wells and to share real-time data across the organization. User management means that current data can be delivered in-context to different users, such as pumpers, field supervisors, engineers and accountants.
Enterprise solution lets companies create hierarchical groups of device layouts with unlimited custom attributes, an interactive trending interface and tiered, custom alarm schemes.
Lead developer Geoff Simpson added, Enterprise provide maximum flexibility for any size organization. In addition to the robust application design, the eLynx Enterprise is designed to scale from a single device to thousands of devices without the exponential cost increase required by traditional SCADA systems. eLynx now offers custom data integrations with third party production, accounting and engineering systems.
The Houston Exploration Company (HEC) is one of eLynxs 100 producer clients and is an early adopter of Enterprise. Houston has just extended its contract with eLynx to include its Arkoma Basin assets. Elynx now monitors 850 producing wells for HEC, some 95% of the companys onshore production.
HEC ops manager Joanne Hresko said, eLynx provides an economic, efficient system allowing us to operate more wells with fewer people. eLynx Technologies was formed in August 2000 by its parent company American Central Gas Technologies.
AspenTech has announced availability of its integrated software solution for the oil and gas industry, the Aspen Oil and Gas Solution (AOGS). AOGS builds on Aspens two software components, the AssetBuilder asset modeling environment and HYSYS Upstream simulation technology. The combined solution enables companies to model, plan and optimize the production of an entire field, including multiple wells, pipeline networks and process facilities, using an integrated simulation model.
The solution also integrates industry standard third-party applications from Schlumberger Information Solutions (SIS) to support accurate and timely operational decisions. An integrated asset model is generated using Aspen AssetBuilder. Companies can improve field performance by optimizing the whole system from subsurface to producing facility.
SIS VP product development Olivier Le Peuch said, AspenTechs simulation and optimization technologies complement our upstream workflow and modeling applications. The integrated asset model provides oil and gas companies with additional capabilities that support production optimization and field planning initiatives, and help identify opportunities to improve facilities utilization.
The multiple applications and data sources that support the integrated asset model within Aspen AssetBuilder are connected in a common environment based on Microsoft .NET. .NETs support of Web services is said to help customers harness an expanded array of existing tools, while protecting their options to incorporate new features in the future. SIS also announced a major .NET development last year.
Marise Mikulis, Microsofts oil industry manager, said, By developing its solutions based on .NET, AspenTech expands the scope of integrated modeling to improve operations and increase value for oil and gas companies. AspenTech claims over 1,500 clients for its software including BP, ChevronTexaco, ExxonMobil, Shell, and Total.
At the SPAR conference in Houston this month, Aveva Solutions presented new laser scanning technology that allows retroactive capture of digital plant design data. 3D laser scanning hardware from Leica Geosystems is used to capture plant geometry as built. The point cloud data is then fed into the plant data management system. Laser scanning hardware can resolve point accuracies of ± 6mm over 50 metersimportant for accurate 3D tie-in and clash detection. Aveva claims significant savings on rework costs with the new technology.
Halliburton and BP Norway recently completed what is claimed to be the first remote offshore cementing job. Operations were controlled from BPs operations center in Stavanger, 340 km from the Valhall field in the Norwegian North Sea.
Onboard equipment and software for remotely-controlled cementing operations was installed on the field late last year. The operators workstation uses BPs DrillView software and a twin screen display allowing for controls of cement mixing and pumping. Data and video from offshore cameras are fed over a twin optical fiber link to the shore.
Audun Bjordal, Halliburton Fluids Division country manager in Scandinavia said, We now able to control and monitor many operations and processes from the beach. Prior cement jobs were trialed from a control room located on the platform, one deck above the cement equipment.
Paradoxically, moving operations from the harsh offshore environment is not always appreciated by the personnel involved. In the past, generous pay and conditions meant that offshore workers could spend their leave in Spain, or remote rural locations in Norway. Technology is turning their jobs into Stavanger-based, 9-5 routines.
Input/Output (I/O) has bolstered its full wave seismic offering with the acquisition of seismic processing house GX Technology Corp. (GXT) in a $150 million cash and paper deal.
I/O president Bob Peebler said, GXT is a crown jewel in the world of seismic technology and is recognized as a leading imaging solutions provider. GXT will play a critical role in our strategy to lead the industry into the digital full-wave era. GXT will re-orient our company from equipment manufacturing to offering a full range of seismic imaging solutions. GXT complements our VectorSeis sensor technology, but full-wave imaging is about more than just the sensor its also about planning, field execution and advanced processing.
GXTs processing offerings include time processing, velocity modeling and pre-stack time and depth migration. GXT also provides value-added services including survey design, project management and quality control. GXT pioneered pre-stack depth migration techniques, originally applied in the Gulf of Mexico.
Mick Lambert, CEO of GXT, added, Our companies share a vision of a high tech seismic company focused on solving our clients imaging problems. The combined company will be able to tackle some of the most challenging seismic opportunities in geophysics; full-wave, multi-component processing and 4-D imaging.
The $150 million purchase price includes the assumption of $4.5 million in debt and the delivery of I/O stock options with a value of approximately $15.5 million. I/O plans to issue $100 million of common stock to finance the deal along with a new loan. In calendar 2003, GXT reported revenues of $49 million. For more on I/Os full wave plans read our interview with Peebler in OITJ Vol. 8 N° 12.
SAP reports continued success for its upstream industry-specific application, SAP for Oil and Gas. In one of the largest ever SAP go-lives in Asia, Oil and Natural Gas Corporation Limited (ONGC) has rolled out the software at over 100 locations across India.
SAP now serves over 5,200 ONGC users and provides a single companywide platform to integrate and optimize all business processes. Key components deployed by ONGC include Remote Logistics Management and Production Sharing Accounting.
ONGC executive director Amitava Kaviraj said, We chose SAP to cover our end-to-end business process needs. Now, all information on company activities is available online, 24/7 to support tactical and strategic decision-making. Operations are tightly coupled, increasing productivity and reducing costs. For customers, this means better service; for ONGC, a fundamental competitive advantage.
Kjell Petter Gilje, described Norwegian Statoils experience as an early adopter of SAPs Joint Venture Accounting solution saying, We have integrated and streamlined cash call and joint interest billing processes which contribute directly to easier handling, faster processing and collection of funds from our venture partners. Standard month-end joint venture processing enables Statoil to achieve faster month-end closing, leading to faster financial reporting at less cost.
SAP for Oil & Gas is a customization of the mySAP Business Suite and leverages SAPs NetWeaver web services infrastructure. SAP claims over 500 customers in the oil and gas industry worldwide, comprising more than 750,000 users.