The Norwegian Petroleum Directorate (NPD) has awarded the next five-year contract for the operation of Norway’s national data repository to Schlumberger Information Solutions (SIS). Halliburton unit Landmark Graphics will remain as provider and developer of the Petrobank data repository. Petrobank is currently operated by Landmark/IBM unit PetroData.
In January 2004, Petrobank operations will be transferred to Schlumberger which plans to leverage its own infrastructure solutions to ‘provide open access to the DISKOS repository, improved workflow by seamlessly integrating DISKOS data into company internal solutions.’
The DISKOS Group is composed of the NPD and 16 Norwegian operators. DISKOS, the world’s first national and multi-company data repository houses 60 Terabytes of seismic and well data.
SIS president Ihab Toma said, “Data is the nucleus of real-time exploration and production workflows. Due to the forward thinking of DISKOS, the common repository has already proven to be tremendously valuable by saving the industry millions of dollars in direct costs and enabling geophysicists and geologists to have instant access to high-quality data. The continued vitality of the DISKOS repository depends on leveraging new technology and a sustained commitment to research and development. SIS is pleased to have the opportunity to work with the NPD and the DISKOS Group to support the long-term success of DISKOS.”
Schlumberger plans to augment the existing Petrobank solution with its own tools to expand capabilities and types of data available. Web access and multi-repository data access are planned and the repository will be extended to include data in the field management domain where real-time data access enables real-time monitoring and control.
Schlumberger won the DISKOS contract in the face of stiff competition from the incumbent PetroData. The endgame—with Schlumberger operating Landmark software—means that NPD has kept the competition alive. But it would be nice to be a fly on the wall when the new DISKOS kicks-off!
Core Lab unit Scott-Pickford (SP) has been testing its compute-intensive 3D depth conversion software ‘Velit’ on VoxelVision’s GigaViz cluster engine. Nick Crabtree, manager geophysical projects said, “We compared GigaViz’ autotracker against GeoQuest’s Autopix over a huge middle-eastern oilfield. Where the GeoQuest tool took 4 hours, GigaViz tracked the same horizon in under 30 seconds with a better result and without having to interpret a grid of seed lines first.” SP’s Velit and Cubit packages are to be bundled with GigaViz along with a depth conversion ‘wizard’. SP’s Mark Beasley added, “GigaViz is the ideal platform for a true 3D depth conversion module. We use the cluster engine to parallelize cube generation—a process which scales with the numbers of CPUs in the cluster.”
In a separate deal, VoxelVision is to cooperate with Ødegaard’s Norwegian unit to offer seismic-derived rock property volumes over the internet. Ødegaard’s clients can perform visualization and interpretation at any location and any time using Voxel-Vision’s client/server technology. Ødegaard’s ISIS is said to ‘make the QC of inversion volumes a faster and more accurate task’.
Recent events in the seismic acquisition business reminded me of a trip I made to the US a few years ago. This was in the good old days (actually is was exactly at the end of the good old days as you will see later) when oil company wheeler-dealers like myself traveled business class or better. One of the good things about business class is that it is actually possible to hear the sound track on the in-flight movie—making watching a film an option.
Unusually for Hollywood, this film dealt with some real business issues. I remember Danny de Vito introducing the notion of the ‘buggy whip’ company. This is a reference to a company which carries on doing ‘what it does best’, regardless of the fact that its fundamental business has been killed-off by a changing market. Just as the makers of ‘the best buggy whips in the world’ carried bravely on while the Model T’s began rolling off of Henry Ford’s production line.
I must say I remember feeling very uncomfortable hearing this. The purpose of my trip was to try to farm out some acreage our company had acquired. Could Danny be trying to tell me something? He got to the point in a later scene—“The worst mistake a company can make is to expand into a declining market.” This really sent a shiver down my spine. In the early 1990’s our company had expanded into the rapidly shrinking onshore E&P market and here was Danny telling me we were doomed. He was right of course which is how I came to be writing editorials instead of wheeling and dealing.
Having had a good chance to observe the upstream oil and gas business over the last few weeks at the SPE and SEG exhibitions I have been wondering if parts of the industry aren’t following the path of the buggy whip. In other words, how much of the current industry’s woes is cyclical and how much represents irreversible, structural change.
The signs that something is amiss are everywhere. WesternGeco has thrown in the towel on its North American seismic acquisition operations—with the loss of 1200 jobs. WesternGeo president Gary Jones cited the “high-risk, no-return state of affairs in the seismic industry.” Petroleum GeoServices is said to be fighting off bankruptcy and posted a $ 1 billion loss for Q3 2002. These companies’ difficulties are partly self inflicted. At least in the offshore environment, seismic acquisition is so productive these days that a boat can acquire in under a day what it would have taken over a week to achieve a couple of decades ago. There has also been an order of magnitude reduction in real terms and in like-for-like processing costs over the same period.
There is no doubt too that some of the skills required to turn the old paper seismic sections into ‘black gold’ have gone the way of the buggy whip. Workstation technology means that today, you map the faults, ‘interpretation’ is no longer required. Similarly, powerful autotrackers, while not obviating the need to calibrate and correct from time to time, have boosted productivity enormously. Some have even argued that the whole interpretation process should be ‘commoditized’ in so far as the ‘correct’ interpretation is no longer a matter of, err, interpretation!
When you compound the orders of magnitude improvements in accuracy, costs and productivity you have to conclude that the industry has come a long way. One of the striking things about the tradeshows is the amount of science and high technology that is continually injected into the business despite the sub-par returns. Rock physics, amplitude versus offset computation, geomechanics of well bores, 4D seismics and real-time monitoring abound. But the increased application of high-tech and science to interpretation is a different trend than the commoditization of interpretation.
While the old ‘seat of the pants’ exploration is definitely a thing of the past, the widespread application of science and technology has yet to bear all of its fruits. This may be due to a kind of high tech blight that has been cast over the industry. In the 1990’s, 3D seismics was so successful in structural reservoir delineation that it became unthinkable to drill a prospect without a 3D survey. Competition for 3D resources meant that a lot of second grade acreage did not get shot and drilled. Likewise today, the promise of using technology to see into the reservoir before drilling—or to solve some of the more tricky exploration problems—mean that exploration management is getting even more picky.
I wonder if we aren’t at a kind of crossroads today between commoditization and customization. Heavy duty seismic acquisition and processing is predicated on economies of scale but at the same time, the increasing sophistication of the interpretation process mandates a great flexibility and rigor as data transits through the interpretation process. We are not in decline, but ‘buggy whips’ abound in our industry’s processes. The chasm between the ‘commoditized’ processed seismic volume and the interpretation workstation is one. Another is a very old chestnut for Oil IT Journal readers—the inability of operators and vendors to properly manage data through the asset lifecycle. This last is likely to prove crucial in the years to come as the new, analytical exploration successes are predicated on access to and integration of massive, cross discipline data sets.
By the way, I didn’t tell you how my trip ended up. I sold the company’s assets for a song and wound up what was left. Which reminds me of the name of that movie—“Larry the Liquidator!”
Aberdeen-based Oilcats and OFS Portal have signed an agreement to develop product and services classification standards. OFS Portal, a group of upstream oil and gas suppliers is working with the Petroleum Industry Data Exchange (PIDX), the e-commerce committee of the API (American Petroleum Institute) to develop a products and services classification system for use in the US upstream oil and gas industry. Oilcats is doing similar work in the UK in conjunction with the UK DTI’s Logic initiative.
Oilcats has already developed over 8,500 product templates
used by the UK oil and gas industry. These will eventually cover ‘every type of
component, product and services for oilfield work’—a market of
£ 300 million. The new joint development assures ‘a global approach to standards development’.
Oilcats is to map its data templates to the PIDX standards. Oilcats MD Dave Andrews said, “As the only company offering this type of service in the UK, the potential opportunities for us here and in the US are huge, particularly in light of recent drives within the sector for standardization and cost efficiency.” Oilcats specializes in materials data control and inventory analysis and boasts an annual turnover of £1.25 million. Oilcats’ clients include AGIP, BP, Mobil, Shell and Talisman.
DTI e-standards chair Dave Rodger added, “This initiative will make electronic transactions between our industries much easier. By jointly working towards common standards, we can reduce costs and improve business operating efficiencies.”
CIS, the oilfield procurement division of UK-based Craig Group, has teamed with other suppliers on an on-line stock catalogue. The CIS Alliance will allow companies to check out stock, obtain prices and create an order on the web.
CIS MD David Allan said, “Companies have many consumables suppliers. By using one online procurement service, paperwork, administration can be reduced and cycle times shortened.”
Shell adviser on strategic sourcing, Chris Miller added, “Today within Shell, e-procurement accounts for 3-5% of an annual $22 billion spend on goods and services. Shell’s Businesses are already planning to raise this to 60% within the next two years.” The CIS Alliance catalogue was built by Oilcats. The system was developed for internet access by First eBusiness.
Trade-Ranger and OFS Portal have signed an agreement to provide a framework for members of both organizations to transact electronically with each other. The agreement provides an avenue for Trade-Ranger buyer members to access OFS Portal suppliers’ electronic catalogue content through the Trade-Ranger marketplace. In addition, Trade-Ranger, OFS Portal and their respective buyers and suppliers will continue to cooperate on developing industry-wide electronic data standards through Petroleum Industry Data Exchange (PIDX), the electronic commerce committee of the American Petroleum Institute (API).
Bertrand Deroubaix, e-procurement VP with TotalFinaElf and Trade Ranger president said, “This agreement brings together key buyer and supplier organizations in the upstream oil and gas sector and is an important step in advancing global e-procurement initiatives in our industry. OFS Portal members make up an important group of suppliers for Trade-Ranger’s buyer members. Many are global leaders and have demonstrated a strong commitment to e-procurement.”
OFS Portal CEO Bill Le Sage said, “Since our initial contact with Trade Ranger last year, we have had one mission, to make e-procurement a cost-effective and trusted process. Our agreement with Trade Ranger is another step toward realizing that goal. We have worked closely with some of the largest oil and gas members of Trade-Ranger and their support, along with the support of our members, has been critical to achieving this milestone.”
Jeff Helms, Trade Ranger CEO added, “The establishment of industry-wide content and transaction standards is critical to achieving the benefits of e-procurement. This agreement allows Trade-Ranger to progress in its standardization efforts, and will advance the e-procurement initiatives of our buyer members and members of OFS Portal. It’s an important step in making e-procurement in our industry a reality.” Trade-Ranger’s founding members are ConocoPhillips, The Dow Chemical Company, ENI, Mitsubishi Corporation, Motiva Enterprises, Repsol YPF, Royal Dutch/Shell, Statoil, TotalFinaElf, Unocal, Occidental Petroleum and BP. OFS Portal’s members are ABB, Ambar Lone Star Fluid Services, Baker Hughes, BJ Services, Cooper Cameron, ENSCO, FMC Technologies, Greene Tweed, Halliburton, Hydril, Kvaerner, M-I, Schlumberger, Smith, Grant Prideco, TETRA and Weatherford.
Spotfire has just announced a new extension of its DecisionSite analytical tool. DecisionSite for Reservoir Analysis (DSRA), developed jointly with ChevronTexaco, allows managers, engineers and geoscientists to pool and analyze large quantities of disparate technical and business data.
DSRA lets an asset team member ‘use data from outside his or her domain’. DSRA builds an ‘information library’ of production and other data from different application sources and provides pre-screening, analysis and post-processing of ‘disparate technical and economic reservoir data’. Tabular data is imported and stored in a multi-dimensional hypercube for display and analysis.
DSRA offers data visualization and reservoir analysis with ‘guided analytics’ – canned views of data such as maps and decline curves. Text wizards suggest what you might want to do with a dataset. Production data can be merged with field ‘notes’ i.e. tubing leaks, rod/pump behavior etc. This can reveal relationships between in-field failures and equipment or operating conditions changes. Hypotheses can then be tested on larger data sets.
Spotfire VP marketing David Butler said, “With hundreds of thousands of dollars riding on their decisions, the oil and gas industry is strongly motivated to speed cross-domain decision-making. But interpretation tools are not well suited for this objective. DSRA leverages existing investments in data, tools and best practices from individual experts.”
Spotfire and ChevronTexaco have co-staffed a project team to interview internal experts within ChevronTexaco to specify opportunities for analytic applications. The Spotfire team collected requirements and developed prototypes working directly with asset teams to fine-tune DSRA.
Spotfire believes that asset team decisions involve a many business and technical factors. While discipline-specific tools exist to analyze each of these, they lack the ability to enable knowledge sharing and collaboration in the asset team. Because the application is based on the DecisionSite guided analytics platform, teams can perform constant ‘what-if’ portfolio analysis, capturing and deploying best practices for problem solving while making team-level decisions.
Øystein Byberg writing in the Norwegian Hegnar Online (HO) offers this inside track on the Petrel acquisition by Schlumberger. The overall sale price according to HO was a whopping $140 million – a figure which was subsequently denied by Petrel although no other numbers were forthcoming. Interestingly, Petrel revenues for 2001 according to HO were 142 million Kroner (about $ 20 million).
In a joint statement to clients, Petrel and Schlumberger revealed a roadmap for Petrel within SIS. Petrel is considered as ‘new core technology’ in the SIS product portfolio and will be supprted as a separate product line. Petrel will be the ‘cornerstone’ of Schlumberger’s PC technology offering. All available resources will be directed at maintaining Petrel’s unique position as ‘a responsive supplier of exciting and intuitive technology and enhanced workflows.’
Petrel Workflow Tools will be an independent portfolio in SIS, headed by Jan Grimnes. Grimes will have full autonomy on product direction, product development and pricing. Petrel is said to have reached a stage where further development will benefit from access to Schlumberger’s in-depth expertise in ‘bordering technologies’.
The Living Model
These will be ‘critical’ as Petrel responds to the future requirement for real time reservoir management, in the shape of SIS’ ‘Living Model’. The bordering technologies will include faster data handling and model updates to underpin decision making. Petrel will assist in SIS’ push to ‘eliminate the traditional barriers between geophysics, geology and reservoir engineering’ and to create the ‘ultimate asset team solution’.
Petroleum Place is to partner with Rigzone to consolidate the organizations’ web presence. Petroleum Place will be henceforth focusing on its ‘core business’ of asset disposal while its business directories and other listings will be consolidated into the Rigzone portal.
210,000 visits per month
Rigzone claims over 210,000 visits per month for its online oil and gas directory which provides links and information on thousands of oil and gas industry companies and support organizations from drillers and production companies to equipment manufacturers and engineering services.
Noble Corporation unit Maurer Technology Inc. (MTI) is offering a free electronic tally sheet for Pocket PC’s. The Pipe Tally Sheet (PTS) lets engineers build an inventory of tubular products and downhole tools.
The PTS was funded by the U.S. Department of Energy’s National Energy Technology Laboratory. Maurer says that Pocket PC usage in the oil patch is increasing. Large and small service and operating companies are using Pocket PC’s, equipped with easy to use software programs to augment field worker productivity and efficiency.
The PTS complements Maurer’s Driller’s Toolkit Pocket PC package for drilling and completion engineers. Download the PTC and user manual from www.drillerstoolkit.com or www.npto.doe.gov.
As revealed in Oil IT Journal (Vol. 7 No 9) Paradigm Geophysical has ported several of its geoscience applications to Linux. Paradigm has now entered an agreement with Hewlett Packard to tune its Linux-based applications to the HP workstations and servers.
Paradigm intends to offer clients pre-installed, ready-to-use visualization and interpretation systems based on the emerging HP architecture. Paradigm claims its applications and data management software is well suited to Linux. Paradigm’s ‘SeisRoam’ process was designed to manipulate massive data sets from disk, making it an ideal match for the HP zx1 chipset and the Intel Itanium 2 microprocessor – according to Paradigm.
Ali Ferling, HP’s global general manager for the energy industry, said, “We are excited to see Paradigm’s innovative solutions available on HP’s leading-edge 64-bit Itanium 2-based systems. HP is committed to developing a complete solution portfolio for the oil and gas industry. ”
According to Neuralog, there are still significant volumes of paper and raster data which need to be converted to digital data if they are to be used in geological software. Neuralog is now offering users of Schlumberger’s GeoFrame reservoir characterization system a solution for analog data capture, conversion and loading.
NeuraMap provides on-screen digitizing for SIS clients, superceding the manual approach of GeoDig. Neuralog technology provides raster and digital preparation, interpretation, automated digitizing and scanning. Companies can capture, access and analyze paper and digital inventories for increased productivity. More from www.neuralog.com.
WesternGeco has just announced the closures of its land seismic operations in the US lower 48 states and Canada. The closure is due to ‘sustained unprofitable market conditions’ according to the company. Downsizing is also expected in other areas of the company, with about 1200 employees to be affected worldwide by the end of the year. Land operations will continue in Alaska and Mexico, as well as other ‘economically viable’ areas worldwide.
The move is part of the company’s restructuring to ‘decrease its emphasis on conventional seismic operations’. According to WesternGeco, such operations have been severely impacted by commodity pricing, excess risk, and difficult terms over the last 10 years. WesternGeco is simultaneously ‘accelerating its move toward the production side of the E&P business.
WesternGeco President Gary Jones said, “This action is an inevitable result of the high-risk, no-return state of affairs in the seismic industry. We will focus on customers and geographical areas where the value we provide is recognized, and will concentrate our growth in providing advanced reservoir information technology. This move is another step in our long-term commitment to sound business practices, including shutting down losing operations and exiting markets where reasonable terms and conditions do not prevail. However, given fair compensation and acceptable terms, our services are available everywhere to everyone.”
The Landmark Real Time Asset Management Center (RTAMC) has to qualify as the ‘star of the show’. No expense was spared in putting together this multi-room theater showing real-time collaboration between an offshore rig, regional and head offices – and even a mobile worker on an international flight. Drillers access Landmark’s drilling software through an oversized electronic whiteboard/touch screen and communicate with the shore by videoconferencing. The finale of the RTAMC is a globetrotting manager accessing real time information from an intercontinental flight—overseeing the whole process through Accenture’s E&P Online.
SIS president Ihab Toma announced that Schlumberger and Intel are teaming to produce a high performance reservoir simulator running on the latest Intel chips – including the 64 bit Itanium 2. A ‘massive reduction of total cost of ownership’ is promised. The simulator is ‘at the heart of real time reservoir management – and will ‘be the equivalent in the current decade 3D of seismics in the 1980s’. Today, only 45% of the world’s reservoirs are simulated. Amongst these, simulations are only re-run every 3 months on average. Today simulators are limited to 3-4 million cell models. In the future this will rise to 30-40 million.
Eugene Ennis (ex Landmark CEO), John Mouton (Landmark co-founder) and Zycor founder Jim Downing are developing code for a “space-time dynamic model of the reservoir”. Currently a service offering, the software derives rock physics properties from production (not test) data. The first month of production data is analyzed for ‘perturbations’. The software will later be productized as “Resolve”.
Amtec’s TecPlot Reservoir Simulator (RS) post processor for VIP/Eclipse also displays seismic data with animated and interactive slicing of model. TecPlot is a joint development with ChevronTexaco and will be released early next year.
IRAP RMS V7
The IRAP reservoir modeling system (RMS) offers consistent fault handling, structural framework and facies modeling. These can be used to ‘bias’ petrophysical parameters in upscaling to the simulator model. All the time the tree-view workflow manager expands and contracts to show contextually relevant information. A movie can be viewed moving down and around the model to check consistency.
TACIT’s Knowledge Mail (TKM) extracts knowledge from emails, installing as an add-in to Microsoft Outlook. Users ask a question of TKM which checks its expert database and decides who should receive the mail. A click on the name brings up the expert’s CV. Tacit’s proprietary search technology performs noun and noun stemming searching – claimed to be more powerful than Google. Tacit builds its expert database automatically by reading emails and documents on disk. Tacit builds on existing taxonomies augmented with its own analysis. TKM’s first customer is development partner Texaco.
Core Laboratories is leveraging web hosting with its new CoreWeb front end to its database of seismic rock properties. Core Web is a reservoir information browser which lets users gather and use data in an HTML environment. CoreWeb offers thin client access to the large volumes of geological and petrophysical data associated with consortia projects. CoreWeb was developed around Rapid (acquired along with Reservoir Inc. – formerly Core Petrophysics). Rapid (Reservoirs Applied Petrophysical Integrated Data system) is used by around thirty major, independent and national oil and gas companies. CoreWeb is being used in CoreLab’s Deepwater GOM study currently underwritten by 35 companies.
Aclaro Software’s PetroLook V 3.0 – reporting and data analysis tool has been developed over Microsoft’s .NET architecture. PetroLook is used to roll-up and reconcile reserves and produce EIA and RIGS reports. PetroLook can be used with other vendors’ tools such as ARIES, PEEP, and Excel. The security model is new in V 3.0. Aclaro’s Christoph Faig told Oil IT Journal “Microsoft’s .NET architecture is fantastic—it’s highly scalable and integrates well with non Microsoft systems. Also, you only need a single Windows 2000 .NET server”.
One of the biggest changes in drilling over the last decade or so is the amount of science that goes into wellbore design. Economics, deepwater drilling and environmental pressures argue for improved well construction and design traceability. Advantek’s borehole mechanical software performs quality assurance of cuttings and produced water re-injection. The tool also computes fracturing, erosion and wellhead corrosion and assures life cycle management of the injection process including C02 and H2S disposal.
Advantek is currently working with the DeepStar consortium in the Gulf of Mexico on the major problem of production related fault reactivation. Varco unit CTES’s Cerberus software models forces acting on a completed well – computing buckling and tensile stresses. The software targets service companies working with jointed pipe and coiled tubing. Landmark’s PressGraph computes a 3D ‘pressure cube’ from seismic data. A WITSML compliant OpenWire link brings drilling data into PressGraph for real time analysis. Knowledge Systems’ ‘Safe Seal’ is a new tool for analyzing the efficiency of a reservoir’s top seal. Safe Seal is currently a service offering involving estimation of minimum stress and compartment fluid pressure.
PetroWeb has been contracted by Penn EnergyData to build and manage the ex-PI/Dwights dataset that the old PennPoint acquired. PetroWeb uses MapGuide technology (an ESRI port is available). PetroWeb can be used to dump data into an application, or to view over a thin client browser. The price is ‘a fraction of the cost of IHS Energy.’ PennEnergy data has a staff of around 15 plus 5-10 scouts maintaining database. The data model is described as ‘optimized—not PPDM’. XML is used to deliver data to clients.
PetrisWindsNow hosted ASP software now includes 55 products—from G&G, through project management and into mid and downstream. These are supplied by 15 vendors including Computer Modeling Group, EPS, Inside Energy, Invensys and TH Hill. Petris CEO Jim Pritchett says, “ASP is coming of age in verticals where upfront cost and lead times are barriers to the adoption of niche software.” After nine months in business, WindsNow has 15 clients. A new version of Halliburton-Landmark’s InsiteAnywhere Drilling ASP was rolled out at the SPE. Insight began life as Sperry-Sun’s Intelligence Central – now incorporated into myHalliburton.com. Insight offers multiple, context sensitive interfaces—showing drilling gauge data from the drill floor—in near real-time. The limits are satellite/microwave link latency. V 6.5 includes enhanced plots with API standard grid types and SnagIT’s TechSmith for cut and paste. iReservoir’s iProject is a software tool for sharing data with clients and joint venture partners. An intuitive interface offers a tree view of data. New data can trigger an email report along with annotations. iProject also offers free text search and QC tools for common data types and simple viewers. An FTP-like function supports file transfer with resume, encryption and access control. Current uses are data sharing and annotation – true workflow and security will come next.
Shell’s Rodney Calvert asked in a keynote paper, “Can you do without 4D seismics today?” 4D data is most useful early in the life of a field – where it can influence critical well placement. The key to good 4D is to keep acquisition parameters identical between subsequent surveys—if necessary repeating acquisition ‘mistakes’. 4D take-up has been slow in part because of cultural barriers. Calvert suggests we need ‘genetically engineered merged geophysicist/petroleum engineers’!
This article is abstracted from a 26 page illustrated report on the 2002 SPE ACTE produced as part of The Data Room’s Technology Watch service. For more information, email firstname.lastname@example.org.
The big buzz at the SPE is the use of low cost wireless technology to provide SCADA-like monitoring of remote facilities. While the technology is currently ‘read-only’ most vendors plan to offer feedback and control ‘real soon now.’
Backed by Shell venture capital, vMonitor’s low cost battery operated data acquisition units collect real-time data in the field. This is transmitted over spread spectrum wireless—through concentrators and into the office. Real time data is viewed in a web browser with ‘smart alarms’ and links to legacy systems including Honeywell, Foxboro and Oracle. A 50-70% cost reduction on traditional SCADA is claimed. Clients include Saudi Aramco, Shell and ChevronTexaco.
Luna’s solar powered sensors measure beam pump activity, rpm, gas rate and tank levels. Data is beamed up through a Qualcom satellite modem operating at 9600 baud. With judicious sampling and compression, meaningful daily data can be squeezed into a 96 byte package at a cost of 10 cents per transmission. A single well can be equipped for around $2,000 – c.f. ‘$10,000 for SCADA’.
Axonn’s ‘Axess’ off the shelf wireless package collects and transmits data from up to four sensors over one mile. Sensors are available for pressure, flow, temperature strain etc.
This article is abstracted from a 26 page illustrated report on the 2002 SPE ACTE produced as part of The Data Room’s Technology Watch service. For more information, email email@example.com.
Genetek was showing tightly integrated tools for AVO analysis built into its EarthWorks interpretation package. Synthetic seismogram generation, calibration and AVO form an integral part of seismic interpretation. The synthetic is generated where it should be – at the well location and within the seismic interpretation package. GeoModeling’s VisualVoxAT seismic interpretation system began life as a Statoil R&D project in 1996. VisualVoxAT performs dip cube, dip azimuth, waveform difference, spectral decomposition and other volume based processing as well as neural network-based classification. GeoPlus Corp.’s PetraSeis V2 2D/3D seismic interpretation package is to be released ‘within a month’. The new product incorporates a database and claims ‘accurate’ faults, multiple interpretations and full 2D3D interpretation. PetraSeis can also pick on raster images.
Linux is ‘for real’
Landmark’s John Sherman believes, “Linux is very much for real – providing a 200% to 1100% performance advantage over Unix”. One 60 box Unix installation was replaced advantageously with 80 Linux machines – the cost went down from $12 million to $300K. Magic Earth’s demo used to take 45 minutes on Unix - it now runs in 15 minutes on Linux. Landmark’s EarthCube development is now handled by Mike Zeiltlin in Magic Earth. Ødegaard was pushing its new consultancy workflow which connects time lapse seismic to reservoir modeling. Differences between 4D seismic measurement and reservoir model predictions are used to update the model. Paradigm introduced its new ‘trace to target’ cross-disciplinary workflows which facilitate and enforce cross discipline workflows. Paradigm products are linked together by a common file format—and are interoperable with third party frameworks such as GeoFrame and OpenWorks.
A new Well Asset Manager spiders data from different sources and ‘pumps’ it into GeoLog. University of Colorado’s BP Center of Visualization is releasing its Interactive Drilling Planner (IDP) which will be commercialized through Earth Decision Sciences (ex-T-Surf) end 2002. IDP was developed by Arco’s R&D department. IDP displays ‘cones of uncertainty’ around well bore trajectories – and is used in well path collision avoidance. IDP will be marketed as a GoCad plug-in.
Schlumberger’s DecisionPoint (DP) offers a customized workspace and ‘control at a glance’ through alerts, notifications and prioritized workflow. DP can show daily production key performance and has also been used to provide seismic processing clients with workflow status reporting. In a marketing paradigm shift, Schlumberger no longer plans to ‘push’ applications. The intent is to be able to leverage client investment in proprietary or competitor products and datastores. These will be woven into the customized ‘InfoStreams’ announced at the SIS Madrid Forum earlier this year. New SIS Streams components include the Federator middleware, the ISurf browser and the ResultsDB – a new vendor-neutral repository for G&G information. Finder and AssetDB will remain as components of Info Streams ‘when needed’. GeoWeb will be phased out and replaced with ISurf. The ResultsDB will be commercially available mid 2003. CGG has ‘nearly abandoned’ its data management business although it continues with PetroVision in Russia.
SeisDB is also on the way out and will be replaced by WesternGeco’s Expeditor along with the Panther/SDMS catalog. Oracle is back at the SEG and was touting its e-business tools Product Development Exchange and Oracle Sourcing. OPDX offers revision control and workflow, RFQ, purchase orders, parts, people, documents and change management. Hays IMS new E-Search Asset Management (ESAM) was derived from Open RSO and leverages Oracle’s Intermedia. ESAM is a Java GUI (for power users) or native web client providing searching of configurable database fields. ESAM V1.1 offers a published API for hooking into enterprise IT – amenable to GIS enablement. Kelman has invested $12 million in its seismic data management solution. By year end 2002, Kelman will have around 50TB on its system. The Kelman Archive uses XML/SOAP to talk to PetroWeb’s mapping front end for web data delivery. A shopping cart/checkout process verifies that there is room for the data on the recipient’s machine – in an oil company, or seismic contractor—B2B in action!
ModViz, a spinoff from Siemens, provides PC cluster-based 3D visualization leveraging TGS’ Open Inventor graphics. ReachIn’s haptic GeoEditor interface was developed for ChevronTexaco and Norsk Hydro and is now available as a plug-in for Earth Decision Sciences Gocad. SGI’s VizServer offers high-end interactive graphics over great distances—the screen, mouse and keyboard can be couple of km from the processor. SGI is also engaged in the Open Source movement and has offered its XFS journal file system to the community. Notwithstanding this, SGI will ‘never’ abandon IRIX. SGI also partnered with Teraburst to demo high quality visualization over private or public networks. Teraburst offers synchronized video, audio and control data on a Sonet optical carrier which has been tested over a 30,000 mile link! Paradigm is extending VoxelGeo’s capabilities to allow for visionarium-based well planning.
Multiwave Geophysical Company has developed towed streamer, seabed 4C cables and electromagnetic services. This kit has been deployed in Shell’s ‘game changer’ 4D seismic test on the North Sea Skua field. Engineering Seismology Group’s FracMap monitors fracture growth during fraccing operations by shutting-in an offset well and deploying a VSP array. Data is processed in real time to give engineers an image of what is happening. The time difference between P and S wave arrivals gives range while the horizontal component gives the azimuth. ESG offers a neat laptop-based 4D time lapse visualization package along with the service.
APPRO , Einux, IBM and RackSaver were showing clusters of PC’s deploying various inventive geometries and specifications. Clusters are one real geophysical growth sector—see last month’s editorial. Also in good health is the network/storage space with various OEM configurations on show from DataDirect Networks, Zambeel RaidZone, PARS/EMC, Hybinette and LSI Logic. LSI was showing its new TrAM ‘luggable’ disk array – ‘2TB in flight case’ for serious bandwidth users. NAS/SAN convergence is a hot topic – especially through the SGI SAN Server 1000 incorporating a Brocade 2 GB Fiber Channel switch.
This article is abstracted from a 24 page illustrated report on the 2002 SEG Convention produced as part of The Data Room’s Technology Watch service. For more information, email firstname.lastname@example.org.
Brian Russell (Hampson-Russell) commented on trends in seismic survey size, multi-component analysis, pre-stack depth migration (PSDM), time-lapse seismics and visualization – all of which are extremely compute intensive. As hardware costs tumble, Russell’s future computing has the laptop PC as the computer of choice – running Windows or LINUX (for Power users). One problem is finding top notch programmers—it’s hard to interest youngsters in geophysics. Russell favors a ‘new focus’ on object oriented programming and a move away from ‘Nintendo geology’—with the application of more diagnostic science.
Schlumberger Fellow Craig Beasley wants to go beyond more data and faster machines—and into ‘better resolution’. Today’s seismic acquisition can involve around ˝TB/day. On the hardware front – Beasley notes that in terms of compute power per $ ‘we are beating Moore’s law’. In 1972 an offshore survey cost $6,000 to process – the equivalent of $12,000 at today’s prices. The same survey could be processed for $1,000 today. The future will see ubiquitous high performance computing through technologies like GRID computing—commoditized CPU cycles. The future will also bring ‘bullet proof’ computing (not just ‘fault tolerant’), pervasive 3D visualization, wireless computing and continuous field monitoring.
IBM researcher David Klepacki described IBM’s ambitious ‘Blue Gene’ Linux supercomputer. Blue Gene M is a switch-less, massively parallel 80,000 dual PowerPC node machine. IBM also has plans for a one million processor Blue Gene P deliverable by 2009.
Ulrich Neumann is director of the Integrated Media Systems Center at the University of Southern California—a $10 million/year facility for studies in streaming media, perception and cognitive modeling, computing and virtual reality. For Neumann, ‘we are infovores’ and require rich visual stimulation to ‘get our endomorphins going’. IMSC studies the design of systems that harness engineering and science information to exploit the brain’s processing power. IMSC ‘Remote Media Immersion’ fuses internet browsing with an immersive theatre experience with a 45 Mbit/s stream of HD video and 12 channels of audio. This is by doctors for remote operations – or to facilitate negotiations ‘as if you were in the same room.’ In the future, ‘widespread tele-presence will replace email’.
Don Paul (ChevronTexaco) believes that the future will see even more integrated oil companies and more ‘reality’ in reservoir modeling. Other objectives are to augment the human ‘computer’ and to manage IT and business together as an ‘ecology.’ Paul observes ‘exponential’ desktop network traffic at ChevronTexaco (CT) – around 55GB/month currently. CT’s global IT spend splits as follows – 34% upstream, 53% downstream and 13% corporate and finance. Growth is seen in finance, security, communications, supply chain, video and visualization, real-time and ‘simulation of all kinds’. IT is ‘no longer just number crunching’.
This article is abstracted from a 24 page illustrated report on the 2002 SEG Convention produced as part of The Data Room’s Technology Watch service. For more information, email email@example.com.
Petrosys is to release a major new version of its high-end upstream mapping package next month. Petrosys V12 will offer support for the direct display of generic spatial data from Arc Shapefiles and Microstation DGN files. A new spreadsheet user interface has been added to manage map display layers.
Petrosys is also revising its marketing strategy in the light of emerging usage patterns of upstream data. Most upstream operators are moving away from the central data repository, to manage data in application frameworks such as GeoFrame or OpenWorks. The role of Petrosys’ dbMap is to be downplayed as the company adopts a renewed focus on its mapping engine.
Upstream mapping is usually associated with ESRI’s mapping tools. Oil IT Journal asked Petrosys’ Tom Robinson how the company views the stiff competition from ESRI. Robinson agrees that, “ESRI is entrenched but it doesn’t have gridding and contouring—and doesn’t treat a seismic line properly—shotpoints are not fire hydrants! Petrosys software can co-exist with ESRI environments. We have used the ESRI dev kit to allow for ShapeFile generation.” One new target market for Petrosys is the large brigade of oil company ArcView developers.
The new release offers colors and patterns selection based on database attributes, such as those associated with an ArcSDE layer. Similar functionality is extended to the many E&P groups with large amounts of information in Microstation DGN files.
Petrosys now supports direct connections of their mapping, querying and reporting facilities to databases provided by the Calgary data vendor International Petrodata Ltd.—widely used in the Canadian petroleum industry.
Schlumberger Information Solutions (SIS) has licensed Venezuela-based PetroSoft CA’s FloMatic software. FloMatic is now part of the SIS production planning chain. FloMatic is described as a common economic evaluation platform for the analysis of field development projects.
FloMatic helps the planning engineer estimate oil and gas production profiles, physical characteristics of the products for sale and resulting cash flow for different economic scenarios. FloMatic provides a standard workflow within an organization, enabling like for like comparisons for project ranking and selection.
Jaime Barriga SIS drilling and production portfolio manager said, “FloMatic has proven to be a valuable planning tool for hundreds of users. It is the final link in the SIS production planning chain and completes the workflow from field development tools that define production forecasts to economic evaluation models.”
“FloMatic will enhance the Merak Value and Risk software suite by providing asset information required for comprehensive economic evaluations that lead to better long-term planning and more effective utilization of resources.”
Former PetroSoft president Robert Barletta has joined Schlumberger as FloMatic project manager. Barletta added, “As part of SIS, with its dedication to research and technology, and its worldwide support infrastructure, FlowMatic will undoubtedly become the standard tool for asset development and production planning.”
Midland Valley Exploration (MVE) unveiled its latest modeling application NMove project at its US user meeting held in Colorado’s Stanley Hotel – said to be the inspiration for Stephen King’s ‘The Shining’. NMove is the working title for MVE’s rebuilt and extended software suite.
NMove is set to merge the capabilities of MVE’s existing software portfolio into a single component-based application. NMove will be scaleable from ‘laptops to super computers’ and will leverage recent developments in multi-threading and advanced visualization. NMove will allow users to work with different document ‘views’ of the same data set allowing use of 2D, 3D and volume views.
MVE director Alan Gibbs said “This is the next big thing in geoscience software. NMove removes the artificial barriers between 2D, 3D models and the volume data in the earth model. All this provides users with the ability to move transparently through geological time. Seeing is believing, and NMove will shortly be a reality for both PC and the visualization centre.
Meanwhile, MVE’s new release of its 3D Move flagship is proving useful to geoscientists working in overthrust belts and in areas of complex salt tectonics. Here, 3D Move’s ‘wraps’ 3D point clouds with a single continuous surface. This can then be used for image ray tracing and depth conversion in what MVE claims is a paradigm shift in model building.
MVE software is now all available on Linux—allowing users to perform high-end modeling on laptop computers.
Canadian Computer Modeling Group (CMG) has claimed a world record for its Imex simulator. The new parallel Imex simulator successfully completed a 112 million cell black oil model on an 8 CPU IBM 690 with 256 GB of ram.
The 3-component waterflood model contained 200 wells and 90 patterns. A PowerPC-based IBM eServer running IBM’s AIX 5.1 OS was used for the simulation. A preliminary, single CPU test with CMG’s Aimsol linear solver modeled the reservoir for one time step (0.5 days of simulated time) in only 4.7 hours of CPU time.
Next CMG’s Parasol linear solver was run using 8 CPUs for 68 simulator time steps (322 days of simulated time) in only 32 hours of CPU time. CMG claims that the test result demonstrates the feasibility of
full-field, single and dual porosity/permeability modeling of primary and secondary recovery processes for giant and supergiant fields.
CMG also believes that its new parallel processing capability will set new benchmarks for reservoir property upscaling and permit the validation of geocellular-scale streamline simulation results. Work is ongoing to port GEM (CMG’s EOS-compositional reservoir simulator) and STARS (CMG’s thermal/chemical reaction/geomechanics simulator) to the IBM platform for single and multiple processor operation.
SimSci-Esscor (SSE) has released a new version of its PipePhase simulator. PipePhase models steady-state multiphase flow in oil and gas networks and pipeline systems and is used to perform a range of functions – from sensitivity analysis of key parameters in a single well, to a multi-year facilities planning study for an entire field.
The new release, Version 8, includes ‘lots of work’ on the user interface and on enhanced data exchange with Excel. PipePhase can now model subsea and surface pipe networks – and ‘advanced architecture’ multi-lateral wells.
A new pipeline device lets users copy and paste pipeline profiles from spreadsheets and automatically generates results in an Access database format. Version 8.0 also enables users to generate Vertical Flow Performance (VFP) tables or import them from third-party applications.
SSE general manager Ken Brown said “PipePhase is one of the global industry standards among petroleum producers. SSE has continually invested in PipePhase to maintain its core strengths and add ever-increasing convenience and power for the end user. Version 8.0 demonstrates the company’s ongoing commitment to delivering leading-edge tools for petroleum producers.”
A new, COM-compliant API layer enables other COM-compliant applications, such as Excel, to control Pipephase execution externally. A new initialization ‘wizard’ helps users set up new problems.
CGG’s worldwide network of seismic data processing centers has passed the 10,000 CPU mark with a world-wide total capacity of over 15 Teraflops.
The increased computing power follows the recent installation of an additional 1536 CPUs in Houston, 1088 in London and 320 in Kuala Lumpur. CGG will use the new compute power to support its ‘A+’ anisotropic pre-stack time and depth migration services and its WaveVista shot-domain migration.
Guillaume Cambois, Executive VP for Data Processing and Reservoir Services, said, “This increased computer capacity is in line with our commitment to provide clients with the best seismic products and the shortest possible turnaround times. It also demonstrates our mastery of PC cluster technology.”
All CGG processing centers, whether open or dedicated, have their own computing power and can draw additional capacity from any of the three regional “hubs” in Houston, London and KL.
Anadarko Canada Corporation, EnCana Corporation and Nexen Inc. are now receiving and approving invoices in a live production environment, using Digital Oilfield (DO)’s OpenInvoice Internet-based solution. OpenInvoice is used to automate invoicing and field ticket reconciliation in the companies’ western Canadian field operations.
EnCana CIO Hayward Walls said, “Digital Oilfield’s technology is an important part of our strategy. We have been able to see results within a matter of months of the deal and have already transacted with a broad range of suppliers, from very small to very large.”
Anadarko Canada IT director Robert Austin commented, “Implementing OpenInvoice has improved our business workflow processes by integrating directly into our accounts payable system. This seamless field-ticket to ERP system process has allowed us to capture spend data right down to the line item, as well as to significantly reduce professional staff time that is better utilized elsewhere.”
The latest release of OpenInvoice (Version 3.6) now includes company specific business rules, support of PIDX XML document standards, and enhanced Credit Invoice functionality.
DO VP of development Doug Spackman said, “The new release lets operators configure business rules to control the coding of invoice line items. Code verification ensures that only valid codes are imported into the various ERP systems. For example, operators can define how their own AFE coding rules, cost centers and GL coding will be implemented.
The new XML invoice import function allows a digital invoice to be created as an XML document that can be loaded directly into OpenInvoice. This feature saves the suppliers from manually entering invoice data that has already been entered in their own accounting systems, while at the same time taking advantage of workflow process improvement. DO has worked closely with operators and suppliers to leverage the API PIDX document standards.
PIDX RP 3901
The new release supports the PIDX RP 3901 XML transactions standards using PIDX formats for invoice document exchange, and RNIF 2.0 for the transport, routing and packaging protocol. Also included in Version 3.6 is enhanced Credit Invoice functionality, providing improved workflows in the browser-based user interface. In addition, Credit Invoices can be transmitted electronically via the PIDX XML electronic standard.
Denver-based Exprodat Technology Inc. has just released ‘Project Administrator’ (PA), a web-based application for monitoring, diagnosing and reporting on Unix project environments. PA monitors Landmark and other projects from a single, integrated web console. The application was originally developed for Anadarko and Marathon Oil.
Anadarko’s geophysical data supervisor Susie Foss said, “PA offers a powerful set of utilities that replace all of the piecemeal, command-line scripts we previously used to gather information about our Unix projects. PA warns you if a full disk presents a critical problem and as new data is loaded, PA helps identify inactive projects that can be archived to free up valuable disk space."
Exprodat’s Technical Architect Bruce Rodney added, “Project Administrator fills the gray area between IT, application support and data management, where problems can be difficult to diagnose. PA lets you analyze ‘logical’ projects, where data is spread across several physical file systems and databases. The bottom line is reduced downtime costs and happier and more productive users.”
Exprodat claims that PA delivers an easier-to-manage project environment in which problems are anticipated and resolved proactively. PA offers a simple, customizable web interface, providing intuitive access to a range of reporting tools. Automatic email notification and aggregated ‘dashboard’ displays of key system indicators provide early warning of potential problems.
Project Administrator is the third of Exprodat’s web-based developments. Project Browser, developed in 2000, introduced rapid browsing of online project data, irrespective of format or location. The Browser was acquired by Landmark and is now sold as Web Open Works. Project Archiver, also acquired by Landmark, is now marketed as the ‘Corporate Data Archiver’. More from www.exprodat.com.