I attended a the SPE session on ‘accelerating technology take-up’ this month. Apparently, the industry suffers from slow technology take up according to Ali Daneshy of the University of Houston. Daneshy reminded us of Geoffrey Moore’s oeuvre ‘Crossing the Chasm.’ The ‘chasm’ is the temporal gap between early adopters and broad take-up of a new technology. Moore (reference to whose work I have heard many times, but so far spared you) presents an intriguing graph—a bell curve, whose horizontal axis is time (with no scale) and whose vertical axis is ‘take up’ (again, no scale). Some way in from the left hand, leading edge of the curve, is the ‘chasm,’ an arbitrary chunk excised from the smooth curve.
OK let’s give Moore a spin. Unfortunately, the wheel’s invention is too far back in time for us to have a great insight to its early adoption. One can speculate as to what early mule cart travelers thought of the technology as they were pitched into the occasional chasm! But since, the wheel has been pretty good business. A glance from my office window confirms my intuition that ‘take up’ continues apace. In fact if I was an inventor of curves, I submit that exponential growth would be a better fit than a standard distribution. I’m not convinced that Moore has all the answers.
Houston Technology Center
Getting back to the SPE session, a study from McKinsey for the Houston Technology Center purported to show that the ‘chasm’ is reached earlier and lasts longer in oil and gas than in other industries. Slow adoption means that everyone ‘leaves money on the table,’ this is ‘bad business’. Vik Rao (Halliburton) showed a graph of patent applications since 1990. At that time, oils and service companies were granted an equal number of patents. By the late 1990s, oil company patents declined as service industry patents rose spectacularly.
This shift in technology spend has caused a shift in expertise and increased ‘IP sensitivity’. Today we have ‘information asymmetry,’ a lack of transparency between buyer an seller which ‘destroys economic value’. Three suggestions have been put forward to rectify this situation: 1) that an organization (the ‘ministry of truth’) be established to validate the utility of an R&D project, 2) an insurance company underwrites R&D downside and 3) there be established an ‘industry uptake award.’
Tom Bates (Lime Rock Partners) reported that R&D spending by US majors is 30-40% down over past decade in real terms and industry fares poorly compared with other sectors—GE spends 18% of net income on R&D; Microsoft, 95%; and Exxon a meager 2.6%. The service companies ‘outsource’ R&D by acquiring new technologies. US government spend is ‘insignificant.’ Bates opined that global procurement programs are viewed by entrepreneurs as an significant obstacle to the introduction of new technologies.
Shell Technology Ventures
Bill Dirks (Tecton Energy) described how Shell felt it was not receiving proper reward for its new technologies in the marketplace. A lot was spent on ‘things that no one asked for’ and there was a lack of discipline to kill off non-viable projects. Dirks recommends sharing risks and rewards and that operations should provide assessment and funding. But it is hard to counter the perception that while the bulk of risk is for the service industry, the bulk of the reward accrues to oils. Dirks suggests the SPE could have a role in developing and publishing E&P technical standards, promoting joint industry projects and perhaps even by setting up a technology ‘clearing house’.
A different slant on the impact of technology on E&P is to be found in a new publication from the International Energy Agency*. The report notes that, although picking technology winners is fraught with uncertainty, the oil and gas industry is ‘very active in pushing the technology envelope’. The IEA report, authored by Schlumberger’s Christian Besson, describes the industry as ‘relatively risk averse’ and suggested that this might be a reason for the fact that ‘changes take time.’ Besson writes that while the R&D departments of key industry players are already working on technologies that will bring major changes to the industry, picking the winners ‘offers plenty of scope for error’. In the past 25 years, 3D seismic has proved a real winner, while the considerable investment in chemical recovery and oil shale technology has yet to show a commercial return.
Having worked for twenty years in exploration, I would have described the industry as gung ho rather than risk averse. What can be less ‘averse’ than going sole risk on a wildcat? Perhaps it is because the industry is so used to risk that is inured to the ‘sure fire’ new play whether in R&D or a new basin.
One obstacle to technology take-up that was not fully investigated by either the SPE or the IEA report is the effect of patent law. This has not been designed to accelerate take-up for some greater good, but to ensure that the inventor of a new technology gets a return on investment. But as we have seen in these columns (OITJ Vol. 10 N° 3), patents can be more marketing than science. Even when they are science, they are not necessarily innovations. Without wanting to embarrass anyone, I am thinking of some of the ‘vanity’ patents that oil companies have been granted for technologies that are common knowledge, or in some cases, already in the textbooks. Maybe there is a role here for the SPE, doing the job that Patent Office seems incapable of, judging the intrinsic merits of new patents and putting gentle pressure on abusers or pointing out obvious ‘prior art’ to naïve applicants.
* ‘Resources to Reserves, Oil and Gas Technologies for the Energy Markets of the Future.’ International Energy Agency/OECD, 2005.
At the 2005 Society of Petroleum Engineers Annual Conference and Exhibition, held in Dallas this month, one vendor described the ‘perfect storm’ that is brewing around the standards that will underpin the digital oilfield. At issue is who is to be in charge of automated and optimized oil and gas production. There are currently two overlapping domains.
In the left corner is the upstream, the traditional bailiwick of the production engineer—aided and abetted by the reservoir engineering community. These folks are traditionally involved in the front-end engineering design (FEED) production systems.
Over in the right corner there is the process control industry, whose mainstream activity is based in the factory, but whose technology, particularly SCADA, is widely used to monitor and control oil and gas production.
The standards storm is brewing because the upstream brigade is moving from FEED to real-time simulation and production optimization. At the SPE, Shell’s Ron Cramer introduced a new project for a PRODML modeling language targeting real time production operations—see our report from the SPE ACTE on pages 6&7 of this issue.
But the process control industry is simultaneously working on XML-based process control standards, notably the Instrumentation, Systems and Automation Society’s ISA SP95 and its XML manifestation—the Business to Manufacturing Markup Language—B2MML.
Recognizing that one size does not fit all, the PRODML community plans to model up to the data historian and to leave the automation to the downstream ‘black magic’ of SP95. But for the digital oilfield to really take off, greater visibility will likely be required across the ‘frontier’ of the data historian.
Whether we are heading for a perfect storm is a moot point. The vendor we spoke to saw the colliding standards as a great opportunity. While ‘wrapping’ different ‘MLs’ into a proprietary system may be good business for a software vendor, it is perhaps not really what the standards movement is about.
In another SPE presentation, Russel Foreman presented BP’s web services real time data architecture project RTAP (more in our SPE report on pages 6&7). Foreman recognizes another potential threat to web services deployment in the form of ‘boxed’ web services from vendors such as SAP. Foreman recommends sticking with the W3C standards as the best route to compliant, interoperable systems. Sounds like good advice in the face of a potential standards ‘storm’.
UK-based upstream software house Tigress Limited has acquired GeoBrowse from Integrated Solutions Australasia (ISA). GeoBrowse combines a desktop GIS with generic database query technology. GeoBrowse runs on Windows XP/2000 and offers rapid comparison of data stored in disparate databases including Landmark, Schlumberger, SMT, GeoPlus and IHS Energy. The acquisition is part of Tigress’s strategic move into E&P data management—complementing its ‘signature’ Tigress database and interpretation suite.
Tigress’s MD David Sullivan said, ‘This acquisition enhances our position as the supplier of the world’s most integrated E&P data management system. With the Tigress database, the TIES data exchange and QC system, and our integrated interpretation suite, we have the best tools in the industry.’
Tigress will be marketing GeoBrwose in the US through Houston-based Seisquest Data Management. Seisquest consultants are certified in data management implementations such as IBM Tivoli, Veritas NetBackup and EMC≤ Legato. A further relationship with Sigma Solutions extends support to ADIC and StorageTek tape libraries.
OITJ—We’ve been tracking real time simulation and optimization for quite a while and were impressed with the Integrated Asset Management presentation at the Schlumberger Las Vegas Forum earlier this year. But we have been struggling to locate real-world examples of simulation and optimization in the upstream.
Lochmann—We have already shown in several midstream and chemicals installations that the concept of a complete model of the process, combined with the possibility of running simulation and optimization offline works and gives operators visibility of the whole process. But it is a hard sell to the upstream—which is why AspenTech created our group whose job is to translate these tools from process to oil and gas production. These are to be delivered as a part of AspenTech’s Open Simulation Environment (OSE) underpinned by Schlumberger’s PipeSim and Eclipse and our HySys Upstream—bridging processes between the compositional fluids of the downstream and the black oil of upstream. This involves some rather sophisticated math for lumping and delumping fluids and for production allocations from limited tank measurements.
OITJ—Why aren’t more companies adapting to these new work processes?
Lochmann—Aspen has an advantage in the process industry because chemical engineers already think in terms of the complete process. Look at a refinery—the first thought is the big picture. But in oil and gas we tend to treat each well as a separate process—there is less of a holistic approach. This is exacerbated by the fundamental difference between reservoir engineers and petroleum engineers. To the RE, ‘correct’ reservoir management means ‘go slow’. To the PE—‘suck it out as quick as you can!’ At the core of the process is a conflicting priority—optimization becomes impossible. You can’t neglect the effect on production of topside facility sizing. This is where the Integrated Asset Management package comes in—when folks can ‘see’ the whole asset. We need to demonstrate to the upstream world that dealing with an asset as a whole will make managing it the same as managing a plant or refinery.
OITJ—Is this good for both the onshore and offshore?
Lochmann—Offshore process is the same as onshore concentrated into a smaller footprint. There is no difference in function. In fact the lack of a holistic approach is even more applicable to the offshore. Onshore, if you need a separator, you can just call up someone and get pretty well any separator you need. Offshore is not like this and has far more rigorous front end engineering design requirements.
OITJ—We’ve heard a lot about the merits of the digital oilfield—just how important is this technology?
Lochmann—In sum, we are offering an order of magnitude improvement in a company’s ability to generate cash! To potentially improve P&L by 25%. These numbers are supported by the CERA digital oilfield study and by internal oil company studies. Unfortunately though, things are not moving in that direction even though such an investment in an intelligent production system could cost less than a new well. One hurdle is culture—the entrenched view that each scenario is different. Twenty years ago everything was separate—now integration and IT is bringing it together. Drilling and reservoir engineering are getting closer. I was in one major oil company office where they had a real time feed to a LWD* job. This indicated a top hadn’t come in quite as expected. In the next few minutes, the geoscientist adjusted the fault pick and generated a new set of predicted tops—all over a cup of coffee. Incredible—all thanks to real time model update.
* logging while drilling.
Mat Simmons of Simmons & Co., addressing the Petroleum Club of Houston, said that ‘We Are In A Deep Hole’ and that our ‘wake-up call is here!’ The consensus used to be that demand would peak and supplies would grow as energy costs fell. This was a dream. In reality, demand has grown, costs have doubled and new supplies have gotten smaller.
By August 2005 spare capacity was disappearing as everything, from drilling rigs through production facilities, pipelines to refineries. Perhaps even more important was the shortage qualified personnel across the industry. Onto this stretched market came hurricane Katrina which removed what little spare capacity that remained. For Simmons, Katerina is ‘our energy 9/11,’ and the full impact of Katerina (and Rita) is still emerging. Gulf coast facilities have been crushed, refineries are broken and much of the workforce is AWOL!
The lessons learned from the hurricanes? ‘Just-in-time inventory’ was a mistake as the impact of a local problem spreads nationwide. Concentrating America’s energy engines into the epicenter of hurricane alley was another error. The problem is even bigger, with demand forecast to grow to 120 million barrels/day by 2020. This is to be set against the ‘fantasy’ of both current reserve estimates and Saudi Arabia’s supposedly new 200 billion of proven reserves. According to Simmons, ‘peak’ production may now be ‘past tense.’
Simmons believes that the US should put itself on a ‘war footing’ regarding energy supply and use and recommends a ‘Marshall Plan’ for energy including the following emergency measures: reduce transportation energy by moving goods by boat and train, stop long commuting, reduce distribution costs by eating locally produced produce and manufacturing closer to home. Fixing the US situation regarding natural supply may be even harder. In general, an R&D ‘explosion’ is required along the lines of Alfred Loomis’ WWII Tuxedo Park effort. Read Simmons’ paper in full on simmonsco-intl.com.
ER Mapper has applied peer to peer technology, as used by Napster and others to share music on the internet, to the distribution of massive geo data sets. The technology, ‘GeoTorrent,’ is used to share large imagery datasets which can easily be many gigabytes in size. GeoTorrent.org uses bittorrent technology to alleviate the traditional bottlenecks and costs of internet data distribution.
ER Mapper’s Richard Orchard said, ‘At ER Mapper, we deal with multi gigabyte geospatial images. A good quality air photo of a state is going to be a large file! Previously, you could download data, but in such small chunks that they are unusable. People don’t want a square kilometer, they want a whole region, state or country. Because of the amount of data involved, the traditional means of delivery were too expensive. GeoTorrent, which uses BitTorrent’s brilliant and perfectly legitimate technology, allows people to download the datasets they want quickly. It also allows them to share their own datasets.’
An indication of the bandwidth needed for traditional file distribution can be got from the following math. 1000 downloads of 100 5GB files would amount to 500 terabytes of bandwidth! The cost of serving this amount of data would be astronomical. Download speeds would likely be dire, taking weeks or more to download a single file.
The answer is BitTorrent technology. Users who have downloaded data offer their untapped upload bandwidth to send data to other peers. Each new participant adds to available bandwidth at no cost to the system. A website, GeoTorrent.org started out as Orchard’s research project. GeoTorrent.org provides access to several terabytes of geospatial information, including continental and worldwide Landsat mosaics and complete vector datasets for the United States. New datasets are continually being added.
PetroChina’s R&D unit RIPED is to team with the French Petroleum Institute (IFP) to develop innovative technology for the upstream oil market. The deal was signed while RIPED president, Jia Chengzao, visited with IFP President, Olivier Appert at the IFP’s headquarters in the Paris suburb of Rueil-Malmaison.
The first projects to be kicked-off in 2006 will involve reservoir characterization and improved recovery with gas and polymer injection. The agreement also involves exchange of personnel between the two research establishments.
Beijing-based RIPED, established in 1958, employs some 1900 scientists working on activities spanning the upstream including IT and technology standardization. RIPED provides PetroChina with technology, decision support, high-tech R&D and professional education.
Earth Decision is to partner with O’Meara Consulting (OMC) to embed OMC’s Geo2Flow saturation modeling toolkit into its flagship GoCad geomodeling package. Geo2flow integrates data and interpretations from the fields of geology, petrophysics, geophysics and reservoir engineering to provide a ‘best-in-class’ solution for reserve estimation and reservoir modeling.
OMC president Dan O’Meara said, ‘Reserves are the crown jewels of our industry. But judging by most geological modeling solutions, you’d think we were selling rocks rather than oil and gas. Geo2Flow, in combination with GoCad, models both the rocks and the oil and gas within them. Our patented technology is a comprehensive solution for calculating reserves, modeling saturations, identifying reservoir compartments and predicting permeability.’
Earth Decision president Jean-Claude Dulac added, ‘Combining Geo2Flow’s patented technology into the GoCAD workflow provides an interdisciplinary approach to building holistic and consistent reservoir models. The collaboration with Geo2Flow extends our vision of a shared earth model in which numerous best-in-class technologies combine, leveraging our open development platform.’
The developers of GoCad, the 3D geomodeling package have undergone a few changes in corporate identity of the last few years. The first commercial spin off from the GoCad consortium was T-Surf. T-Surf changed its name to Earth Decision Sciences back in 2002. But while this collapsed snappily down to ‘EDS,’ the IT behemoth of the same name was none too pleased by the acronym’s proximity. EDS has now been reborn as ‘Earth Decision.’ Please note this should not be acronymized, as the hard discounter ‘ED’ is a household name in France!’
Calgary-based Intervera Data Solutions has just launched the commercial version of the DataMerge package – a module in its flagship DataVera data QC application. DataMerge lets users connect to any data source, convert data on the fly while improving data quality. DataMerge leverages Intervera’s library of automated data quality solutions to rectify data issues during conversion. The new solution augments other DataVera modules including Data HealthCheck, released last year (OITJ Vol. 9 N° 11).
Well life cycle data
DataMerge, which works on both well life cycle and operations data, lets users perform complex conversions without writing custom programs or ‘throw-away’ scripts. A GUI-based interface and a pre-packaged QC rules library lets data managers and IT specialists consolidate multiple data stores, applying corporate standards and duplication checks. By preventing bad data from being loaded, DataMerge helps companies keep a high level of confidence in data quality. The technology is particularly suited to mergers and acquisition-related activity and loading third party data.
Intervera president Paul Gregory said, ‘Industry now can benefit from our own data conversion experience – with the same tools that we have been using for years. With DataMerge, our customers can complete data transformations three times faster than traditional methods.’
The sixth meeting of the informal National Data Repositories (NDR) group was held in TNO’s Utrecht headquarters last month. The NDR meetings used to be restricted to government and oil company attendees, but since 2004, they have opened up to include vendors. A shift has also taken place in recent years as NDR now extends beyond oil and gas to more generic geo data repositories. NDR6 offered a good summary of the state of play of national data banks and the data situation in member countries.
NDRs differ so much that it is almost impossible to categorize them. Data release rules, culture, pressure from industry all combine to make for as many NDR and pseudo-NDR geometries as there are countries. More in the case of the UK which has 3 or 4 more or less coupled repositories (DEAL, CDA, National Geoscience Archive and the UK Oil Portal).
Norway continues to show the way in national data management. Not only because of its decade-long history of an online data bank, DISKOS, but also because of its aggressive policy of ‘refreshing’ its commercial partners. DISKOS was originally run by an IBM-led consortium, which developed the PetroBank data management system. This was later taken over by Halliburton unit Landmark Graphics. The situation changed when the DISKOS contract came up for its first renewal. Landmark’s PetroBank was retained, but operations were awarded to Schlumberger. Now the DISKOS software contract is again up for renewal and this time, a Norwegian newcomer Kadme is on the shortlist, along with the usual suspects, Schlumberger and Landmark.
The Netherlands deploys large scale complex, multi domain databases to manage the low-lying country that is susceptible to geo hazards like flooding. The NDR meeting included a field trip to Rotterdam’s Maeslantkering storm surge barrier. Coordinated use of this facility is a prime example of the use of a multi-disciplinary geological database. The cross discipline DINO internet database offers over 200 geoscience data types. DINO is based on ‘e-government’ principles and works on the principle that free data stimulates economic activity.
In South America, Schlumberger has built an own brand NDR, the Columbian EPIS. This uses Schlumberger Information Solutions’ Decision Point portal to expose Columbian E&P data and licensing information to the public.
In North America, NDR activity is severely hampered by federal and state divisions of labor. Culture means that commercial data release predominates. In Canada, the ‘Calgary model,’ of all-commercial data release also holds sway. Both Canada and the USA are in the process of trying to unite disparate data stores across their countries. Everywhere, the legal situation of speculative seismic data requires sensitive treatment. Governments are wary of damaging an industry that has proved a powerful motor in promoting exploration. In the US, trying to make states, Federal government and other organizations cooperate is like ‘herding cats.’
NDRs can force an unusual degree of collaboration between vendors and service providers. In its DISKOS operations, Schlumberger’s SINAS unit operates Landmark’s PetroBank software. PetroBank uses Petris’ Recall (ex-Baker Hughes) package to manage well log data. The multi-vendor aspect goes even further. In the SINAS’ Ordering System (SOS) for public domain data, the GIS web application is Kadme’s Integrated GeoStore (IGS) application. The SINAS website itself was ‘customized’ by Kadme using the open source PostNuke, a PHP-based web portal development environment.
Ron Meiburg outlined Shell’s vision for ‘borderless’ E&P data. Global data management was an ‘essential component’ that was missing from Shell’s previous, devolved business unit based structure. The aim is for a ‘fully digitally-integrated borderless E&P business’. Shell’s vision is illustrated by a Norwegian operator supplying gas to a reseller in the UK with a customer in Italy—all tied together by data and information. This requires standards for data, metadata, documents, reports, exchange formats and taxonomies. Data and documents need to be digitally stored in real time and with suitable control of entitlements.
This article has been abstracted from a longer report produced as part of The Data Room’s Technology Watch reporting service. For more on this service, please visit the Technology Watch home page or email firstname.lastname@example.org.
The SPE is in excellent financial health with some $57 million in total assets and net income of almost $ 5 million for the year ending March 2005. The ACTE is commensurately grandiose with over 500 papers and, this year around 8,000 attendees. Outgoing president Giovanni Paccaoli reported massive membership growth to a global total of 64,000. As a not for profit, the SPE ‘redeploys’ all of the $2.8 million revenue from meetings to its distinguished lecturer program (there are currently 33 distinguished lecturers) and the online resources of spe.org. Some 2% of the SPE’s reserves are set aside for ‘innovations’ such as a new magazine for young engineers—The Way Ahead.
Incoming president Eve Sprunt reflected on current public perception of the oil business as a ‘sunset industry that pollutes.’ A perception that is adversely affecting recruitment—and one that needs to be redressed. Sprunt suggests talking to your children, other family members and friends to correct such perceptions.
10 year plan
Under Sprunt, the SPE is to kick-off a ten year plan with, inter alia, the aim of producing a ‘single set of reserves best practices’, bringing together the UN, the SPE and the International Accounting Standards Board’s work. The SPE will create new programs to address gaps in its continuing education program and will build the spe.org website, which, Sprunt insisted, ‘must break even financially’ (spe.org caused a $1.1 million write-down in last year’s accounts). Other key programs are the e-library and technical publications—with plans to reach out to other orgs—and a new SPE women’s professional network.
Russell Foreman’s paper covered BP’s web services-based Real Time Data Architecture Project (RTAP). Web services are ‘untangling the spaghetti’ of a multiplicity of data sources and applications where ‘IT has become the bottleneck.’ BP launched RTAP in 2003 (OITJ Vol 8 N° 12). RTAP includes a US onshore project that wraps legacy data sources such as P2000, SAMS, WonderWare, Oracle to client tools such as Microsoft Office and Matrikon’s ProcessNet. The system has been rolled-out at 30 locations. Foreman commented that, ‘We have yet to find a legacy system that was so ugly that we couldn’t wrap it!’ Foreman warned that because everything is so new there are few truly standard definitions. Problems were reported with ‘boxed’ WS solutions from SAP and others.
Cynthia Rees explained how ExxonMobil plans to ‘significantly’ improve its subsurface work environment by the year 2010 by adopting a global functional organization and global standards. The project, Exxon’s ‘EM2010 Vision’ comprises three components: the Technical Computing Systems for engineers, a ‘WellWork’ data and process management (the subject of Rees’ talk) and the RETR interpretation methodology presented at the Calgary AAPG earlier this year (OITJ Vol. 10 N°7/8). WellWork is built on Peloton’s WellView application which Exxon has been using since 1996. Originally just a well sketch tool; WellView has evolved into a complete well data system. WellWork is the most widely used application in Exxon with 18,000 operated wells and 500 users.
Ron Cramer (Shell Services) presented a new initiative to establish a standard, ‘PRODML’ to underpin the ‘digital oilfield’ concept. PRODML has backing from BP, Shell, Chevron, ExxonMobil, Statoil, Weatherford, Halliburton, Schlumberger, PETEX, Invensys, Microsoft, OSI Soft, Sense Intellifield, Kappa, TietoEnator and POSC. Scope of the new standard is the process control domain but the group intends to ‘keep out of SCADA.’ XML will work upstream of the data historian and will initially focus on gas lift optimization. The PRODML group is ‘in phase’ with the process control SP95 standard - but the latter is more relevant in the context of the factory. Harmonization is also being sought with the existing ProdML Norwegian production reporting standard. Once developed, POSC will be the ‘custodian’ of PRODML – due for publication in August 2006.
James Crompton (Chevron) described the digital oilfield’s impact on the petroleum engineer’s workflow. The challenge of the digital oilfield is linking across the traditional ‘silos’ of drilling, completion, petroleum engineering, surface facilities etc. Fields are designed statically, but production is a dynamic process and after a few years, ‘debottlenecking’ is usually required. We still reacting to ‘red lights’—we don’t see the yellow lights. ‘Digital Oil’ is a huge opportunity to leverage reservoir surveillance, well performance etc. Today it is easier to acquire more data by adding measurements (tags). One major West African development has 30k tags already - and a lot of analysis capacity. But the real problem is the information ‘pipeline’ between data collection and analysis. Who owns it, what is possible, what is affordable. Evaluating new technology is also getting harder. Crompton suggests a metric of return on technology deployed (ROTD), and asking vendors to ‘prove that your technology is better than my spreadsheet.’
i-Field asset management
Trond Unneland (Chevron) stated that our ability to measure is much greater than our ability to use data. In general, data is ‘weakly exploited and poorly managed.’ Tools, especially spreadsheets, are unsuited to real time. In answer to a survey of i-field technology, Chevron’s assets teams responded, ‘do not give us more data!’ So Chevron is trying to make the oilfield more like a factory, as the refining business has been doing for a decade or so. Chevron is creating a business case for a multi-well instrumentation program designed to minimize lost or deferred production, to accelerate first oil, avoid contractual penalties and to increase recovery. Chevron has already kicked-off several autonomous i-field R&D initiatives including ‘RTPRO’ (with Schlumberger) for visualization and modeling of real time data. Other initiatives target the application of artificial intelligence to production optimization, an ‘Asset Decision Environment’ for control rooms (with SAIC) and the CiSoft training program with the University of Southern California. (OITJ Vol. 8 N° 9). Chevron’s de-centralized organization means that such initiatives ‘can’t be forced down assets’ throats’. The solution? Encourage local i-field projects like continuous compressor optimization on the East Texas Carthage, heat management and resources optimization on the San Ardo field in California and a holistic approach to production management on the North Sea Captain field. Here the aforementioned Asset Decision Environment is used to support voidage and water management. Chevron’s partners in the i-field include Schlumberger, SAIC, Norwegian EPSIS and Microsoft. Unneland didn’t say what Microsoft’s take was on ‘eliminating’ the spreadsheet from production engineering!
Eric van Oort described Shell’s real time operations center (RTOC) in New Orleans. The business driver for the RTOC is the fact that operators waste a lot of money on non-productive time in drilling. IT is now so good that you have to capitalize on the possibility for remote control and optimization of operations that the RTOC offers. The RTOC is a Shell/Halliburton joint venture with 20 staff and 24x7 operations. RTOC staff is made up of software (application) specialists and Shell’s drilling engineers. The RTOC has managed up to fifteen concurrent rig operations including global ‘big cat’ wells. Global reach is achieved with regional hubs and satellites. The RTOC has helped Shell cull best practices from around the world and to react to emergencies such as an abnormal pressure situation with a cross-discipline team. To avoid the ‘big brother syndrome’ whereby the RTOC is perceived as an ‘expensive intrusion’ by local operators, Shell rotates field staff through the RTOC. That is they did, until Katrina. Today the facility has been abandoned pending relocation to Houston.
Drill pipe data rates
Li Gao (Halliburton) presented a succinct analysis of data rates that could be achieved using acoustic transmission up the drill pipe. Current mud pulse technology can attain 2 to 10 bits per second (bps) but operators want to send more data up the drill pipe. By using state-of-the art Discrete Multitone Demodulation, transmission rates of several hundred bps are achievable.
SPE for managers
Ahsan Rahi from Schlumberger’s Integrated Project Management (IPM) unit proposes a scoring model for project complexity and risk. Projects may be complex but not risky and vice versa. Understanding such project metrics can help manage and mitigate project risk and also be used to assign project managers from the ‘HR pipeline’. Risk can also be transferred during project negotiations. A ‘bubble map’ of profitability in risk/complexity space proved ‘very insightful’ and has helped Schlumberger and its clients to mitigate these factors.
This article has been abstracted from a longer, illustrated report from the SPE ACTE, produced as part of The Data Room’s Technology Watch reporting service. For more on this subscription-based service please visit oilit.com or email email@example.com.
At first glance, DecisionSpace/Nexus is Landmark’s new simulator, a replacement for VIP. Indeed, the marketing material focuses on Nexus’ new ‘PowerGrid’ algorithm and reported ‘2 to 20’ times speedup over ‘industry simulators,’ including VIP by the way. But sit in on a Nexus demo and you see more than the traditional fluid flow modeling. With help from BP, who participated in the Nexus project, Nexus reflects a ‘bigger picture’ approach to asset design with simultaneous modeling of reservoir, well bore, gathering networks and surface facilities.
Along with PowerGrid, other hard core reservoir engineering tools have been packaged up into Nexus. These include a fancy layer lumping schema and decision support environment for upscaling geological models prior to flow simulation. Users can define drilling targets and use dog leg severity and other constraints to figure out which targets are attainable. But the fun starts when you roll-in a few gathering centers and the pipe network for true, holistic asset evaluation.
John Killough, Landmark’s Reservoir Simulation Research Fellow said, ‘Piloting of Nexus is under way on several diverse and complex assets around the world. We have realized enormous time savings and process efficiencies due to the robust solver and seamless data transfer abilities.’ A ‘direct access’ modeling environment leverages the VIP Data Studio, eliminating the need for data transfers and data reformatting.
Wellogix has raised $8 million in equity from a group of investors led by San Antonio-based First Capital Group. The cash will be used to fund global expansion and software development. First Capital’s Jeffrey Blanchard now has a seat on the Wellogix board. The company also announced the appointment of Thomas Harper as vice chairman and director.
Juan Szabó, Hussein Shehab and Mohamad Idris Mansor been appointed to the advisory board of Stone Bond Technologies. Mansor was previously with Petronas, Szabó with PDVSA and Shehab with Kuwait Oil Co.
OpenSpirit Corp. has appointed Carlos Eiffel Arbex Belem, principle of IES Brazil Consultoria as agent for Brazil.
MRO Software has appointed Terry Ray as Director, Industry Marketing for Oil & Gas. Ray was previously VP Gartner’s Energy & Utilities Industry Advisory Service.
Fugro is to acquire Elcome Surveys Pvt. (India) and has signed a joint venture with Oceansatpeg in Brazil. Elcome has 69 employees, the Oceansatpeg JV of which Fugro holds 62%, 250. Arnold Steenbakker has joined Fugro’s Executive Committee with responsibility for the Onshore Geotechnical business.
The US Energy Department has named Carl Bauer as director of the National Energy Technology Laboratory (NETL).
Liu Xiao Bin has been named VP of Aveva’s new Guangzhou, China-based Process and Power Division.
An Enskilda Securities study found that E&P spending is to rise 15-25% in 2005 and by another 15% in 2006. Worldwide exploration spend as a percentage of E&P fell from around 25% in the late 1990s to 15.5% (a record low) in 2004. It is now coming back and is forecast to exceed $5 billion (17.5%) in 2006 (from).
Geosoft has added Shuttle Radar Topography Mission (SRTM) data sets to its Geosoft DAP public data server. Elevation data is available at 30 meter resolution for the US and at 90 meter resolution for the rest of the world. The near-global scale SRTM data was acquired by The Endeavor Space Shuttle in an 11-day mission in February, 2000.
Divestco has acquired Canamera Corp. and Canamera Equities—collectively known as Drilling Records. The US portion of Drilling Records will be combined with Divestco’s previous acquisition, Petro Data Source. Divestco also acquired Focus Holdings and Investments, a Calgary-based management and technical consulting services provider. Divestco has secured $10 million in new long-term debt financing from Scotiabank unit Roynat Capital. The cash will be used to fund potential asset and share acquisitions and for general corporate purposes.
The Gas Technology Institute (GTI) and the Network of Excellence in Training (NExT) are to partner on a training program for the natural gas industry. NExT is a joint venture between the universities of Heriot-Watt, Texas A&M, Oklahoma and Schlumberger.
In a delayed 10K filing, Silicon Graphics’ (SGI) auditors’ opinion on recurring operational losses, negative cash flows and a stockholders’ deficit raised ‘substantial doubts’ about the company’s ability to continue as a going concern. SGI has retained the services of turnaround firm AlixPartners LLC to advise on ‘further expense reductions, liquidity improvement and other restructuring actions for fiscal 2006’. The company has also secured a $100 million asset-backed facility from Wells Fargo Foothill, Inc. and Ableco Finance LLC.
Energy Solutions International has named Roderick Hayslett CFO. Hayslett was previously with Florida Gas Transmission.
UK-based Peak Group has secured 11 million financing from the Royal Bank of Scotland. The funds will be used to further develop the company’s turnkey multi-well, multi-client programs.
TGS-NOPEC is to acquire Aceca Ltd. for $10 million in cash and $2.5 million in TGS paper. London-based Aceca markets multi-client interpretation studies in northern Europe—distributed with Aceca’s web-enabled Facies Map Browser interface. TGS-NOPEC unit A2D has signed a five year agreement with Chesapeake Energy for unlimited access to A2D’s smartRASTER well log database. Chesapeake also will partner with A2D to expand its LAS log library.
C&C Reservoirs has launched a Structural Analogs Module co-developed with the French Petroleum Institute, IFP.
The New Zealand Ministry of Economic Development’s (MED) has awarded Landmark a contract for the provision of the technology for its geotechnical database and public Web site portal. Landmark will provide a single data storage solution for all of the Crown Minerals Group’s digital exploration and production data using its PetroBank Master Data Store (MDS) data analysis and management system. Secure internal and public data access will be provided through Landmark’s PowerExplorer and Team Workspace portal.
Crown Minerals Group manager Adam Feeley said, ‘This technology will help us provide access to all publicly available data on New Zealand’s petroleum, coal, and minerals exploration assets through a web portal. Open file data, public information on permits, including ownership, gazettals and block offers will be available on the web, allowing New Zealand to provide free of charge for data on block gazettals and to open up more of the country’s frontier exploration acreage.’
Team Workspace will be the public interface to Crown Minerals’ data stored in the PetroBank MDS. PowerExplorer provides web-based browsing and management of spatial and tabular data.
Landmark president Peter Bernard said, ‘Petrobank increases productivity and project cycles times are reduced by 5% or better. This will greatly enhance MED’s effort to promote and encourage exploration in New Zealand.’
Norwegian Kadme AS and the Netherlands Organization for Applied Scientific Research, TNO, are to offer joint Information Management solutions for the upstream oil industry. The two organizations will work together to combine the E&P modules of TNO’s DINO software with Kadme’s web-based components into an integrated multi-client solution for E&P data repositories.
DINO is the Netherlands’ geotechnical database for all geo data types and embeds some 60 workflow processes and data streams to assess usage patterns and costs, allowing for process improvement and best practices deployment.
Kadme’s software components K-map and K-doc are built on open source based web services architecture using tools such as the OpenGIS-based MapServer from the University of Minnesota, web services for remote portlets, JBoss and mySQL.
Gianluca Monachese, Kadme’s MD said, ‘By combining DINO with our components we can deliver state-of-the art, multi-client E&P data repositories. Our scalable, open solutions integrate with any E&P data source, from a single well to a National Data Repository. Our solutions scale from a single laptop to a Linux cluster running Oracle RAC 10g’.
Kadme is to market the solutions and will co-ordinate deployment, with technical support provided by TNO. The standards-based solutions offer internet-ready data repositories. End users will be able to securely access entitled data in the repository independently of the client application. An expert GIS user might access the data through ArcGIS, while casual users might prefer using Google Earth or a web browser. Corporate portals and client applications will be able to retrieve and repository data via web services. ź
Landmark is to market Epoch Well Services’ unit Bengal Development’s AFE management system and rig scheduler. The applications will be packaged into Landmark’s Engineers Data Model (EDM) Infrastructure.
David Field, director of Landmark’s Drilling and Completions business said, ‘To deliver on the vision of a comprehensive well operations data and process management infrastructure, we are continually seeking out companies with critical applications that meet our clients’ needs. With this agreement, we hope to build on the success of Landmark’s Engineers Data Model and broaden the scope of the technical solutions we bring to the market.’ Bengal’s software targets CAPEX workflows, tracking, control and planning.
Epoch Well Services president Chris Papouras added, ‘Landmark and Bengal Development have complementary solutions that add value to the E&P. Together we offer solutions that leverage the power of Landmark’s integration platform and Bengal’s business application skills. This relationship will bring stronger technology synergies as our clients’ needs grow.’
Roxar has announced a new release of its Tempest reservoir simulation package. Tempest 6.2 now includes a novel technique for simulating fluid flow in naturally fractured reservoirs. The new single grid, dual porosity technique halves run times compared to conventional dual porosity models. A new ‘tensor permeability’ option is now used to describe complex heterogeneous reservoir systems, especially when dips do not conform to grid coordinates.
Tempest now runs on both 64 bit Linux and Windows 64-bit as well as Unix. The state-of-the-art simulator engine is coded in C++ with a Java-based user interface. Tempest models a range of physical processes within a single application including black oil, compositional, dual porosity, steam, coal bed methane and polymer injection. Tempest can also act as a companion package to Roxar’s flagship Irap RMS modeling suite, reducing workflow complexity.
Roxar CEO, Sandy Esslemont said, ‘As industry focuses on reservoir management and production optimization, the need for a modern, fast, easy-to-use simulator that scales to the users’ needs, has become a critical success factor. Our proven track-record with high-profile customers such as Lukoil and our commitment to support means that Roxar’s clients will achieve greater understanding of their reservoirs and maximize reservoir performance for years to come.’ Reservoir engineers can also take advantage of Irap’s new engineering-focused capabilities such as fault-seal analysis and fracture modeling techniques, bridging the gap between geological and reservoir engineering workflows.
Houston-based Quorum Business Solutions has launched its ‘next generation’ upstream operations and accounting package, the Quorum Upstream Suite (QUS). QUS is built on Microsoft’s .NET and includes modules for land and lease management, division order, volume management, core financials and lease operating statements.
QUS can be deployed as an integrated suite or module by module. Quorum claims fifteen clients for various Upstream Suite modules. The module’s design has been based on the requirements and best practices culled from large and small Quorum clients. Pervasive workflow automation throughout the suite enables oil and gas companies to manage growth while keeping operational and administrative costs down. All modules provide the security, configurability and process controls necessary to meet the Sarbanes-Oxley compliance.
Quorum Business Solutions president Paul Weidman said, ‘Consistent with our philosophy of providing operational and staff efficiencies, QUS has been designed to let growing E&P companies achieve enterprise-wide data, transaction, and reporting integration. We will continue to provide first-rate customer support as well as flexible licensing options to help our clients to manage the total cost of ownership of their IT infrastructure.’ Quorum clients include Shell, TGO, Occidental, ConocoPhillips, Chevron and Anadarko.
A survey undertaken at the Kalido User Group last month found that almost two-thirds of Kalido customers see tackling master data management* as a top priority for their organizations even though the IT industry as a whole has neglected the issue.
MDM is needed to deploy standard business definitions and improve data governance and plays a central role in enabling accurate business intelligence and reporting. An emerging trend is for MDM ownership to move from the IT department to business executives.
Over the last decade Shell’s Lubricants division has moved from country-based units through geomarkets and now, to a global model. Mergers and demergers further complicated Shell’s product topology. Meanwhile, enterprise customers such as GM were ‘getting centralized’ and began demanding standardization across global procurement, something Shell was ill equipped to provide.
Shell realized that the move to e-business would expose many of the inadequacies of its decentralized back office systems to customers. The solution was content standardization through rigorous MDM. Shell developed a centralized global catalog of a rationalized product portfolio—five times smaller than the initial product count.
Standard rules for product and attribute names were implemented and mapping between local and global nomenclatures meant that ERP logic would not be disturbed.
Shell has developed its own web based master data management application, leveraging Kalido’s Adaptive Data Store technology. A relational structure for master data allows product teams to execute queries such as ‘find all products with the same attributes.’ A query that would have been impossible to execute in the past due to the multitude of disparate systems across which product master data was stored.
*Master data management concerns data that is shared across the enterprise. Not to be confused with metadata, header information or ‘data about data’.
A new study* from Wood Mackenzie, of 28 leading oil companies, found that many oil companies no longer replace reserves through their discoveries. Exploration investment has reduced in response to growing technical risks and uncertain oil prices.
Author Andrew Latham, upstream VP with WoodMac, said, ‘Not only is exploration more expensive now, but it has become more difficult to achieve success, as the more accessible fields have been discovered.’ Inflationary pressures, increased competition for exploration areas and capacity constraints are continuing to drive up finding costs for new reserves. In this climate, only the top performers have been able to both replace reserves and create value through exploration.
Although the supermajors remain the top performers with high exploration success, their share of exploration spend peaked in 1998 and is 33% down—translating to an overall 50% decline in reserves replacement. But the bad news on reserves replacement is counterbalanced by the fact that improved margins mean that all companies now create value through exploration, especially top performers BP, Petrobras, Apache and Chevron. The latter is a ‘turnaround success’, moving from relatively mediocre performance in the late 1990s to top class value creation since the ChevronTexaco merger.
The study categorizes companies into categories of hunters, gatherers, grazers and foragers. Only the 25% in the hunter category managed to fully replace production through new field exploration. Half fell in the ‘forager’ category, achieving neither high returns nor full reserve replacement.
* ‘Exploration Strategy and Performance.’ Wood Mackenzie, 2005. More from woodmac.com.
Wellogix and partner ChanneLinx have implemented a ‘purchase-to-pay’ e-business solution for Marathon Oil. The Marathon-branded ‘QuickTix’ process for supplies and complex oil field services routes transactions through Marathon’s global procurement process, improving efficiency and streamlining operations. SAP NetWeaver was used to integrate mySAP Supplier Relationship Management with Wellogix’ Complex Services Management Suite. The integrated package enables Marathon to improve complex services procurement and increase collaboration with suppliers.
Wellogix CEO Ike Epley said, ‘Marathon has invested substantial financial and human resources in its SAP solution. By combining mySAP SRM with our EFT module, Marathon has achieved significant leverage from the combined offering.’
SAP VP for the oil and gas vertical, Peter Maier said, ‘We recognize the need for industry partnerships to address complex services procurement. By leveraging Marathon’s existing ERP framework with point to point integration through mySAP SRM, enhanced by the use of industry standards (PIDX) and Wellogix’s industry knowledge and functionality, SAP and Wellogix have provided an economical and timely solution. Regarding the upstream, we see our partnership with Wellogix fitting well with our enterprise services vision. Wellogix is committed to build out its complex services procurement scenarios on the SAP composite application framework and xAPP product line.’
Invensys Process Systems (IPS) is rolling out new asset performance management solutions to monitor industrial assets, to maximize asset value and improve business performance. The solutions draw upon technology and expertise from Invensys units Avantis, Foxboro, SimSci-Esscor, Triconex and Wonderware.
IPS president Mike Caliel said, ‘Asset availability used to be the domain of the maintenance department while asset utilization was the domain of operations. As a result, vendor solutions were typically aimed at one or the other, depending on the vendor’s particular area of expertise. However, maximum asset value is derived by balancing asset availability and use in a manner that best meets current business objectives. This requires a holistic approach that, unfortunately for users, is beyond the ‘comfort zone’ of most automation vendors. Our portfolio encompasses measurement and control, safety systems, simulation and optimization, plant intelligence, real-time monitoring and enterprise asset management. So Invensys is arguably the only automation vendor that can deliver comprehensive asset performance management solutions that address both availability and utilization.’
IPS is rolling out a family of nine component and system-level asset performance management solutions, or ‘monitors’ that will be expanded to encompass performance management solutions for vertical, industry-specific assets. Initial components include monitors for equipment, instruments, valves, pumps and process loops. The pump monitor acts like a ‘computerized stethoscope’ to provide a variety of on-line diagnostics. High frequency acoustic sensing technology filters out background noise to produce a graphic representation of pump health. An ‘intelligent’ alarm management monitor directs operators to the root cause of an alarm. Security management conforms to guidelines including ISASP99, NERC CIP002-CIP009, and 21CFR11.
RPS Group has acquired ECL (Exploration Consultants Limited) for a 27 million cash and paper consideration. ECL is now part of RPS’ Energy Division. The UK-based group now operates from regional offices in the USA, Canada, Europe, SE Asia and Australia. The move is in response to developing market opportunities within the oil and gas consulting sector and will add ‘significant additional capability’ to RPS Energy’s offerings.
RPS Energy CEO Phil Williams told Oil IT Journal, ‘We have been building the RPS Energy business steadily over the last two years. The ECL acquisition will allow us to accelerate our development— particularly with our international operating bases in Houston, Perth and now Calgary. RPS has a real commitment to developing the leading consultancy business in the energy sector. This began with the acquisition of Hydrosearch two years ago and RPS Energy is now one of the largest energy sector consultancies.’
Management teams of both organizations are to remain with the businesses. In addition to the HydroSearch acquisition, RPS recently acquired UK-based Cambrian and Troy-Ikoda. RPS Group announced record results for the six months ended 30 June 2005 with profit before tax up 23% and earnings per share up 13%. Net revenue for the period, at 82.3 million, was up 35%.
Supply-side upstream e-business consortium, OFS Portal, is to deploy a content management and customer support solution from Heiler Software Corp. of Stuttgart, Germany. Heiler’s Premium Content Management (PCM) solution electronically receives, validates and publishes supplier’s product information and delivers it to buyers. Heiler implemented, and now officially supports, the API’s Petroleum Industry Data Exchange (PIDX) catalog schema.
OFS Portal VP Randy Dutton said, ‘We needed a system that was both supplier and buyer centric. PCM is a scalable solution for our community and Heiler, a flexible partner that will help us meet the complex needs of the oil and gas industry.’
Heiler’s catalog and content management solutions integrate with corporate e-procurement systems such as SAP’s Enterprise Buyer Professional, mySAP Supplier Relationship Management and procurement solutions from CommerceOne and Ariba. PCM will automate OFS Portal’s catalog management and will distribute catalogs from supplier’s back ends through OFS Portal to buyers.
OFS Portal will benefit from Heiler’s hosting and managed services, providing members with a dedicated environment and support. Deployment of PCM Premium Content Manager for OFS Portal was completed in two months.
Heiler COO Greg Wong added, ‘We are pleased to be working with OFS Portal to drive industry standards. Clients are looking for turn-key solutions where vendors take responsibility, not only for delivering software, but for making sure it works in their environment.’
Real-time process management specialist OSIsoft has teamed with NRX Global to offer users of SAP’s Enterprise Asset Management (EAM) solution access to real-time performance data, integrating disparate systems in a ‘role-based’ portal. The solution maps equipment tags and measurements to SAP records. OSIsoft’s real time data acquisition system RtPM will be coupled NRX’s VIP, a ‘packaged composite application’ built on SAP’s NetWeaver infrastructure.
OSIsoft marketing VP Michael Saucier said, ‘The combined solution from OSIsoft and NRX bridges the information gaps between operations, maintenance and the supply chain. Organizations can support proactive enterprise asset management initiatives and optimize performance.’
NRX will incorporate OSIsoft’s PI data historian, the Advanced Computing Engine (ACE), and SAP’s ‘iViews’ java portal components. PI ACE is a Visual Basic .Net environment for programming calculations such as heat and material balances, data reconciliation and cost accounting. The ability to import and analyze data in real time gives asset owners fast access to performance data and enhances strategic planning. Asset efficiencies and other performance metrics can be fed back into SAP modules, such as the Business Information Warehouse.
NRX Global president John Burke concluded, ‘By combining the power of VIP with OSIsoft’s RtPM platform, SAP’s customers will now be able to measure business process improvements in real-time. Combining VIP’s IM capability with OSIsoft’s real time solution will let asset owners realize maintenance and materials management excellence.’
Newfield Exploration has extended its implementation of P2ES Enterprise Upstream business management suite to cover all of its North American locations. Newfield’s deployment links Enterprise Upstream with Oracle eBusiness Suite including general ledger, accounts receivable, accounts payable, joint interest billing, AFE and settlement. Initially, Enterprise Upstream Land, Production and Field Operations are to be deployed. Newfield plans to have its Malaysian office on the same system by the end of 2005.
Newfield Controller Brian Rickmers said, ‘Integrating our offices in Tulsa, Houston and Denver means we can track production, control costs and manage our domestic operations efficiently. By expanding the implementation internationally, our ability to manage our business processes world wide is even greater.’
Enterprise Upstream is a platform-independent web-based oil and gas business management solution that integrates with corporate ERP systems. The system includes modules for business intelligence and reporting, operations accounting, land management and volumes management—leveraging a centralized, cross business-domain data store.
Newfield’s initial P2ES deployment dates back to 2003 (OITJ Vol. 8 N° 7) when P2ES helped Newfield establish an enterprise datamart to provide a single, centralized source of operations and financial data to all of its office and field personnel. P2ES, a Petroleum Place unit, was created in January 2003, by the merger of Paradigm Technologies, Novistar and Petroleum Financial.