July-August 2008


Quantitative virtuality!

Ingrain’s Jack Dvorkin makes a compelling case for a new kind of special core analysis—combining a state of the art 3-D NanoXCT x-ray tomographic scanner and high performance computer modeling.

Modeling can be viewed as a spectrum of activities—from the speculative nature pre-discovery model of what may or may not prove to be a reservoir, to the near physical reality of a refinery simulator. What separates the speculative from the ‘real’ is the ease with which the model can be checked with additional measurement. Ultimately, models can become so well-constrained that they can be used to perform ‘experiments’ themselves—such as automobile crash testing.

In a recent presentation* to the Canadian Well Logging Society, Ingrain advisor Jack Dvorkin made an elegant case for a step change in reservoir analysis, with computer tomographic (CT) scanning providing the constraints to high fidelity computer models of reservoir rocks. Dvorkin, whose other life is as a researcher in the geophysics department of Stanford university, describes the approach as ‘quantitative virtuality,’ a new way of relating measurements such as well logs or seismics, to physical reality.

Stanford studies on the North Sea Troll gas supergiant have shown some counterintuitive results that were resolved by computer modeling. Compression and shear wave velocity ratios varied widely for similar porosity ranges—depending grain cementation. This was ‘understood’ by computer modeling of the whole process, from an unconsolidated sand through digenesis and burial.

Permeability, an elusive parameter for conventional approaches, is usually categorized by expensive special core analysis in the lab. With quantitative virtuality, full forward modeling of the rock’s history has been used to relate permeability to grain size, pore topology and membranes. As Dvorkin says, ‘experiment, experiment, experiment...’

Ingrain’s 3-D NanoXCT** imaging system gives a ‘true’ 3D representation of the rock. Images of Saudi Arabian carbonates even show fossils in detail. Armed with such high resolution images, Ingrain’s scientists run fluid flow ‘experiments’ on the computer. Checks with lab measurements sometimes produce widely different results. In one case, a difference in permeability of two orders of magnitude was observed. It turned out that the computer was right and the lab was wrong! Multiphase flow is very hard to implement in lab experiments—and these can be performed more accurately, cheaper and faster in the computer where heterogeneities can be correctly categorized. Ultimately, Dvorkin sees modeling of rock physics right down to the quantum scale.

The computer can also be used to ‘decrack’ cuttings for numerical experiments or perform virtual experiments on different fluid fill—which would take weeks in the lab. Massive databases of rock properties can now be delivered in near real time.

* www.oilit.com/links/0807_1

** www.oilit.com/links/0807_2


Vista buys P2ES

San Francisco venture capital fund buys Petroleum Place Energy Solutions and The Oil and Gas Clearing House.

Venture capital fund Vista Equity Partners (VEP) of San Francisco has acquired the parent company of both P2 Energy Solutions (P2ES—formerly Petroleum Place) and The Oil & Gas Asset Clearinghouse (OGAC). P2ES is an upstream software and data boutique—with products including Enterprise Upstream and Tobin Land. OGAC is an asset auction house and negotiated transactions unit.

Vista founder Robert Smith said, ‘We are long-term investors in technology companies that are committed to market leadership. P2ES’ domain expertise and vertical market focus have allowed them to help clients address issues facing the global oil and gas industry. We look forward to building out the platform with additional solutions.’

P2ES applications are used by 340 E&P companies. The company’s geospatial data products and services have been sold to over 2,000 customers including over 70 percent of the top 50 public E&P companies. The OGAC has closed $7 billion in oil and gas asset and corporate transactions since 1992. VEP has $2 billion invested in technology-based organizations and is to ‘recapitalize’ P2ES.


What happened to oil and gas R&D?

Oil IT Journal editor Neil McNaughton back from the European Association of Geoscientists and Engineers reflects on how R&D used to be and how it is now. Are petroleum engineers really an endangered species? Why do oils lag so far behind other industries in R&D spend?

Once upon a time, large oil companies were hard to get into, but once inside, there was, how should I put it, an easier atmosphere in the workplace. Companies tolerated independent thinking not to say eccentricity. They financed research—often with their own laboratories investigating most everything from palynology of continental slope deposits, through time series analysis for seismic prospecting and downstream to catalysts for refining. In fact the old approach to R&D way was in many respects rather like the ‘new way’ as exemplified by the easy going attitude of Google and other dot coms—with their indoor basket ball games, pizzas delivered to every cubicle and an open attitude to research on the basis that something useful may turn up from even the craziest seeming projects. Well maybe not, but you get the picture...

The special EAGE R&D session on research (see page 6 of this issue) showed just how much the picture has changed in the thirty years or so since the halcyon days I refer to above. Today, it is hard to categorize, let alone understand the situation regarding R&D in the upstream. On the one hand, large oil companies have been through a long phase of reorganizing, of acquisitions and mergers that has turned established in house research inside out when it has not simply been wiped out. On the other hand, governments, looking to reduce costs of universities have rationalized out of favor subjects like geology and petroleum engineering and encouraged the financing of the remaining research programs by industry.

On the face of it, this should have led to a perfect match between a new, outsourcing paradigm with oils migrating their research effort into universities and consortia. But if Pat Corbett, professor of petroleum geo-engineering Herriot Watt university is to be believed, this is not what has happened at all. In fact Corbett thinks we may even be on the edge of a Petroleum Engineering ‘mass extinction.’

A Financial Times survey this month* discovered that despite a 16% hike in R&D in 2007, western oil majors lag behind other industries, with ‘strikingly low’ expenditure of less than 0.3 per cent of turnover—compared with 15% for technology and pharmaceutical. The big R&D spenders are Shell, Exxon, Total and Schlumberger.

I’d think that the R&D enigma is symptomatic of  a wider malaise in the way large oils have been managed over the last few decades. In the latter half of the 20th century, the industry was pretty profligate across the board, with decades of growth into virgin exploration territory and rising oil prices in the 60s and 70s. The wheels came off in the late 80s and 90s as the oil price collapsed to around $10 per barrel. Which opened the way for a new approach to management. The cost cutting era began with huge losses in the workforce. Consultants made promises of economies from outsourcing. Benchmark studies showed that oils performed like dogs when compared with banking and financial services and later with the dot coms. The key performance indicator (KPI) era began with the notion that financial improvement could be driven from the shop floor.

Meanwhile the majors turned their attention to ‘giant hunting,’ only looking for really big stuff and began to divest non core assets. I’ve never really understood giant hunting. Its a bit like Coke saying, ‘We don’t care about selling bottles—we’re going to focus on selling tanker loads.’ And non core? To divest the smaller stuff says, ‘Despite our huge investment in IT, our outsourcing and cost cutting, we are not a very efficient operation and can’t really sweat the small stuff.’ 

Bottom line is, all the benchmarks and KPIs in the world won’t tell you what the oil price in going to be tomorrow. Investors understand that a share in an oil (or service company) is essentially a bet on the future oil price. Cost cutting, whether it is in R&D or in IT, is a bet on a low oil price and vice versa. The trouble is that companies have been slavishly following the cost cutting approach for so long that they are continuing with it even during the current boom. There is a lot of inertia in management fads unfortunately.

But there is light at the end of the tunnel, thanks to the destructive and creative forces of capitalism. As big oils have shed their staff many have gone on to work with or create independents. According to a PFC Energy study quoted in the Financial Times this month, recent exploration successes have been ‘mostly limited to small E&P specialists and national oil companies, which now control three-quarters of worldwide reserves.’ Ironically, the startups are ‘aggravating the talent shortage faced by large oil companies’ as experienced executives jump ship and form their own start-ups.

Are they financing R&D? Probably not to the same extent as the majors used to. But some independents are pretty active in the application of cutting edge technology. It makes for a great differentiator and is a great tale to tell the investors too!

R&D can be viewed as a bell weather of oil and gas activity at large. During the last downturn most majors were wrong footed. BP’s stance on R&D has gone through a complete cycle since the 1980s when its world class R&D center was closed and the job was largely left to contractors. As Michelle Judson explained, BP now has returned to a more hands-on approach with ‘flagship areas’ selected for in-house technology investment preceding outsourced ‘at scale’ implementation.

In the end it’s probably impossible for majors to go counter cyclical in a any significant way, although their current failure to replace reserves should be a warning against the cost cutting across the board mentality. It’s not just that NOCs have the oil, it’s also that they have a more recession proof game plan than the western majors. For governments the picture is different. Governments have shorted oil in the past as they closed geology and PE departments. Geoscience funding is now targeting global warming and it’s easier to get funding for research on pre Cambrian weather systems than on maximizing oil recovery!

* www.oilit.com/links/0807_5


Oil ITJ interview—Tony Edwards, chair of Intelligent Energy

Tony Edwards (BG) who was Chair of the 2008 Intelligent Energy Conference took issue with our April 2008 editorial categorizing the tradeshow as a ‘beauty contest’ and contrasting it with the supposedly more workmanlike PNEC data integration conference. Edwards believes that the digital oilfield is less about technology than about people and process.

Edwards—I think that your editorial missed a key point about Intelligent Energy (IE) 2008. The digital oilfield approach emphasizes the integration of people, process and technology to change the way we work. The digital oilfield (DOF) approach mirrors a similar evolution of thinking that has already taken place in the HSSE arena. This was initially addressed with standards and technology—but early failures led to the introduction of process and more people-related solutions. Technology is the enabler but not necessarily the key to success. DOF projects have shown that the integration of the work processes and the team that run them is the critical factor in a successful outcome. If you integrate the people by having a multidiscipline approach to a value adding process such as production optimization, you can overcome the fact the there is imperfect data and system integration.

OITJ—But how can this happen if you haven’t solved the underlying data issues that PNEC shows are still prevalent?

Edwards—Of course the technology is important and the smoother the data and system integration the easier it is, but the DOF approach centers on the people and process that add value even if the data is incomplete. In short, we are often in a position in E&P of having incomplete data and yet we still have to make a decision. What’s key are the processes we can set up—and how we organize to take these value adding decisions whilst minimizing the risk of the impact of those decisions.

OITJ—So we missed the point of IE2008?

Edwards—I think so. One day of IE2008 was devoted to integration through people and processes. One of my favorite books on the topic is James Surowiecki’s ‘The Wisdom of Crowds*.’ This describes just how better decisions can be arrived at through collaboration—even with partial information. In my days at BP we made great use of these techniques—which also interrelate with game theory. DOF projects have shown that giving very simple data in real time to petroleum engineers working remotely from the operational site can make a big difference in understanding a well’s performance.

OITJ—So data is not the big issue?

Edwards—Sometimes an IT or Technology led approach has been taken where the project is deemed a technical success but in fact adds little or no value. This is usually because the process work flow has not been updated and the people working them resist the change, work as they have always done and no value is added. But once the ‘people’ issues have been addressed, improvements to data can ‘turbo charge’ the whole process. Data certainly is an issue. But as Chevron CTO Don Paul suggested at the 2008 Houston Digital Energy event, some data issues have proved rather intractable. Hence the need for a different approach.

OITJ—One of the IE2008 success stories and flagship DOF projects is the Norwegian Integrated Operations initiative. This surely is a technology-driven initiative with the development of a whole range of IT initiatives and standards.

Edwards—Not at all! Integrated Operations is primarily a people and process approach emphasizing collaboration as above. This, like other successful DOF projects, is an organizational and transformational program rather than an IT project. Taking this approach put the emphasis on moving the people to a new way of working that is enabled by the technology. Often it is the simple technologies that have the biggest impact, e.g. video conferencing (VC). Having ‘always on’ VC between onshore and offshore allows the teams to build trust. This results in the teams including each other more readily in the decision making process. You have one team instead of two. Being able to see the same data at the same time is also key but it is the change in the team’s behavior that really adds the value.

* www.oilit.com/links/0807_6


Book Review—Numerical Earth Models, Jean-Laurent Mallet

New EAGE publication provides math behind, and insights to, commercial earth modeling solutions.

Numerical Earth Models* (NEM) is a good blend of discussion and math, and, although we didn’t understand much of the latter, the subtext turns out to be bang on in terms of the actualité of commercial earth modeling. Indeed, Prof. Mallet, recently retired from his position at the Nancy, France, School of Mines and father of GoCad, has performed a remarkable feat in slipping in a compelling piece of marketing for Paradigm’s new ‘Skua’ technology under the guise of an EAGE ‘Education Tour Series’ publication.

Mallet takes it as a given that earth modeling is the basis of E&P decision making. He dismisses the ‘classical,’ i.e. engineering CAD approach as being inadequate for modeling geological complexity and presents his ‘discrete smooth interpolation’ approach as used in GoCad. NEM includes some pretty good illustrations of complex geologies to back up the mathematical exposé. The math is tough, but accompanied by copious and relatively clear explanations for the mathematically challenged.

Mallet moves on from topological models to geostatistics, offering many examples of stochastically generated channels and geobodies. A chapter on seismic interpretation is a little idiosyncratic as it appears to dismiss conventional Fourier approaches to propose ‘trigonometric polynomial’ modeling of the seismic trace.

But the payload of NEM is in combining all of the above in the shared earth model and its successors, blending seismically derived attributes with the geometrical model. Mallet makes a good case for the inadequacy of ‘conventional’ pillar grid approaches of Schlumberger’s Petrel and many other tools including previous incarnations of GoCad.

Mallet’s answer is the ‘unified earth model,’ an approach that removes structural influences before performing cellularization and attribute modeling. A final chapter extends all of the above to upscaling and fluid flow modeling.

NEM is an extraordinary publication that conveys Mallet’s enthusiasm for modeling the earth, seismics, fluid flow and just about everything else. Is it ‘commercial?’ Definitely, and the EAGE is to be congratulated for producing a great blend of product-oriented discussion and science.

* EAGE 2008/ISBN 978-90-73781-63-4 and www.oilit.com/links/0807_9


SKUA—‘Petrel eating’ software rolls-out at EAGE

Paradigm’s new modeling package promises ‘depositionally correct’ attribute modeling.

Pre-announced at the SEG last year (OITJ Oct. 07) Paradigm’s Skua 2008 was officially released at the Rome EAGE. Paradigm’s flamboyant CEO John Gibson explained the significance of the awkwardly acronymic ‘subsurface knowledge unified approach.’ The Skua is an Antarctic bird that eats Petrels! Gibson described Skua as a ‘groundbreaking software development that exemplifies Paradigm’s scientific contribution* to subsurface modeling technology.’

Skua chief architect Jean-Claude Dulac said, ‘Skua embeds a native, full 3D description of the reservoir that removes the limitations of conventional pillar-based (like Petrel) methods. This makes possible hitherto impossible geomodeling and will change the way geoscientists work.’

Key to the Skua approach is its ability to model geological events in a ‘depositionally correct’ manner. We quizzed Dulac at the EAGE for more information on the product to learn that although Skua is not geodetically ‘aware,’ coordinate systems can be managed in Paradigm’s Epos data framework. Earlier GoCad usability issues are now addressed thanks to Skua’s structural and stratigraphic workflows. On the interoperability front, Paradigm is in the process of porting all of its applications to the cross-platform QT toolkit ready for its next major release in 2009.

* See our review of Professor Mallet’s book on page 3.


Caesar Systems puts Smalltalk change logs to multiple use

Object-oriented programming tool used for scenario analysis, recovery and training.

Senior developer Leandro Caniglia, speaking at the Smalltalk Solutions 2008 conference* in Reno last month, described use of the Smalltalk programming language by Caesar Systems. Smalltalk was first developed at the famed Xerox Park R&D facility and was one of the earliest languages to use object oriented techniques and an integrated development environment. Caesar Systems uses Smalltalk in its PetroVR flagship application for analysis of worldwide oil and gas opportunities. Caniglia said, ‘Smalltalk gives Caesar Systems and its clients flexibility and reliability for critical decision support and risk analysis applications. PetroVR users adjust key parameters under various ‘what-if’ scenarios. By logging all user changes, PetroVR is able to play any sequence of commands back and forth, thus making the building process repeatable and auditable.’

Caniglia’s presentation showed how Smalltalk’s user change logs can be leveraged in a range of innovative ways—to script and replay complex user interactions, to recover system states, audit user changes, undo actions and to make training demos. Other less obvious uses include the recreation of old projects in upgraded systems, for bug reporting and software regression testing. Change logs have also proved valuable in monitoring user ‘dedication and productivity’ and to see what functions are most used. PetroVR users include Anadarko, BHPBilliton, BP, Chevron, ConocoPhillips, Murphy Oil, Pioneer Natural Resources, Shell and Total.

* www.oilit.com/links/0807_3


DrillingInfo Network Architecture rolls-out

‘DNA’ announced as web based knowledge and project management platform for E&P

US oil and gas data provider DrillingInfo has announced the DrillingInfo Network Architecture (DNA), a web-based knowledge and project management platform to capture and store project information and documents. DNA organizes information into a tree structure and map view to offer E&P professionals a searchable knowledge base.

DNA is designed to store well files, land information and well production data. Data rooms can be built on the fly and data transferred to interpretation projects. Drilling Reports, lease information, well logs and elections can be shared with partners. Maps and reports can be attached to project areas, for competitor analysis. DNA targets users in operating oil companies, data vendors and the investment community.


Multicore and ‘petascale’ seismic processing analyzed

IBM paper sees promising future for multi core architecture in seismic. Repsol flagship implementer.

IBM researcher Mike Perrone has just released a paper* on the impact of multi core computing on seismic processing. Perrone notes that the seismic imaging industry deploys ‘some of the largest supercomputing clusters on the planet’ and has a ‘tremendous appetite’ for computing resources. He sees the adoption of petascale computing as ‘only a few months away.’ The transition to petascale means that oil and gas will be one of the first groups to make the significant changes required for petascale computing.

The move to multicore architectures is driving a ‘sea change’ in the computer industry and brings challenges such as algorithm design, integration of novel architectures with traditional computational systems and data management. Typifying the new breed of scientific computing engines is the IBM BladeCenter QS22 blade server with over 400 GFLOPS of peak performance per blade. The PowerXCell 8i processor on the QS22 has 9 cores optimized for parallel processing and consumes only 0.5 watts per GFLOP.

Perrone cited Repsol’s Kaleidoscope Project in Barcelona as an example of the way forward. ‘Unparalleled speed’  for applications such as reverse time migration have been claimed for the existing Kaleidoscope setup. This system will be a ‘pure’ QS22 cluster with next generation Cell processors in 2010.

* www.oilit.com/papers/perrone.pdf


Software, hardware short takes ...

Zeh, Enigma, Mechdyne, Millennial Net, Exprodat, IDS, Auto-trol, Quorum, MapInfo, Neuralog, P2ES.

A new release of ZEH Montage Professional offers loading and zooming performance enhancements and automating the generation of multi-page PDF documents with a page size of up to 200”.

Enigma Data Solutions has successfully tested its PARS geoscience archiving solution on the NetApp NearStore Virtual Tape Library (VTL) appliance. VTL lets users archive data on disk-based storage while maintaining the advantages of tape-based backup systems and compatibility with legacy systems.

Mechdyne has just announced the 3D Review Station, an integrated display and software system that uses a high definition TV display to make interactive 3D more accessible to technical applications. Mechdyne’s Conduit software is used to view images created in commercial applications for engineering, geophysical mapping and other technical fields—in fully interactive stereoscopic 3D. The system includes a 61” HD screen, a 3D graphics workstation, 3D glasses and a game controller for interaction. Conduit modules include support for ProENGINEER, CATIA, ArcGIS, Google Earth and Autodesk Maya.

Millennial Net’s MeshScape 5 wireless mesh network introduces ‘virtually on’ technology to enable nodes to run at low power on batteries for several years. Active frequency hopping technology avoids interference in noisy environments and a new low latency capability reduces packet delivery time in multi-hop environments.

The 8.3 release of Exprodat’s NitroView ArcIMS helper app includes the ability to ‘re-symbolize’ vector layers and to add data from external data sources without first including it in the map service. NitroView offers ‘out-of-the-box’ integration with common E&P systems.

Completion Services’ Completion String Design (CSD) package now supports bi-directional data exchange with the IDS ProNet completions reporting database.

A new release of Auto-trol’s Konfig   configuration and information management tool provides a central Oracle repository, globally distributed secure file vaults and web based clients for access by multiple users to corporate engineering data. Konfig 8’s smart workflows automate business processes according to predefined rules and metadata changes. A graphical workflow editor is used to build reusable templates for key business processes.

Quorum Land System 5.0 has achieved endorsement as a SAP business solution integrated with SAP ERP Financials.

Pitney Bowes has just announced Encom Discover V10.0, its geological mapping add-on to MapInfo. The new release notably offers supports for very large grids such as continental SRTM data.

Endeeper has announced PetroLedge Suite for capturing and analyzing petrographic descriptions of sedimentary rocks. PetroLedge includes a high-level geological ontology for classification and query.

Midland Valley has just rolled out ‘Move2008’ a fully integrated bundle of its structural geo-modeling tools 2DMove, 3DMove and 4DMove.

Neuralog has announced NeuraLaserColor—its dedicated color well log printing system. The new device offers continuous laser prints at up to 7” per second in ‘brilliant’ color. The printer is OEM’d from Lexmark International.

The R2.5 release of P2 Energy Solutions’ Enterprise Land toolset promises ‘complete’ integration with Enterprise Upstream and tighter integration with Tobin GIS Studio. A new lease acquisition module acts as a broker management system. The package is also now integratedwith SAP’s NetWeaver SOA platform.

Paradigm has announced Interpret 2008, its new well test analysis package. New functions include well test deconvolution and multi-session formation tests. The package is now compatible with Paradigm’s Geolog petrophysical package.


SAP implementation fixes ‘uncoordinated’ plant maintenance

Accenture delivers SAP Netweaver-based solution to Korea National Oil Corporation.

Accenture has just completed a major SAP development for the Korea National Oil Corp. (KNOC). KNOC was faced with the challenge of ‘uncoordinated’ plant maintenance processes across disparate systems. Complex user interactions were resulting in high maintenance costs and low productivity.  Accenture’s solution was an SAP enterprise resource planning (ERP) solution that enabled KNOC’s maintenance professionals to carry out all of their job-related transactions via an SAP NetWeaver portal.

The deployment has optimized KNOC’s business processes and improved data visibility thanks to the SAP’s Enterprise Service-oriented Architecture approach.  KNOC has now standardized on SAP NetWeaver as its web services platform and Accenture reports ‘better workforce productivity with increased reliability due to immediate access to asset information.’ Reuse of web services extends across all KNOC oilfield applications and plant maintenance costs have been reduced. In 2006, KNOC reported sales of 1 billion.


Suncor deploys EnergySolutions’ pipeline scheduling solution

PipelineTransporter to support US and Canadian oil sands pipeline operations.

Calgary, Alberta-based Suncor Energy has successfully deployed Energy Solutions’ PipelineTransporter (PT) application on its US and Canadian pipeline network. Suncor is using PT for planning, scheduling and volumetric accounting on its Rocky Mountain and Centennial pipelines in the US and the Oil Sands pipeline in Canada.

Ellen Lauersen, director, optimization, planning and crude QC at Suncor said, ‘PT has simplified our processes by eliminating or reducing manual calculations. Refineries and other customers are expected to benefit from more detailed and complete pipeline and volumetric reporting. EnergySolutions has deployed the interfaces we required for a more integrated scheduling and accounting system.’


EAGE 2008, Rome

A session at the European Association of Geoscientists and Engineers’ conference and exhibition discussed a possible ‘mass extinction’ of petroleum engineers! Landmark presented StatoilHydro’s ‘next generation’ seismic system. Schlumberger unveiled SharePoint ‘smart workflows’ and more.

Carlos Dengo (ExxonMobil) opened the ‘executive session’ on the state of R&D in the oil and gas industry noting the impact of ‘a changing business environment, increased competition, new technology and environmental concerns.’ Co chair Salvatore Meli (ENI) added that hydrocarbons will dominate energy supply ‘for decades,’ which implies finding increasingly complex reservoirs and which will require an ‘extraordinary R&D effort.’ Meli asked, ‘Is the R&D community ready?’

Pat Corbett, professor of petroleum geo-engineering Herriot Watt university, appears to think it is not. Corbett believes we may even be on the edge of a Petroleum Engineering ‘mass extinction.’ This flies in the face of plans for increased hydrocarbon production and what should be a golden age for R&D on the reservoir, on recovery, and CO2 sequestration and use. But the reality is that there is little public funding, less IOC funding, and unhelpful educational strategy, an now, ‘corporate universities’. Corbett claimed that, ‘None of us in this room believe in Peak Oil. We believe in the Shell blueprint with a peak circa 2030.’ But others, notably university planners are less optimistic. Corbett cited Statoil’s Leif Magne Meling who has equated the world’s needs to finding another Saudi Arabia in the next five years! Currently the UK government is putting 2% of R&D funding into oil and gas—about the same amount as into CO2. Even then, some argue that, with oil at $120, 2% is too much!

Len Srnka (ExxonMobil), citing ExxonMobil’s study on ‘The Outlook for Energy,’ predicted a 30 to 35% hike in oil demand by 2030 to conclude that a ‘perfect storm’ is brewing for R&D on various fronts including new data types such as controlled source electro magnetics (CSEM). High performance, sustained ‘petaflop’ computing will be here by around 2015. ExxonMobil expects to have several hundred TFLOP jobs running by 2012. Marine resistivity, Srnka’s speciality, has helped Exxon’s drilling success rate—and is ‘a powerful tool when combined with seismic.’ On the ‘CO2 challenge,’ ExxonMobil is ‘as concerned as anyone about what its impact may be on the environment.’ Srnka reckons we need to ‘retire all coal plants and replace with nuclear and/or clean coal. Srnka concluded that the ‘sun is beginning to peek out from behind the storm clouds,’ thanks to data integration, next generation computing and new learning paradigms. For researchers, cash and ‘consistency’ are required. ExxonMobil’s funding is ‘decoupled from the oil price, possibly a bit contrarian and definitely long term.’

Rob Helland enumerated several of StatoilHydro’s recent R&D projects emanating from the merger last year. A ‘next generation’ seismic interpretation has been developed in collaboration with Landmark (see below) and Statoil has been working on cluster-based high performance imaging for seismic processing. A Google Earth based data visualization tool originally developed by Hydro has been enhanced and re-baptized StatoilHydro Earth (SHE). On the drilling front, StatoilHydro, with Shell, is backing an autonomous seabed drilling unit, the Badger Explorer.

Ibraheem Assa’adan described a key element of Saudi Aramco’s R&D as the goal of increasing recovery from the current 50% to 70%. For Aramco, a 1% hike in recovery equates to 30 billion barrels, i.e. one year of world consumption. Contributing to this effort are next generation petrophysics, the Powers II simulator (which will be capable of running 258 million cell models next year), the i-field and ‘extreme’ reservoir contact wells.

BP’s position on technology, according to Michelle Judson, has evolved from the time when BP just selected ‘best in class’ implementations from contractors. Today, BP identifies ‘flagship areas’ for technology investment. Examples of these include seismic imaging—60% of BP’s Gulf of Mexico reserves are subsalt. Their discovery required a ‘leap of faith’ and a deep capability in modeling. BP collaborates with peers on fundamental research, but for Judson, there comes a time when you need a commitment to the proprietary development of a proof of concept. After which, BP returns to a collaborative model for ‘at scale’ implementation. Examples of BP’s R&D successes include wide azimuth seismics, ‘Bright,’ a polymer that opens up like popcorn in hot zones to shut off high permeability areas to redirect injection and the field of the future. Questioned on BP’s stance on the intellectual property (IP) of its R&D effort, Judson described BP’s stance as ‘defensive, not offensive.’ IP can go either way (oil or contractor) but it needs to go to the ‘natural owner’.

In the general Q&A, Corbett asked why have oils stopped sponsoring much R&D and why they were still laying people off! ‘Western Europe has a problem—if you want to work in oil and gas, move to Saudi Arabia!’

After the R&D session we rushed over to the Landmark booth for the ‘next generation’ seismic interpretation system presentation. This $13 million, three year project is Landmark’s largest ever. Statoil’s Rolf Helland described how Landmark’s DecisionSpace interpretation toolset is being extended to support ‘all inclusive’ basin to prospect analysis with a common GUI, fast data access, third party integration and traceability. A demo showed how ‘continent-scale’ super regional work can begin, even in the absence of seismic and well data, by using maps and sections from published reports. The regional picture can be carried forward into conventional interpretation—a functionality that has been lost with the move to digital/workstation based interpretation. The question remains as to why the project stops at the prospect level and does not carry through to the reservoir.

Along with the R5000 DecisionSpace launch (see back page), Landmark unveiled its new EarthModel ‘hub’ product scheduled for release next year. This is to extend Geographix’ integration strategy of seismic to simulation and reservoir to basin modeling. EarthModel will offer ‘in memory’ modeling with a small number of integrated applications—not separate products as before. Neither will it be a ‘monolithic’ application. EarthModel promises a high resolution, scale independent geological model. The vision is for an easy to use, state of the art mapping system producing ‘simulator friendly’ grids. Uncertainty, risk/portfolio analysis and custom workflows also ran. EarthModel will also leverage Landmark’s ‘exclusive’ agreement with Geovariances for its geostatistics  API.

Schlumberger was showing off the latest, 2008.1 incarnation of its Petrel modeling flagship, with what is described as a ‘unified’ seismic to simulation workflow. Fluid flow simulation history can now be imported into Petrel and ‘GeoFlows,’ stream simulation models, can be run from inside the geological model. Multiple realizations can be performed to ‘map’ oil in place. According to Schlumberger, 18 out of the 20 top oil producers use Petrel which is used in some 600 companies. The product is showing ‘50% year on year growth.’ By our reckoning, that would be north of a quarter of a million corporate clients for Petrel in ten years or so! A presentation at a pre-conference workshop highlighted Petrel’s integration capabilities—although one user pointed out that integration with third party was basically impossible without a plug-in.

Another Schlumberger presentation showed how DecisionPoint running on Microsoft Office Share Point Server (MOSS) could address more complex E&P tasks than Petrel. With such stand alone applications, data and documents can be lost and it may be hard to track decisions and understand the process. Enter MOSS ‘smart workflows’ for field development planning (FDP). This has been developed under a Schlumberger Microsoft alliance. MOSS tracks tasks, and captures documents and structured data. An FDP demo showed activity in the Ship Shoal area, where MOSS supported weekly team meetings and asset reviews. Risk analysis and Petrel subsurface workflows could be kicked-off from ESRI GIS or Google Earth. The project has been documented in a Shell paper (SPE 99482).

Exprodat rolled out Team-GIS Acreage Analyst (TGAA) at the show, an ArcGIS-based tool for ‘common risk segment analysis’ and acreage ranking. TGAA is used to evaluate and rank acreage opportunities. Ranking workflows can be standardized, automated and rapidly iterated to produce maps of basin and play fairways. Multiple inputs including depositional environment and paleo-geography can be combined to create component risk maps for key petroleum system elements such as reservoir, source and seal.


AspenTech 2008 European User Group, Berlin

KBR presents Hysys-based projects. Schlumberger’s Avocet models BG-operated Armada facility.

David Rassam (KBR) described the use of dynamic simulation to predict operational performance of offshore and onshore production facilities. Three case histories were presented. ‘Project A’ involved modeling an onshore gas processing terminal with AspenTech’s Hysys dynamic simulator. The model encompassed the onshore terminal, five satellite drilling platforms and a central production platform. Dynamic simulation was used to specify new features of the onshore terminal. During the project, Hysys was extended with an anti surge controller algorithm from Dresser-Rand, stream multipliers and a component splitter. Head vs. flow rate plots were used to investigate compressor startup. The study concluded that suction throttling valves were required for the onshore facility, that hot gas bypass valves were not required. The Dresser-Rand anti-surge controller algorithm allowed for optimal sizing of anti-surge valves and controllers. Project B involved an offshore producer connected to a high pressure production manifold. Again, Hysys was used to model transient behavior of the gas compression system in the event of unplanned ‘upsets’ to normal operation. Hysys was again extended with an anti-surge controller algorithm. The study confirmed plant stability as the plant inlet flow was ramped. Project C investigated control and operations of a multiphase pipeline connecting a nine-slot platform to an onshore facility, in particular the ramp down of production to a rate suitable for pigging. Hysys and Scandpower Petroleum Technology’s OLGA 2000 were used to model the multiphase pipeline. The study confirmed the feasibility of a reduced production rate and allowed for fine tuning of slug catcher pressure controller set points. Rassam concluded that Hysys was a good tool for validating engineering design and for mitigating risk—eliminating the ‘element of surprise.’ Designers gained in confidence and were better able to meet environmental constraints and to define operating procedures.

~

Martyn Johnston from Petrofac Consulting and Taoufik Manai, Schlumberger presented a paper on the use of Hysys to model the BG-operated ‘Armada’ facility. The Armada platform acts as a hub for six North Sea producing fields. Armada liquids are exported to Forties and gas to the Central Area Transmission System (CATS) gas pipeline system. Armada handles gas, gas condensate, volatile and black oil. Hysys is at the heart of the Armada integrated asset model (IAM) and of Schlumberger’s ‘Avocet’ vision to ‘streamline the entire reservoir and production system into a single workflow.’ BG’s interest in the Armada IAM was for life-of-asset development planning, in calculating ‘back-out’ and to study the economics of proposed modifications to the facility. Other model components included Petroleum Experts’ GAP multiphase gathering network modeler,  Calsep’s PVTSim and Schlumberger’s Eclipse fluid flow modeler. Legacy spreadsheet-based models were ported to the Avocet infrastructure. Here the main issue was making the model sufficiently robust for use in the IAM. The solution was to make use of Hysys’ ‘user variables’ to tune the model for use in a range of studies. These included disconnecting network branches, shutting in wells and shutting in a whole field. Johnston concluded that integrated asset models are powerful tools for investigating the impact of proposed changes in reservoir management strategy. They are also useful in studying network hydraulics and facility sizing on lifecycle production and asset economics. Schlumberger Avocet IAM package proved a flexible, user-friendly framework for integrating many different applications including Hysys. The Armada IAM has proved to be a flexible tool for asset management, capturing the complex interaction between reservoirs and the top side network

~

‘Pinch technology’ was the subject of a presentation by Shell’s Oscar Aguilar and Ashok Dewan. Pinch technology uses high tech thermodynamic modeling to optimize energy use across processes such as refining. The holistic approach to process design and optimization exploits the interactions between different plant units to optimize resources and minimize costs. Pinch technology was originally developed in the 1980s to reduce energy consumption but its use has now spread into other areas such as  water and feedstock use and debottlenecking. Typically, pinch technology involves using the heat rejected by hot streams to warm cold plant processes.


Folks, facts, orgs ...

News from Quorum, SEC, Michael Baker, Chevron, dGB-USA, Energistics, Geotrace, IES, Ingrain, Kalido, Midland Valley, Paradigm, SensorTran, Boston Group, Swagelock, Weeden & Co, WellPoint.

Quorum Business Solutions has hired Rich Schaeffer and Andy Fitz as directors of its new SAP Production Revenue Accounting Practice.

The US Securities and Exchange Commission has issued a request for information on the modernization of oil and gas reporting requirements (oilit.com/links/0807_8). The paper includes a discussion on the admissibility of computer methods in reserves estimates.

Michael Baker Corp. has appointed Michael White as CFO of its energy segment. White hails from Gexa Energy.

John McDonald, Chevron VP and CTO has been appointed to the board of the CTO Forum, a technology talking shop.

Kristofer Tingdahl is now CEO of dGB-USA and Friso Brouwer is VP technology and operations. Current president Fred Aminzadeh is leaving the company but will keep an advisory role.

A national data repositories work group has been established under the Energistics eRegulatory special interest group.

Darko Tufekcic has joined Geotrace as geoscience advisor to Eastern European clients. Tufekcic was previously with Western Geco.

Interactive Exploration Solutions has appointed Hector Sepulveda as VP business development.

Avrami Grader is now chief scientist with rock physics specialist Ingrain.

Kalido has appointed Mary Wells as VP marketing. Wells was previously with BEA systems.

Rosa Polanco-Ferrer has joined Midland Valley as structural geologists and Alison

Martin as technical author.

Gary Morris has been named CFO of Paradigm. He was previously with Halliburton.

Invensys Process Systems, ENPPI and GASCO are to set up an industrial automation engineering centre of excellence in Cairo.

Mikko Jaaskelainen is now CTO of SensorTran. Jaaskelainen  was previously with Shell. The company has also named John Rothermel (previously with Quantapoint) as EVP sales and marketing.

A study by the Boston Consulting Group has established that carbon capture and storage (CCS) is a feasible way of mitigating global warming. CCS could ‘reduce one-third’ of global emissions of carbon dioxide from stationary sources.

Spectraseis has appointed Ian Vann and Bjarte Fageras to its board. Vann was previously with BP. Fageras is chairman of Octio Geophysical.

William Winans has joined Swagelok as director, eBusiness and knowledge management.

Brokerage house Weeden & Co. has hired oilfield service sector specialist Geoff Kieburtz to its team of analysts.

WellPoint Systems has appointed Mickey Abougoush, Ben Mayberry and Don Wilson to its board. Abougoush is president of Teknica Overseas, Mayberry is CEO of Winston Sage Partners and Wilson senior VP of Stantec.


Done deals

ION, Ikon, Roxar, AGR Group, ARKeX, CDA, CiDRA, IHS, Spectrum, Fotech, Yokagawa and ABB.

ION Geophysical has acquired seismic equipment manufacturer ARAM Systems for CDN $350 million cash and stock.

~

Ikon Science has acquired the ‘Geopatterns’ technology from Chroma Energy. GeoPatterns is a seismical data mining technique that uses pattern recognition to locate oil and gas prospects.

~

StatoilHydro signed a four-year global software contract with Roxar for the provision of its reservoir modeling software. The deal, worth some $5.9 million, is one of Roxar’s software unit’s largest ever contracts. Roxar previously had agreements with both Statoil and Hydro. The new combined contract represents a 10% revenue hike.

~

AGR Group has acquired all of the outstanding share capital of the TRACS consultancy, a provider of integrated services and training in geology and petroleum engineering. Initial payment on the NOK 204 million cash deal will be financed with NOK 125 million of loans and a share issue of NOK 34 million. The remainder will be paid in an earn out over 30 months.

~

Airborne gravity specialist ARKeX has raised $30 million from a venture capital group led by Ferd Venture of Oslo, Norway. The deal is said to be the largest venture round for an E&P service company in five years.

~

UK Common Data Access (CDA) is tendering for the building and management of a seismic data store. The contract is to be awarded before year end and operations are to start early in 2009.

~

CiDRA has sold its SonarTrac oil and gas metering business to Expro Group for $60.5 million. This is CiDRA’s third sale of a business unit since 2001. The metering system was developed in collaboration with BP to address multiphase flow measurement.

~

IHS has acquired the digital log and raster image assets of Reservoir Visualization for $4.1 million cash. The acquisition doubles the IHS’ digital log inventory.  

~

GGS ASA has floated its Spectrum seismic unit on the Oslo Axess exchange. The IPO raised nearly NOK 150 million.

~

Fotech has secured £6.5 million venture funding from Scottish Equity Partners, Energy Ventures and Saudi Arabia-based Shoaibi Group to commercialize its fiber optic solutions for monitoring oil and gas wells and pipelines.

~

Yokogawa has signed a strategic partnership agreement with JSC Gazprom Neft for the provision of production control systems and operation support software. Yokogawa is also to provide technical assistance in control systems design, delivery and operations.

~

Petrobras has awarded ABB a five year, $61 million frame agreement for the provision of process automation systems and services to eight oil refineries in Brazil.


Barco’s electronic display wall for China National Petroleum

54 OV-D2 80” screens make up ‘e-wall’ for Beijing pipeline command center.

Barco has supplied a state of the art display wall to China National Petroleum Corp. at the Beijing oil and gas pipeline control center, said to be the largest control center of its type in the world. The center controls long distance pipelines moving crude oil, product oil and natural gas across mainland China. 54 of Barco’s new 80” OV-D2 projection cubes in an 18 x 3 configuration comprise the display wall. The cubes are driven by Barco’s new SCN system that allows the workstation image to be shared throughout the company’s network. The OV-D2 includes the latest DLP technology, high brightness, a large viewing and Ethernet connectivity.

CNPC automation manager Qi Guocheng said, ‘With Barco’s help, we have upgraded our monitoring process. In fact, when we laid out our requirements, Barco was the obvious choice in meeting our solution needs. The end result will be a superior visualization backdrop that places our control room among the finest in the industry, not just in China, but around the world.’

Barco China MD Frank Christiaens added, ‘When you walk through the control room it’s easy to take for granted the visual element of the center’s operation. But behind the scenes, there’s a huge amount of activity going on both technically and strategically. We worked closely with CNPC to deliver what is close to an ideal display solution.’


Pipeline Group teams with CeleritasWorks

New joint venture addresses pipeline public awareness, safety and API RP 1162 compliance.

The Pipeline Group (TPG), a specialist in pipeline public awareness and safety programs has teamed with CeleritasWorks, providers of pipeline regulatory and compliance software to help companies comply with the API’s Recommended Practice 1162 for public awareness and safety, now a mandatory requirement for Department of Transportation compliance.

Under the terms of the agreement, access to TPG’s database of compliance documentation for 35 states will be available on a version of Celeritas’ Public Awareness Manager software. Clients of both companies will have access to what is claimed as the largest database in the industry of emergency responder and excavator contacts and capabilities.

The companies have also teamed with Pier Systems to add a web-enabled, on-demand communications platform to enable operators to send, receive and manage critical information, ‘directly and transparently’ to stakeholders, including emergency responders, contractors, news media, residents and regulators.


IDC study on HPC shows US energy sector lead

Tier 1 US energy companies outpace others in use of high performance computing.

An IDC study commissioned by DARPA, the DoE and the US Council on Competitiveness found that US ‘tier 1’  energy firms outpace other industries in integrating high performance computing into critical business functions. All T1 energy companies deployed HPC in R&D and design engineering and three quarters use HPC in ‘manufacturing.’ T1 energy firms are ahead of their counterparts in life sciences, automotive and aerospace in large scale data management—an emerging application area with the potential for ‘enormous business payoff.’ Energy also bested the other sectors in applying technical computing to ‘help drive innovation’. Energy companies reported under use of HPC—suggesting that there may be ‘more innovative applications on the horizon than in other industries.’ Note that ‘HPC’ for the study was defined as ‘all servers used for technical computing tasks’ covering machines with a price tag of $5,000 up.

* Read the full study on www.oilit.com/links/0807_4if only to see on what thin evidence it is based!


Cortex Business Solutions signs with mystery major

E-Business specialist to be ‘primary provider’ to undisclosed Canadian operator.

Cortex Business Solutions (formerly Electrobusiness) has entered into a memorandum of understanding with one of Canada’s top five integrated oil and gas companies to become its primary e—commerce solution provider. Under the terms of the MOU, Cortex will connect the major’s top 3,000 suppliers across North American upstream, midstream, downstream, oil sands and corporate business units. Implementation will begin immediately, with the majority of the initiative being completed by year end.

Cortex VP Ryan Lailey said, ‘This contract validates the benefits of our approach of providing a low cost, efficient solution to solve an important business need. Over the next 12 months Cortex will experience significant growth in our customer base across North America and will provide the foundation to expand our service to additional companies of similar size.’ Cortex clients include ConocoPhillips and PetroCanada.


Moblize rolls out ‘DARP’ WITSML server for drilling data

Mehta—‘Current WITSML solutions slowing adoption by locking customers into premium solutions.’

Moblize has just announced ‘DARP’ an ‘enterprise class’ WITSML server for aggregating drilling data. WITSML is an XML-based standard for well information transfer from Energistics. Moblize principal, Amit Mehta, said, ‘WISTML promotes openness and interoperability. However current WITSML solution providers want to lock clients into premium data management and visualization solutions. This attitude will slow WITSML’s adoption and value creation as a backbone for drilling solutions. Moblize provides clients with the flexibility to buy virtually any component of the WITSML solution from data to decisions.’

DARP Central Server acts as a gateway to well data from multiple service companies. This central aggregator collects and aggregates both real-time and historic well site data and stores it in Oracle, Mysql or SQL Server. DARP provides a single source server for real-time and historic data. WITSML compliant viewers and applications can connect to the DARP with controlled access for authorized users. DARP provides a web-based query mechanism for specifying data formats.


Gazprom signs Ventx for market analytics software

Software and ‘simulation-ready’ database to support Russian gas major’s European expansion.

Ventyx, of Atlanta, GA, is to provide its  market analytics software and services to Russian gas behemoth Gazprom’s Marketing and Trading unit in support of continuing European growth plans. The deal covers multiple licenses for Ventyx’ Market Analytics package—a ‘total solution’ for market modeling, transmission and generation analysis. Ventyx Market Analysis also provides forecasts of energy prices, ancillary services and markets around the world.

A ‘simulation-ready’ database contains Ventyx’s 25-year European energy market outlook down to power station unit-level detail. Gazprom is to use the toolset to model the impact of key market drivers including deregulation and liberalization, energy policy, environmental legislation and capacity expansion plans across target markets for diversification. Ventyx Energy Advisory service team will assist Gazprom improve operational and financial performance.

Vitaly Vasiliev, CEO of Gazprom Marketing and Trading said, ‘We have an interest in the European power generation sector and this technology will assist us in our market analysis. Our goal is to trade in a variety of energy commodities and to get close to the downstream customer, in line with our aim of becoming a leading global energy player.’


Standards Stuff

Energistics—PPDM MoU, IHS provides well ids, UncertML announced, Chevron endorses PPDM 3.8.

Energistics (formerly POSC) and the Public Petroleum Data Model Association (PPDM) have signed a memorandum of understanding (MoU) and established reciprocal memberships to identify areas of cooperation. The MoU aims to ensure the organizations leverage each other’s strengths, ‘for the benefit of the upstream oil and natural gas industry as a whole.’ PPDM and Energistics will identify standards suitable for joint sponsorship or joint member participation. One such field is the development of connectors for ‘seamless’ data transfer between Energistics’ PRODML production data exchange standard and PPDM’s flagship data model.

~

It has been a long time a-coming, the project kicked off in 2004 (Oil ITJ March 04),but IHS and Energistics have finally announced availability of the ‘first-ever’ standard for global unique well identifiers (GUWI). This provides a unique identifier for all known E&P wells outside North America. The GUWI replaces multiple naming conventions and spellings and should help companies find and match well data from multiple sources. The new service extends IHS’ existing identification services for its clients and will be ‘readily available to the industry as a whole.’ IHS VP Bob Stevenson explained ‘The GUWI service is designed so that no oil company, service company or data vendor is precluded from obtaining a standard well identifier.’ The GUWI is made up of three components, a registration service for new wellbores, a matching service to tie existing wellbore records to the new standard and a proprietary master well index for IHS clients that contains meta-data for IHS’ US, Canadian and international wells.

~

Researchers at Aston University UK have come up with ‘UncertML,’ a new XML schema for exchanging uncertainty information. A paper* by Matthew Williams et al. describes a framework for interoperable communication of uncertainty in geospatial data and risk management. The project leverages OGIS standards for spatial representation and sensor information. The project was funded by the European Commission, under the Sixth Framework Program.

~

PPDM has announced V3.8 of its petroleum data model. The new version, the fruit of three years work by industry experts, enhances existing subject areas and adds new ones—reflecting a growing emphasis on data management. V3.8 adds nearly 500 tables to the model in equipment and facilities management, records management, HSE, well operations and classification. Arthur Boykiw (Petro-Canada and PPDM chairman) said, ‘This release and its adoption demonstrates PPDM’s leadership in providing an environment in which software, data, consulting service, governments and E&P organizations work together.’ PPDM reports a ‘rapid increase’ in the number of implementations of the PPDM data model both as a master data store and a component of master data management solutions. Chellie Hailes, global upstream information architect with Chevron said, ‘PPDM membership can congratulate itself on the development of its most comprehensive data model to date. We are very pleased to make PPDM 3.8 the foundation of our upstream master data management’

* www.oilit.com/links/0807_7


Chevron uses Iocom visualization for worldwide collaboration

Live link demos intercontinental connectivity with technology from Iocom and Impact Marcom.

A Chevron presentation in Aberdeen last month highlighted real time, onshore/offshore collaboration technology. The three way communications and visualization showcase offered a live link between Aberdeen, London and Stavanger using technology from Iocom and Impact Marcom.  Chevron has been using Iocom’s audio, video, and data communications technology since 2007 to keep onshore base operations in constant collaboration with field engineers, geophysicists and drill ships. Systems integrator Impact Marcom described how the audio-visual industry has evolved from simple video conferencing to ‘true collaboration capability.’

At the SPE Intelligent Energy event in Amsterdam earlier this year, BP also demonstrated its use of Iocom systems to connect their global command centers and their offshore rigs with a live link to both its command center in Azerbaijan and an offshore rig in the Caspian. Iocom operates over a variety of networks using scalable bandwidth to optimize connectivity to remote locations.


M2M Data takes iSCADA upstream

Hosted monitoring and alert service targets oil and gas producers. ‘iServices’ to optimize operations.

M2M Data Corp. of Denver, CO is to offer its iSCADA hosted suite of monitoring, control and alerting services to the oil and gas vertical to allow producers to remotely monitor fluid flow, compressor performance, tank batteries, artificial lift devices and other related surface equipment. iSCADA provides remote tracking of individual wells and devices with customizable reporting and data export data to a variety of applications and back office systems.

The new functionality allows producers to make real-time operational adjustments to minimize operating costs, reduce downtime, improve safety and increase production. Optional add-on services include preventive maintenance, machinery analysis and a hosted 24x7 manned control center.

Matt Begler, VP sales, said, ‘To extend our services to the upstream, we have hired Jim Bell as oil and gas sales engineer. Jim brings in-depth producer-based remote monitoring sales experience and is specifically tasked to support this market.’ M2M’s ‘iServices’ manage machine-to-machine communications for optimizing operations. The turnkey service includes communications with ‘low capital expense and service level guarantees.’


eSimOptimizer fine-tunes Enbridge’s gas processes

Simulation technology increases profitability at Texas processing plants.

US Pipeline operator Enbridge Energy Partners has rolled out gas processing simulation technology from eSimulation Inc. at its Avinger and Longview, Texas gas processing facilities. eSimOptimizer, part of the eSimulation suite of predictive, model-based solutions will be used for real-time process optimization and to maximize profitability. eSimOptimizer (eSO) calculates optimal process targets for a plant based on operational conditions, equipment capability, commodity pricing and contracts. Targets are then posted to a secure area on the eSimulation website for use by the plant operations staff. A ‘value capture program’ provides weekly reporting and a forum for operators, engineers and managers.

Enbridge’s  Charles Raiborn said, ‘Local management at our Henderson, Texas facility already use the recommendations from eSimOptimizer to manage plant profitability. The eSimulation team have structured their solutions to match our requirements and are eager to incorporate our ideas into their solutions.’

eSimulation president Mark Roop added, ‘Enbridge has made novel use of eSO and is our first client to display optimal plant targets along side key process variables on the operations console.  This integrated approach provides optimal targets to the operators in a timely fashion and in a single console window.’


El Paso protects pipeline network with ThreatScan

Solar-powered sonic intrusion detection system equips 11 mile section of Houston pipeline.

El Paso Corp. has deployed GE Oil and Gas’ ‘ThreatScan’ sensors on an 11-mile section of a natural gas pipeline in southeastern Houston to enable around the clock remote monitoring. ThreatScan’s solar powered acoustical sensors reduce the risk of third-party damage (often from construction machinery) to critical pipeline segments.

Jesus Soto, VP operations with El Paso said, ‘Third-party damage is the most significant integrity threat to both liquid and gas networks globally. Installing ThreatScan sensors on our Houston-area pipeline underscores our continued commitment to protecting the integrity of our assets.’

ThreatScan, developed by GE Oil and Gas’ PII Pipeline Solutions unit, transmits vibrations that may signal an impact over satellite to the ThreatScan call center in Houston. If appropriate, operators can be notified for further action and remediation. GE Oil and Gas operates a similar call center at its headquarters site in Florence, Italy to support European ThreatScan users.


Geospatial tools facilitate response to 2008 hurricane season

Erdas’ ‘Titan GeoHubs’ offered free to authorities for speedy emergency image sharing.

Erdas is offering free geospatial technology to assist US federal, state, and local government emergency management teams throughout the 2008 hurricane season. The offer centers on Erdas’ Titan GeoHubs (TGH) tool for publishing, indexing, searching, and retrieving geospatial information. TGH lets users share imagery and vector data in real-time and to publish geospatial data in a collaborative online network. Deployed ahead of an emergency, state agencies can collaborate on planning. During an actual event, TGH provides situational awareness and management of emergency response efforts.

Eddie Pickle, Erdas’ content development director said, ‘When disaster strikes, agencies need a means of developing a common operational picture that can be shared among responders. By making it easy to distribute the most recent imagery and vector data in a wide range of formats, TGH provides imagery, flood data, fire perimeters, medical services, shelters and evacuation routes for headquarters, field offices and on site laptops.’

Erdas has also partnered with MCH GeoPoints to provide data on medical facilities and other key institutions such as schools and nursing homes. Such data is already supplied to the federal homeland security community and to state and local officials as ‘Places2protect’ and ‘People2notify’.


Landmark’s DecisionSpace R5000 launches at Rome EAGE

New framework includes 70 products spanning upstream. SDK offered to third party developers.

Already covered in Oil IT Journal last April, Landmark has now officially rolled-out the DecisionSpace ‘R5000’ release of some 70 products spanning the E&P lifecycle. CTO Chris Usher said, ‘R5000 will increase productivity and collaboration across the upstream. R5000 results from a multi-year integration effort by Landmark. In upgrading all applications to a common platform, we provide an enabling framework for the Halliburton ‘Digital Asset,’ allowing operators to model, measure and optimize their asset.’

Landmark’s ‘DecisionSpace’ framework supports workflows across in-house applications and other vendors’ offerings. New features include enhanced drag-and-drop functionality and common viewing for all applications. Knowledge capture capabilities provide audit trails of decision-making and preserve crucial information. ‘Customer-driven’ enhancements to project management address ‘rampant’ data duplication issues and enable faster project start up.

R5000 notably includes new functionality in Landmark’s PowerView (geoscience interpretation) GeoProbe (3D volume interpretation and visualization), OpenWells (drilling) and Nexus (development). A software development kit provides access to the R5000 applications and data stores. According to Landmark, ‘numerous’ third party developers are already leveraging the SDK to link their products to R5000 solutions and create novel workflows.


Fieldbus safety instrumentation demos at Shell Global Solutions

Interoperable equipment spec supports ‘proactive condition based monitoring.’

The Shell Global Solutions technology centre in Amsterdam was the venue for a recent Fieldbus Foundation Safety Instrumented Functions (SIF) demonstration with attendance from worldwide process automation end users and equipment suppliers. The SIF demo promoted adoption of Foundation-based safety instrumentation, best practices and interoperability test tools for control system and instrumentation.

Fieldbus Foundation CEO Rich Timoney emphasized the use of proactive condition-based monitoring in emergency shut down (ESD) saying ‘Foundation SIF’s diagnostics improve safety functions and streamline testing. The technology offers new opportunities to optimize asset management and reduce operating costs.’ A safety shutdown demo rig incorporated technology from HIMA, Yokogawa, ABB, Siemens, Emerson and others. Other demos running at Saudi Aramco, Chevron, and BP included other systems and products from all of the major process automation suppliers.

Shell Global Solutions’ Audun Gjerde conducted the live SIF demo including high and low level trips, partial stroke valve tests, and a test that was interrupted by the ESD. Even in the middle of a partial stroke test the ESD could successfully take over and shut down the system. Two out of three voting was demonstrated using various Fieldbus SIF devices. Gjerde commented, ‘Shell expects SIF diagnostics and an integrated asset management system will result in early detection of dangerous device failures—and fewer wasted trips.’ Shell, Saudi Aramco, BP and Chevron presented at the event and attendees included Emerson, Invensys, Saudi Aramco, Siemens and Yokogawa.


Identec Solutions’ people on board system for Ekofisk

ConocoPhillips rolls-out ‘WatcherSeries’ RFID-based personnel tracker following ‘intensive’ tests.

ConocoPhillips has installed a personnel tracking solution from Indentec Solutions of Lustenau, Austria on Ekofisk, the earliest North Sea oil field—now in production for 35 years. Identec’s WatcherSeries (IWS) system survived ‘meticulous and intensive’ tests before roll out. IWS provides employees with an active RFID tag. They are then tracked on a real time map of the facility. In the event of an emergency, real-time headcount and personnel locations support efficient and safe evacuation. Identec CEO Gerhard Schedler said, ‘The ConocoPhillips contract emphasizes our position as a leading provider of RFID solutions to the oil and gas industry. IWS sets the standard in offshore personnel tracking.’