July-August 2005


SMT thrashes Moore’s Law

SMT’s SURE reservoir flow simulation unit has achieved blistering performance using graphics cards for number crunching. The technology is set to outpace Moore’s Law by orders of magnitude.

There will be no need to wait on Moore’s Law to align CPU speeds with your reservoir simulation requirements if technology under development by Seismic Micro Technology’s Sure reservoir simulation unit comes good. The idea is simple, instead of using the computer’s CPU for number crunching, use the graphics processing unit (GPU) of a gaming card.

Smith

SMT president Tom Smith told Oil IT Journal, ‘You’ll wet your pants when you see the spectacular performance the GPU brings. A top of the range computer CPU produces around 4 GigaFlops (billion floating point operations). But today’s GPUs can already run at around 30 GFlops. Performance is fast approaching the Teraflop—truly a supercomputer on a chip!’

Ganzer

SMT’s Leo Ganzer presented the technique at the Madrid EAGE last month, showing how reservoir simulation can now be performed in an interactive mode—as opposed to traditional batch processing. Interactivity is facilitated by making the simulator a part of the visualization application.

Moore beat

GPUs have evolved from fixed-function devices into programmable chips with high floating point operation counts and a competitive price/performance ratio. Current GPUs outperform CPUs on certain computations. But whereas ‘Moore’s Law’ is now slowing to a CPU speed doubling every 18 months, GPU speed is currently doubling every six months!

NVidia

The simulator was developed on dual NVidia GeForce cards using NVidia’s C for graphics (Cg) language. Grid block properties and transmissibilities are pre-computed and stored in GPU textures. The program reads all input from the grid using texturing operations, calculates the coefficients, solves the equations and updates the dynamic properties for the new time step. At the end of a render pass, textures are updated via a copy from the frame buffer to the texture. Between steps, interaction with the model is possible by changing well production/injection rates or the time step sizes during the simulation—using slider bars for input.

Gpgpu.org

GPUs are highly parallel in nature and because the time between new generations of GPUs is currently much less than for CPUs, it is anticipated that the technology has enormous future potential. The technique leverages open source code for GPU processing from the General Purpose Computation Using Graphics Hardware Forum. More from www.gpgpu.org.


P2ES bags Qbyte

P2 Energy Solutions has acquired IBM Canada’s QByte division, expanding its software portfolio, client base and geographic reach.

P2 Energy Solutions (P2ES) has acquired IBM Canada’s Qbyte unit. QByte supplies management information systems to some 140 upstream companies in Canada. QByte’s portfolio includes Qbyte Financial Management (ex Petro-Lab), Land Management, Prism (production accounting) and Data Retriever. PetroLab provides a GIS front end to data in other QByte systems which are also available in a hosted application service provider (ASP) mode.

Gosbee

George Gosbee, president of P2ES parent company, Tristone Energy Services, said, ‘This acquisition advances our corporate goal of becoming the leading provider of upstream financial and technology services in North America.’ Other Tristone companies include Tristone Capital and The Oil & Gas Asset Clearinghouse.

Vickers

P2ES CEO Gary Vickers added, ‘Our strategy is to provide solutions that allow oil and gas companies to expand their view and control of assets, finances and operations. With the addition of Qbyte, we will be able to provide our customers with cross-border services that support the merger and acquisition process, including integration of properties, people and back-office systems.’


Data—to crunch or not to crunch?

Oil IT Journal editor Neil McNaughton welcomes a new crop of sponsors for the www.oilit.com website before reflecting on the never-ending growth of computer modeling in geology and the revival of the corporate data store. He also reports on a new book on ‘Data Crunching’ and extends an invite to all Oil IT Journal readers to visit us at the Paris AAPG International meet in September.

First of all, a big thanks to the new sponsors of the oilIT.com website. 2005-2006 sponsors are:

· Biznet Solutions

· Exprodat

· Foster Findlay Associates

· geoLOGIC

· Georex-AT

· HP

· Kelman Technologies

· Landmark Graphics

· OFS Portal

· Petris Technology

· Tigress

· TGS/Nopec—A2D

~

At the 2005 AAPG Conference and Exhibition in Calgary, I was struck by the ever growing computerization of geology. You may find this surprising. I mean, this is Oil IT Journal, I am a geologist (of sorts) and I should know about these things. Well I do, but I am still surprised. Just as things seem to be settling down around a shared earth model paradigm—or around a single ‘platform,’ the granularity of application software goes down by an order of magnitude. Let me explain…

Models, models everywhere

Recently, we have been focusing on the 3D earth model as the focus and ‘finality’ of the interpretation process. There is no doubt that the choice of a 3D modeling environment is a tough one—both technically and commercially. But my bird’s eye view of the technology on display at the AAPG—and of that behind many of the papers presented—makes me doubt that convergence on a model is either possible or desirable.

Granularity

The models that caught my eye at the AAPG were, at first glance, at the periphery of the interpretation process. There were niche models for fault plane seal analysis, for plate tectonic reconstruction, for structural balancing and palinspastic restoration, for geochemical modeling of basin evolution and reservoir fractioning. I could go on. But what seems to be happening is that these peripheral ‘niche’ applications are getting more and more polished, the science behind them is getting better and their use is on the up. The periphery is moving in on the mainstream.

Standard model returns

Each of these models of course has its own, possibly very exotic, data requirements. The geochemical characteristics of a source rock are unlikely to be found in the average E&P data store—although paradoxically, they may relate to measurements made in the refinery! How could we ever have thought that one standard model would fit all? Well we did once, and paradoxically, the corporate data model is back with a vengeance! But I’ll have to explore why that is so in another editorial. In the meantime, vive le paradoxe!

~

On the topic of data, another trend is for vendors to deprecate those ‘hard to maintain’ Shell scripts and what have been described as ‘bubble gum and bailing wire’ solutions to data management. The subtext here is of course, ‘Throw away your Shell scripts and buy our solution.’ But wait a minute, who says scripting is un-maintainable and who says a ‘solution’ is necessarily better? Not Greg Wilson, author of the excellent book ‘Data Crunching*’. Wilson clearly has a lot of hands-on experience of the ‘unglamorous’ activity of turning inconsistent ASCII-based data files with the odd typo into clean data in XML or maybe a database.

No unifying theory

Wilson offers no ‘grand unifying theory,’ just a lot of pretty up-to-date advice on using modern tools like Java, Python, Perl and Ruby and XML to process and condition data. There is good treatment of the XML Document Object Model, of XPATH and XSLT—and how these can be put to good effect crunching and managing your data. Use of older tools—the Unix Shell and SQL is also presented—with some good advice on when to give up on XML and use a database. If your SA still lets you see a Shell prompt, and you have not yet bought the ‘solution’ that does it all, this book is for you.

~

Finally, if you are looking for something to do in September, you may like to consider the AAPG International Conference and Exhibition to be held in Paris from September 11th to 14th. I say this because for the first time Oil IT Journal will be exhibiting at a major trade show. Hope to see you there!

* Data Crunching, Solving everyday problems using Java Python and more. 2005 Greg Wilson, The Pragmatic Bookshelf—ISBN 0-9745140-7-1 (www.pragmaticprogrammer.com)


Letters to the Editor

Our short piece on the Recall OpenSpirit link in last month’s Oil IT Journal raised a few hackles...

A propos of our piece on Recall and OpenSpirit in last month’s Journal, Chris Hamley (Petris) writes—‘I don’t know where this misinformed stuff came from. You refer to a prototype Recall server that we wrote for OpenSpirit three years ago. This was successfully demonstrated to several clients. This did not use the OpenSpirit dev kit. Today, OpenSpirit is integrating Recall as a data source for OpenSpirit, using our development kit and software support services. Later this year, Recall data will be accessible from any OpenSpirit
enabled application.’

OpenSpirit CEO Dan Piette adds, ‘We are in the process of writing a Recall data server. Yes, it was probably complicated to write a data server in the past, which is why we did it ourselves. And yes we have purchased a dev kit to write a Recall data server. So soon Recall will be serving up data to any application that is OpenSpirit enabled.’

David Gorsuch (SIS) points out that the first version of Ocean was the 2003 ‘North Sea’ release. The ‘Atlantic’ release is due out at the end of 2005.


Interviews—Lange, Neri, Pferd and Smith

Our AAPG interviews cover Windows-based interpretation from Geographix and SMT, web services-based data management from Petris and Paradigm taking-on Petrel’s 3D earth model.

Martin Lange (VP Sales, Geographix)—Halliburton is in the process of ‘re-igniting’ GeoGraphix, its PC-based interpretation product line and is broadening the application spectrum from G&G to engineering. Most importantly we are putting GeoGraphix Discovery directly on top of the OpenWorks (OW) database. Clients can use Geographix, OW or even Geoprobe on the same data. In Q3 2005 clients will be able to throw a switch to ‘run on gas or diesel’ with no change in performance.

OITJ—Did you use the OW dev kit?

Lange—For some stuff, but we used proprietary code for enhanced performance. This is not available for dev kit users, giving us a competitive advantage.

OITJ—Do you link to the Engineering Data Model?

Lange—There are plans to link to EDM and to bring Aries and DSS into the Geographix brand. These will remain stand alone products but will integrate with Geographix Discovery for workflows like production surveillance.

OITJ—How will you demarcate OW and Geographix?

Lange—Geographix is moving into engineering production and economics. The next step is to move to flow simulation, with a VIP ‘Lite’ due out towards the end of 2006. Landmark’s focus is around DecisionSpace with workflow management and risk tracking. Mainstream interpretation will be increasingly with GeoGraphix. We are taking mature Landmark tools and giving them new life. Already, Aries and DSS clients are often GeoGraphix users. The OW classics like SeisWorks will remain.

OITJ—Who uses Geographix?

Lange—Flagship clients include Chevron, ExxonMobil, ConocoPhillips and EnCana. We have around 750 corporate clients in 3,500 locations.

OITJ—What’s your 3D capability?

Lange—We no longer have a 3D offering and this is a big political hot potato. I’d like to see GeoGraphix ‘reignited’ with 3D.

~

Phil Neri (CTO, Paradigm) —Our new ‘SE’ Epos release introduces multi survey, 3D and 2D seismic interpretation. This leverages a disk cache so that you can work with 60 or 80GB data on a Linux box. You don’t need an expensive 64GB of RAM—which has proved a show stopper for our competitors.

OITJ—You prefer Linux to Windows?

Neri—Seismic interpretation is all on 64 bit platforms, Linux, AMD, Solaris and AIX. Windows is the preserve of drilling and engineering although Geolog is now native Windows. Clients are driving us to standardize more on Windows—this will be complete by 2007.

OITJ—All of your clients?

Neri—No. The move to Windows is driven by a couple of key clients whose CIOs are desperate to ‘remove anything with an X’ from their deployed operating systems. We are also being driven by Petrel, which is shaping up as a strong contender. We still have misgivings as to Windows in high performance computing and visualization. Other clients say ‘no way’ to an all-Windows desktop and have no problem with the cost of keeping two OSs. In fact some majors see the high performance computing and visualization potential of Unix/Linux as a significant advantage. Others are less obsessed with operating systems and will deploy ‘whatever it takes to do the job’. Actually, only two major clients are fully behind the Windows movement, another is hesitating and the rest are not bothered. We have concerns with support of multiple Linux versions and Red Hat’s increasing costs and are planning a switch to Fedora. We can’t spend man months testing everything. Another good OS contender is Sun Solaris, the X release is a well packaged system. Linux has been an ambiguous success.

OITJ—How are you doing in 3D?

Neri—We are on a par with, if not ahead of GoCad in solid modeling. Paradigm users are always building the same model for re-use. VoxelGeo is now a specialized high end tool. Paradigm is no longer just a ‘point solution’ company. It’s not just VoxelGeo, not just Stratimagic or Geolog. Epos is the key to our integration strategy.

~

Jeff Pferd (CTO, Petris) —Petris has evolved over the last 8 years. Our early involvement in web technologies, especially with Anadarko, gave us insight into federating data repositories and data workflows. Anadarko’s business philosophy was to offer enterprise-wide data visibility and support best of breed applications. We started working with XML back in 1999. This set the scene for our loosely coupled, services-oriented architecture. Our research with Anadarko led to a process patent applied for in 2002. Petris was able to pursue this thanks to its privately-funded status. The product, PetrisWINDS Enterprise, is now stabilized and is being deployed in Saudi Aramco.

OITJ—What’s the data footprint?

Pferd—Initially well data, but we are working with Pemex to transfer interpretation and survey data and soon seismic trace data. We are also integrating with document management systems and are building a data cleanup solution. We deploy use a business process modeling-like (BPML) language to articulate our workflows.

OITJ—Who are your partners?

Pferd—We deploy an ESRI map-based interface and we use INT’s Viewers, WebDataView from ZebraGeosciences and middleware from BEA. Our text search solution is Oracle.

Oil ITJ—How is Recall integration going?

Pferd—Great. Recall clients tell us they need the functionality that PetrisWINDS Enterprise (PWE) brings—GIS, spatial search, a meta-data catalog and more connectivity. Our plans include PWE integration and support for the Recall API for other integration solutions. Another synergy is with PetrisWINDS NOW! our ASP, or Software as a Service (SaaS) offering. We see this as a way to reach a new market—beyond Recall’s traditional client base.

~

Tom Smith (President SMT) —We are engaged in a ‘deep’ re-development of our Kingdom Suite in Microsoft .NET. We are finding that .NET’s rapid application development needs a new discipline, it’s easy to pull in unanticipated dependencies. But we do see web services as our stepping stone to a multi-user environment.

OITJ—Where are you today with databases and connectivity?

Smith—We offer three database options, Access, SQL server and Oracle. Our data model is an amalgam of POSC Epicentre and OpenWorks. We are also working on ‘Open Kingdom’ which is a GIS-based data management solution and a new data server. We will also be opening up access to well data in Kingdom through a publicly available dev kit. We are also releasing a 64-bit Windows version ‘real soon now.’In fact we are waiting on Mercury and FlexLM to port their components to 64 bit Windows.


Blue Marble’s off-the-shelf map server

OGIS and EPSG standards leveraged in new, low cost internet mapping package.

Blue Marble Geographics (BMG) has just released an off-the-shelf internet map server – the Blue Marble GeoMapServer (GMS). GMS is an Open GIS Consortium-compliant re-write of BMG’s BeyondGeo MapServe. GMS includes a web mapping server, a map creation and display tool, and a customizable web interface for on-line display of maps. The product will also shortly be leveraged as the new BeyondGeo IMS Service as currently hosted by Blue Marble today.

Cunningham

“The great thing about GeoMapServer is its scalability,” stated Blue Marble’s President Patrick Cunningham. “If you’re looking to create cutting edge Internet mapping this is the product for you. Its open architecture will encourage users to develop new applications on top of the mapserver and we are all for that!”

EPSG

Blue Marble’s Kris Berglund told Oil IT Journal, ‘GMS embeds the European Petroleum Survey Group (EPSG). Based on the popularity of our other products in the Oil and Gas Industry and our low price point we expect to have some adopters soon.’

Affordable

Blue Marble is to offer GeoMapServer at ‘a fraction’ of the cost of ESRI’s IMS package. The standard license model also offers ‘affordable pricing’ for developers creating new products or services. Blue Marble claims 130,000 customers.


OpenSpirit adds generic PPDM server

The new 2.8 release of E&P middleware extends data footprint and adds data store connectors.

E&P middleware provider OpenSpirit has released a new version of its application and data integration solution. OpenSpirit V2.8 adds support for non-seismic grids and 2D seismic interpretation, more robust data moving and synchronization functionality via the CopySync utility, and enhanced GIS integration through the Scan utility and ArcGIS extension. New, user-requested data store connections are also included.

Piette

OpenSpirit president Dan Piette said, ‘Version 2.8 adds support for two new data types and four new data stores. We have more applications that we are supporting and more companies are using our product. OpenSpirit is the only platform-independent, standards-based and vendor-neutral integration solution for upstream data on the market.’

End users

End users can now read, write, update and delete non-seismic mapping grids in OpenSpirit-enabled applications and associated viewers. Enhanced GIS integration automates the creation of a spatial catalog of sub-surface data, including non-seismic horizon grids, which can then be viewed in GIS applications. Non-seismic grids and 2D interpretation are also accessible through the OpenSpirit Developer’s Kit, allowing upstream application developers to access these data types and speed the development of their applications.

Connectors

Five new data store connectors are currently under test for a generic PPDM database, GeoPlus Petra, ESRI SDE culture data, SMT’s Kingdom and Recall. In 2004, OpenSpirit reported over 120 oil company users and 1,000 licenses.


Spotfire, Geomark team on geochemistry

Webcast shows DecisionSite use in oil trading—leveraging ad hoc data mining capability.

Spotfire and GeoMark Research are teaming to demonstrate the use of Spotfire DecisionSite for interactive visual analysis of geochemical data. Geochemical applications span the upstream and downstream petroleum business. Spotfire is used to mine log data for overlooked pay zones and to optimize commercial strategy in the downstream.

Impurities

Petroleum purchasers’ discount oil prices depend on the impurities of the well sample. Purchasing strategies for a portfolio of samples require careful geochemical analysis. DecisionSite facilitates such analysis by connecting to multiple data sources—offering ad hoc data mining with minimal IT support. Analysts can deploy best practice as DecisionSite ‘guides’ to users, and can share and report results through DecisionSite’s Posters or by export to PowerPoint.


Statoil, ‘SBED boosts revenue by millions’

Following cash injection, Statoil investors promote Geomodelling technology enthusiastically.

Following its $2.5 million investment in Geomodelling last month, Statoil is now ramping the technology with the enthusiasm of a Texas wildcatter! Trygve Lægreid, advisor to Statoil’s Offtech Invest arm said, ‘Statoil has been using SBED in Norway and internationally. The detailed models contribute to improved oil recovery and provide better estimates of how much oil and gas can be produced from new discoveries. Geomodelling software is used in both exploration and production, and estimates from Statoil’s own operations teams show that they can help to boost revenues by ‘hundreds of millions of Kroner.’


Structural restoration from IFP and EDS

New ‘Kine3D’ package embeds French Petroleum Institute’s structural technology in GoCAD.

The French Petroleum Institute (IFP) is further diversifying its software partners with the announcement of an agreement with Earth Decision Science (EDS) for the development and marketing of Kine3D, a new GoCad-based 3D structural restoration package.

Plug-ins

The application will be released as three GoCad plug ins – a toolkit for ‘coherent’ 3D geological interpretation, a 2D balanced cross-section generator and a 3D volume restoration package that offers a combined, geometrical and geomechanical approach.

Dulac

EDS president Jean-Claude Dulac said, ‘By combining our 3D modeling capabilities with the IFP’s structural restoration expertise we have solved this technical challenge. The EDS development framework and our plug-in architecture provides clients or other software providers with an integrated framework for their proprietary modules.’

Friès

IFP deputy manager Gérard Friès added, ‘The new alliance combines EDS’ global network with our R&D capabilities to bring geomodeling and basin modeling to a wider audience. The Kine3D alliance has sponsorship from several oil and gas majors.’


W3C releases simple knowledge standard

SKOS a ‘simple knowledge organization system’ promises shareable, machine readable taxonomies.

Many companies and organizations are rolling out taxonomies to capture name lists of business objects as diverse as well logs and pumps. But once you have your list, how do you deploy it in a way that will integrate with other folk’s lists. How do you standardize taxonomy publishing? An answer may be the new, simple knowledge organization system ‘SKOS’ emanating from the World Wide Web Consortium’s (W3C) Semantic Web work group.

SKOS

SKOS is a simple language for expressing the structure and content of thesauri, classification schemes, taxonomies and other controlled vocabularies in a machine-understandable form. SKOS uses the W3C’s Resource Description Framework (RDF) that facilitates linking and merging lists with other lists or data, so that distributed sources of data can be ‘meaningfully composed and integrated’.

Quick start

RDF has had a poor reception in the IT community which has preferred XML-based lists over the semantic technologies. The workgroup is trying to counter this with a ‘quick guide’ to publishing a thesaurus on the semantic web which describes how to express the content and structure of a thesaurus, and metadata about a thesaurus in RDF. More from www.w3.org/2004/02/skos.


Paradigm’s SolidGeo 3D models ‘as quick as maps’

‘Breakthrough technology’ moves structural modeling from expert to average Joe geologist!

Paradigm is claiming a breakthrough for new technology that helps automate the structural modeling process. Model building, once the preserve of the expert, is now accessible to the interpreter, thanks to a new release of Paradigm’s SolidGeo 3D modeling solution.

Complex

Paradigm has spent over a decade developing, using and mastering the complex modeling technology. Extensive experience in a wide variety of geological regimes has been gained from the use of SolidGeo technology in all of Paradigm’s velocity model building and time-to-depth conversion solutions. SolidGeo now enables interpreters to create 3D models as quickly as they make maps.

Neri

Paradigm VP Phil Neri said, ‘The ability to present a comprehensive, 3D model of the interpreted data greatly enhances the geological accuracy of the interpretation. It facilitates a unified and geometrically-consistent approach for the integration of geological and geophysical data, and helps reduce the risk of missing a drilling target or a dry hole.’

Epos

Paradigm’s technology leverages its Epos 3 SE infrastructure which enables integrated line, volume and spatial interpretation, mapping and solid model building in a single 3D window. Epos allows interpreters to create a solid model concurrently with interpretation, and refine it as their insight progresses. More on Paradigm’s solutions in our Interviews on page 3 of this issue.


Mercury’s Open Inventor cluster edition

‘Transparent scalability’ promised for large 3D seismic and voxel modeling applications.

Mercury Computer Systems (formerly TGS) has just released a version of its visualization programming toolkit, Open Inventor, tuned for computer clusters. Open Inventor Cluster Edition brings ‘transparent scalability’ to visualization applications like 3D seismics and large voxel models.

Linux

The Cluster Edition is available on 64 bit Linux systems based on either AMD’s Opteron or Intel’s Xeon EMT64. 32 and 64 bit Windows versions will be available later this year. Open Inventor is embedded in systems from Foster Findlay Associates, Landmark, Schlumberger, SMT, Jason, Paradigm and Roxar. Oil company developers include Total and BGP (China).


AAPG 2005 Calgary—post boom record falls

With 7748 in attendance, this was the largest AAPG since the 1980s boom (the all-time record is 13,000). In a caricature of industry demographics, the average age at the Awards Ceremony might have registered on the Phanerozoic time scale! A more youthful (and feminine) audience was visible at the environmental/Kyoto session. On the exhibition floor, we noted a plethora of applications for modeling anything—from plate tectonics to sand juxtaposition across fault planes (see this month’s editorial). The same models cropped up in several of the talks we attended where increasingly, the ‘demonstration’ paradigm of the learned journal is usurped by a presentation of the results of a computer model. Some modeling techniques, such as deriving a stereo net of fault distributions, really turn-on the geologists. As do seismically-derived palinspastic models—especially when spiced up with some migration pathway modeling. A significant driver in the current software push comes from BP’s geological toolkit run-off and ‘co-visualization’ research effort. We were impressed by Exxon-Mobil’s ‘RETR’, the interpretational equivalent of ‘extreme programming’. Schlumberger’s new display was a model of roominess and frequentation. Roxar was conspicuous by its absence. And Landmark was somewhat obscured by the smoke coming from a ‘re-ignited’ Geographix.

Sidney Powers awardee, Ken Glennie (ex-Shell), author of the definitive Petroleum Geology of the North Sea gave an entertaining account of early work with Shell, investigating an apparent association between oil in Oman and nearby outcrops of oceanic basalts – which turned out to be tectonic serendipity. Declining a posting back to the UK in 1972, Glennie was banned from promotion for life, reflecting the ‘autocratic times’ of the day [Today, you’d just get fired!]. Glennie went on to become an educator, within Shell and outside—with publications on the petroleum geology of NW Europe, the desert of SW Arabia and most recently of the southern North Sea’s Permian basin.

Reserves Session

Mary Van Der Loop (Ammonite Resources) quoted Shell’s Walter van de Vijver as being ‘sick and tired of lying about the extent of our reserves issues’ during the write-down debacle. Because reserve growth is one of the best indicators of market returns, it is an important KPI for the ‘mom and pop’ investor. Van der Loop therefore tracks companies’ proven undeveloped (PUD) reserves as obtained from SEC 10K postings. These ‘cut through the glowing discourse of the Annual Report’. Shell’s PUD to Proven Developed ratio climbed to 56% in 2002 before dropping to 48% on revision, bringing it into line with Exxon. Elsewhere Van Der Loop notes ‘gyrations’ in PUD around a general upward trend. So companies are either finding great stuff ‘or stretching the truth.’ Oil price rises move the ‘dogs’ into reserves. A large fraction of PUD increase is due to pressure from Wall Street. But the real big question is, ‘Is there someone in Saudi Arabia saying “I’m sick and tired of lying…”.’ Dan Tearpock (SCA) heads up the joint AAPG/SPEE investigation into the ‘reserves question’ and a possible certification program. A decision is to be made later this year. Whatever that is, it will be the fruit of a stupendous number of committees. Five top level committees will investigate reserve definitions, ethics, qualifications etc. Below them nested sub committees will study recommended practices, geoscience and modeling issues etc. With worldwide reserves at around $600 trillion, the key question for Tearpock is, ‘How much of this is real?’

Schlumberger

Schlumberger’s talk on the future of E&P IT made some interesting claims. Like the imminent ‘irrelevance’ of Moore’s Law. Seemingly we will soon have all the CPU bandwidth we want. In the 1990s seismic processors were all waiting on the CPU. Today only 20% of the wait is CPU-dependent, with more time spend on QC, visualization, testing new processing paradigms and on interpretation. A ‘GUI rebellion’ is underway—against the way the computer makes us work. Soon we will use pens and vocal control of ‘learnable’ software. SIS is working with the MIT media lab on ‘tangible computing,’ using the whole body as input—a touted remedy for carpal tunnel syndrome. Other futuristic pronouncements seemed little more than optimistic extrapolations from the situation today – thus integration will be a given, as will ‘pervasive’ optimization and ‘ubiquitous’ geomechanics etc. All enabled by SIS’ data and consulting unit – its fastest-growing business sector. Other innovations include digital structural analogs (from the University of Alberta), artificial intelligence on real time data, decision support, ‘social network analysis’ and ‘beyond databases’ and into ‘workflow repositories,’ part of Shell’s brave new smartfields world.

Anadarko

Henry Posamentier (Anadarko) presented a fireworks display of seismic geomorphology leveraging VoxelGeo, StratiMagic and and image processing with ER Mapper. Seismic geomorphology has undergone a revolution – moving from the 2D internal reflection mapping of the past to 3D seismic volume interpretation. By changing cube transparency, fluviatile patterns or carbonate patch reefs are revealed. A video of the Alaskan shelf showed stupendous imagery of sedimentary structures like crevasse splays, revealed by changing vertical relief and illumination angle. Such techniques can also identify ‘FLT’s (funny looking things—a ‘technical term’). Quantitative seismic geomorphology leverages geoprocessing with ESRI tools, to study channel sinuosity and center line mapping for automated channel counts.

Geographix

As a part of its ‘re-ignition’ Geographix demonstrated its new native OpenWorks connectivity with data from BP’s Wytch Farm field. Geographix accessed interpretation data from SeisWorks and PetroWorks leveraging its map-centric approach. A right click on the map highlights wells with tops/production data. Production data and decline curves are easily obtained as is rather more tabular data than most folks would want to see in a demo! Thematic mapping can be contextualized to users preferences and the cross section display (recently acquired from A2D) is indeed pretty nifty. Data management wise, the association of Geographic and OpenWorks could be a compelling solution for some clients.

Apache

Alan Clare (Apache) speaking on the Schlumberger booth described a 140 million cell model of the N. Sea Forties field, created in Petrel. For simulation, this was ‘dumbed down’ to a 3 million cell model for use in the 3D StreamLine simulator. Layer-conformable scaling and grid refinement preserved as much heterogeneity as possible in the unswept periphery of the field, where seismic data quality is poor. Production data was used to refine the correlation. Modeling required the introduction of a permeability barrier at the base of the channel system which had not been detected in the logs. Results were encouraging with STOIIP up from 4.2 to 5 billion bbl. in place but all the remaining oil in a very poor quality reservoir. A history match run takes 28 hours and total CPU time for the study was around 1,700 hours—producing a well by well match to production within 1%. A streamline time of flight video showed unswept areas behind shale barriers. The big question for Apache now is whether to shoot another 4D survey.

ExxonMobil

Lester Landis presented the E&P equivalent of ‘Extreme Programming,’ Reservoir Evaluation Time Reduction (RETR). This has produced a ‘step change’ in cycle time reduction for reservoir evaluation ‘without compromising interpretation quality’. Key enablers are Schlumberger’s Petrel and Exxon’s EPSIM and Power simulators. The idea is simple—build a simple model quickly and then refine it. EPSIM builds a single common scale property model, Power does the fluid flow simulation. The initial ‘low frequency’ model, built in the ‘discovery’ pass is expected to be modified—so the model is designed to accommodate subsequent change. The later ‘flow unit’ pass rolls in sub-seismic features like shale barriers. Landis advocates using computer power to move faster through the workflow, rather than to build bigger models. He also advocates using a common scale model, rather than upscaling from a fine scale model to a flow simulator.

Renewable Energy Seminar

Talking about Kyoto in the oil and gas business is a risky proposition. It was therefore with some circumspection that Steve Ball (Conestoga-Rover) presented a ‘Kyoto 101’. Since Kyoto, and despite US reticence, many oil companies have adopted renewable carbon credits and are investing in wind farms and other diversifications. In Alberta, energy deregulation is making wind farms an economic proposition. For Ball, the consensus is that that ‘global warming is happening.’ Greenhouse gases are rated in CO2 equivalents—so that methane is 21 CO2e and SF6 (used in transformers) a whopping 23,900 CO2e! The main body administering carbon trading is the UN Framework Climate Change (FCC). Kyoto goals are to reduce emissions to below 1990 levels with the onus on developed countries. Results are mixed, Canada is up 20% on 1990 so far and the UK is 15% down. Mechanisms are being introduced to trade carbon credits with a global carbon credit stock market. Other projects aim at better carbon management—landfill gas collection and use, CO2 injection and sequester, reforestation and, of less interest to geologists, manure management.


This article has been taken from a longer, illustrated report produced as part of The Data Room’s Technology Watch service. Subscription info from tw@oilit.com or visit www.oilit.com and follow the Technology Watch link.


Interviews—Art Paradis, Bjorn Wygrala

Dynamic Graphics’ EarthVision to enhance well planning. IES has 70% of 3D basin modeling market.

Art Paradis (CEO, Dynamic Graphics)—We have been working with Baker Hughes on ‘next generation’ well panning software. This will allow better handling of faults and geomechanics during well planning, letting drillers control the angle of attack with respect to potential hazards such as faults.

OITJ—Whose geomechanics do you use?

Paradis—Baker Huges is supplying this technology which will also add torque and drag calculation to our mainstream product. OITJ—What about real time?

Paradis—Earth Vision can import real time data. The current version uses an internal Baker Hughes format, but we are working on a WITSML-based protocol. We have to, BP is our biggest client!

OITJ—What are your selling points?

Paradis—Structural accuracy, complex faults ‘as nature intended.’ The thorny problem of multiple z-values is handled gracefully and simply. We also have great connectivity with reservoir simulators. We were a founding member of the RESCUE initiative, which has been a great success, providing rich contacts with other vendors.

~

Bjorn Wygrala (CEO, IES)—IES has doubled its revenue in three years adding companies like ENI/AGIP, Petronas, Sinopec and Woodside. We now have around 70% of the 3D market (Beicip-Franlab is the only challenger). Basin modeling is now part of workflow because source rocks are the main exploration risk. We now link to structural modeling, through our association with Midland Valley. This is driving sales of our 2D package that works in thrust belts.

OITJ—How do you sell ‘risk’?

Wygrala—Our PetroRisk add on is technologically advanced. Such that we have to educate our clients—asking if they can really trust their models. PetroRisk uses computer clusters to run multiple 1D/2D or 3D analyses many times—to check if the initial results were a fluke. Tornado plots show which uncertainties dominate the analysis. Chevron is a ‘power user’ of our tools—along with its in-house software. PetroRisk is due for release in a year.


Folks, facts, orgs etc…

Abu Dhabi takes up XBRL, changes at the top of Baker Hughes, Aspen Technology, Kelman, Seitel and Acker Kvaerner. News from NetApp, CERA, ADMA, Neuralog, Scotiabank, GITA and WoodMac.

The Abu Dhabi Stock Exchange is to adopt the XBRL standard for coding items of financial information. XBRL is a standards-based financial reporting protocol - see www.xbrl.org.

~

Baker Hughes Inteq president Ray Ballantyne is to retire after 30 years of service. Baker Atlas President Martin Craighead is to replace him. David Barr is to take on Craighead’s role on an interim basis.

~

Aspen Technology has appointed Frederic Hammond as senior VP and General Counsel. Hammond was previously with digital video specialists Avid Technology.

~

A report from Cambridge Energy Research Associates (CERA) has determined that oil production capacity is set to increase dramatically over the rest of this decade. Supply could exceed demand by as much as 6 to 7.5 million barrels per day later in the decade.

~

Rene VandenBrand is now president and CEO of Kelman Technologies. The company also reports that it has raised $ 1 million in private financing from four of its officers.

~

The Geospatial Information & Technology Association (GITA) has been awarded a $700,000 grant from the U.S. Department of Labor to conduct a study of workforce readiness for jobs in the geospatial sector.

~

Network Appliance has unveiled two midrange storage systems, the FAS3020 and FAS3050. The FAS30xx systems are said to be suited for a variety of energy industry demands.

~

Neuralog has announced NeuraWellTool (NWT), a well log access, correlation and log markup application. NWT interfaces with online and local data sources, as well as with the NeuraScanner log scanner and the NeuraLaser printer. The company also reports major sales to Pemex and PDVSA.

~

The Benchmarking Network is to initiate new studies of the accounts payable processes and IT procurement. More from www.ebenchmarking.com.

~

Ade Audifferen is to head-up GE Energy’s Oil and Gas service center in Port Harcourt, Nigeria.

~

POSC has upated its units of measure dictionary of conversion quantity class data. The information is available on www.posc.org/refs/poscUnits20.xml and www.posc.org/refs/poscUnitsClasses.xml.

~

Aker Kvaerner is restructuring its Oil, Gas, Process and Energy (OGPE) business. The EPC capital projects and technology solutions to oil and gas client is now headed by Les Guthrie.

~

Abu Dhabi National Oil Company unit ADMA has upped the capacity of its Linux geophysical processing cluster to 120 Itanium 2 nodes.

~

Scotiabank has acquired oil and gas financial advisors Waterous & Co. with the intent of creating a one-stop oil and gas mergers and acquisitions shop.

~

Seitel has appointed William Restrepo as executive VP, CFO and company secretary. Restrepo hails from Schlumberger where he was CFO for the Americas. Marcia Kendrick continues as senior VP, treasurer and chief accounting officer.

~

A new report from Wood Mackenzie finds that ‘life is getting harder for explorers who are hit by rising finding costs and the falling value of discoveries.’


Time, Malibu merge to Decision Dynamics

Time Industrial takes-over Malibu Engineering to create new software house for energy vertical.

Malibu Engineering and Software has been taken over by Edmonton-based Time Industrial. The companies are now to merge into a new unit called Decision Dynamics’. Malibu, a privately held, Calgary-based enterprise software company provides workflow automation applications and services – notably the Wellcore well data capture and management solution.

Chrapko

Time Industrial founder and Executive Director Evan Chrapko told Oil IT Journal, ‘Decision Dynamics is a public shell being used to effect a transaction between Time Industrial and Malibu. Decision Dynamics will issue stock to all shareholders of T.I in exchange for all their shares of T.I. Decision Dynamics will issue stock and some cash (approx C$2.8 million) to all shareholders of Malibu in exchange for all their shares of Malibu. Former T.I. shareholders will comprise the biggest group of shareholders in the resulting entity. In other words we will have performed a reverse takeover’.

Contractor management

Time provides utility and oil and gas refinery companies with services and software systems to aid in the management of outside contractors during outages, turnarounds, shutdowns, capital construction and maintenance projects. Time’s services and software are designed to address what can be a complex commercial relationship between utilities and oil and gas companies and their contractors, namely due to issues relating to timing, scale and cost.

BP/Chevron

Time currently serves BP/Chevron joint venture Central Alberta Midstream (CAM). Time’s services increase cost visibility and progress control on turnarounds done at CAM’s gas plants. CAM employs dozens of contractors and runs its operations around the clock.

Hubick

CAM VP finance Kim Hubick said, ‘We looked at other approaches to the problem of accessing timely, accurate cost control information but found nothing else as fast, user-friendly, and valuable as the Time Industrial service. We are saving on admin costs—but the far bigger savings and value-creation come from contractor invoice control and project decision-making.’ The new company will be headquartered in Calgary and its stock will be traded on the Toronto Stock Exchange/Venture sub-exchange under the ticker ‘DDY’.


US NETL recaps 2004 successes

Government money for Rock Solid Images, Maurer Engineering and Advanced Reservoirs

The US Office of Fossil Energy’s National Energy Technology Laboratory has just published its annual ‘Accomplishments Report’ for 2004 outlining some of the organization’s ‘advances’ in fossil energy related research and technology transfer. Several upstream software houses and Universities benefited from the US Government’s largesse.

Rock Solid Images

Houston-based Rock Solid Images has developed a ‘greatly improved’ method of predicting and quantifying oil and gas saturation distributions by ‘interpreting seismic attributes such as velocity and impedance in terms of inelastic rock properties.’ Two U.S. patent applications have been filed covering various aspects of the technology. The California Institute of Technology, Cornell University and GeoGroup have published a geochemical model to map source and migration pathways of hydrocarbons within a geologic basin. The model also tracks chemical changes that can occur during migration, such as phase separation and gas washing.

University of Texas

The University of Texas at Austin has improved the efficiency of seismic inversion algorithms for the estimation of petrophysical parameters, leveraging Bayesian Stochastic Inversion.

Maurer Engineering

Maurer Engineering’s electronic tally sheet program for Pocket PCs was upgraded and improved with NETL’s help. The New Mexico Bureau of Geology and Mineral Resources has released a new GIS-based digital oil-play portfolio of the Permian Basin of West Texas.

Advanced Resources

Advanced Resources International with help from Chevron has successfully used AI-based cluster analysis to interpret porosity and permeability trends in a low-cost decision making tool.

Argonne

An interactive Drilling Waste Management Information System has been developed for the DOE by Argonne Labs, Chevron and Marathon.

Guthrie

Finally, NETL reports that NETL manager Hugh Guthrie (85) was named ‘Outstanding Older Worker for the State of West Virginia’ by Experience Works, a provider of ‘older-worker training and employment’. Good on ya Hugh!


Halliburton’s ‘Centrino’ wireless frac job

Joint presentation with Intel—engineering breakthrough or product placement?

A joint Halliburton and Intel presentation at the Calgary AAPG described a wireless fracture job carried out using Intel’s Centrino-based communications devices. The idea is to replace the conventional distributed control systems (DCS) with a WiFi 802.11 network. A video showed a bunch of red trucks whose heavy duty cables were apparently replaced with a WiFi ‘hotspot’.

Product placement

Why the consumer-oriented WiFi technology was used and why Intel technology was deployed was not explained. In fact the whole presentation had more than a whiff of product placement. Speakers seemed to be under instructions to weave ‘Intel’ into every talk. Halliburton Canada’s Darcy Cuthill drew the short straw and had to ‘place’ Intel into a presentation on a horizontally-drilled ‘u-bend’ of a well. Cuthill concluded her talk with a dutiful, ‘Behind the scenes all this is enabled by the digital highway. Intel is certainly one of the most important things we can say today about the kind of technology that made this possible.’


Biodiversity database available from IHS

IHS Energy gets exclusive rights to market world conservation biodiversity database to oil sector.

The UN Environment Program World Conservation Monitoring Centre (UNEP-WCMC) and IHS Energy are to launch a Global Biodiversity database. The new ‘information module’ delivers desktop access to protected areas and other sensitive terrestrial and marine habitats, including World Wildlife Fund ‘Ecoregions.’ Biodiversity data will be delivered ‘in context’ via IHS Energy’s datasets and map-based analysis tools.

Mobed

IHS Energy president and COO Ron Mobed said, ‘Widespread access to this integrated information fills a gap between energy companies’ biodiversity goals and the decision-support data required for compliance. Combining biodiversity and E&P data in a map-view creates a powerful, early warning resource for our customers, helping them understand the footprint of planned projects and how they might impact sensitive ecosystems.’

Johnson

UNEP-WCMC deputy director Tim Johnson said, ‘We seek to make biodiversity data available to the users that need it most. Combining the biodiversity module with IHS Energy’s applications will help oil and gas operators make informed decisions on biodiversity and managing growing energy demands.’ The agreement gives IHS Energy exclusive rights to market and license the UNEP Module to the oil and gas industry. ‘Significant’ revenues will be returned to UNEP-WCMC to update and enhance the biodiversity datasets.


Kongsberg gets BP Plutonia OTS contract

Kongsberg’s ASSETT and Scandpower’s OLGA 2000 form heart of operator training solution.

BP has awarded Norway-based Kongsberg Maritime (KM) a contract for the operator training simulator (OTS) and associated services for its Angola Block 18 Greater Plutonio development. The OTS, which is slotted for delivery in April 2006, is based on a customized version of Kongsberg’s ASSETT dynamic process model. Asset will be linked to a controller emulator from Compressor Control Corporation, a DeltaV SimulatePro control system from Emerson Process Management, and an OLGA 2000 multiphase pipeline simulator from Scandpower Petroleum Technology.

Training

Along with its operator training role, the OTS will provide assurance of staff competence in situations including normal operations, start-up and shutdown, emergencies and critical situations, process trips and other ‘upsets’.

FPSO

Greater Plutonio, located in 1300m water depth, consists of an FPSO vessel with all production, gas and water injection wells sub sea. Topside facilities comprise a three stage gas-oil separation plant with desalting, sized to produce 220,000 bbl/d.

Sivertsen

KM’s Kurt Roger Sivertsen said, ‘We are pleased that BP has selected the ASSETT dynamic process simulation technology for this project. Our simulators meet the highest requirements of real-time process simulation systems. This contract award is also KM’s operator training simulator project for BP and our first project in Angola.’


M2M Corp FlowAlert remote monitoring

New Application Service Provision (ASP) flow meter service targets small natural gas producers.

M2M Data Corp. has just announced two new satellite-based monitoring and control services for remote assets. FlowAlert, part M2M’s new AlertServices product line, is a service offering that monitors operations of natural gas production wells. The basic service provides monitoring of flow status, with options for data reads of local electronic flow meters, additional input output, and remote control.

Wallace

M2M Data CEO Don Wallace said, ‘FlowAlert targets small-sized natural gas production operations and includes 24x7 remote system monitoring and customer support. As with our other AlertServices products, FlowAlert is low power and simple to install.’

DataAlert

The second new offering from M2M, DataAlert extends M2M’s monitoring service beyond simple run status to include up to ten measured variables from either direct input/output or serial connection to existing control panels. Support for demand poll lets users request status and data information as required. M2M technology is a ‘cost-effective alternative’ to proprietary SCADA and telemetry systems traditionally used by energy producers to manage remote assets.


ROMeo deployed in Kazakh plant PDMS

SimSci-Esscor’s modeling and simulation package now offers consistency across plant lifecycle.

Invensys unit SimSci-Esscor has just released a new version of its ROMeo plant modeling and optimisation package. ROMeo 4.0 promises greater ease-of-use and productivity and a common environment across off-line and on-line simulation applications. ROMeo is deployed in refineries, petrochemical plants, and other process facilities.

SIM4ME

ROMeo is now integrated within SimSci-Esscor’s SIM4ME modeling environment providing a consistent user interface, common data model, consistent thermodynamics and common software modules across all simulation applications. All of which increases productivity while minimizing operator training needs.

Whittaker

SimSci-Esscor EMEA Sales Manager David Whittaker told Oil IT Journal, ‘Oil refining and upstream oil and gas are our main areas of focus and most of the major oil companies are users of our software. ROMeo is currently being implemented, along with some of our other software, in a Production Data Management System (PDMS) for a large oil and gas production, gathering and processing facility in Kazakhstan and for a production planning and optimization tool in Norway. ROMeo provides data reconciliation and rigorous process modeling functions and also has the capability for on-line (closed-loop) or loop advisory (open-loop) optimization.’

FPSO

‘Our SIM4ME technology is the core of our Operator Training Simulator (OTS) offering and is used on a number of upstream applications, notably for several FPSOs. SIM4ME incorporates our first-principles dynamic process simulator, DynSim, along with the architecture to enable various calculation engines, including those from third parties, to be combined into a single modeling system,’ Whittaker concluded.

Patented

ROMeo’s patented technology provides users with a ‘focused strategy’ of data re-use across the full range of simulation applications, from steady state process design to dynamic simulation and optimization. A PRO/II process model can be run from within ROMeo application, allowing users to take existing off-line models on-line for performance monitoring and optimization, while retaining equity in previously developed models.


Statoil deploys SAP Solution Manager

Service portal proved critical to successful mySAP Customer Relations Manager (CRM) roll-out.

While setting up its mySAP CRM system last year, Statoil availed itself of SAP’s Consulting unit’s SAP Solution Manager. SAP Solution Manager is a Service Portal for the implementation, operation and optimization of an SAP solution. As an SAP ‘Ramp-Up’ customer, Statoil was offered the Solution Manager free of charge.

Grahnstedt

Statoil’s IT project manager Trond Grahnstedt said, ‘Starting from scratch with mySAP CRM, we knew the Solution Manager would help roll-out. We developed our business blueprint and documented the exact configuration of our CRM system, including a description of processes.’

Central repository

SAP Solution Manager is delivered with a library of preconfigured business processes and a central document repository. Statoil’s implementation was completed in under a year and reduced the need for external consultants.


Emerson predicts equipment failure

New ‘Predictive Analyst’ software monitors performance and forecasts future performance.

New technology from Emerson Process Management (EPM) promises to predict future performance of mechanical and process equipment. Version 4.0 of Emerson’s AMS Suite Equipment Performance Monitor (AMS PM) introduces a predictive analytical capability that supports proactive maintenance strategies, leading to increased plant availability and performance, and reduced maintenance costs. The new release expands the predictive technology that powers Emerson’s PlantWeb digital plant architecture.

Degrade

Typical AMS PM deployments include predicting the date when a steam turbine’s performance degrades by 10MW from the current value, or when a gas turbine’s performance will decrease to 2% below design. AMS PM analyzes current performance of equipment as compared with original design, and then predicts how equipment will be operating in future months.

Llewellyn

EPM Asset Optimization unit president Craig Llewellyn said, ‘We are constantly looking at methods and technological advantages to help customers run their process more efficiently and save both operational and maintenance costs. This latest version of AMS Performance Monitor provides another mechanism to enable the move to proactive maintenance. The Predictive Analyst capability maximizes the benefits of looking ahead and planning maintenance strategies to improve the availability of critical plant assets.’


Shell Oil trials English language query

An interpreter converts oil trading business rules expressed in English to complex SQL queries.

Ted Kowalski (Shell Oil US) and Adrian Walker (Reengineering Llc.) have been working on a natural English query language to SQL translator and its use in oil product trading. The present use case involves matching customer demand for a particular quantity of product with an optimal supply source available to the trader. Many factors affect optimum product composition including the season, the locations of available equivalent products, and the availability of suitable transportation.

SQL

A competitive supply chain plan depends on knowledge of the above factors, on business policy knowledge, and on inventory facts in SQL databases. Because the situation can change rapidly, it can be difficult to write conventional application programs and SQL queries to optimize profitability. It is possible however to express the knowledge needed to optimize fulfillment using open vocabulary business rules expressed in English. These expressions are automatically translated into the appropriate SQL queries to produce a suggested supply chain solution. Even in simple examples, the generated SQL queries are too complex for a programmer to write reliably. However, it is easy to change the business rules to specify a new policy, and the generated SQL then changes automatically. A feature of the technology is that a supply chain solution can be explained, at the business level, in hypertexted English.

An example

Say a target region needs 1000 gallons of product ‘y’ in October, 2005. We then ask what alternative routes and modes-of-transportation (truck, train, boat, pipe) do we have to get that product to the region. We are also interested in the proximity of a refinery with available capacity and we may need a delivery plan that is optimized to deliver on time, make a profit, and beat the competition. Plain English rules such as ‘estimated demand some-id in some-region is for some-quantity gallons of some-finished-product in some-month of some-year’ are drafted to describe the optimization process.

SQL complexity

Even a few straightforward English language rules combine to make SQL queries of considerable complexity. Moreover, the English language business rules reflect the underlying business process more clearly than the generated SQL and are more amenable to editing and further optimization.

The current status of this project in Shell is unknown. The full Kowalski-Walker paper is available on www.oilit.com/papers/kowalski-walker.pdf.


Wellogix first out of SAP starting blocks

Netweaver certification set to make upstream e-commerce plug and play with SAP.

Wellogix has achieved SAP NetWeaver certification for a new version of its eField Ticket oil country e-commerce solution. The certification has been awarded for Wellogix’ Complex Services Management solution. Additional components will be certified in the near future.

CSM

Complex Services Management (CSM) is a complete end-to end solution for the planning, procurement and payment of complex oil field services. Running in the Netweaver Enterprise Portal, CSM is designed for deployment across a company’s engineering, procurement and financial functions. Customers using Wellogix CSM have achieved an average of 67% reduction in costs for payment processing. By utilizing NetWeaver technology, Wellogix is able to offer a complete web services based business solution that can be leveraged by a client’s Enterprise Services Architecture (ESA) strategy.

Epley

Wellogix CEO Ike Epley said, ‘Last year, we introduced the CSM suite to global energy companies. By participating in the adoption of SAP’s NetWeaver and developing the CSM suite on the NetWeaver platform, we are able to offer our customers significant business benefits derived from the latest SAP enterprise integration capabilities.’ Wellogix has received the ‘Powered by NetWeaver’ certification and has been an official SAP partner since March 2005.

Take-up

Epley added, ‘Many energy companies are embracing NetWeaver, which was a real factor in our decision to build CSM on the SAP NetWeaver platform. SAP NetWeaver allows Wellogix to combine our industry expertise with a leading technology-enabling business solution. This new business package offers our customers a high-value solution with lower total cost of ownership.’

eField Ticket

The ‘Powered by SAP NetWeaver’ certification was awarded to Wellogix’ eField Ticket Service 5.0 which has been certified for J2EE deployment on SAP WebAS 6.40 and for Business Package integration into Enterprise Portal 6.0. Wellogix is now in the process of certifying its business processes that use the XI (eXchange Infrastructure) content standard and the PIDX (Petroleum Industry Data eXchange) standards.


MRO jumps on web services bandwagon

Web Services Interoperability compliance announced for Maximo Enterprise Suite.

MRO Software’s Service Oriented Architecture (SOA)-based solution, Maximo Enterprise Suite (MXES) has met the requirements of the Web Services Interoperability Organization’s (WS-I) Basic Profile. MXES claims to be one of the first enterprise software applications to leverage SOA architecture. The WS-I compliant web services framework is embedded into the MXES platform. Customers can deploy web services for all MXES processes and can also add their own custom processes to the platform.

WS-I interoperability

WS-I is an open industry organization chartered to promote interoperability across platforms, operating systems and programming languages.

Young

MRO Software executive VP Jack Young said, ‘We are pleased to be one of the first application software vendors to offer an SOA solution that has achieved WS-I compliancy. A lot of hype has surrounded SOA and Web services, so it’s important to share with customers and prospects the real benefits of these standards. A web services based solution like MXES can provide a big competitive advantage over older or partially modified systems and processes.’

Proof of pudding

Of course a web services solution from a single vendor misses the point. What industry is really after is cross-vendor interoperability. MRO Software appears to have addressed this issue too, with a position paper on Maximo and SAP Netweaver integration. Moreover, SAP also claims WS-I compliance for Netweaver. Is this interoperability Nirvana or what?


Tigress upgrades Russian applications

Interpretation suite now tailored to Russian marketplace. CGG joint venture shares software.

UK-based Tigress has released a new version of its Russian-localized GeoTIGG application. GeoTIGG 2.0 was developed in conjunction with Russian partner and CGG affiliate, GeoLeader. GeoTIGG targets upstream workers in the Commonwealth of Independent States, leveraging Tigress applications and the Project Data Store (PDS) and adding a Russian-specific petrophysical application, GeoTOP. GeoTOP is tuned to traditional Russian logs, with their own data types, corrections and specific interpretation algorithms.

GeoQC

New in release 2.0, is the mapping program GeoQC, based on CGG’s FastQC application, a PetroVision component. GeoQC offers a range of mapping, statistical analysis and kriging. Both GeoTOP and GeoQC read and write their data directly to the PDS. The next release of GeoTIGG will include another program popular in the CIS market, GeoMIG, a 3D seismic interpretation tool.

Ingeoservices

Last year Tigress established a Russian unit, Tigress Ingeoservices, so that its financial, product support and product development businesses could be handled locally, with Russian staff and in compliance with local procedures and regulations.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.