December 2007

Teradata for oil and gas

Manchester University researcher comes from left field with data warehousing solution to upstream data management—including seismic processing, production, financials—the whole shebang!

Speaking at the Petroleum Exploration Society of Great Britain’s Data Management Conference, Duncan Irving (Manchester University) presented a seminal pilot study using Teradata’s data warehouse engine to attack complex storage and retrieval issues associated with massive upstream oil and gas data—seismic in particular.


Irving’s study began with a problem set by Hydro, to automate 3D channel location. Experts from the UK’s National Computer Center figured the answer was to load subsets of the data to graphics processing units (GPU) for fast pattern matching. But the early work failed because of the lack of a robust storage infrastructure.


Irving was then approached by Teradata with a proposal for a data warehousing approach. Teradata’s technology is designed for very large data volumes and is used by Wal-Mart to store and analyze its real-time sales data (OITJ June 2006). The approach involved an earth-model-based data structure that could become the focal point of enterprise technical computing.


Seismic data is stored by voxel and referenced by hashing at various spatial scales for speed of retrieval à la Google Earth. The data warehouse has the potential to support reservoir and financial modeling, rolling in weather, production data and more.


Teradata uses a massively parallel ‘shared nothing’ database architecture with ‘Access Module Processors,’ physically distinct units of parallelism that have exclusive access to portions of the database. Seismic data is stored as a 10 byte header and a binary large object. Spatial data management locates traces in 3D space and more data (velocity) and methods can be attached to a trace. Rendering and compression middleware can be farmed out to other hardware—perhaps a a Microsoft Xbox! Irving is currently working on geospatial query and is looking for tie-ins with other vendors.


Irving concluded by noting that data warehousing has been around for a couple of decades—but the oil and gas vertical has yet to ‘get to grips with it.’ There are many spin-off benefits from enterprise-class data warehousing.


Version control means that nothing is ever overwritten—good for Sarbanes-Oxley compliance and workflow management. Data warehouses are amenable to fuzzy logic and neural net processing. The Manchester/Teradata solution supports high performance computing for flow modeling and real time monitoring of production measurements. Teradata’s Data Warehouse is used, albeit in a non technical role, by Repsol-YPF as a business intelligence system to analyze sales data.

SAIC’s SOA play

Consultants to embed work performed for major oil and gas accounts in public service-oriented architecture-based interoperability toolkit.

Science Applications International Corporation (SAIC) has announced an initiative to collaborate on service oriented architecture (SOA) interoperability standards for the energy market. SAIC’s IM practice is already working with major oil and gas producers and public utility companies and has seen an opportunity for a set of industry-wide interoperability standards that will enable operators and oilfield services companies to collaborate on hydrocarbon development and production projects.


Doug Charles, SAIC senior VP said, ‘We see this as a fast-cycle development project that will utilize existing technology standards and practices by publishing an agreed set of standards by the end of second quarter 2008.’


SAIC will draw on its experience with several industry standards groups to develop the roadmap and produce a ‘well-documented suite of actionable standards.’ The announcement follows SAIC’s acquisition of Indian upstream data specialist Scicom earlier this year. More on SAIC’s initiative from and our exclusive interview with SAIC’s Dean Forrester on page 3 of this issue.

Our message for 2008—move up!

Oil IT Journal editor Neil McNaughton witnesses a rare moment of drama on the UK data scene, notes confusion in the tape standards department and suggests a ‘stretch goal’ for 2008.

The was a moment of drama at the Petroleum Exploration Society of Great Britain’s biennial Data Management conference this month. To set the scene for those of you who are not familiar with the UK data scene, and to simplify the picture somewhat, there is already one commercial national E&P data repository—Common Data access (CDA) and another, the National Hydrocarbon Data Archive (NHDA) managed by the British geological Survey. If you like you can read more about the UK’s data efforts in my July 2000 editorial, ‘.com vs. .gov?’

A surprise...

But back to the PESGB conference... Protagonists from both CDA and the NHDA had finished their presentations (more of which in next month’s Oil IT Journal) and had settled down to listen to a few more talks before tea and biscuits. We were all in for a surprise.


Charlotte Norlund from Southampton University’s IT Innovation Center outlined the ‘Avatar-m’ project. This £3 ($6) million project, now running for two years, is seeking to establish a digital data repository of the UK’s seismic data set. The project promises web services-based access to a long-term archive of field and processed data. The project is also designed to support the BBC’s needs for long-term archival of its digital video library. But Avatar-m has a credible seismic partner in Ovation Data Services. How Avatar, NHDA and CDA will relate in the future is a good question. But it is probably not one that the project partners were too keen to answer when they spotted the opportunity of £3 million of UK Government largesse. One can only admire such entrepreneurial opportunism.

SEG –Y? Why not?

In case you think that such stories, from what Slashdot’s editors would call the right-hand-doesn’t-know-what-the-left- hand-is-doing department, are unusual, here is another one—also about seismic data storage by the way. I was surprised to read in the October issue of The Leading Edge of a proposed revamp of the venerable SEG-Y seismic data standard. The proposal came from Energistics and ONGC and in essence, is suggesting a re-vamp of SEG-Y to embrace XML and extend the standard to include more processing information and whatever. What was puzzling—and I have this on good authority—is that the SEG Standards committee was completely unaware of this initiative before it was published in TLE. The moral of the tale? I don’t know—maybe that if you have an idea and an audience—then go for it. Oh, and by the way, we have a more measured account of the ONGC SEG-Y proposal in our report from the Energistics Houston Standards Summit on page 7.


Talking of learned societies, I attended a very entertaining talk by Schlumberger Fellow Oliver Mullins about downhole fluid analysis. This talk is given under the auspices of the SPE’s Distinguished Lecturer Series and if you get the chance, you should definitely hear him. The subject itself is a bit off-topic for Oil IT Journal (OK, when did that ever stop us!), but Mullins is a thought provoking iconoclast. For instance, he concluded by stating that today’s standard deepwater development workflow is ‘not scientific.’ Facilities are planned and built with too few wells and there is no test of the reservoir description in the current workflow—‘You don’t assess error on a computer! You assess error by measurement—by predicting properties and running logs to check.’


On a completely different subject, like many I have been havering over a possible ‘upgrade’ to Vista and Office 2007. I used to enthusiastically upgrade with the expectation that niggling bugs in previous versions of software would be fixed and life would be easier. Some publishers would even produce a list of bugs fixed so that you could see what you were getting. But as time went on the ‘bug fix’ aspect of the upgrade has been written out of the spec sheet by the marketing department.

One bug I would like to see fixed is the ‘dialog of the deaf’ that Windows occasionally engages regarding changes to ‘’ ‘Do you want to save them?’ ‘No thanks, I didn’t make any.’ ‘Where do you want to save them?’ ‘I have no idea.’ And so on. Other issues which I would like to see fixed are the differences in text handling across Excel, Word, Publisher that make content management impossible. But worst are the bugs that come and go. Firefox is always fighting with Windows with embedded hotlinks—fixes are unfixed with every new version. Either the Firefox folks are very bad or Bill is very devious...


One of my favorite ‘coming and going’ bugs is the way an incoming email message occasionally combines its DNA with your (I guess) to produce a cute ‘invisible’ font in your reply. It would be nice to upgrade to Vista safe in the knowledge that all these bugs had been fixed. But it would be even nicer if they had been fixed a long time ago.

Move up!

I don’t want to end the year on a downbeat note. In case you get the feeling that our buggy software is getting us nowhere, I would encourage you to read our report from the 2007 EU Plant Engineering conference on page 6. Derek Middlemas was demonstrating Intergraph’s engineering design capability with a ‘move up’ command on an FPSO* model. A whole deck of the FPSO slid up and reconnected itself on the fly. How it decided what pipes connected with what I don’t know. What happens if you move the flare stack down to the bottom deck? But even if ‘move up’ is more image than reality, it’s a great concept. A ‘stretch goal’ for 2008 maybe? Move up!

* Floating production, storage and offloading.

Oil IT Journal interview—Dean Forrester, SAIC

We quiz SAIC assistant VP of commercial business services about the new SOA architecture for energy. The soon-to-be released public spec leverages work performed for clients including BP.

What’s the idea behind your service-oriented architecture (SOA) based Interoperability Standards (IS) initiative?

Many clients are turning to SOA to reduce the IT bottleneck to a timely response to market and legislative change. But they waste a lot of energy defining the basic SOA ‘wiring.’ We believe that a foundational set of standards is an essential first-step to SOA deployment and that there is little competitive advantage to be had by reinventing the wheel. Also, subtle differences in these developments actually impede SOA deployment.

What will the IS entail?

We will provide an interoperability baseline, a common transport language for organizations to communicate between each other and a stable foundation for vendors’ products.

Who is involved in the services-oriented architecture (SOA) project?

We are working with companies in the upstream, downstream and utilities and have recognized similar problems in their strategic programs—whether they are field, refinery, or utility of the future. Participants include several super majors, regional US utilities and major vendors.

Can you give some more details about what level your Interoperability Standards will work at?

The Interoperability Standards (IS) will allow organizations to quickly ‘get off the blocks’ with SOA by specifying a discrete set of existing W3C and web services standards that operators and vendors can leverage. The idea is to codify a small subset of the SOA standards and quickly get to agreement amongst the participants on the basic set of transport-level protocols that will be used. These will be vendor agnostic. No product recommendations will be made. The approach will increase consistency in the development of services, both inside and outside of operator organizations and reduce some of the friction incurred when organizations attempt to communicate.

Can you name any of your contributing clients?

As a way of ‘seeding the pot’, BP has offered its SOA Interoperability Standards (which we helped develop) to the industry. These are the same standards that were offered to the PRODML community for V1.0 of the production standard and which formed the basis for the original PRODML Reference Architecture. From our own experiences with PRODML (we were involved in the Reference Architecture, pilot implementations and the Deployment Toolkit), we believe that an agreed set of interoperability standards would have significantly reduced the time spent developing the standard.

What’s the relationship with WITSML and PRODML?

We see the Interoperability Standards as a way to ensure that PRODML, WITSML and others can be consistently delivered by anybody. Future domain standards can then focus on the business logic being implemented rather than on a standardized transport layer.

Will the IS embrace device-level integration of SCADA systems—and in this context, what process control standards are leveraged—OLE, UA?

The standards will define transport-level, SOA base protocols for the transfer of any data type. The standards are not intended to replace COM/DCOM, OLE or even OPC. They will supplement these and provide an open way of accessing data that is not restricted by the traditional network or domain boundaries that hinder these solutions.

The press release mentions your work with industry standards groups?

SAIC participates on committees of the major standards bodies such as W3C and OASIS. We have also worked on industry standards groups such as PRODML and have deep experience in implementing all these standards for our clients. SAIC invests significant funds in joint R&D with our clients each year to encourage the development and adoption of standards and help our clients understand how and where they should focus their efforts.

You have also mentioned a ‘well-documented suite of actionable standards’ will these be made public?

Yes. Our intent is to build consensus around a core set of standards and then publish these standards to the industry. It is likely that we will base the distribution around an ‘open source’ style license to allow broad distribution and a vehicle for future updates.

Will the development process be open?

Yes. We have made the BP document available on the SAIC internet site. Interested parties can ask to be involved with the initiative there.

More on SAIC’s initiative from

ONGC, Chevron select Paradigm

Paradigm has major sales this month to India’s ONGC and Chevron.

A license agreement provides ONGC with a ‘central server’ license for all Paradigm’s interpretation, modeling, reservoir characterization, well planning and drilling software. The deal is valued at $3 million and a separate maintenance and support contract is currently being negotiated.


ONGC exploration director Dinesh Kumar Pande said, ‘We are leveraging competitive advantages in R&D and technology by integrating enterprise-level software including that from Paradigm. Cost-effective, real-time access to these tools will provide a valuable link throughout all exploration sectors of our corporation.’ Paradigm COO Jorge Machnizh added, ‘The central server provides access to software for 27 geographically dispersed units and represents a cost effective solution to ONGC’s requirements.’


Chevron has also signed a multi-year, ‘million dollar’ deal with Paradigm for the supply of a ‘next generation’ interpretation and modeling solution. Paradigm’s interpretation and modeling tools will be integrated with Chevron’s proprietary E&P systems. Paradigm will partner with Chevron Energy Technology Company (ETC) to implement the solution worldwide and provide further technology development. Another agreement extends Chevron’s Geolog license.


ETC CIO Peter Breunig said, ‘After an extensive internal assessment and market review, we selected Paradigm for its strength in advanced interpretation and modeling workflows. Paradigm’s will be one of the key technologies deployed globally throughout our E&P organization and their framework will be integrated with our proprietary technology.’

HPC news from Schlumberger, Microsoft, SGI and SC07

Microsoft HPC now ‘mainstream.’ IBM and SGI head the Top500 high performance computing list.

At the SPE ATCE last month, Schlumberger and Microsoft were showing off Windows Compute Cluster Server (WCCS) which is set to ‘eliminate, isolated hard to maintain Linux islands.’ Schlumberger showed how WCCS was used to run multiple Eclipse fluid flow models and geostatistics.


The solution targets small companies with a workgroup cluster of 8-32 CPUs serving a few engineers working in a Windows environment. Schlumberger’s experience is that the WCCS offers ‘similar performance to a Linux machine, at least for up to 8 processors*.’ Microsoft announced similar ‘HPC’ arrangements with CMG, Roxar and ScandPower. WCCS 2003 is 64 bit only so paradoxically, many developers are waiting on WCCS 2008 for its 32 bit capability. Microsoft unveiled WCCS 2008 at the Super Computing (SC07) show in Reno.


SC07 is also the venue for the release of the TOP500 list of supercomputers—the yardstick of HPC at least for information in the public domain. IBM Blue Gene machines got the N°1 and 2 slots, with SGI coming in an unexpected 3rd with a quad core Xeon-based Altix. On the operating system front, Linux’ domination of the Top 500 continues with an 85% share (Windows came in with 1%).

Supercomputing challenge

New at the show this year was the Cluster Challenge (sponsored by Chevron and Western Geco). Here university teams backed by cluster vendors were set the task of assembling a cluster running various benchmarks. The challenge was won by a team from the University of Alberta with a 64-core SGI Altix. More from and

* A lackluster cluster?

ESRI ArcGIS Image Server supports CenterPoint Energy

ESRI responds to the Google Earth challenge with enterprise image management for utility.

Natural gas delivery company CenterPoint Energy (CPE), has deployed ESRI’s ArcGIS Image Server (AGIS) to process and deliver large image data sets to its multiple users. AGIS provides quick access to, and visualization of Landsat and other high volume image data, with image processing performed on-demand at the server. CPE has been using AGIS since late 2006 to support mission-critical applications including large-scale natural disaster response such as a hurricane disruption to its service.


Cynthia Salas, CPE GIS manager said, ‘ArcGIS Server’s fast processing and delivery has fulfilled our goals for image data use. Our designers, technicians, and digitizers were pleased processing time and image resolution.’ AGIS provides a range of processing options and supports third-party client applications such as AutoCad and MicroStation to provide images in the applications that end users are familiar with. AGIS’s central server model is claimed to reduced data storage overhead and provides an ‘instant’ return on investment. One driver for AGIS deployment has been the popularity of Google Earth that has increased users’ expectations of an image backdrop to applications.

Chevron adds Petrel to E&P software mix

Supermajor integrates Schlumberger flagship with in-house technology and deploys globally.

Chevron has deployed Schlumberger’s Petrel ‘seismic-to-simulation’ package as one of its global interpretation and modeling solutions. Chevron business units around the world will now have Petrel available for their G&G workflows.


Peter Breunig, CIO of Chevron Energy Technology Company, said, ‘Following an internal needs study and an assessment of available products, we selected Petrel to enhance our teams’ ability to analyze data quickly and effectively in a collaborative environment and to reduce exploration risks. Petrel will be integrated with our proprietary technology and deployed globally throughout our E&P organization.’

Earth model

Petrel’s plug-in architecture and the Ocean API will be used to embed Chevron’s proprietary tools and to standardize its processes around Petrel’s earth model. Chevron and Schlumberger are also developing a ‘next generation’ reservoir simulator, ‘Intersect’ (OITJ September 2005).

Husky deploys Invensys cyber security solution

Real time monitoring protects upgrader facility’s networks—but prevention is better than cure...

Husky Energy will deploy a cyber security solution from Invensys Process Systems at its upgrader facility in Lloydminster, Saskatchewan. Invensys is to provide design, implementation and real-time management of plant and production networks. Services include site and vulnerability assessment, security architecture and policy development, and ongoing optimization.


Invensys will also provide 24/7 network monitoring and management, leveraging German IT security specialist Integralis’ global network of Security Operations Centers. Invensys cyber security guru Doug Clifton said, ‘Our offering goes beyond a simple firewall—it is a managed solution, focusing on threat prevention, real-time monitoring and mitigation.’


Robin Mayo, Integralis Americas president added, ‘An emphasis on prevention keeps the actual threat detection and management to a minimum. But if suspicious activity does get through, we have cyber security specialists on the case 24 hours a day with advanced tools and processes.’

Software, hardware short takes ...

Ikon, eLynx, ESRI, Aveva, Beicip-Franlab, Petrosys, TradeCapture, Zeh, DresserWayne, Killetsoft, A-Prime.

Ikon Science has ‘beefed-up’ its quantitative interpretation (QI) offering with new workflows and personnel at its locations in Houston, Durham, London and Perth. QI adds geopressure, reservoir and time lapse capabilities to Ikon’s rock physics-to-reservoir line of business. New workflows include rock physics-calibrated reservoir characterization, geopressure analysis, time-lapse (4D) reservoir monitoring and fractured reservoir characterization in association with Golder Associates.


eLynx Technologies has just released WellLynx, a remotely configurable communications device for oil and gas wells. WellLynx uses cellphone technology to transmit flow, alarms and other information from producing wells over the ‘ubiquitous’ public digital network. GSM technology supports wireless internet access over 90% of the US coastal regions. Users can check compressor status, tank levels, casing and tubing pressures, flow rates and valve positions from a browser.


ESRI has also announced a new release of ArcGIS Explorer which now includes better map customization, improved ways of communicating map feature information and ‘instantaneous’ navigation to target areas around the globe à la Google Earth. In fact another Explorer enhancement is better support for Google’s KML.


AVEVA has just released a new package, ‘ReviewShare,’ for collaborative design review and markup of 3D engineering models. ReviewShare works at any scale—from valve, to pump, compressor, or an entire process plant in full 3D. The package supports email and document-based review and annotation tools to record reviews on an ‘intelligent’ 3D model.


French Petroleum Institute unit Beicip-Franlab has launched the ‘OpenFlow Suite,’ a next generation platform for geosciences and reservoir engineering. Early release OpenFlow components include FracaFlow, for fractured reservoir characterization, CondorFlow, for assisted history matching, and PumaFlow, the IFP’s new reservoir simulator. All share a common, Windows-based user interface, data model and visualization environment.


Version 16 of Petrosys’ eponymous mapping package provides ‘significant Enhancements’ for Coordinate Reference System handling—notably long term data integrity thanks to enhanced metadata management and storage and referencing of CRS information against the authoritative European Petroleum Survey Group’s standard model and dataset.


TradeCapture has announced ‘new and refreshed’ software products for scheduling and physical trading of liquid hydrocarbons—building on its flagship ICTS Symphony product line. Symphony streamlines operations, reduces transaction costs and manage futures, exchange options, OTC options, swaps and physical trades.


A new release of ZEH Software’s Montage Professional for Windows adds offline working and support for multiple print servers. Other enhancements include ‘fit to plotter’ and improved plot annotation. Montage Professional is used to combine CGM and raster graphics into a single montage of unlimited canvas size.


Dresser Wayne has received ‘PCI 2.0’ level certification from the Canadian Payment Card Industry Security Standards Council for its fuel dispenser secure payment solution. The certification applies to unattended terminals such as automated fuel dispensers that accept PIN-based transactions.


German software house Killetsoft has announced GeoDLL, a plug in for geodetic software development. GeoDLL offers 2D and 3D coordinate transformation, geodetic datum shift and other geodetic functions for a vast range of current and historical European and North American coordinate and reference systems. GeoDLL is fully documented and can be called from most commonly used programming languages or supplied as C++ or Computer Associates Visual Objects source code. More from


A-Prime Software has announced CrossView, an ArcGIS plug-in for creating cross sections and profile views from inside ArcGIS. The package targets occasional users of ‘heavyweight’ packages such as AutoCAD, Earthvision, or Illustrator who want to create cross-sections of topographic or geoscience data. CrossView expands the ESRI toolset and removes the need to spatially adjust files. A-Prime is the newly created software arm of GIS consultants and developers DPRA. First CrossView sale went to the US Geological Survey.

$70 oil price required to match 2005 $30 projects

WoodMac study finds $70 oil needed to match returns made in 2005 with $30 oil

A new analysis by UK-based consultants Wood Mackenzie has found that the average return on exploration for conventional hydrocarbons in the past three years was under 15%. This figure assumes a future oil price of $70 per barrel. Back in 2005, WoodMac found that the same return was achieved for a $30 oil price.


WoodMac VP Andrew Latham explained, ‘In the last two years, development costs have risen, especially offshore. The cost of acquiring new acreage in prospective basins has also increased and many of the most petroleum rich parts of the world are now off-limits to international companies. The average cost of a well has risen by 60% since 2004 and rising taxes have adversely affected returns. Many changes in contract terms have yet to impact revenues because the bulk of recent exploration is conducted on licenses negotiated in the 1990s, with more favorable fiscal terms. Ultimately, accepting moderate returns at high oil prices may be the only way for companies to continue to be active in conventional exploration for hydrocarbons.’

EU Plant Engineering and Design Conference, The Hague

The EPEDC conference focuses on the oil and gas, process and power verticals. This year saw renewed interest in the ISO 15926 data standard (especially for handover) which has been leveraged successfully by Malaysian power generator Malakoff. Other presentations focused on commercial tools from Aveva, InnoCielo, Intergraph, ShareCAT and Dassault Systèmes. Presentations covered deployments on ExxonMobil’s North Sea assets and the Shell-operated Sakhalin II project.

Miguel Muñoz’ European Plant Engineering and Design Conference (EPEDC) was held last month in The Hague. Munoz has been a strong supporter of plant engineering since the enthusiasm of the Plant Information Management (PIM) Conferences of the 1990s. A theme of Muñoz’ conferences is the plant and process standards initiative, the POSC/CAESAR (ISO 15926) plant data standard. Although interest in the standard waned in the early years of this century, there is renewed interest in both plant data management and in the standard itself.


Aveva VP Derek Middlemas’ keynote contrasted the progress that supply chain management has made in retail and in discrete manufacturing with the ‘haphazard’ approach to information management (handover) in the plant construction business. The plant industry is very information intensive, much more so than the retail or banking sectors. Supply chain integration in the automotive sector means that your ‘blue BMW’ is made to order*. Why can’t we use information as effectively as other verticals? It is down to the project execution strategy of owner operators and engineering contractors. There is little incentive for contractors to improve in a cost plus environment. Short termism and pressure to deliver reduce institutional learning and the ability to optimize. Contracting strategies are hampered by a fragmented supply chain.


Such issues can only be solved by owner operators. Unfortunately, many oils have outsourced the engineering department without changing their processes. But they still ‘own’ the risk. Middlemas suggests the only workable solution is a proactive IM strategy, including manage your own data. We need ‘information engineers,’ but they don’t exist inside owner operators. They are inside companies like ShareCat, Pearson Harper and others. To quote from Bill Gates, ‘Today, virtually everything in business is undifferentiated commodity. How you manage information determines whether you win or lose.’


Eirik Fjelde (KTB Consultants) described ExxonMobil’s use of InnoCielo’s Meridian engineering document management system to support operations and maintenance (O&M). InnoCielo Meridian (ICM) is used on ExxonMobil’s production facilities and exploration activity, crossing multiple departments and external partners. The system offers secure, onshore and offshore access and revision management. This allows document re-use in maintenance. Data can be exchanged with vendors, leveraging an engineering numbering system based on the NORSOK standard and linking to Maximo, SharePoint and Documentum. The system is also used in Exxon’s Operating Integrity Management System for procedures, manuals and revision control. Exxon believes that ‘all parties should work on one system.’


This was borne out in a major incident that occurred in 2004. A helicopter spotted a gas leak which was a serious hazard to shipping. The Norwegian authorities were contacted and they informed all nearby facilities. Jotun A responded as they saw pressure drop in the feeder connecting to StatPipe. An expansion loop on the sea bed had been exposed and caught by a trawler. Meridian allowed an onshore team to access all relevant documentation in under 30 minutes for an effective shut down. The effectiveness of the data management system was proven in this single incident.

Sakhalin II

Chris Mitchell (AMEC) outlined some of the information management challenges on the Shell-operated Sakhalin II project. AMEC provided the topside FEED**, design, engineering and procurement. Two platforms have been built, LUN-A (22,000 tonnes) and PA-B (28,000 tonnes)—both world records when built. Information management scope spans engineering design and document management, document control, engineering systems PEGS, PDMS and Intools (Intergraph), AVEVA and ShareCAT. Systems have been interfaced with AMEC’s corporate IT in support of the information handover guide. IM customers and locations are spread around the world in a complex supply chain of vendors, software tools databases and workflows for data capture and QC. ShareCat has been deployed as the key application for supply chain coordination. The tool was first used by AMEC to gather vendor documentation. Now Shell has adopted the tool and expanded its scope to include documents, tag data, content and spare parts. A Shell International audit revealed a need for better IM support focused on improving handover understanding by both AMEC and SEIC. The audit also determined that an information loss due to poor IM would amount to about 1% of Capex—i.e. $250 million. A full time SEIC information manager was appointed in late 2004. Some 192,000 tag numbers have been generated and 1.5 million data attributes assigned. These learnings have been rolled-into Shell’s current electronic information sharing design and engineering practice.


Rosli Abdul Hamid (Malakoff Corp.) presented the results of a major real-world ISO 15926 by a power plant operator. ISO 15926 is a key component of Malakoff’s lifecycle asset information management (LCAIM) project. LCAIM leverages ISO 15926 from an owner operator perspective to combat IM challenges such as plant information in unstructured documents, hard copies data management and low ‘as built’ data quality. Data is hard to maintain because of missing native formats of engineering schematics, especially when acquiring a 15 year old plant. LCAIM leverages IM developments such as Open Source, XML, portals and standards from PCA and FIATECH. LCAIM offers a structured approach to handover from EPC*** to O&M**** by harmonizing IM across O&M and engineering. On the business process side, EPC deliverables leverage ‘intelligent’ engineering applications based on the ISO 15926 data model. 90% of the LCAIM data model is reusable across projects. There has been a shift in thinking from documents to structured data management. EPCs and vendors should align on ISO 158926. Competition in construction is hotting up—especially from China. Owner operators with good IM and risk management will have a competitive advantage.

Pearson Harper

Steve Pearson (Pearson Harper) believes that if you build a data warehouse without a content management system you are wasting your time. Much of today’s engineering involves information loss. A jet engine has been desribed as an ‘as information degradation’ machine. Digital data is turned into scanned documents and a thousand page PDF. ‘Intelligence’ is lost and information becomes hard to navigate. We need to stop these silly practices from spoiling the information age. Pearson Harper (PH) clients, which include BP, SASOL and Shell, are all buying the same equipment so this is a great opportunity to share information. EPCs tend to work in silos so PH captures their information into its ‘PHASSET’ database which is ‘ISO 15926-like.’ Client systems including AVEVA, SAP, and Maximo can then be populated in a orderly manner with QC’d data. PHASSET now holds all of BP Angola’s Block 21 engineering data. In the old days, project information arrived at the OO 12 months after first oil! Now, information is available before commissioning, so operations work with the same data from the outset. Data completeness has risen from 50% to 95% and accuracy from 11% to 90%. Incidentally, BP had no tag management system in Texas City before the disaster—they do now! PH has already mapped the PH dataset to ISO 15926, ‘a charitable contribution to industry.’ One problem with the standards process is that when a contributor like PH submits new data to ISO, ‘they expect us to pay for the review process.’ For an SME like PH this is ‘a non starter.’

Dassault Systèmes

Rolf Gibbels described Dassault Systèmes’ plant lifecycle management (PLM) joint venture with IBM. This was initiated in 1981 and now embraces IBM’s Maximo MRO unit. Dassault brands include SolidWorks, Catia, Simulia, Delmia Enovia and 3DVia and the new CIMAGE acquisition. A click through from CATIA pulls up information from the Enovia parts database. A modern FPSO weighs in at some 10,000 tonnes and 135,000 parts—somewhere between an Audi (2 tonnes, 10,000 parts) and a Boeing 787 (240,000 tonnes, 1 million parts). Until recently plant process and petrochemicals has lagged in PLM—but today, ‘the time is right.’ Dassault’s PLM Plant supports OO data ownership throughout project, linking engineering construction and maintenance from day one. EPEDC presentations are available on

* Automotive is not perhaps the best analogy when you consider information handover to the ‘owner operator’ i.e. the driver!

** Front end engineering design.

*** Engineering and procurement contractor.

**** Operations and maintenance.

This article is based on a longer, illustrated report from The Data Room. More information on this subscription-based service from

Energistics ‘Standards Summit,’ Houston

Updates on WITSML, PRODML, global well identifiers, seismics and e-Permitting.

About 100 attended Energistics’ (formerly POSC) second ‘Standards Summit’ in Houston. The standards body has seen significant (44%) growth this year and is now finalizing its post-rebrand reorganization prior to ‘completing the mission’ in 2008 and to develop an ‘end to end’ methodology for standards collaboration and corporate deployment. Mark Greene (Accenture) has joined the board.


The meeting focused on Energistics’ flagship WITSML and PRODML standards. Jon Curtis (Petrolink) noted that WITSML has yet to replace its binary ancestor, WITS which is still in use. Typically, rigs transmit WITS data to the data center for aggregation to a WITSML data stream. WITSML needs to get faster to replace WITS. Julian Pickering (BP) gave strong backing to WITSML, comparing the standard’s situation with that of the process control industry’s OPC protocol a decade ago. But WITSML has to develop from an ‘interesting tool’ to the way of doing business. Melissa Symmonds announced that Schlumberger’s operations were now WITSML-enabled with an InterAct API. A WITSML data link was also added to the Schlumberger Operations Support Center in 2003 and Petrel has a link that enables real time evaluation of drilling trajectory in conjunction with the geological model.


Laurence Ormerod (Weatherford) described PRODML as the ‘most viable route to production interoperability.’ To date pilots have been carried out on production reporting (Statoil), water flow management (Chevron) and DTS data management (Weatherford). The road-map for the next three years includes—Daily reporting (2007), Norwegian joint venture production reporting (2008), ESP wells (2009) and SRP wells (2010). By 2010, PRODML will offer an extended language, multiple protocols with professional support. Rick Morneau (Chevron) opined that combining WITSML and PRODML would be a good idea, ‘It seems crazy to fragment, we need to work together, we need a standard ML.’

GWUI, Seismics, e-Permitting

Nick Duncan (IHS) outlined the final stages of the global well unique identifier project. This is to be operated by IHS, as a public registration service available to non clients. Ashok Kumar Tyagi (ONGC) outlined a new initiative to update SEG-Y with an Energistics Geophysical SIG. This is to investigate storage of observers reports and ancillary information in SEG-Y headers and XML formats for data exchange and capture of velocity and processing parameters.


Alan Doniger’s presentation on e-Permitting recapped earlier work on a WITSML-based electronic permitting standard—leveraging software licensed from the US Groundwater Protection Council (GWPC). This project was hit by a federal funding cut earlier this year and Energistics is working with the Colorado Oil and Gas Association (COGA), to take the work forward. Today, permitting involves much paperwork and can take 30-60 days. The resulting standard will bring efficiency gains and will likely be used by other US regulators.

Folks, facts, orgs ...

Shell, Baker Hughes, SAIC, Roxar, TietoEnator, DPTS, cc-Hubwoo, CGGVeritas, Tudor Pickering, DecisionStrategies, Petris, Geotrace, Halliburton, TradeCapture, IES, Knowledge Systems.

According to media reports, Shell is to slash around 2,000 IT jobs worldwide in a major reorganization and a shift to outsourced technical IT.

Nelson Ney has been appointed president of Centrilift and VP of Baker Hughes succeeding Charlie Wolley who is leaving the company.

SAIC has promoted Senior VP Amy Alving to CTO. Following the completion of the CorrOcean/Roxar transaction, Øystein Narvhus is stepping down as COO.

Tieto-Enator has appointed Hannu Syrjälä as President and CEO. Syrjälä hails from GE.

The National Oil Corporation of Kenya has awarded a tape transcription contract to Ovation unit DPTS.

cc-Hubwoo has appointed Mark Williams CEO replacing Alain Andréoli who is now EAME region president of Sun Microsystems.

Thierry Le Roux, group president and COO of CGGVeritas is now also president of geophysical services following the departure of Christophe Pettenati-Auzière.

Tudor, Pickering has appointed David Cunningham as MD, Investment Banking, ‘spearheading’ the firm’s effort in the oilfield services sector.

Decision Strategies has announced several appointments; Kevin Carpenter to COO, Tony Hamer, VP development and Chris Reinsvold to MD of the oil and gas practice.

Tom Williams joins the Petris board—he was previously with Noble Corp. The company has strengthened it Operations Center product line with the appointment of Debi Castellanos and CJ Pomeroy. Philip Yang has also joined as a developer.

Steve Svatek has joined Geotrace’s technical staff to work on Geotrace’e ‘Diamond’ project, a new integration platform. Svatek was previously with Hydro/Spinnaker.

Andy Lane is to retire as Halliburton’s Executive VP and COO. The COO position will be eliminated and Dave Lesar will assume direction of the Company’s Eastern and Western Hemisphere divisions. Executive VP and CFO Cris Gaut is president of the Drilling and Evaluation Division. Mark McCollum is Executive VP and CFO and Evelyn Angelle is VP and Corporate Controller.

David Newton has been promoted to CEO of TradeCapture.

IES has hired Sandra Bergers as administrator, sedimentologist Matthias Greb to its Mexico unit, petroleum geologist Fujian Ma for Asia and Wolfram Rosenbaum, a numerical methods specialist, to the development group.

Knowledge Systems has hired Monica Danna as director of marketing.


Martin Gainville of the French Petroleum Institute (IFP) has corrected our account of his presentation at last month’s SPE ATCE. Gainville explains ‘We have not yet incorporated the third party software packages (Eclipse, PumaFlow, Reveal, etc.) that your report implied. These were mentioned as representative of typical workflows and to illustrate the need for standards to ensure numerical and data consistency.’

New paper

Gainville has also kindly provided a short paper that better explains the scope of the CAPE-OPEN standard. This is available on

Standards news from ECCMA, OGP, CIDX, OGC, COMCARE

New announcements on metadata, cyber security, asset management, GIS, emergency response.

Peter Benson, executive director and CTO of the Electronic Commerce Code Management Association (ECCMA) has authored a paper describing the new ISO 8000 standard for data quality. ISO 8000 targets ‘next generation’ high speed and high relevance internet searches that rely on accurate and unambiguous descriptions. The first published part of the new spec covers requirements for the exchange of quality master data on people, organizations, locations, goods, services, rules and regulations. Read Benson’s paper on

Cyber security

The Control Systems Cyber Security Vendor Forum (CSCSVF) is to ‘align’ with the Process Control Systems Forum (PCSF) to improve control systems security. The CSCSVF fosters a culture of security in the process control industry by working on non-vendor specific security-related issues and acting as a ‘catalyst’ to mitigate risks.


The Oil and Gas Producers’ association (OGP) has published a document that shows the equivalence between standards from the API, ISO and OGP that concern the oil and gas vertical. The document is updated every six months and covers a vast range of activities including reliability and maintenance information, drilling and production hardware and control systems and construction materials specifications—the (long) list is available on


The chemical industry data exchange standards body CIDX is to cooperate with the Industrial Standards for Automation (ISA) organization on plant asset lifecycle management, considered to be ‘the next frontier in operations profitability for chemical companies’. Jointly developed standards target ‘smooth and consistent flow of critical information between plant assets and enterprise management systems.’ Specific areas of collaboration include CIDX endorsement of the ISA 95 enterprise and control integration standard.


The Open Geospatial Consortium (OGC) has been busy this month with finalization of the ‘official’ KML 2.2 spec for Google Earth data. OCG has also endorsed the work of the Global Earth Observing System of Systems, which provides real-time information on the land, oceans, atmosphere and biosphere. A GEOSS pilot implementation is available at The pilot leverages the OpenGIS Web Map Server, Web Feature Server, Web Coverage Server and the Catalog Services Web implementation specifications.

Emergency response

The COMCARE Emergency Response Alliance is to work on interoperable, inter-organizational communications and information sharing for emergency preparedness, response and recovery. A Core Services initiative promises ‘sophisticated, shared information technology services’ in the form of a provider directory and locator service and identity management and access control services. More from

MicroStation at heart of Petrobras obstacle management system

GIS-based obstacle inventory aids pipeline route planning and safe operations.

Petrobras’ Obstacle Management System (OMS) was highlighted at the recent Bentley user conference in Los Angeles. The $20 million project involved the mapping of 40,000 surface and deep-sea obstacles in Brazil’s Campos, Santos, Espírito Santo, and Rio Grande do Norte basins. Petrobras’ Ruy Santos Cova said, ‘Our new MicroStation-based system enables us to simply and quickly visualize in 3D all surface and deep-sea obstacles in areas of interest to help us research their characteristics. In addition to cutting pipeline installation times, this capability allows us to plot the most cost-efficient pipeline paths. With undersea piping costing about $1,000 per meter, savings from our use of shorter piping paths can be significant.’ By mapping obstacles, including rigid and flexible pipes, manifolds, platforms, pipeline terminals and well heads, Petrobras has been able to prevent accidents, decrease time spent managing pipelines, prevent environmental disasters and increase the security of offshore operations.

Rowan deploys SARS Intellitrax fleet management

Driller extends asset tracking system to onshore rigs and offshore hurricane response.

Rowan Companies, Inc. a provider of international and domestic contract drilling services is extending its use of SARS Intellitrax to monitor its fleet of drilling rigs, buoys and helicopters. The Intellitrax tracking service now includes Rowan’s land-based drilling rigs. Rowan drilling operations president David Russell said, ‘With Intellitrax, we can monitor all of our assets in a single view, greatly improving the efficiency and safety of our operations.’


Rowan uses SARS to track offshore oil rigs, helicopters and buoys marking sunken rigs. During the hurricane season, SARS locates rigs moving off anchor so that they are returned to service swiftly. SARS also manages an international network of Automatic Identification System (AIS) receivers to deliver real-time vessel traffic information for key ports and waterways as well as marine ‘domain awareness’ surrounding offshore rigs. AIS range is currently 25 to 40 miles but this is to be increased to over 100 miles leveraging technology from ShineMicro.

GeoFields User Group

Pipeline GIS and data management presentations from SoCal Gas, Kern River and ESRI.

The recent GeoFields user group conference provided a snapshot of the state of the art in pipeline software and data management. GeoFields software is used to manage some 200,000 miles of pipe—around a third of the US network. As Shaun Healy (SoCal Gas) stated, PHMSA* reporting is driving the industry in terms of technology requirements. For SoCal, this has been reflected in a defocus on the data model to greater data management with end user applications, redefining data management to support SoCal’s integrity management program.


Notwithstanding SoCal modeling defocus, GeoFields supports all ‘open’ industry models including PODS**, PODS Spatial, ESRI’s APDM and GeoFields’ own GFDM (based on PODS). A translator aggregates data across multiple dissimilar models. Data management is provided by the DataFrame and FacilitiesExplorer products while RiskFrame offers data uncertainty and results management.


Pipeline’s killer application is of course GIS and GeoFields is something of a poster child for ESRI. As John Alsup (ESRI) stated, one challenge for ESRI is that ArcGIS’ rich feature set makes it too powerful for many end users. GeoFields adds value to ESRI’s APDM enterprise spatial data model by ‘tuning’ and enhancing GIS for pipeline, creating pipeline-specific workflows that operators understand.

High Consequence Area

Al Brown provided an in-depth analysis of how GeoFields software addresses PHMSA requirements. A detailed workflow establishes steps to conformance and fulfills reporting requirements. GeoFields put 180,000 man hours into HCA development and analysis and has a terabyte database of HCA data. HCA analysis rolls-up pipeline product, topography and drainage and soil nature for impact analysis. A spill model provides data from the pipeline side (drain volume, valve closure time, product, flow) and a GIS-based overland/hydrology model adds terrain, mitigation response time impact buffers. Digital terrain models are used to visualize overland spread of spill. In view of the huge amount of pipeline mileage and the complexity of the interactions, semi-automated geoprocessing is required to prioritize efforts to high impact potential areas.

Kern River

Jeff Dickey showed how Kern River Gas Transmission Co. has perfected the art of database-derived alignment sheets generation using DataFrame to produce very complete sheets showing land ownership, pipeline class definitions, map and schematic tracks, KEG tables etc. ArcMap creates an attractive look and feel with appropriate symbology. Kern River’s sheets are ‘constantly evolving’ to fulfill field users’ increasingly sophisticated requirements and to embed the latest data maintenance functionality from GeoFrames.

* Pipeline and Hazardous Materials Safety Administration—

** Pipeline Open Data Standard—

Qinetiq OptaSense for pipeline security

New multi-mode system detects intrusion, damage, mechanical failure and flow issues.

UK-based Qinetiq has just launched ‘OptaSense’ a border, pipeline and cable security system. OptaSense detects multiple, simultaneous disturbances over a 40km length of either existing or installed optical fiber with a 10 meter resolution. The multi-mode (fiber-optics, hydrophones and accelerometers) system protects critical infrastructure against accidental or malicious damage—a challenging task in view of the distances involved.


Partner in the solution is UK-based Sensoptics which has provided the fiber optic technology behind OptaSense. OptaSense combines standard single mode optical fiber with control and analysis software that continuously monitors the entire fiber to detect, locate and categorize multiple, simultaneous disturbances. The high quality acoustic signature can also be monitored through headphones.


Qinetiq MD Mike O’Connor said, ‘OptaSense detects intruders onshore and also monitors mechanical failure in critical underwater infrastructure.’ OptaSense provides coordinates and cues other sensors, such as cameras, to ensure appropriate follow up. The system’s acoustic sensitivity can be used to ‘listen’ to the flow through a pipe, so that blockages and leaks can be located.

Canadian Pipeline industry unveils ‘Pipelines 101’ website

New website provides ‘balanced’ information on midstream operations, safety to public.

The Canadian Energy Pipelines Association (CEPA) has just announced a new website ‘Pipelines 101’. The idea is to provide stakeholders with access to information about industry operations so that they can make ‘informed decisions.’ The site was also set up to address concerns regarding health, social, safety, security and environmental impacts of pipeline construction and operation.


Topics covered include pipelines’ role, types of pipelines and pipeline operators, safety; stakeholder engagement, policy and regulatory affairs and pipeline design, construction and operation. A glossary of pipeline terminology and links to other sources of information about pipelines are also included.


Pipelines 101 aims to offer ‘balanced, factual information on Canada’s pipelines’ noting in particular that ‘Pipelines are the safest and most efficient means of transporting crude oil and natural gas from producing fields to refineries and processing plants and of distributing petroleum products and natural gas to the consumer.’ The website attempts to redress ‘negative perceptions about potential health, social, safety, security and environmental impacts of pipeline construction and operation.’ Test drive Pipelines 101 at You might also like to visit the eponymous website at, representing the views of the American Petroleum Institute and the Association of Oil Pipe Lines.

Petrel supports GPGPU-based number crunching

New package from Mercury adds NVIDIA TESLA compute engine to Ocean toolkit.

At the Super Computing 2007 show in Nevada last month, Mercury Computer Systems was showing a prototype of graphics processing unit (GPU) based number crunching from inside Schlumberger’s Petrel interpretation flagship. Mercury’s Open Inventor-based 3D visualization engine (a component of Petrel and Schlumberger’s Ocean development platform) has been interfaced to NVIDIA’s Tesla high performance computing (HPC) engine to address the computation and visualization of large 3D seismic data sets.


The resulting ‘Probe’ technology was also on view at the SEG last month where the HPC toolkit was used in direct reservoir identification and for ‘WYSIWIG’ geobody extraction and editing (OITJ October 07). Ocean embeds Mercury’s VolumeViz LDM high-performance volume visualization technology for large seismic data sets.


The new technology results from a deal between Mercury and NVIDIA to add the GPU-based ‘CUDA’ code set to the Open Inventor toolkit. This effectively opens up general purpose GPU (GPGPU) HPC to all users of the Ocean development kit.


Mercury VP Jean-Bernard Cazeaux said, ‘3D visualization has revolutionized the understanding of seismic data, thanks to the performance provided by the GPU. But now GPUs do more than visualization, they provide amazing computing capabilities for interactive applications.’

Beyond Moore

GPU-based computing is showing promise in HPC because the number crunching capacity of graphics chips is fast outstripping that of the conventional CPU (Moore’s law). Tesla, which comes in several form factors from a PCI Express card to a rack mount, is a high-performance hardware device that adds supercomputer power to a CPU-based workstation. The CUDA C-Compiler allows optimized development on the GPU using the C programming language.


The CUDA/Tesla combo provides application developers with access to high performance seismic processing capability with simultaneous visualization of results in 3D. Andy Keane, General Manager of the GPU Computing business at NVIDIA added, ‘Some geophysical tasks necessitate supercomputer power. By providing this on a standard workstation, we are enabling a fundamental transition in the way you define your workflow. By reducing some computation times days to hours, NVIDIA Tesla is a significant disruption to HPC.’

HART bus underpins OrmenLange predictive maintenance

HART Communications and Emerson AMS Suite cited as key enablers of state of the art diagnostics system.

StatoilHydro’s Ormen Lange Norwegian gas field was highlighted as a state-of-the-art deployment of the HART communications standard at the annual PowerGen show in New Orleans this month. HART is a bi-directional protocol for communicating between intelligent field instruments and host systems.


Emerson’s AMS Suite of diagnostic software applications and field devices was cited as the key enabler in StatoilHydro’s predictive diagnostics program on Ormen Lange’s subsea installations, onshore terminal and the world’s longest subsea gas pipeline. The network delivers diagnostics from 350 continuously monitored digital valve controllers, elements of Emerson’s PlantWeb digital plant architecture.


Craig Llewellyn, president of Emerson’s Asset Optimization said, ‘StatoilHydro worked creatively with Emerson to execute its vision for predictive maintenance in an open architecture, maximizing asset availability and performance.’ StatoilHydro was awarded the 2007 HART ‘Plant of the Year’ for Ormen Lange. HART Foundation members include Emerson, GE, Invensys, Honeywell, Siemens, Toshiba and Yokogawa. 20 million HART-enabled devices are installed world-wide.

‘Kanak’ enhances Schlumberger gas operations

WITSML-based data aggregator from Moblize now component of Production Watcher solution.

Moblize has just released ‘Kanak 3.0’ a ‘100% WITSML-compliant’ dashboard for wellsite operations. Kanak aggregates data from multiple sources including SCADA, drilling and completion applications and field devices. Data is consolidated into WITSML or PRODML for storage in corporate systems or the Moblize Enterprise Gateway Server (MEGS). Kanak supports tag selection, data extraction, compression and validation.


Schlumberger was quick off the mark to sign up with Moblize to leverage Kanak to improve operations of onshore gas fields. MEGS data will flow to Schlumberger’s ProductionWatcher solution, a liquid loading workflow and gas well production management system. Schlumberger VP Alex Cisneros said, ‘We see value in this new generation of production processes, with decision-ready data used in new workflows, often by users that had no access to this information before. Realizing this value requires working closely with clients on their projects, advanced ‘last mile’ IT capabilities, and geo-technical strength with a broad, technology led service portfolio.’

Yokogawa augments VigilantPlant with GE monitoring package

Plant Resource Manager—a field of the future enabler—now adds rotating equipment diagnostics.

A new release of Yokogawa’s Plant Resource Manager (PRM) offers integration with GE Energy’s System 1 Platform. System 1 is the software front end to GE’s Bentley Nevada series of condition monitoring hardware, adding real time diagnosis of assets such as turbines and rotating machinery. PRM is a component of Yokogawa’s VigilantPlant/Asset Excellence initiative covering maintenance, condition-based monitoring and historical data.


The System 1 interface adds a supervisory window to PRM for machinery and field instrumentation such as transmitters and flowmeters. PRM integrates alarms and event messages generated by System 1 and provides additional information such as root cause and countermeasures advisories. The interface is said to widen PRM’s asset management capabilities, helping operators make decisions by providing ‘easy access to rich information on the health of their critical assets.’


Another upgraded VigilantPlant component is the FieldMate Versatile Device Management application that is used to set, adjust, and troubleshoot parameters of field devices using either the HART or Foundation Fieldbus protocols. VigilantPlant was recognized as a key component of BP’s Field of the Future initiative while GE’s System 1 saw a major deployment at Syncrude Canada’s Mildred Lake operation.

WellPoint extends Axapta ERP with IDEACA

Companies cooperate on sales and roll-out of Microsoft Dynamics enterprise resource planning.

Calgary-base WellPoint Systems has signed a partnership agreement with IDEACA Knowledge Services (also of Calgary) to extend the reach of WellPoint’s Microsoft Dynamics-based ERP system (OITJ January 2005). The deal covers sales and implementation of WellPoint’s modular platform to the Canadian oil, gas and mining industries.


WellPoint director Ian Harrill said, ‘Our common focus on the energy industry is allowing us to collaborate on something new—a platform upon which companies will create industry-centric business processes that improve operational efficiency, promote collaboration and information sharing.’ IDEACA will help with WellPoint’s sales and implementation.


WellPoint also announced that it has issued 1,382,981 common shares to Blackmont Capital Inc. in consideration of a $525,000U fee relating to financing of Bolo acquisition. (OITJ September 07).

Schlumberger wins exclusive MetaCarta distribution to E&P

Deal augments information management offering with lexicon-driven map search.

Schlumberger has obtained exclusive distribution rights to the oil and gas sector for MetaCarta’s map-based geographic information search technology. MetaCarta provides an industry-specific geographical lexicon, map-based search, temporal filtering and data visualization of both structured and unstructured content.

Le Peuch

Schlumberger Information Solutions president Olivier Le Peuch said, ‘MetaCarta is the perfect complement to our IM offering. With the amount of information held in unstructured form, such as documents, presentations and web content, MetaCarta’s geographically-specific access to unstructured content brings new power to petrotechnical professionals. In combination with our geoscience and engineering information management solutions, now petrotechnical professionals will be able to rapidly incorporate all available information that is relevant to their prospect or field.’


MetaCarta president and COE Ron Matros added, ‘Schlumberger’s global sales and support will accelerate the expansion of MetaCarta solutions into oil and gas.’ Schlumberger will acquire all existing contracts for MetaCarta in the oil and gas sector, resulting in a single source for sales and support of this technology in the industry.

MapInfo acquires Encom

Following its own acquisition by Pitney Bowes, deal adds geoprocessing and data management.

Pitney Bowes unit MapInfo has acquired Australia-based mining and petroleum data and mapping specialist Encom. Encom’s GP Info Division provides desktop data visualization and data manipulation packages tailored to the Australasian petroleum industry. The GP Info database of petroleum exploration information includes up-to-date permit, well and company data for Australia, New Zealand and Papua New Guinea.


MapInfo international VP John O’Hara said, ‘This acquisition expands our operations in Australasia and brings new geoscience and location intelligence decision support technology and services to our global customers.’ The deal adds multi-dimensional analysis, surface generation, 3D visualization, geophysical modeling and data integration to MapInfo. Encom’s Compass Enterprise tools offer advanced metadata cataloguing, data management and visual spatial data discovery.


Encom MD Dave Pratt added, ‘This will enable us to accelerate product growth and enhance support across our complete range of products and services.’ Encom has 50 employees and over 1,000 corporate clients. MapInfo was acquired by Pitney Bowes earlier this year in a $408 million cash transaction.

Landmark gets exclusive rights to ISATIS geostatistics

Geovariances’ high end geostatistical package to underpin ‘next generation’ earth modeler.

Geovariances has signed an exclusive technology partnership agreement with Halliburton’s Landmark software arm. The five year deal will allow Landmark to introduce Geovariances’ ISATIS geostatistical earth-modeling technology to the E&P industry.

Next Generation

Landmark is to incorporate ISATIS’ geostatistical algorithms in a range of workflows supporting both ‘classic’ methodologies and innovations. The relationship will play a major role in Landmark’s ‘next generation’ earth modeling offering.


Geovariances has seen a rapid recent rise in the use of geostatistics in the oil industry due to its ‘significant contribution’ to data QC, spatial data analysis and risk assessment. Geovariances has its origins in Georges Matheron’s pioneering work at the Paris School of Mines and maintains close ties with the school’s Geostatistics Center located in Fontainebleau, south of Paris.

Third party

Isatis provides interfaces to third party software including Schlumberger’s Petrel (OITJ October 2007) and Beicip-Franlab’s RML.

CO2 consortia move from hot air generation to sequestration

US and Norway lead the way in tonnage sequestered—UK initiates ‘evaluation’ project.

Activity in CO2 sequestration is accelerating with new ventures on both sides of the Atlantic. The US Department of Energy’s (DOE) Regional Carbon Sequestration Partnership Program has just awarded $66.7 million to the Midwest Geological Sequestration Consortium (MGSC) for the fourth large-scale carbon sequestration project, located in the Illinois Basin. The partnership, led by Illinois State Geological Survey, will demonstrate CO2 storage in the Mount Simon Sandstone Formation which has the potential to store more than 100 years of carbon dioxide emissions from major sources in the region. One million tons of CO2 will be injected from the Archer Daniels Midland’s ethanol plant in Decatur.


A CO2 seminar was held last month in Norway to assess sequestration initiatives including StatoilHydro’s 11 year long Sleipner test, a new facility at the Mongstad power plant, Gaz de France’s K12-B North Sea project and Total’s experiment at Lacq, South West France where 150,000 tonnes of CO2 will be pumped over a two year period.

UK study

In the UK, the CO2 Aquifer Storage Site Evaluation and Monitoring (CASSEM) project kicks-off in 2008. The two year project is funded by the Department for Business, Enterprise and Regulatory Reform. Partners include AMEC, Schlumberger, Marathon and BGS.

© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.