January 2007

HP bags Knightsbridge

HP’s high-profile creation of a new ‘Business Information Optimization’ unit leverages its recent Knightsbridge Solutions acquisition and extends both companies’ oil and gas consulting footprints.

HP has just announced the formation of a ‘Business Information Optimization’ (BIO) unit within its software organization. The BIO will comprise two groups, business intelligence and information management.


The business intelligence group will leverage HP acquisition of Knightsbridge Solutions also announced this month. Chicago-headquartered Knightsbridge specializes in business intelligence, data warehousing, data integration and information quality. HP’s IM group is to deliver ‘solutions that archive and manage customers’ corporate data.’


Knightsbridge formalized its energy practice last year (Oil ITJ Vol. 11 N° 2) under John Roddy’s leadership. Just before the HP acquisition, the company segmented the practice with an upstream unit headed up by Shannon Tassin. The unit is to build on customer engagements with BP, Devon and others.


HP has recruited Ben Barnes as VP and general manager of the business intelligence group. Barnes was previously CEO of ActivIdentity, a provider of digital identity solutions, and Intraspect Software, a collaboration and content management software developer. Barnes’ early career included managerial positions with IBM and Teradata.


HP head of software Thomas Hogan said, ‘The new unit is an important step in our commitment to establish HP as the trusted advisor to CIOs engaged in the value of their information. This suite of solutions will help businesses maximize the value and impact of their IT investments spanning systems, applications, networks and information.’

Master data management

Knightsbridge has previously been active in the field of oil and gas master data management in association with Shell spin-off Kalido (Oil ITJ Vol. 10 n° 12).


More recently, Tassin and co-author Robert Friedrich presented Knightsbridge’s approach to oil and gas data management in a presentation at last year’s Petroleum Network Education Conference (PNEC) data management meeting in Houston.


In the PNEC presentation, Knightsbridge described an automated data quality certification process that applied data standards through a process of ‘verify, remediate, transform, certify and publish’.


Knightsbridge further advocates extensive process metrics and the development of ‘actionable service level agreements and certified data stores.’ Tassin is a member of the PPDM user group steering committee.

IHS buys tops data

Onshore US picked geological horizon tops from Geological Data Services acquisition to be bundled with Petra interpretation package.

IHS has acquired Geological Data Services (GDS) of Addison, Texas, a provider of formation tops data for the US Permian Basin, mid-continent and Rocky Mountain regions. The GDS dataset provides correlated tops for some 640,000 wells, picked using a ‘consistent and rigorous’ methodology.


IHS VP Mark Rose said, ‘This acquisition, along with others, will increase the value we deliver to our customers by deepening our US content offering and enhancing our customers’ decision-support tools.’ The GDS tops will be integrated with the IHS E&P data and software solutions, notably the PETRA geological interpretation package acquired last year.


The acquisition also brings IHS’ technical expertise as the geologists who developed the GDS dataset will continue to enrich and maintain these assets, ‘ensuring consistency and continuity going forward.’

Comment—upstream software vendors have generally eschewed the provision of data along with their database offerings, notwithstanding the obvious usefulness of such a combination. IHS’ Petra acquisition and the populated database provision must be a compelling solution for many.

How do you evaluate R&D quality?

Oil IT Journal editor Neil McNaughton asks, ‘just how do we evaluate the quality of research?’ He finds he is in the good company of the RAND Corporation and the Society of Petroleum Engineers. A Geological Society event on digital geology leads him to revisit a ‘letter to the editor’ ...

As companies and governments around the world ‘downsize’ and are looking everywhere to cut costs, one good subject for a big cut is usually R&D. In terms of government funded R&D, the debate is often limited to a battle of statistics, how much of GDP is devoted to research being a popular yardstick. This approach fits neatly into conventional party lines—with the left (at least in Europe) generally arguing that more public money should be devoted to R&D and the right arguing that the market should do more. This is a very good argument to have, as some R&D, like particle physics, is unlikely to progress far if left to market forces.

Oil & gas R&D

Other fields, like oil and gas R&D, would appear to be a more natural fit with industry funding. But even in the US, as our article on page 11 shows, there are considerable public funds available for upstream R&D. The newly-established Research Partnership to Secure Energy for America is to receive some $375 million of US taxpayers’ largesse for distribution to oil and gas research establishments around the country.


The public-private funding debate fits neatly with left-right paradigm of politics and gets a lot of attention. In fact I myself gave this issue some column-inches in my March 2003 editorial. But there is, I think, a much more interesting question, and one that is sometimes drowned out by the above debate and that is, how do you evaluate the quality of research, irrespective of how it is funded?

Peer review

Many university research departments get funding on the basis of their publication record. Individual researcher’s careers, likewise, advance or stagnate according to the number of papers they publish in certain scientific journals. Now I’m not talking about getting something printed in Oil IT Journal here, nor indeed in the Journal of Petroleum Technology. To gain R&D street cred, you have to get published in a heavyweight like Nature, or the Journal of the American Medical Association. To get into these illustrious publications, a paper has to undergo peer review—usually by anonymous reviewers. In the excellent Wikipedia entry on peer review, we learn that Nature only publishes 5% of the papers it receives. In other, more specialized journals, the acceptance rate may be much higher. What of oil and gas-related publications? The Journal of Petroleum technology has the good grace to indicate when a paper has not been peer reviewed—as is usually the case. Other more heavyweight publications from the SPE and SEG are presumably peer reviewed. But I imagine that finding a reviewer for some of the specialized papers in oil and gas R&D must be problematical.


The topicality of this subject is demonstrated by a new report* from the RAND Corporation, ‘Measuring the Benefits from Research,’ which addresses these issues and also enumerates other ways of evaluating R&D including economic rate of return and benchmarking. The SPE is to hold its first ever R&D Symposium in San Antonio this April to ‘discuss the big challenges facing the energy industry and the R&D requirements to meet them.’

Digital outcrops

All of which makes for a rather awkward segue into my next topic—the excellent meeting held under the auspices of the Geological Society in Manchester University this month on digital outcrop geology, or, giving it its official title, ‘From Outcrop to Asset.’ We’ll be reporting from this event in next month’s Journal, but the theme of the event was LIDAR and 3D photogrammetrical mapping of geological outcrops, and the subsequent manipulation of these interesting datasets in the computer or visualization center.


This is an exciting field and I think it’s fair to say that it is revolutionizing field geology. In the old days, geologists were limited to what could be chipped off with a hammer and chisel, mapped with binoculars, or captured on a plane table. Today, LIDAR/GPS along a sea cliff, mountain or quarry face generates a few gigabytes of 3D data that can be manipulated just like a 3D geo-model. Moreover, the dataset can be augmented by conventional measurements like sedimentological descriptions, shallow cores, ground penetrating radar and so on.

New data types

From the IT standpoint, this work introduces new data types to manage. The LIDAR generates ‘point cloud’ data in formats that are more or less manufacturer-specific. End users may be more interested in visualization in VRML or geomodelers like GoCad. But there is a bigger issue here and that is the capture and publication of a geological outcrop ‘model’. One presenter in the Outcrop to Asset meet described scanning and measuring sections from an article in the Journal of the Geological Society and incorporating the results into the geomodel. This is a familiar pattern to those working in the upstream. Digital data, even if it’s just a spreadsheet, is published on paper, then laboriously scanned or re-keyed into digital form, prior to further publication ... on paper! There has to be a better way!

Letter to Editor

In a letter*** to the editor of Geoscience last year I suggested how online geo-publishing could improve on tradition. A Wikipedia-like (with a more transparent peer review), ‘publish and subscribe’ model might leveraging an as yet undefined GeolRSS format. This would greatly improve on the current paper/pdf publication paradigm. Ironically, my letter was accepted for online publication only, somewhat limiting its impact! But after the Manchester event, I am convinced that the web has much more to offer geological publication by publishing and sharing the digital data sets of the original observations. We’ll get there eventually!

* www.rand.org/pubs/research_briefs/RB9202/

** www.spe.org/meeings/

*** www.geolsoc.org.uk/letters/

ADMC—Cutler explains all

New technology from Cutler Technology Corp. has demonstrated a 3% improvement in process throughput in one Chevron refinery. Cutler’s trick is to model a larger ‘chunk’ of the process than was possible with industry-standard PID controllers.

Process control guru Charles Cutler’s new Adaptive Dynamic Matrix Controller (ADMC) has just been licensed by Invensys Process Systems. Process optimization is a holy grail – both for refiners and advocates of the digital oilfield of the future. Cutler offered Oil IT Journal these insights into the new controller which has already proved its worth in a Chevron refinery—where a significant 3% improvement in process efficiency was observed. We kicked off the exchange by asking Cutler for a simple example of how the ADMC works.

Distillation column

The maximum feed that a distillation column can handle is when the feed control valve is wide open. But this will mean that the top condensing system cooling water flow, which controls the column pressure, is also at 100%. This in turn requires the pressure control on the column, made by adjusting the steam flow to the reboiler or feed preheater. Today’s PID* controller models are in general not valid for wide-open valves and their multivariable controllers (MVC) cannot achieve this additional capacity since this would invalidate their model. The ADMC optimizer moves process control up in the hierarchy—so that a larger ‘chunk’ of the process is managed and optimized. The ADMC’s dynamic model of the whole column can calculate the feasibility of opening the feed and cooling water flows to 100%, while maintaining the products on specification.

Open loop

The ADMC does all this by using an ‘open loop’ model of the process, eliminating the PID controllers from the control hierarchy. Removing the PID controllers solves many problems such as those associated with sticking valves, increasingly a problem in modern plants as operators over tighten valve mounts to comply with emission control regulation. Removing the PID controllers from the control hierarchy also eliminates the model errors due to tuning and configuration changes.


The real dynamic components of a process are determined by the capacitances and transport lags which are set by the size of vessels, liquid levels, catalyst inventories, and the length and size of piping.


Experience shows that 20 to 30 percent of the degrees of freedom of an optimized controller are used at valve constraints. MVCs need to maintain some flexibility for the PID controller to move its output or the valve will saturate. The multivariable controller valve position constraints are set to keep the PID controller functioning, which is usually two to three percent of the effective span of the PID controller. In other words, the traditional MVC falls short of the economic optimum by at least two to three percent of the processes’ potential value.


Invensys has signed a marketing agreement with Cutler Technologies for the ADMC. At the press briefing in the Invensys Process Systems user group last month in Dallas (see pages 6-7), the ADMC was described as the ‘the first major enhancement in process control for 30 years.’


In a separate deal, Houston-based Mustang Engineering signed a deal last year with Cutler Technology to be the ‘preferred independent service provider’ for the ADMC.

* Proportional-integral-differential controller—see http://en.wikipedia.org/wiki/PID_controller.

Detect compressor faults before they happen

Matrikon’s Lauren Neal uses statistical techniques to identify problems with offshore compressors before they happen.

The right monitoring and preventative maintenance strategies can mitigate compressor failure and shutdown. In this study, a statistical analysis was performed on data recorded from compressors before offshore installation. The main statistical tool, principal component analysis (PCA) was used to identify patterns in recorded data and to identify different operating modes from the recorded data. PCA techniques are available in statistical packages like StatSoft*. A related technique, partial least squares analysis (PLS) was used to predict outputs from measured input variables. PLS can be used to create ‘soft sensors’ or ‘virtual gauges’ which are also used to monitor abnormal behaviors such as unexpected changes in bearing temperature. Matrikon’s models were built using data like suction and discharge pressures and temperatures and other variables such as bearing temperatures and gas composition.

Equipment modes

After cleansing, data was partitioned into five equipment modes representing shutdown, idling, start up, full load and normal operation. This allowed other recorded data sets to be automatically classified according to the operating mode. It then became relatively easy to identify abnormal operating modes such as that encountered when there was a problem with a bearing.

Compression risk

Offshore data is often compressed before transmission to the shore but this should be avoided. Models must be able to detect small changes in the data which are obscured by data compression. Most current techniques rely on thermodynamic equipment models. Data driven statistical models are more robust and are easier to retrain. In reality, a combination of statistical and thermodynamic models would likely make for a truly effective approach to condition monitoring.

* See www.statsoft.com/textbook/stpls.html.

Shell serves Flare’s web E&P Catalog

Flare Solutions has partnered with IBM on global deployment of its E&P Catalog.

Flare’s E&P Catalog, originally developed for Shell, provides web-based publishing and retrieval leveraging a ‘knowledge map’ of E&P terminology. During publication, information can be manually tagged by users. Existing documents can be automatically classified by intelligent agents, adding value to legacy information.


As a metadata store, the E&P Catalog complements and adds value to existing systems, rather than replacing them. Both documents (hardcopy and electronic) and ‘chunks’ of information are indexed in the catalog—allowing for a true ‘single search’ of data in databases, archives, external content, maps and vendor data. The Catalog’s target audience is new business, exploration, well and development engineering staff.


The system is deployed using an Application Service Provision (ASP) model. This may be hosted externally or on the intranet. In one company over 3,500 staff are using the system from multiple geographical locations. The E&P Catalog is web based with installation on the server. End-users access the search and publish screens from their web browser. For Gaz de France, the E&P Catalog is hosted by Landmark in an application service provision (ASP) model as part of an overall IM service, allowing staff to search disparate information and order items. In Shell, the E&P catalog is hosted from Amsterdam.


IBM Global Business Services have partnered with Flare on a global deployment in an oil company where the catalog is used to track electronic deliverables from the top 70 projects.


A selected subset of the taxonomy used by the E&P Catalog will be released later this year to Energistics (POSC) so other companies can benefit. The adoption of standard ‘Product Types’ and context will enable more self-classifying cataloging, particularly from operations and external vendors.

Slice and dice

Dynamic folder structures let users ‘slice and dice’ information and view graphical representations search results against the knowledge map for new information discovery. A module captures key workflows for future re-use and also provides project milestones reporting.


The E&P Catalog uses the web services description language (WSDL) to provide standard content, integrating the Catalog with real time external information from sources such as Deloitte’s Petroleum Services Group. The Catalog has also been integrated with a number of different IT systems including Schlumberger’s DecisionPoint, Landmark’s TeamWorkSpace, ESRI’s ArcGIS and Metacarta. Other integrations include OpenText Livelink Document Management System, Microsoft Desktop Enigma PARS and PetrisWINDS Enterprise for RECALL and other systems.

Computer Society

In December last year Flare won the 2006 British Computer Society European Knowledge Management Project Award for its deployment of the E&P Catalog at Shell.

Consortium develops PSDM for OpendTect

Geokinetics and de Groot-Bril team on pre-stack depth migration plug-in. SSIS phase II announced.

Houston-based Geokinetics and de Groot-Bril (dGB) have launched a consortium to develop a commercial system for velocity model building and pre-stack depth migration (PSDM) from within dGB’s OpendTect environment. The system will provide seismic processors and interpreters with a ‘state of the art’, interactive 3D visualization system. Phase one will last one year and will deliver a pre-stack enabled system, with commercial plug-ins for model building, Kirchhoff PSDM, and tomographic inversion.


Current sponsors include OMV and Gaz de France. The consortium will be open for new sponsors through first quarter of 2007. dGB is also inviting companies to join Phase II of its Sequence Stratigraphic Interpretation System (SSIS) project which will extend and improve the OpendTect interpretation system. OpendTect SSIS Phase I was also a sponsored project that resulted in the release of a commercial plug-in to OpendTect last August.

Eurotech supplies IT services to Newfield

Specialist IT staff from UK-based consultancy provide ‘on site call-out’ service for holiday period.

UK-based IT services group Eurotech has provided support over holidays and peak periods of activity to Newfield Petroleum UK. Eurotech provided personnel with oil and gas experience rather than ‘generic’ IT staff in a time-allocation contract providing a ‘flexible and cost effective service.’


Newfield IT coordinator Richard Inwards said, ‘Eurotech’s support is professional and knowledgeable with excellent documentation of activities performed. We will certainly use Eurotech in the future to cover additional IT requirements.’ Newfield’s activity is ramping up and demand has increased for IT and data management services relating to applications, databases and back office support systems. Eurotech’s ‘time allocation contract provides Newfield with an ‘onsite call-out’ support service to cover unforeseen absence of key personnel and to undertake project work as required.

Software, hardware short takes

News from Fluent, Geomodeling, Peakstream, DeLorme, Blue Marble, Lasser, PODS, Hyperion ...

A white paper from computational fluid dynamics specialist ANSYS-Fluent shows how computer simulation was used to perform risk analysis of a proposed LNG terminal in California. The study investigates the effect of a breach in an LNG tanker’s wall. The spectacular resulting fire can be seen on www.fluent.com/elearning/resources/whitepapers/ensight-fire.avi.

Geomodeling Technology has released VisualVoxAt 6.0, adding synthetic seismic generation to the to seismic calibration and interpretation tool. Another new feature, the ‘S-transform’ enhances thin-bed detection.

PeakStream is shipping its development environment targeting high performance processors such as multi-core CPUs, graphics processor units (GPUs) and Cell processors (Oil ITJ Vol. 11 N° 10).

RAE Systems has been awarded a $2 million contract by PetroChina for the deployment of a 1,600 point combustible and toxic gas detection system at the Dushanzi petrochemical complex.

GPS specialist DeLorme has announced XMap 5.0, an ‘affordable GIS.’ XMap integrates geospatial data with DeLorme’s core GPS mapping framework, adding support for a variety of raster and vector GIS file formats.

ER Mapper has licensed Blue Marble’s GeoCalc 6.3 C++ software developer toolkit to extend coordinate system support in its mapping package.

Generalitat Valenciana has announced release 1.0 of gvSIG, its open source GIS. gvSIG is presented as an integrated GIS and spatial data infrastructure that offers web services mapping and geo-processing.

Lasser Inc. has announced V3.0 of its Lasser Production Data (LPD) package. The major release adds gas disposition report and export, and extensive integration with Microsoft MapPoint for well spot maps and production bubble maps.

The Pipeline Open Data Standards (PODS) has kicked-off a ‘One Call’ working group to address damage prevention data requirements. PODS has also finalized its draft corrosion data standard and is working with the National Association of Corrosion Engineers (NACE) on ratification. PODS is also working on spatialization of its data model, leveraging Oracle’s spatial technology.

The Open Geospatial Consortium (OGC) has teamed with the Biodiversity Information Standards Authority on information standards.

Spotfire has announced a new version of its enterprise analytics platform, Spotfire DXP. V 1.1 introduces a new data relationship finder, a metrics summary table and the ability to save and bookmark workflows.

Tecplot has announced a new version of ‘Edge,’ its development environment. Tecplot Edge 3.0 is a customizable version of Tecplot 360 for adding XY, 2D and 3D plotting capabilities into solvers, simulators and data analysis packages.

Ultera Systems has released its Mirage Data Recorder (MDR), targeting seismic and real-time data acquisition. MDR uses twin redundant RAID 1 disks rather than tape. The device has been tested in the field by a major oilfield service company.

Stop Press !!!

Hyperion, has acquired Crystal Ball developer, Decisioneering Inc.

GITA Geospatial Technology Report 2006

Annual member survey shows OpenGIS progress, rising COTS data use and SCADA as driving GIS deployment.

The Geospatial Information & Technology Association (GITA) has just released the 2006 edition of its Geospatial Technology Report, a comprehensive survey of its membership. Some 386 companies took part in the survey. While GITA’s membership is predominately utilities and public sector, 18 pipeline companies and 23 gas companies participated in the study of which 8 could be considered as integrated oil and gas.

OpenGIS advance

Companies report deployment of heterogeneous GIS platforms, facilitated by the ‘advancement of Open GIS data standards from OGIS, MultiSpeak and PODS.’ The report concludes that ‘the ability to exchange data across platforms is enabling users to buy the application that is right for the job, even if it means running more than a single GIS platform.’


The primary business driver for implementing GIS at utilities is automation. SCADA has shot up to take the number one slot in GITA’s ‘Top 10 Applications’ chart, from number 6 in 2005, showing the increasing integration of SCADA with GIS projects in the pipeline industry. Another ‘Top 10’ rank of GIS Technologies showed a significant rise in Open GIS (ISAT/PODS) from fourth place last year to this year’s number one slot.


The 2006 pipeline data shows a significant increase in ‘full-use seats’ using ESRI (from 17 to 53%) and DeLorme’s software and an increase in ESRI and GE ‘view-only’ seats. Most pipeline companies keep their core GIS costs down in the $0-150,000 range although one respondent cited an investment in the $1.5-2 million range! Likewise 67% gave GIS application costs of $0-150,000 with a couple of ‘outliers’ in the $1-1.5 million range.

Rise of COTS data

The report notes the rise of data from a ‘wide variety of providers marketing commercial off-the-shelf (COTS) map data.’ Companies increasingly use satellite remote sensing as a base, adding value to imagery by combining it with other available sources of topographic, planimetric, and cadastral data. One analysis shows the high level of education in today’s GIS professionals with 55% having a four year college degree.

Google Earth?

The 140 page detailed report contains a lot more on implementation strategies, land base and facilities management, data accuracy, conversion costs and more and is great value at $449 (less for GITA members). We did spot one omission though, where was Google Earth? The ‘big thing’ in GIS in 2006! But overall one has to commend the not for profit GITA on doing a great job collating all this information from its membership. Makes you wonder why the upstream orgs haven’t done likewise for GIS in oil and gas. Order the report on www.gita.org.

Invensys Process Systems User Group 2006, Dallas

Around 850 attended the Invensys Process Systems User Group in Dallas last month. Invensys has a very large scope—from ‘discrete’ manufacturing through process engineering, refining to the upstream. We report on the growth of process modeling and simulation, the new InFusion decision support system, the closing gap between process, ERP and MRO systems and the rise of wireless. A presentation on Shell Nigeria’s $3.6 billion Bonga development shows how the operator training simulator’s role is expanding to project design checkout QC. Targa Resources reported on what is believed to be the first oil country deployment of SmartSignal—a data mining-based alarm system developed by the University of Chicago in the wake of the Three Mile Island incident. TransCanada pipeline shows how Avantis’ ‘management operating system’ is driving process improvement.

Chris Lynden and Don Clark offered an introduction to the process control business. In the 1960s, a typical refinery had perhaps 5,000 control points and a ratio of outputs per point of between 1 and 1.5. Today, a refinery may have 150,000 points and 25 outputs per point. Refining has become an ‘I/O-centric’ business where price per control point is critical.

Process not manufacturing!

Describing SimSci’s Process Engineering Suite, Tobias Scheele noted that the global process industry is worth around $6 trillion and involves some of the most sophisticated plants in world. Process is different to ‘discrete’ manufacturing (like automobile) and deploys a different set of technologies. ‘If you can’t model it, you won’t understand your process, you won’t be able to improve it and you wont be able to compete.’ Models predict performance and underpin decisions. Invensys flagship SIM4ME spans operator training simulators, front end engineering design and operations.

Haverly Systems

Nancy Delhommer (Haverly Systems) noted that many models tend to ‘sit there gathering dust’ because changing stream data to new processes is so time consuming and error prone. A new PRO/II interface transfers crude assay data from H/CAMS (Haverly’s crude assay library) into PRO/II’s Excel libraries. Chevron and BP have huge libraries that can be licensed.


But big changes are on the way. Studies by the ISA predict a big shift to wireless connectivity and new technologies like control loop in the field. Invensys expects a proliferation of wireless instruments and a surge in applications. For instance a wireless human interface might, as an operator approaches a valve, push maintenance information or specifications to a mobile device. ERP vendors are also very interested in this space.

Shell Bonga

Invensys’ Greg McKim presented a paper on behalf of Shell describing Shell Nigeria’s Bonga training simulator (BTS). The BTS was used for controls checkout during engineering design as well as operator training. Today, the simulator lifecycle should equate to the plant lifecycle—OTS is about more than operator training. Everything beneath the control bus is emulated in a few PCs with DynSim plant process model and FSIM virtual controllers, allowing for ‘very accurate controls checkout.’ By testing in a dynamic environment it was possible to see how control systems work together and to train operators at the same time.

Virtual commissioning

AMEC was the EPC for the Bongo propjet with Invensys’ Foxboro unit the main automation supplier. The simulator models flow lines, subsea well heads and gas compression. The whole process—dubbed ‘virtual commissioning’—cost less than one day’s production. The system is being extended to include static design in PRO/II.


At the plenary session, acting IPS president Ken Brown made a bold claim for Invensys having ‘integrated its brands into a single company focused on asset performance management.’ Company finances are ‘strong’ and will allow for investment in R&D and in the new InFusion Enterprise Control System. Facilities that operate over a 20-30 year period present specific challenges such as legacy equipment and a ‘relentless pressure to improve performance.’ Brown noted that, ‘Users may only know IPS through just one of our constituent companies,’ the company is increasingly looking to ‘tie all the pieces of the puzzle together, while continuing to provide ‘best of breed’ solutions in the verticals.


IPS VP Peter Martin described how globalization ‘blindsided’ companies to plant issues as managers were too busy with mergers. A recent survey of plant management found ‘aging equipment and an aging workforce.’ ‘In the 1970s, the average age of field engineers was my age then. Today, the average age is still my age!’ It is hard to understand, let alone measure, the ROI of control systems. This means they are relegated to ‘costs,’ and costs get downsized. Lifecycle benefits are not currently measured.

Harvard Business School

Invensys’ real time accounting work with the Harvard Business School uses focus groups to try to answer questions like ‘Who measures the business?’ Invensys wants to bridge plant to finances with real time performance measurement, production management and real time accounting. This will leverage dashboards, scorecards and portals to provide feedback in real time to present validated financial KPIs to operators. Operators’ dashboards will show how the price of energy changes every hour. Plant ‘conflict’ is represented by contrasting attempts to maximize availability (maintenance) or to maximize utilization (operations). These are ‘inverse functions,’ they fight, especially if they are doing a good job! The idea is to perform asset management and process optimization together.

Decision support

Decision support gets more important as the number of points rises. Remote expertise will compensate for the ‘people problem.’ Data mining will emerge for purchasing and providing answers to questions like, ‘How many pumps in similar service have failed?’ ‘How is this catalyst performing?’ Clark described the current situation as a ‘nexus’ of technologies which are set to fulfill the simulation modeling vision. Pressure for interoperability means that Invensys and others are ‘highly invested’ in standards, the Internet, XML, Java and portals. Invensys’ flagship application InFusion offers an ‘enterprise perspective on performance’.


Marketing director Grant Le Sueur stated that InFusion has transitioned ‘from Power Point to reality.’ InFusion offers an enterprise view of a process, monitoring materials consumed and calculating production costs in near real time.
InFusion’s Historian ‘captures everything.’ Microsoft BizTalk is used to orchestrate business processes, conveying messages from the data Historian to business applications with ‘no programming involved.’ Invensys is offering solution starter templates, ‘Lego’ building bricks that capture and deploy manufacturing best practices across the enterprise. InFusion takes it inputs from PLC, SCADA systems and DCS, which may come from third party systems.

Wireless (again!)

According to Hesh Kagan, wireless is ‘ready for prime time.’ To overcome the problem of multi vendor wireless solutions, Invensys is leveraging Apprion’s middleware to normalize and secure wireless systems from virtually any vendors. The expectation is that at some future date standards like SP 100 will help, but ‘there will always be multiple protocols.’ Apprion’s infrastructure can integrate VoIP, WiFi, WiMax, copper and more. New applications are available with wireless like condition monitoring with cheap sensors—the foundation for model-based predictive performance monitoring. But first, you need security. The system is deployed as ‘ION,’ a wireless canopy over plants and facilities.

Energy Solutions

Energy Solutions president Alan Jacob stressed the importance of pipeline safety and integrity. The US Office of Pipeline Safety statistics for 2004 revealed 206 leaks and safety incidents resulting in $175 million damages. A 2005 report from National Transportation and Safety Board recommends training, including simulations for recognition of abnormal conditions, and the installation of computerized leak detection. Pipeline companies are working hard to reduce costs, replacing out of date SCADA systems and connecting operational data to ERP. Pipeline management software is migrating from legacy in house developed tools to ‘best of breed’ vendor solutions. TBG Brazil has deployed an internet-based pipeline management system leveraging Wonderware to integrate control room and commercial management. PetroChina is using artificial intelligence to predict gas load and pipeline usage growth.

Integrated Asset Management

Invensys’ upstream VP Shaughn Wright described the new ‘Integrated Asset Model Management’ system (IAMM) that integrates third party models along with Invensys’ own Upstream Optimization Suite. The IAMM now extends to production planning and optimization with third party tools like HySys and Olga. The idea is to be able to model and optimize ‘from reservoir sand face to wells, platforms, FPSO and on.’


An interface with Schlumberger’s ECLIPSE allows for automated input of production profiles that can be used as constraints on the rest of the system. Optimization is achieved by forward and backward simulations across wells and manifold constraints. Examples of IAMM deployment include Total Netherlands’ collaboration centers for 26 offshore platforms, Karachaganak Petroleum Operating’s ‘right-time’ production allocation and the ExxonMobil-PdVSA-Veba Cerro Negro project’s safety systems and integration.


Another deployment for Chevron deploys SCADA to link 500 remote platforms, unmanned wells and pipelines into the New Orleans central engineering facility. This involves a ‘multitude’ of local networks, PLC’s and PC’s. The IAMM solution here deploys the Wonderware InTouch MMI. Operating groups can now manage their own facilities while New Orleans staff can monitor the entire asset base.


Clay Nobel described how Targa Resources’ Chico shale gas plant in North Texas was able to leverage the installed base of a digital control room equipped with Wonderware, SCADA and Maximo. Invensys recommended operational improvements including dynamic performance measurement, SCADA and HMI cleanup, and bringing accounting EFR measurement into SCADA. SmartSignal was deployed for alarms. SmartSignal, developed by the Univerity of Chicago following the Three Mile Island incident, leverages ‘similarity-based modeling,’ a data mining technique that produces early warnings of anomalies and deviations. The SmartSignal compressor SCADA monitor generated a rate of return of 37%. This is believed to be the first use of SmartSystem in the oil patch.

TransCanada Corp.

Victor Dix-Cooper described use of Avantis’ ‘management operating system’ (MOS) on TransCanada’s 41,000 km pipeline network and electricity plants. Avantis provides management with planning, scheduling, implementation, reporting and evaluation. Costs, reliability and safety have been optimized by ‘running departments like a business.’ The system distinguished between ‘tool in hand time’ (when an operator is actually working on a project), non tool in hand time (other stuff) and ‘priority work’ in a ‘prioritization model.’ Results are analyzed with MOS scorecards in terms of compliance with objectives and how often folks are distracted by breaks in work. The method applies to all, from maintenance workers to head office. Webex is used to train folks in the use of Avantis MOS. This has ‘changed the way leaders and employees interact.’ There has been some reticence to use, but ‘opting out is not an option.’

This article has been taken from a 15 page illustrated report produced by The Data Room’s Technology Watch program. More information and sample reports from tw@oilit.com.

Folks, facts and orgs…

Baker, CERA, Decision Strategies, FMC, Iron Mountain, Knightsbridge, PGS, Invensys, OFS Portal ...

Didier Charreton has been appointed VP HR with Baker Hughes, replacing retiree Greg Nakanishi. Charreton hails from Coats Plc.

B2B e-business hub cc-Hubwoo is to raise 9.5 million Euros with a rights issue. Proceeds will be used to launch a professional services business targeting supply chain management and to ‘help suppliers to grow their business with the cc-Hubwoo’s large buyers.’

Cambridge Energy Research Associates has promoted David Hobbs to VP/MD Global Research. Hobbs for CERA. Before that, Hobbs was with Hardy Oil and Gas.

CiDRA has appointed Paul Khuri VP of its Oil and Gas business unit. Khuri was previously with Roxar.

Tony Hamer has joined Decision Strategies as MD Emerging Businesses. Hamer was previously with the Monitor Group.

Deloitte Petroleum Services has hired Magnes Grael to work in its Rio de Janeiro office.

Nanne Hemstra is to head-up de Groot-Bril’s new Indian branch office in Mumbai.

Emerson Process Management is to hire 15 R&D engineers in a ‘multi-million’ dollar renovation and expansion of its Fisher control valves R&D facility in Marshalltown, Iowa. Emerson also announced the appointment of Peter Zornio to the new post of chief strategic officer. Zornio moves from Honeywell.

John Gremp has been appointed Executive VP of FMC Technologies’ Energy Systems businesses. Gremp is seconded by senior VPs Robert Potter and Tore Halvorsen.

Mike McEvilly has joined Helix as VP Capital Projects.

Iron Mountain has promoted John Connors to President, Americas, and John Clancy to president of Iron Mountain Digital. Joseph DeSalvo has joined the company as Senior VP and Chief Security Officer.

Paul Haines has joined Knightsbridge (now a part of HP) as a Principal. Haines was previously with Kerr-McGee Oil & Gas.

Sean Mauk is business development manager of Knowledge Systems’ new office in Perth, Australia. Mauk was previously with Aker Kvaerner. Other hires include Sue Pritchett (from A2D Technologies), Steve Vest (from PGS), Joshua Evans (from Texas River Expeditions.)

Spatial data specialist Laser-Scan is changing its name to 1Spatial reflecting the fact that ‘Laser-Scan does not do Laser Scanning!’

Storage specialists LSI Logic and Agere Systems have merged in a $4 billion all paper deal.

Hilde Merete Aasheim is to supervise the integration of Hydro’s oil and gas activity with Statoil.

Thomas Shope has been named Principal Deputy Assistant Secretary for Fossil Energy at the US Department of Energy.

The Open Geospatial Consortium has just become a member of the World Wide Web Consortium (W3C).

Paulett Eberhart has been appointed CEO and president of Invensys Process Systems. Acting president Ken Brown re-assumes the role of General Manager of the Measurements & Instruments business unit. Eberhart was previously with EDS.

PGS has opened a satellite Data Processing Centre in Villahermosa, Mexico. The center is equipped with PGS’ CubeManager software and HoloSeis 3D visualization. A dedicated high-bandwidth link connects the satellite to PGS’ Compute MegaCenter in Houston.

Following the merger with CGG, Veritas president Thierry Pilenko has moved to the position of president of French oilfield service company Technip, replacing Daniel Valot.

PetroCanada and Quantum Resource Management have joined the OFS Portal e-business community.

Roxar and Sonar Ltd. have formed a joint venture in Lagos, Nigeria.

Schlumberger has promoted Simon Ayat to CFO, replacing retiree Jean-Marc Perraud.

Sense Intellifield has appointed Bill Chmela as V.P. Sales and Marketing for software products. Paul Lachin replaces Chmela as Regional Manager for the Americas region.

Serafim Ltd. has appointed Brian Arcedera as reservoir engineer. Arcedera was previously with Unocal Phillipines.

SpectrumData has appointed David Veale as general manager. The company has also just opened its first international data management facility in Jakarta, Indonesia. SpectrumData was recently awarded an unusual contract by NASA for the recovery and storage of data recorded by the Lunar Dust Detector Experiment conducted during the Apollo 11 and Apollo 12 moon landings.

Weatherford International has appointed Lee Colley COO.

Shell/Halliburton joint venture WellDynamics has acquired Halliburton’s Reservoir Performance Monitoring business.has acquired Halliburton’s Reservoir Performance Monitoring business.

Energistics calls for national data standards

Stuart Robinson wants ‘simple standards’ for regulators and national data repositories.

In a recent publication from POSC/Energistics, Stuart Robinson, CIO of the UK’s oil and gas regulator, the Department of Trade and Industry, called for greater cooperation on standards for national data repositories and regulatory bodies. Robinson argues that regulators share many problems associated with reporting and data exchange.

Easily understood

On the other side of the equation, oil companies have to deal with multiple regulators and regulations across different oil provinces, leading to a demand for a low cost, easily understood set of standards for reporting and data exchange.


Energistics, of which Robinson is a director, have facilities on their new web site to facilitate and publicize developing and established standards. Initially, Energistics would sponsor and support this work from their current budget but expects that participating regulatory bodies would ultimately join up.

Outline of Proposal

The proposal is to agree a minimum set of data requirements which will form the basis of an open standard. Agreement will also be sought on a set of standard reference entities and an initial test data set. Robinson suggests that initial data sets could target daily drilling, production and well completion data.


Working on the ‘keep it simple’ principle, Robinson suggests a simple paper-based description of exchange files together with detailed XML specifications such that vendors could deliver code. The aim is to a provide a set of ‘off the shelf’ procedures that define the common reporting requirements for any oil province. The work will leverage previous Energistics projects in the US, Norway and the UK.

OpenSpirit 3.x promises flex data footprint

A major re-design in the upstream middleware promises configurable data access—real soon now!

OpenSpirit V 3.0 represents a significant design change over previous releases of the upstream data integration layer. The changes will enable future expansion of the geotechnical data footprint, ultimately giving geoscientists and data managers access to new data types and data sources. The 3.0 release promises streamlined installation, a new intuitive user interface and improved usability.


Installing OpenSpirit no longer requires Oracle, significantly reducing installation complexity and ongoing maintenance. A Windows ‘master installation’ option is also available to offer more flexible deployment options. A new geographic coordinate service has been developed leveraging the European Petroleum Survey Group’s coordinate reference system library and the ESRI Projection Engine.


OpenSpirit president Dan Piette said, ‘Version 3.0 is a significant redesign of the OpenSpirit infrastructure, and is the stepping stone for upcoming OpenSpirit functionality. Our current product provides great support for application developers but limits our ability to extend the data footprint into new geotechnical areas. We invested considerable time and resources to this release, the real beauty of which is not so much in the existing functionality, but in the underpinnings that will support OpenSpirit releases for years to come.’


Planned future enhancements include new tools for reference data management and mapping data across POSC, GeoFrame and OpenWorks catalogs. In-house developed reference data can also be incorporated and mapped over to vendor data sources.

Units of measure

Version 3.1 will also see the arrival of units of measure management and a data model mapping tool, an option that will allow data managers to view and modify OpenSpirit’s mappings—a potentially dangerous activity for which ‘special training is advised’!


For developers, a new data access application programming interface (API) promises a standardized query language (JDBC, ADO.NET, SQL99), support for OGIS-based geometry types and geoprocessing by way of spatially-constrained queries. The later releases will also expose native vendor models to developers and will provide a sophisticated ‘metamodel’ service leveraging data store-specific capabilities.


The same technology that allows managers to modify and extend the data mappings will be leveraged by OpenSpirit to progressively extend the OpenSpirit data footprint which, by mid 2008, should embrace production, reservoir engineering and drilling domains.

MetaCarta geo-text search engine for SPE

The Society of Petroleum Engineers is to offer geographical, taxonomy-based search of e-Library.

The Society of Petroleum Engineers is to deploy MetaCarta’s geOdrive geographic text search engine to enhance the information held in the SPE’s eLibrary. The SPE’s eLibrary holds over 42 thousand technical papers relating to oil and gas exploration and production. The new tool lets users search and categorize data found in unstructured technical documents by geography and visualize the results in a geographical information system.


SPE CIO Robert Wyatt said, ‘The largest companies in the oil and gas industry rely on the SPE eLibrary to help their employees find the research they need. We now offer users a search engine that is optimized for research. MetaCarta adds geographic relevance to search of unstructured data.’


MetaCarta geOdrive combines MetaCarta Geographic Text Search (GTS) with a Geographic Data Module (GDM) tailored to the energy industry, combining full text search with geographic and temporal factors.


Mike O’Dell, VP of MetaCarta’s energy division said, ‘MetaCarta has a proven track record of providing geographic intelligence solutions to large companies in the oil and gas industry, and we are thrilled to now offer the same solutions to individuals working in the field through our relationship with the SPE.’

SPT Group acquires Sharp E&P Solutions

Rebranded Scandpower Petroleum Technologies acquires Houston-based multiphase specialist.

Scandpower Petroleum Technology has rebranded itself into the acronymic ‘SPT Group’ and to celebrate the fact, has acquired Sharp Exploration and Production Solutions (SEPS) of Houston. SEPS was founded by Richard Sharp in 2002, and is now recognized as an independent advisor on multiphase meter (MPM) selection and deployment for oil and gas companies. SEPS also advises clients on production allocation techniques.


Richard Sharp and SPT CEO Dag Rian plan to offer a ‘complete allocation solution,’ by combining SPT’s OLGA and EDPM software with SEPS’ multiphase metering business. The deal will leverage SPT’s ‘transparent pipeline’ concept which provides ‘soft sensor’ measurement at any location in the system to build profiles of pressure, temperature and liquid holdup. SEPS will integrate SPT Group’s Houston office as a new ‘Metering and Allocation business unit’, headed by Sharp.

Energy Solutions unveils Pipeline Manager 3.0

Pipeline software house enhances user interface, adds leak detection and eases data access.

Energy Solutions unveiled the latest release of its PipelineManager package this month. PipelineManager 3.0, includes a new ‘state-of-the-art’ graphical user interface, ‘VisualPipeline,’ which offers visualization of long-term trends, 3-D profiles and historical line fill with time sliders and animation. VisualPipeline also provides instant access to thermo hydraulic property anywhere in the network.

Leak detection

PipelineManager 3.0 sports enhanced leak detection capabilities, such as leak location accuracy, deviation based leak detection and compensated volume balance. The device library has been expanded to encompass detailed pump model with PID controls, various tank models, heaters and coolers, and control valves.


Energy Solutions has also released a new version of its steady-state and transient hydraulic modeling pipeline design tool PipelineStudio. New features include the ability to define multiple pipe wall layers, the specification of valve closure characteristics and the ability to compute pipe wall expansion.

‘Hi PI’—high availability Historian from OSIsoft

High availability PI system adds fail-over and fault tolerant operation to data Historian.

OSIsoft has just announced a new ‘high availability’ (HA) version of its PI System data historian. PI System is described as an ‘enterprise class real-time historian platform.’ The HA release enhances real time data capture with fault tolerant software, interface failover and hardware-based buffering and server replication.


Maureen Coveney, OSIsoft VP Marketing said, ‘The PI System provides a real-time infrastructure with powerful data management and intuitive decision support capabilities that enable continuous improvement and real competitive advantage.’


The HA release addresses what are described as ‘unavoidable conditions’ that can trigger data loss or render data inaccessible. These can occur during unplanned downtime when network cables damage may bring down a system momentarily.

M2M and Opto 22 team on telemetry

Hardware and service platform for monitoring and management of fixed and mobile assets announced.

Industrial automation supplier Opto 22 has teamed with internet SCADA specialist M2M Data Corp to extend the reach of Opto 22’s ‘SNAP’ programmable automation controllers (PAC) with M2M’s iSCADA connectivity. The deal heralds what is claimed as a ‘complete hardware and service platform for comprehensive monitoring and management of fixed and mobile assets.’

Remote monitoring

The companies will deliver remote monitoring and data acquisition solutions to customers leveraging connections from Opto 22 SNAP PACs to M2M Data’s iSCADA web-based monitoring and data management portal. The solution claims reduced cost of deploying asset management-type applications by single sourcing hardware, software, communications and off-site data hosting.


Opto 22 VP Bob Sheffres said, ‘We provide interfaces to virtually any mechanical, electrical or electronic asset. This alliance with M2M will extend our SNAP PAC systems into the iSCADA market, offering customers an integrated hardware, communications and software solution.’ Opto 22 was founded in 1974 and today claims over 85 million Opto 22-connected devices worldwide. M2M Data.’s iSCADA is used by energy companies, utilities, and government contractors developing and operating the United States ‘critical infrastructure’ program.

WoodMac reports on decommissioning

Report estimates a $42 billion spend in North Sea platform decommissioning through 2031.

A report from Wood Mackenzie, ‘Decommissioning in the North Sea’ examines the challenges and issues associated with decommissioning and discusses ‘areas of uncertainty’ surrounding regulation, tax and decommissioning liability rules.

$ 42 billion

Wood Mackenzie forecasts that the remaining decommissioning expenditure in the North Sea sector will be US $42 billion with the bulk to be spent in Norway (48%) and the UK (40%). To date 40 fields have been abandoned. A further 66 fields are in the process of being decommissioned or await abandonment. The number of fields that be are to be abandoned will increase dramatically in the next decade.


WoodMac analyst Malcolm Ricketts said, ‘We anticipate the majority of future decommissioning expenditure to be between 2015 and 2031. However, operators can delay spending by extending the economic life of the field for example with enhanced oil recovery or by converting platforms into processing hubs for nearby fields.’

Emerson announces 2.4 GHz SmartWireless

Self-organizing wireless mesh extends reach of PlantWeb to devices deep in plant’s metal ‘canyons.’

Emerson Process Management has announced a 2.4 GHz ‘Smart Wireless’ solution for plant and process control applications. Wireless extends the reach of asset optimization and predictive maintenance, enabling operators to avoid costly process interruptions and shutdowns. Response to safety incidents is improved and emissions reduced.

Self organizing

The new technology brings a self-organizing network to Emerson’s PlantWeb digital plant architecture, capturing data from assets that were previously out of physical or economic reach. Self organizing networks open communication pathways to devices deep within the metal ‘canyons’ of the plant.

100,000 devices

Smart Wireless supports networks of up to 100,000 devices which may be Emerson’s own Rosemount brand or third party devices via industry standards including SP100, Modbus and OPC. Security is assured through encryption, authentication, verification, anti-jamming and key management technology.

Maximo Oil and Gas Council deliberations

MRO Software’s oil and gas focus group looks to future as IBM embeds Maximo in Tivoli portfolio.

Maintenance specialists MRO Software’s Oil & Gas Advisory Council met last month in BP’s Houston office to deliberate on industry concerns such as the aging workforce, training, new frontier operations, refinery turnarounds and planning ‘virtualization.’ The council, set up in 2005, tracks industry trends and acts as a focus group for future development of the Maximo product line.


MRO VP Jack Young said, ‘Our work with the Council has been integral in the development of features in our Maximo industry solutions.’ MRO provides asset management solutions for over 200 oil and gas customers. The package targets operation efficiency by lowering the cost of acquiring and managing assets, managing compliance and reducing risk.


MRO Software was acquired by IBM last year and is now part of IBM’s Tivoli portfolio. The 6.2 release extends the Tivoli service management portfolio beyond data center management to embrace assets such as refineries and production facilities. Maximo clients include BP, CNOOC, ADNOC, Kuwait Oil, Occidental and Chevron.

Enterprise Engineer for Petro-Canada

Lifecycle engineering document management system supports oil sands operations.

Petro-Canada has selected McLaren Software’s Enterprise Engineer (EE) suite to ‘drive’ document-management best practices across the company. The roll-out follows a pilot of EE at Petro-Canada’s Fort Hills oil-sands project. EE will support the company’s engineering processes, providing a single point of access to manage CAD drawings, standard operating procedures and email throughout the asset lifecycle.


McLaren CEO Paul Muir said, ‘Delivering controlled processes around the creation and use of business-critical information is important in oil-sands operations. Petro-Canada is reducing costs by leveraging EE features like built-in business rules, processes and security tools. These allow users to automate the use of engineering content across the company and with third-party contractors and customers.’

$375 million for US oil and gas R&D

‘Pork’ promises energy technology ‘nexus’ status to Sugar Land and Fort Bend County .

The Sugar Land, TX-based ‘Research Partnership to Secure Energy for America’ (RPSEA) has landed a 10-year, $375 million contract from the US Department of Energy to manage R&D into new technology for oil and gas exploration and production. RPSEA is to award research contracts to universities, research institutions, national laboratories and industry partners.


Congressman Hall introduced the original legislation in 2001 to provide research and development funding for natural gas supplies. Hall commented, ‘I am pleased that the Department of Energy has awarded the contract to begin this research program which will keep Texas at the forefront of energy research. The development of new technologies and resources will help sustain our nation’s energy needs.’

Energy Policy Act

The natural gas and oil supply R&D program was created by the Energy Policy Act of 2005 and is funded from oil and gas production royalties. One target of the R&D Program is the development of natural gas and other petroleum reserves from ultra-deepwater and non-conventional onshore provinces in the US.

90 members

Non-profit RPSEA has 90 members, including 19 universities, five national laboratories and energy producers across 21 states. RPSEA’s $37.5 million annual budget will be allocated, via competitive bidding, to member institutions. According to the release, Sugar Land and Fort Bend County are to become the ‘nexus’ of new and critical energy technology development.

BP onshore fields ‘hyper-connected’

Karl Cisk unpacks BP’s ‘Field of the Future’ acronym collection at SPE Gulf Coast section meet.

BP’s Field of the Future (FoF) guru Karl Cisk, speaking an SPE Gulf Coast Section meeting this month, described how BP’s has been piloting ‘smart’ automated SCADA systems across its onshore US production. BP’s goal is to create automation standards for control systems, communications and operator interfaces leveraging Foundation Fieldbus.


BP’s Integrated Subsurface Information Systems (ISIS) project targets real time production surveillance, sending alerts from wells to operators, creating a proactive reservoir management culture. Data is displayed in graph/plot form for immediate comprehension and action.


Data to Desktop (D2D), the counterpart of ISIS for facilities, subsea and pipelines provides operators with dashboards, monitoring of plant performance and automation of repetitive tasks. ISIS/D2D is built on web-based applications, extensively configured using BP’s own business logic and processes. One problem that BP has encountered with real time is assuming the full lifecycle cost of high end components like smart devices.

The ACE age

RT data can only be exploited if operations adapt to RT decision making. Enter BP’s Advanced Collaborative Environments (ACE), an active BP program with 10-15 installations in place or planned. Cisk described the ACE as the FoF’s ‘engine room.’ IT now underpins BP’s business and the ACE is helping BP become a ‘hyper-connected, highly visual collaborative company’.

TGS to blitz Indonesia frontier basins

Multi-acquision ‘megaproject’ includes seismic, potential field, Multibeam and sampling program.

TGS-NOPEC has started acquisition of a speculative ‘megaproject’ over 16 Indonesian frontier basins. The multi measurement ‘blitz’ program includes new seismic data, Multibeam SeaSeep (MSS), potential field, cores, geochemical analyses and heat flow probes. The Megaproject has industry prefunding and will take eighteen months to complete.


The project has support from the Indonesian Directorate General of Oil and Gas (MIGAS). MIGAS director R. Priono, said ‘This program will generate a huge amount of new data and ideas on Indonesian geology.’ MSS data provides a high-resolution view of seafloor topography and detects hydrocarbon seeps..


TGS’ Asia-Pacific manager Paul Gilleran added, ‘Modern, integrated data can help open these frontier basins to new, ‘smart’ exploration.’ Making this diverse data set usable is quite a data management challenge. We will be bringing you more on how TGS-NOPEC plans to address this in a future edition of Oil IT Journal.

Intertek expands in production allocation

Caleb Brett unit acquires hydrocarbon accounting consultancy Smith Rea Energy.

Intertek’s Caleb Brett unit has acquired hydrocarbon accounting specialists Smith Rea Energy Ltd. (SRE). SRE provides hydrocarbon allocation and production reporting consultancy services to international oil and gas companies and develops fiscal accounting models and commercial agreements for complex production and allocation systems.


Caleb Brett CEO Mark Loughead said, ‘The expertise and services provided by SRE expand our capability in providing high-end research and development, consulting and support to major global industries including oil and gas.’


Other SRE services include the development and auditing of commercial and contractual agreements, the development and implementation of allocation, simulation and modeling software, specialist allocation and metering consultancy services and a fully outsourced operational management service.

Kongsberg Maritime buys Sense Intellifield

$45 million deal brings Sense’s WITSML technology alongside Fantoft Process Technology.

Kongsberg Maritime has acquired e-field specialist Sense Intellifield (SI) in a $45 million cash transaction. The purchase has no effect on Intellifield’s sister rig equipment company Sense EDM.


SI’s solutions target integrated operations, a.k.a. ‘e-fields’ leveraging real-time data – particularly WITSML. SI was also involved in the development and testing of the POSC/Energistics production optimization specification ProdML.


SI joins Fantoft Process Technologies in Kongsberg’s oil and gas portfolio. Kongsberg Maritime president Torfinn Kildal said, ‘The SI acquisition broadens our offering in integrated operations.’


SI president Børge Kolstad added, ‘There is good synergy in products and markets which will further improve the range of products available to our oil and gas clients in future.’ SI has 70 employees and reported 2006 sales of $18 million.

© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.