September 2009


ADAM changes @ BP

A review of BP’s application development and application maintenance (ADAM) across upstream, downstream, trading and corporate businesses sees five major outsourcing deals struck.

Following a year-long evaluation of its information technology, BP has initiated sweeping changes, consolidating its application development and application maintenance (ADAM) vendors across its three businesses - Exploration and Production, Refining and Marketing and Alternative Energy. The evaluation has resulted in the award of five major outsourcing contracts, announced in Bangalore, India last month.

BP Group CIO Dana Deasy told Oil IT Journal—’Previously we had in excess of 40 existing application development and maintenance suppliers with more than 80 contracts. We are simplifying our application suppliers to 5 strategic vendors—Accenture, IBM, Infosys, TCS and Wipro. This will simplify and standardize the way BP does application development and application maintenance. The five awards span a broad mix of general business and oil-specific applications.’

Accenture has been chosen to provide SAP application development services, helping BP to ‘simplify, standardize and consolidate’ its SAP-based applications and to meet BP’s goals of organizational efficiency and managed IT costs. Accenture has been working for BP for over 20 years. Deasy said, ‘BP will benefit from Accenture’s SAP know-how and deep understanding of the oil and gas Industry. The contract will help accelerate BP’s implementation of SAP across the organization, and lower our overall cost base.’

IBM gets a less glamorous five year contract to manage and run BP’s enterprise applications and integrated service desk, although IBM notes that this contract is the largest of the five. IBM is to be the strategic ADAM vendor for global SAP maintenance and system development. IBM’s Global Business Services unit will manage BP’s SAP applications running both from BP locations and from IBM’s network of global delivery centers.

Infosys Technologies is to manage and operate ‘a large portion’ of BP’s business systems in a five-year applications outsourcing and support deal. The Infosys component covers ADAM work across BP’s supply, trading and E&P businesses.

Tata’s Consultancy Services unit has been named as a strategic partner in the transformation of its downstream and corporate IT program including its fuels value chain, from upstream to trading. Tata plans to leverage its global network delivery model in support of BP’s objectives

Another Indian outsourcing behemoth, Wipro Technologies, has been selected to provide ADAM services to BP’s fuels value chain and corporate businesses globally. Deasy again stated that the Wipro component was ‘an opportunity to simplify and streamline processes to drive out cost.’


IBM rolls-out IIF

IBM is officially rolling out its long anticipated ‘semantic model-based’ Integrated Information Framework for chemicals and petroleum.

IBM’s Chemical and Petroleum Integrated Information Framework (a.k.a. the IBM Chem and Petro Framework) is a software platform that lets users build integrated solutions leveraging common processes and services to offer real-time visibility of equipment, provide enterprise-wide access to real-time processes and key performance indicators, federated information across multiple assets and increase production through enterprise wide, event-based information.

The ‘standards-based’ platform leverages a reference semantic model of process equipment that is custom built for each facility. IBM’s service oriented architecture blends model components and real time data. Semantic models have been built for scenarios including monitoring well field and platform rotating equipment, turnaround optimization and integrated asset management.

IBM’s ILOG JViews Diagrammer graphical environment is used to design and execute the models—providing a control-room like view of the asset. ILOG, a French software house, was acquired by IBM earlier this year. More on the IIF in our July 2009 issue and on IBM’s SOA offering on page 4.


What’s in a name?

Now that IBM has re-named its ‘obscurely’ titled RedBook on the ‘Chem and Petro’ Framework, Editor Neil McNaughton is on the warpath, looking for more misleading or ambiguous nomenclature. Examples include files on your hard disk, a new book on ‘viral data,’ a metadata initiative, and the first name he thought of for his own company.

While setting up my company fourteen years ago, I benefitted from a government-sponsored scheme that encouraged the unemployed (remember $10 oil?) to get off their butts and create their own businesses. As part of the process, heads of earlier startups that were still in business devoted some of their time to picking over business plans such as mine and offering free advice. This proved very valuable to me, but before I explain why, I am going to elaborate on what seems to be a growing tendency to give products or services, not so much a bad name, more of a dumb name. One that, in the interest of a perceived marketing benefit, actually provides potential buyers with a complete bum steer as to what is on offer.

Last month we reviewed a book titled ‘Discovering the Value of an Integrated Information Framework’ and described the book’s title as ‘obscure.’ In fact it was only our idle delving into the booklet—page five in fact, that it emerged that the book was about the now officially released IBM ‘Chem and Petro’ Framework (page 1). Since then, the obscure title has gone and IBM has released the RedBook with a more appropriate title, ‘Discovering the Business Value Patterns of Chemical and Petroleum Integrated Information Framework.’ Neither grammatical nor snappy—but at least it tells you what’s inside.

We encounter naming issues in everyday life as witnessed by the ubiquitous my-whatever. mySpace, My Yahoo, or as my son is currently working on, my CV. All such possessiveness is fine so long as it remains on your disk, or in your Face Space or whatever—but once emailed—my CV loses implicit ‘context.’ Is this my CV or your CV? An easy fix is to add a name—better still, your name and the company/job you are applying for. The name really needs to provide all the context—you never really know what ambiguities might arise in future use. Better still, nail all this down in the document properties before signing and sealing the whole thing as a pdf. Capture the context as metadata.

Which brings me to the interesting development chez Energistics which is launching a ‘use of metadata’ initiative this month (page 10). A laudable field of investigation for the oil and gas standards body—but I do have a comment to make on the nomenclature…

The ‘metadata’ tag has pretty well been hijacked by the geo community to the extent that ‘metadata’ to anyone who works anywhere near geographical information systems means ‘geographical metadata.’ As an ex geophysicist and having a more general interest in ‘data about data,’ I am in another camp that sees metadata as stuff that gives context to just about any information. I assumed that Energistics would be advancing a similarly broad church definition.

Having read through the Energistics literature, I’m not so sure. I read in the prospectus that ‘the initial focus is on structured and unstructured information resources that contain explicit spatial coordinates.’ So I conclude that we are talking about ‘spatial metadata.’ But later we have, ‘the intent is to develop a standard that can be expanded over time to support additional content types, place name locations, and other segments of the energy industry.’ The Roadmap also mentions Dublin Core—a rather horizontal metadata initiative that comes out of the document community.

And I am now puzzled. Are we going to work our way from ISO 19115 to a new metadata standard for seismic? The ambiguity in the initiative’s title may be designed to produce an open ended project. But going from GIS to production data and seismic is putting the cart before the horse. Again—it is all in a name—or not!

Curiously, if the top level folks play free and lose with the nomenclature, this is not so for the lowly computer programmer. In the old days of programming, i, j, k, were automatically assigned integer status by the compiler—leading to unexpected behavior and occasional havoc. Today, professional programmers use longwinded names that summarize what is being counted, how high you can count, and maybe a notion of the variable’s scope in the program. Context, context and context is to programming what location is to real estate!

Bad naming can be the result of an expensive marketing study that has gone wrong or a last minute author’s brain fart - what shall I call this editorial for instance?

When I asked for a review copy of the new book* ‘Viral Data in SOA*’ I imagined that this was viral as in ‘viral marketing’ i.e. a good thing. Not so. The book is about the risks of bad data to the enterprise. Seemingly these are heightened with the interconnectivity that SOA brings about. The book is actually about data quality (page 3). But the title, once unpicked, holds the subliminal message that a) SOA is widely deployed (is it really?) and b) somehow bad data gets worse in an SOA environment (how is that now?).

Now to get back to the advice I got on when starting my own company. I had decided to set up a consultancy to work in data management to the French oil and gas sector and had prepared the ground with a survey and business plan. But the original company name was something of a brain fart. In my business plan I called my company ‘Données-Moi!, a bad pun on the French for data (données) and the French for ‘give’ (donner). Perhaps an English equivalent might be ‘Data R Us.’ My advisor intimated that this silliness was in poor alignment with someone of my age and general seriousness! Good advice on two counts—the bad name and the fact, unknown to me then, that age had conferred seriousness on myself. Hey, fourteen years down the road I must be really serious now!

* Viral Data in SOA, An Enterprise Pandemic, by Neal Fishman, IBM Press/Pearson, ISBN 978-0-13-700180-4.

** Services-oriented architecture.


Oil IT Journal Interview—Lars Olav Grøvik, StatoilHydro

StatoilHydro’s information architect tells of a constantly raising bar in data management, of creative chaos in the workplace and ‘sustainable’ data management on the digital oilfield.

This interview was sparked off by the May 2008 ‘Tale of two tradeshows’ editorial*.

Yes. I feel that something is missing from data management conferences. We discuss a lot of practical day to day problems, maybe even look ahead a little, but we never really seem to lift the horizon to a few years out, to the level of implied by ‘digital oilfield’ deployments. I am interested in having a cross industry dialogue looking at the medium term of data and information management. We need a roadmap to get from how data is managed today to how it will be managed in the ‘razzmatazz’ world of the digital oilfield as is on show at the Intelligent/Digital Energy events.

What’s wrong with the status quo?

Today, the reality is that management is screaming, ‘why isn’t IM working despite the millions invested?’ Now the short answer to this is that management has ‘raised the bar.’ We are now dealing with more complex data, in larger data sets and shorter cycle times than before, all in the face of staff reductions in the library and elsewhere—and we are still in business! As we are constantly raising the bar we will never get rid of quick fixes like spreadsheets and PowerPoints. We will always have clever engineers and geologists creating value. If we deprive users of the tools they like we damage creativity. The real question is how to avoid spreadsheets creating chaos—how to capture good new ideas to the corporate workflow—moving the chaos and creativity to a new level. The problem is that most all data management solutions are ‘static.’ What we need is an approach that accommodates this constant raising of the bar. What happens for instance when a new data sets arrives and needs to be ‘managed.’ How do you handle the legacy data?

What’s the answer?

That’s a good question!

How does the Intelligent/Digital Energy paradigm actually work then?

Today’s Digital Energy implementations work because they leverage a highly customized infrastructure. But before you even get to this situation, you may need to spend hours cleansing the data for a single asset. You should ask the IE/DE protagonists how long they spent getting data from the asset to the corporate database. This usually involved blending data from both clean and unclean sources, so you either exclude the latter, clean the data, or reduce the scope!

Are these fixes done for the lifetime of the asset or for the demo?

They can be both! Some really are examples of how StatoilHydro and other operators really work—examples of ‘sustainable’ data management. But for some of these, you need to check to see how many support people are involved. For some flagship NOC digital oilfield deployments, the support requirements are pretty amazing—more that would be possible for a company like StatoilHydro.

But if that’s what it takes to ‘digitize’ a major asset why not bring in a significant IT staff?

One reason is that a lot of bulk data is stored near to the end user, so you can’t really outsource to cheap IT providers in India or elsewhere. Although there are some interesting attempts to develop a ‘global data store concept.’ There is no doubt that we need to go beyond the ‘pain barrier,’ but it’s hard to achieve this while still doing business. It’s like trying to change the wheels on a moving vehicle!

What about production data—that’s gaining traction at ECIM?

This is a very active field that is in the same situation as geoscience was a decade ago. But the direct connection between production systems and reporting makes data quality a sensitive issue. Discrepancies between reports and field data can be brushed under the table.

It’s interesting though that business systems like SAP scored highly in quality**?

This could be a case of garbage in, garbage out I’m afraid!

Semantics have got a lot of traction in Norway—what’s your take?

There is a lot of new technology coming up and there is value here. Semantics, like other new technologies is following a fairly classic path, with a few ‘believers’ actively promoting the technology at first, the technology appears to provide all the answers and explain everything (like sequence stratigraphy!). Then the pendulum swings. The semantic pendulum is still swinging! Some think they have seen the light and are very positive, some remain skeptical. Semantics will likely find its place—and this may come sooner than we think. It depends a lot on data providers. If service companies and authorities take-up the technology, it may come faster.

But semantics is not a done deal in the context of integrated operations as it sometimes seems?

No. We are looking at technology choices.

What is happening in real time data and automation?

I should be more involved in real time data than I actually am. We have been too busy with the mergers. Some semi-automated processes have been developed using Perl scripts to move real time data around. These tools may or may not be maintained. This has led to a kind of lose automation.

And master data—a shared subset of corporate data? Such ideas were posed as a panacea a couple of years ago.

We have had lots of discussions around these issues. If you have two databases with similar kinds of data, you can opt to standardize them and merge the two. Or you can keep the two databases running, especially if you need extra parameters on different fields or other use cases. You could then blend information from the two by sharing top level information. StatoilHydro’s approach is for an ‘official’ dataset (the first solution). But I can see cases where the second approach is valid. But a well master data footprint does have to be relatively large. And sensor data may or may not be standardized and can blend high and low frequency information. It is a complex issue. But as I said, Statoil-Hydro’s approach is for a single master data set. One thing is for sure, without a minimum set of parameters ‘masterized,’ you can’t have true integrated operations. Having said that, today’s reality is one of manual or semi-automated data links that are very hard to sustain.

And PPDM/Energistics. What’s their role here?

We are involved in Energistics’ projects—but here again the bar is being constantly raised in the workplace. Standards body’s work will never be on a par with a major development from Schlumberger or Landmark, nor will they measure up to a super database from a major oil company.

I am curious to know how other industries manage—pharma, internet, mobile telephony all must have similar issues. All seem to manage to constantly raise the bar without interfering with their ongoing business. How other industries achieve this would be an interesting study topic for a future-looking meeting.

* Oil IT Journal May 2008.

** ECIM presentation.


Book Review—Viral Data in SOA, An Enterprise Pandemic

Neal Fishman’s book is an entertaining ramble on data quality and remedial methodologies.

The first thing about this little book is what is ‘viral data?’ Is it good—as in ‘viral marketing?’ Or bad as in ‘swine flu?’ Apparently it is bad. In fact the introduction by IBM-er Tim Davis suggests that 2008 collapse of the worlds financial markets was due to ‘a perfect storm of viral data’ which was news to us. A definition of ‘viral data’ as ‘[producing] undesirable effects by engaging information though a service’ does not get us very far either—although the claim that ‘left unchecked, viral data in a service oriented architecture can reach epidemic proportions in an enterprise’ is a bit more promising—even if it does have a perfume of good ole FUD (fear uncertainty and doubt).

The book is a bit of a ramble, but the theme is that with SOA, bad data gets much better visibility than it does in other systems—so data quality is doubly important.

Fishman claims ‘the overall quality of information in organizations continues to be suspect and poor’ and his book sets out to address ‘the treatment and prevention of harmful data.’ Does it succeed?

Well it is certainly not a recipe book, more a collection of anecdotes and remarks, some of which are relevant to data quality, and many which, while interesting (a rant about PowerPoint, the origins of the Federal Reserve System), are not.

The contents page offers a semblance of structure that is not really borne out in the book. In fact a cover to cover read is necessary to capture all the nuances and anecdotes. Is it worth the effort?

I would say yes if you have the time and are prepared for some frustrating digressions. Fishman’s credentials (IBM program director, DAMA board member) are pretty impeccable. We enjoyed his observation that ‘organizations adopt specific protocols to drive measurable improvement—but suffer from the illusion that they can assure quality by a specification like ISO 9000 [..] this is the ultimate naiveté.’ There is some good advice to data modelers—about the need to get data structures ‘right’ and the risks of subsequent changes to tightly-coupled architectures. The chapters on data governance and reference models were eye openers to us as non practitioners, touching on methodologies like DQA3 and CIDER. But Fishman suffers from a tendency to qualify just about every statement made. Four ‘howevers’ on a page is too much! Spend a few hours reading VD in SOA you will probably want to call in the consultants—which is maybe the object of the exercise!

* Viral Data in SOA, An Enterprise Pandemic, by Neal Fishman, IBM Press/Pearson, ISBN 978-0-13-700180-4.


Meyer floats consortium on Marcellus shale fracture modeling

Move seeks to ‘collapse learning curve’ of hydraulic fracturing in emerging major gas province.

Meyer & Associates, Inc. is floating the idea of a consortium of energy companies active in the Marcellus Shale to form the Marcellus Shale Fracture Technology Consortium (MSFTC). The objective of the consortium is to assist in collapsing the ‘learning curve’ in hydraulic fracturing practices in the Marcellus Shale. This will help lead to optimal well production, increased estimated ultimate recovery and maximized economic returns for its contributors.

Meyer has also just released its new ‘MShale’ application, a specialized fracturing simulator designed to simulate multiple, cluster, and discrete type fractures in shale and coal bed methane formations. Discrete fractures in naturally fractured or faulted formations can be modeled by specifying a fracture network grid to simulate fracture propagation in the major vertical, minor vertical, and horizontal fracture planes. MShale can also be used as a diagnostic tool to compare Discrete Fracture Network (DFN) numerical results with microseismic data. More from www.oilit.com/links/0909_3.


Three new services extend hosted rig site data offering

IDS Data Net has introduced modules for well completion, logistics and data analysis.

IDS has extended its DataNet2 hosted drilling data reporting system (OITJ Jan 09) with modules for completions and well intervention reporting, logistics and planning and data analysis. All three components share information in the DataNet2 database.

ProNet, the new completions and well intervention reporting services, launched by IDS adds an integrated drawing package and campaign life-cycle manager to IDS’ reporting tools. The Campaign Manager component, IDS life-cycle and well integrity tracking service, gives the completions engineer a full history of the well from spud to abandonment.

The hosted service offers ‘Web2’ functionality to provide, for example, ‘time-tracked’ borehole visualization, an on-screen walk through of the entire well construction and intervention history, accessible through a slider bar.

The StockNet component a logistical planning solution exposes a master equipment list that can be shared by all stakeholders. The StockNet purchasing tool allows users to raise purchase orders, whilst a link to the IDS CostNet service allows the raising of invoices. Manifests and load-out lists are created by drag and drop, eliminating the problems of creating multiple spreadsheets. StockNet integrates with other purchasing systems including SAP.

IDS’ Business Development Manager Yew Huey Kang commented, ‘Our experience tells us that if a system is too cumbersome then sooner or later people will revert to using a spreadsheet. With so many inconsistencies and variables in our industry, there is a real need for reliability and a consistency of approach.’

Finally, VisNet adds ‘advanced data analysis’ to IDS’ offering. Features include trend analysis, a drag-and-drop query builder and more. IDS CTO Reuben Wee claimed, ‘VisNet enables clients to really make the most of their data in a fast and simple manner.’ More from sales@idsdatanet.com.


Software, hardware short takes

WellEz, Fugro, Petris, Aveva, Ikon, Rose, AGM, Hampson-Russel, JOA, SDI, Neuralog, Petrolink and more.

WellEz Information Management has announced a new entry-level service of pre-configured reports for its hosted WellEz.NET data collection service.

Airo Wireless has rolled-out an ‘intrinsically safe’ A25 cell phone for hazardous environments. The A25 offers secure telephony, ‘Push-to-Talk,’ GPS and telematics in a ruggedized Windows Mobile PDA handset.

IPC Petroleum Consultants has published a model and report on medium and long term worldwide oil and gas supply and demand including appraisal of oil and gas reserves and assessments of offshore and unconventional potential reserves.

Fugro-Jason released a new version of the Jason Geoscience Workbench (JGW) AVO attribute analysis, full waveform synthetics and enhancements including a new spectral decomposition module.

PetrisWINDS Enterprise (PWE) V 6.2 has been upgraded to leverage the ArcGIS engine for geo-processing. A ‘Google-like’ function allows for search across enterprise data assets. The 5.3 release of Recall includes workflow, search and uncertainty analysis. New ‘Raven’ and ‘Impetus’ modules provide data quality and entry-level image log analysis for casual users. Petris has also released a movie (www.oilit.com/links/0909_4) showing its PWE ‘OneTouch’ data explorer.

Transpara has extended its Visual KPI wireless data link to provide Palm Pre users with on-demand operations, financial and infrastructure data via the new phone’s Palm webOS-based browser.

Aveva has enhanced its Plant portfolio with across the board product improvements. Aveva PDMS users can now share data with users of Aveva’s Marine portfolio, providing offshore oil and gas users with enhanced standardization and flexibility in the way they work with project partners.

Golden Software has released V9.0 of its ‘Surfer’ entry-level contouring and 3D mapping package. Surfer converts XYZ data into contour, 3D surface, 3D wireframe, vector, image, shaded relief, and post maps. New features include map layer opacity, georeferenced image (GeoTIFF) support and on-the-fly projection system conversions. Surfer 9 costs $699 or $229 for an upgrade.

Ikon Science has released a 64 bit RokDoc plug-in for Petrel 2009 for Vista and XP.

Output from Rose and Associates’ MMRA package can now be mapped using Priemere’s ‘Power Tool for ArcGIS.’

The 3.1 release of Austin Geomodeling’s Recon geological interpretation applicatons introduces ‘cascade technology’ for multiple ‘what-if’ structural scenario evaluation.

Golder Associates has announced FracMan ‘reservoir edition’ with discrete fracture network-based workflows for reservoir characterization, reserve evaluation and well and EOR planning.

Gaea Technologies has announced StrataExplorer, a subsurface mapping and resource or contaminant evaluation package for use in the environmental, geotechnical, mining, oil sands, and petroleum industries. Optional modules are available for well logs, cross sections, gridding and contouring.

Hampson-Russell is migrating its voluminous training material to ‘GeoBook,’ an ajunct to its new ‘paperless’ AVO/lithology workshops. GeoBook, a laptop PC, with note taking functionality is included in the price of HR’s training sessions.

IPCOS has announced ‘INCA Discovery’ a new model identification utility for the management and analysis of process data and dynamic process models. Discovery provides model post processing, trending and improved process realism.

JOA’s JewelSuite 2009 has achieved Windows 7 compatibility and has won a ‘Compatible with Windows 7’ logo from Microsoft. Windows 7 is scheduled for release ‘real soon now.’

SDI now supports Neuralog’s NeuraLaserColor printer in its Print Master and Office APS products offering unlimited length printing from Solaris, Linux and Windows.

Octaga’s Enterprise 2.2.5 release now offers a ‘pipeline follow,’ used to visualize a pig transiting any line in a processing plant. The feature was based on requirements from Chevron and service provider Epic Integrated Solutions Inc.

Meridium ‘s 2.1 release of its joint SAP development Reliability Centered Maintenance and Optimization (RCMO) integrates fully with SAP’s Enterprise Asset Management solution. Users include Abu Dhabi Gas.

Canary Labs has announced Trend Historian 8.0 with a 3.6 million updates/second write bandwidth and scalability to 1 million tags and 100 ns resolution.

Petrolink has released its PetroDAQ system to convert rig sensors output to WITSML. The system is currently under test with ADCO in Abu Dhabi.

WellPoint Systems has announced the Intelligent Dashboard, a front end to corporate data and KPIs powered by SQL Reporting Services and data cube technology that aggregates data and measurements from WellPoint’s and third party applications.

Calsep has teamed with Neotec to couple PVTsim’s flash module with Neotec’s PipeFlo, WellFlo and ForGas simulators.


SAS unit DataFlux fleshes-out E&P data quality offering

Webcast discusses data QC and remediation strategies across geotechnical, business data.

Joe Rademacher (DataFlux) and Clay Harter (OpenSpirit) presented a webcast this month introducing the DataFlux/OpenSpirit Interactive Adapter, the fruits of a year long collaboration (OITJ November 2008) between the horizontal data quality specialist and the geotechnical data integration vendor. The webcast showed how DataFlux’s dfPower Studio can be deployed to assess and rectify data quality issues spanning geotechnical computing and enterprise/ERP data sources.

DataFlux advocates an ‘Analyze, Improve and Control’ methodology and Power Studio has tools for each phase of a quality improvement project along with ongoing sustainable management. Projects typically start with a clean up of multiple similar attributes like company and formation names with the option of deriving standards from the data. PowerStudio targets business analysts who know their data but don’t necessarily have an IT background. Once established, quality rules are captured to the DataFlux Unified Repository and quality knowledge-base.

SAS unit DataFlux’ background is in business data quality, but the challenges of integrating the idiosyncratic world of geotechnical data have been largely solved by the OpenSpirit integration framework. This allows quality strategies and rules to span geotechnical data, ERP systems, text/XML files and enterprise databases.

Post clean up, business rules control ongoing data capture, triggering user defined events when exceptions are encountered. These can be logged or possibly spark-off a more complex remedial process involving geotechnical applications or other DataFlux jobs. More from www.dataflux.com.


Expert Center for Information Management 2009, Haugesund

Norway’s ECIM upstream data conference, with 300 attendees, is billed as the largest in the world. We report from a sample of the eight parallel ‘workstreams’ on moves in production data management, Petrel’s database extension and an upstream cloud computing trial.

The 2009 Expert Center for Information Management (ECIM) conference saw some 300 information architects and data managers from over 30 companies gather in Norway. The head count lead the organizers to claim that ECIM is now ‘the biggest oil and gas data management conference in the world.’ Certainly it is the data management show with the largest number of workstreams—eight!

Knut Mauseth (Shell) provided the keynote address—a limpid resume of Shell’s analysis of the future of the industry, mankind and the planet. Shell’s analysis sees world population growth as inevitable—and looking out to 2050, has the world population rising by 50%. The analysis further considers that ‘energy raises folks from poverty’ and that oil and gas will continue to play a major role. This inevitably leads to concerns about CO2 and global warming. Shell sees two polar scenarios—a ‘scramble’ (everyone for themselves) and a ‘blueprint’ for sustainability. The latter may be hard to achieve in a democracy—but will entail a combination of carbon tax, regulation, cap and trade, sequestration, energy savings and renewables. In case you think this is pie in sky, Mauseth introduced the large scale carbon capture and sequestration experiment at Mongstad*, Norway where StatoilHydro, along with partners including Shell, is testing different flue gas capture technologies from Alsthom and Aker.

Ian Barron (RoQC) described how the Statoil—Hydro merger has involved a ‘reduction in truth en route to higher data quality.’ The title is slightly misleading as the data merge targeted a reduced number of ‘versions of the truth’ rather than a decrease in truth per se. StatoilHydro’s data merge started by identifying all versions of truth for each data item, determining the validity of each and finding and flagging the best. The result was a single data set of much higher quality and value of any of the input data sets. En route, data sets with uncertain provenance and lacking in metadata were discarded. The process was partially automated with lookups of approved Landmark project and stratigraphic names. A technical evaluation of the ‘best’ data sets was set off against what users want to keep. This involved a major data mapping exercise across GeoFrame, OpenWorks and Petrel projects—all collated to a staging database before filtering, standardization and final QC prior to capture in the corporate master OpenWorks database.

Jan Erik Martinsen (KPMG) and Morten Mønster Jensen (Abbon) investigated mature field data management. Setting the scene, Martinsen revealed that a 2009 KPMG survey of oil and gas CFOs determined that the financial crisis is indeed impacting the upstream, with companies in ‘wait and see’ mode, focused on cost cutting. A third expect no profit/loss in the next three years.

Jensen estimated that in the North Sea, a typical brownfield stands to gain a few percent of production with better analysis of production data. A holistic decision support system is needed to span production and pipeline data management, terminal management and FPSO accounting. A better common understanding across departments is needed to optimize production. The goal is to align production data management with gain/loss management and to be able to perform holistic analysis and recommend loss mitigation strategies. This implies a more integrated asset model—with tuned and verified network models. Enter Abbon’s productized solution ‘Optimum Online.’ Hafsteinn Agustsson’s presentation covered StatoilHydro’s PetrelRE (reservoir engineering) data management. PetrelRE inhabits a workflow that includes OpenWorks, Statoil’s ‘Prosty’ application and various reports and ‘unofficial’ databases. PetrelRE sees a move from a workflow controlled by multiple ASCII files to a cleaner hierarchy of data objects manipulated through a single interface. This automates job ordering and ‘forces users to structure data.’ On the downside, there are compatibility issues with RMS, with RESCUE transfer and legacy ASCII flat files. Discussions are underway with Schlumberger to add more data functionality to Petrel. The issues are of considerable import as ‘official’ models need archiving for stock exchange reporting compliance. Schlumberger’s Ocean infrastructure is being leveraged to add in other tools such as MEPO—used to control PetrelRE for history matching. Agustsson concluded that ‘for the first time we have a single application that does the whole job—from pre processing through simulation and post processing—replacing multiple legacy applications.’ This has lead to a more holistic model treatment at the expense of some data management issues currently being addressed.

Todd Olsen presented the Petrel ‘DBX’ Database extension, Schlumberger’s answer to Petrel data management issues. Olsen acknowledged ‘Data managers see Petrel differently from users.’ Data management will be different from OpenWorks or GeoFrame and Schlumberger plans to support users with best practices and education on Petrel and its data. But as users already know, Petrel tends to create a multiplicity of projects. Users may be ‘successful’ while the organization may struggle. The key to Petrel data management is the Windows globally unique identifier (GUID), a unique, machine generated, identifier that identifies each and every version of all objects in Petrel. When an object (well, horizon) is loaded, it gets a GUID that is persistent throughout Petrel. Petrel uses GUIDs to ‘remember’ objects and workflows that were used to build other objects, providing an audit trail of activity. Olsen presented a typical workflow involving multiple interpretations of the same seismic data set combined to reservoir models. The example made it clear that even with the GUID, managing even a relatively simple data use case is not for the faint of heart. The upside though, according to the oft repeated Schlumberger mantra is that ‘it’s not just data, it’s Petrel data!’ In other words, the GUID is a window into a slug of potentially informative metadata about every component of an interpretation.

The other facet of Petrel DBX is of course the database, a new Seabed-based Oracle database that is used to mirror whole Petrel projects. This is currently a subset of Petrel objects, but with time the idea is to expand the footprint and support full bandwidth data management of everything seen in a Petrel project. Petrel DBX will be available with the 2010.1 release (December 2009) with limited data type support.

In the other corner, or rather ‘workstream,’ Landmark’s Susan Hutchinson was presenting OpenWorks R5000 data management strategy. Prior to R5000 OpenWorks data duplication was widespread. R5000 introduces an underlying Oracle database and from now on, OpenWorks projects become views into the data. This hides complex reference system and other data issues from end users. Interpreters ‘see through’ the database to seismic data files. R5000 also expands coverage—especially in seismic data management—and adds GeoTIFF, interwell attributes, fracture job monitoring and basin modeling from a joint development with StatoilHydro.

Jan Åge Pedersen (Tieto) described StatoilHydro’s production data management effort which began in earnest in 2007. Production data management has proved harder to achieve than geosciences data. This is because production assets are independent and people are not incentivized. Production data is heterogeneous, people are entrenched and recalcitrant to new tools. Excel and Power Point are the tools of choice. Production data comes from many incompatible sources and is combined in an ad hoc way to suit engineers’ requirements. Real time data will be even more ‘messy’.

StatoilHydro is now rolling out a ‘vendor and asset neutral’ production data model, leveraging Energistics’ ProdML. Data from SCADA systems, the historian and production accounting are normalized and pushed to the enterprise service bus. Initial management has been supplied by the central data management group, but ultimately the plan is to involve asset personnel in localizing the production models and handing over to a new race of project production data managers (PPDMs). Pedersen noted that ‘it is proving hard to find PPDMs with the appropriate skill set, you can’t just extend the exploration data management function.’ Statoil already has a network of project data managers for exploration data.

David Holmes and Jamie Cruise (Fuse Information Management) wound up the proceedings with an enthusiastic presentation of potential uses for cloud computing in the upstream. In a sense, cloud computing is not really new to the upstream as users of Norway’s Diskos data set know. But Fuse has been pushing the envelope of cloud computing with a test bed deployment of a seismic data set on Amazon’s Elastic Compute Cloud. This offers compute resources for 10 cents per hour and similarly economical data storage. The Fuse test resulted in a monthly bill of $14! A more serious proposition, storing a Petabyte in the cloud, would come out something in the range of $2 million over three years. ‘Traditional’ data hosting would cost 50% more. Amazon also offers bulk data loading and unloading services at $40/TB, avoiding potential ‘lock-in’ costs.

* www.oilit.com/links/0909_1.

This article is an abstract from The Data Room’s Technology Watch from the 2009 ECIM. More information and samples from www.oilit.com/tech. .


PPDM User Meet—Perth, Australia

Chevron, Petrosys, Laredo Energy and Geoscience Australia discuss PPDM implementations.

Some 60 attended the Australian meet of the Professional Petroleum Data Management association in Perth this month. Chellie Hailes gave an update on Chevron’s use of the PPDM data model in a master data repository (OITJ January 2008).

Petrosys MD Volker Hirsinger described a more shrink-wrapped approach to PPDM deployment—via Petrosys’ dbMap application. Hirsinger outlined drivers for corporate PPDM master data stores—these include joint venture data sharing, geographically dispersed teams and ‘the quest for a digital oilfield.’ Case histories included one company that wanted to rationalize a substantially duplicated data set following an acquisition and rapid data growth. M&A is a frequent driver in PPDM deployments.

IT challenges vary across deployments, Hirsinger reported some issues catering for Oracle access from diverse Windows desktops. Elsewhere, outsourcing solves a lot of potential problems including loading of legacy data. Petrosys noted that end user acceptance varied from ‘somewhat disappointing,’ through to users becoming so enthusiastic that they were pushing for scope creep in the data management project. Data management can be challenging especially in the face of merging legacy data sets of different CRS. Finding staff to perform geodetic reconciliation can be hard. Overall, PPDM cost was ‘more effective than using a major vendor’.

Steve Jaques (Laredo Energy) and Steve Cooper (PPDM and EnergyIQ) described deployment of a PPDM 3.8 well master data repository alongside a suite of data management tools and ESRI’s ArcGIS Server. The system provides SMT Kingdom project builds to Laredo’s interpreters and well and land data feeds to Laredo’s in-house developed ‘iOps’ data browser. PPDM was selected as a ‘comprehensive, reliable, and flexible’ data model with good data management support including data quality and auditing. Also critical was the fact that PPDM is supported by a large number of vendors and an active community of experts.

Laredo’s database contains over 50,000 well records and close to 10 million production records. Daily updates from IHS run in under five minutes. Laredo encountered a potential issue implementing the PPDM master store on Microsoft’s SQL Server—which has some limitations on the number of foreign keys that can be attached to a single table. Laredo’s PPDM deployment only covers a small subset of the tables and so this was not a fatal issue.

SpectrumData CEO Guy Holmes provided an entertaining analysis of data ‘routes to the graveyard,’ subtitled ‘getting away with murder.’ Data ‘dies’ in three ways—from ‘natural causes,’ i.e. ones that can be anticipated such as media aging, from ‘manslaughter,’ accidental destruction and from ‘premeditated murder.’ Paradoxically, ‘murder’ is the correct way for a data ‘lifecycle’ to end—i.e. with ‘willful destruction of data after rationally considering the timing and method of doing so.’ Natural causes and data manslaughter are the ones to avoid! Homes presented an exhaustive list of ways your data can die and recommended strategies for mitigation. In the case of a failure to implement such a program Holmes suggests you ‘get a good solicitor.’

David Rowland presented Geoscience Australia’s new data management and data entry system, parts of which leverage the PPDM data model. The GA well data download site is at links/0909_2. GA has executed a four person-year effort to QC its well data set. Some 88,000 data values from 2,200 wells have been checked and fixed. Corrections were made to 60% of GA’s well locations, many significant, and thousands of comments on values and data sources were added. GA’s improvement project leveraged the PPDM audit model. The result is a ‘trustworthy data set with verified and verifiable data—eliminating the need for repeated data checking.’ More from www.ppdm.org.


Folks, facts, orgs ...

This month’s newsmakers hail from ABB, Aclaro, Ernst & Young, Coreworx, Calsep, CGGVeritas, TAQA, FMC Technologies, Gas Certification Institute, D-RoTH, Geokinetics, Getech, Halliburton, Hampson-Russell, IDS, KBR, Landis+Gyr, Merrick, Meyer & Assoc., Norsar, Object Reservoir ...

Brice Koch has been appointed Head of Marketing and Customer Solutions for ABB.

Scott MacFarlane has joined Aclaro as president and COO.

Energy trading software house Amphora, Inc. has named John Beaty as president, Americas. Beaty was formerly with Zytax.

Ernst & Young has appointed Dale Nijoka to lead its Global Oil & Gas practice.

Coreworx has appointed Erik Vander Ahe VP of Professional Services, Paul Harapiak Chief Architect and Nick Clemenzi as CFO.

Pashupati Sah heads-up Calsep’s new location in Kuala Lumpur, Malaysia.

CGGVeritas/TAQA’s ARGAS Middle East joint venture has opened a Technology Center in Al Khobar, Saudi Arabia. CGGVeritas has also signed an R&D agreement with the Western Australian Energy Research Alliance.

Jay Nutt has been appointed Vice President of FMC Technologies.

The Gas Certification Institute is partnering with alarm management specialist D-RoTH to provide training in fundamentals of SCADA.

Executive VP Jim White is to resign from Geokinetics in February, 2010.

Richard Tyson has joined Getech as a senior geochemist, and Peter Kovac as structural geologist.

Nance Dicciani has been named to Halliburton’s board. Dicciani was formerly president and CEO of Honeywell’s Specialty Materials unit.

Liang Shen is geophysics advisor at Hampson-Russell’s new Beijing office.

Douwe Franssens, General Manager at IDS has been elected to the newly created WITSML Executive Team.

KBR has appointed Mitch Dauzat President Gas Monetization and Roy Oelking Executive VP Oil and Gas. Both units are part of the company’s newly formed Hydrocarbons Group.

Landis+Gyr has appointed Heath Thompson as VP and CTO, North America. Thompson was previously with ISS and IBM.

Merrick Systems has named Norman Kroon Business Development Manager. Kroon hails from Schlumberger.

Larry Eakin is joining Meyer & Associates.

NetApp has appointed Tom Georgens as president and CEO, to succeed Dan Warmenhoven.

Norsar Innovation has recruited Carlos Eiffel Arbex Belem, principle of Ies BrazilConsultoria, as Brazilian and South American representative, and Patricia Lugao of Stratalmage for technical support. Mike Branston is to head up the new office in Kuala Lumpur, Malaysia, and Hugo Moen and Chris Watts join as Sales Managers for respectively Europe, Africa, Asia and Australia, and North and South America.

Object Reservoir has announced a Collaborative Exploitation Project (CEP) for the Montney Shale in Western Canada. Schlumberger’s PetroMod unit has hired Matthias Gross as Software Developer, Jaron Lelijveld as geologist, Alan Monahan as Technical Writer and Nicola Tessen as Product Champion.

Millennial Net has joined the MIT Energy Initiative (MITEI).

James May has joined Universal Well Site Solutions as VP marketing. May was formerly with Nalco Company’s Energy Services.


Done deals

Baker Hughes, Fugro, Geoservices, IHS, Macquarie Group, Wood Group, Recon, WellPoint, more.

Baker Hughes is to acquire BJ Services in a $5.5 bn deal.

Fugro is acquiring marine EM data specialist Interaction AS and has also entered into a global cooperation agreement with Electromagnetic Geoservices (EMGS). Fugro is providing a NOK 150 million interest-bearing loan EMGS. The company also acquired General Robotics Limited (GRL), supplier of simulation and visualization software for an undisclosed sum.

Geoservices Group has acquired seismic and real-time pore pressure and rock property analysis specialist Petrospec Technologies. Mark Herkommer is now CEO of Geoservices’ Petrospec Division, reporting to Jean-Pierre Poyet, executive VP and CTO.

IHS has bought LogTech (Canada), for CA$3.3 million. The deal includes the LogArc software suite which adds a log data management solution to IHS’ offerings.

Macquarie Group has completed its acquisition of Calgary-based Tristone Capital Global.

Wood Group unit Mustang International is acquiring a majority interest in Al-Hejailan Consultants through a newly established joint venture, Mustang Al-Hejailan Engineering. The unit will provide engineering and project management services to Saudi Arabia’s oil, gas and chemicals industries.

PanGeo Subsea has secured a multi-million dollar investment to support commercialization of its acoustic imaging technology and to increase its product offerings. The investment is led by Lime Rock Partners and CTTV Investments LLC, the venture capital arm of Chevron Technology Ventures.

Chinese oil and gas automation services provider Recon Technology has announced its initial public offering of 1,700,000 ordinary shares at $6.00 per share. Shares trade under the ‘RCON’ ticker on the Nasdaq.

The French Strategic Investment Fund has acquired a 5% stake in Technip.

WellPoint Systems is to receive a minimum US$3.3 million payment from Export Development Canada in respect of an accounts receivable insurance policy covering commercial risks associated with a ‘South American customer.’


17th GeoInformatics 2009 meet looks at oil and gas GIS

Presentations on GeoScience Information Network, oil spill detection, large scale data visualization.

Jess Kozman (CLTech) with co-authors from Schlumberger and the Arizona Geological Survey showed how MetaCarta’s geographic place-name-based search engine can leverage the content of the newly created GeoScience Information Network (GIN). GIN is a unified front end to over 3,000 data stores maintained by US Geological Surveys. GIN leverages open source standards from the Open Geospatial Consortium, a data exchange standard based on the GeoSciML geoscience mark-up language and a ‘prototype’ catalog web service for data discovery. Kozman argues that the use of natural language processing and geographic search applications like MetaCarta improve on keyword-search on more structured data sets. The technology can be considered to provide the US with a ‘virtual’ National Data Repository.

Anwar Dahish (Malaysia Technological University) showed how remote sensing with space borne radar has been used to delineate oil spills in coastal areas. Despeckling, image segmentation and digital classification, and geo-referencing of remote sensing data were used to determine the relative impact of spills of different origin. Digital filtering enabled fingerprinting of light and heavier petroleum products on the water surface.

Upendra Dadi (Oak Ridge National Laboratory) described the query and visualization of extremely large network datasets over the web using ‘quadtree’ based KML. Quadtrees, a spatial indexing technique, allow for scale-dependent selection of image or vector features.


GE explains SemStar5 contribution to ‘digital oilfield’

Chau Nguyen explains how improved bandwidth is moving control from subsea to topside.

Oilfield chemicals provider Multi-Chem has selected BP Logix’ Workflow Director application to to improve its business processes. Workflow Director will be used to create workflows and forms to support business users across HR, accounting and IT. Previously, Multi-Chem users accessed information on capital expenditures, new hire authorizations, time off, training requests and expense forms through its company portal. But Multi-Chem wanted to improve on email as a means to track the status of a request and turned to a workflow-based solution, based on currently used processes and control.

Fernando Coronado, Muti-Chem’s applications manager, said, ‘All of our processes are now captured with Workflow Director and are more fluid and efficient. Our forms are less likely to get lost - and everyone knows where to find the forms in order to complete them. Take-up of the new intuitive, browser-based system has been great. Workflow Director is a very powerful tool.’ More from www.ge.com/oilandgas.


BP Logix Workflow Director standardizes business processes

Oilfield chemicals provider moves from email to workflow-based HR, accounting and IT.

Oilfield chemicals provider Multi-Chem has selected BP Logix’ Workflow Director application to to improve its business processes. Workflow Director will be used to create workflows and forms to support business users across HR, accounting and IT. Previously, Multi-Chem users accessed information on capital expenditures, new hire authorizations, time off, training requests and expense forms through its company portal. But Multi-Chem wanted to improve on email as a means to track the status of a request and turned to a workflow-based solution, based on currently used processes and control.

Fernando Coronado, Muti-Chem’s applications manager, said, ‘All of our processes are now captured with Workflow Director and are more fluid and efficient. Our forms are less likely to get lost - and everyone knows where to find the forms in order to complete them. Take-up of the new intuitive, browser-based system has been great. Workflow Director is a very powerful tool.’ More from www.bplogix.com.


ExxonMobil’s programming requirements

Job posting offers peek at what’s hot in supermajor’s R&D unit.

Job ads and Facebook postings provide an interesting way of finding out what companies are up to in the IT field. The usually somewhat secretive ExxonMobil opened up some in a recent job advertised in the venerable Oil and Gas Journal. Exxon’s Annandale, NJ-based Research and Engineering company is seeking to hire optimization and logistics engineers to work on supply chain optimization using ‘large-scale, mixed integer programming.’ Required tools of the trade include C++/Java, SQL on DB2, Teradata and Informix for data analysis. As if that didn’t narrow down the field enough, a further requirement of ‘experience of railroad industry forecasting’ should just about nail down the perfect candidate. But will a railroad industry forecaster be reading the O&GJ’s small ads one wonders?


Sales, contracts and deployments

AGR, Chevron, Fluor, Aveva, Clariant, AMEC, FMC, CGGVeritas, GEDCO, GE Oil & gas, Energy Solutions, Invensys, Landmark, IBM, IDS, Oildex, Quorum, Siemens, VMware and more...

AGR Subsea has signed a contract with Chevron USA to perform Integrated Project Management and Engineering for the build, deployment and prove-up of the Dual Gradient Drilling System.

Fluor has extended its contract with AVEVA, and now has on-demand access to the full AVEVA Plant portfolio.

Clariant Oil Services has been awarded a three-year contract valued at about US $20 million by ConocoPhillips China for integrated services at its Peng Lai 19-3 oil field in Bohai Bay, China.

AMEC has been selected by Esso Exploration Angola to provide design, procurement and logistics support services for the Kizomba Satellites project on Block 15, offshore Angola. The project will be executed by Paragon Angola, AMEC’s joint venture with Prodiaman in Luanda.

StatoilHydro has awarded a $73 million contract to FMC Technologies for the manufacture and supply of subsea equipment for its Gullfaks oil and gas field in the Norwegian North Sea. FMC was also awarded a $90 million contract by Petrobras to supply a subsea separation system at the Campos Basin’s Marlim field.

CGGVeritas’ manufacturing unit Sercel has established a strategic cooperation with software house GEDCO. The agreement lets Sercel resell Gedco’s hardware and the OMNI 3D survey design and VISTA seismic processing packages. Sercel and GEDCO are to unveil a ‘complete seismic acquisition package’ at the SEG Annual Meeting in Houston next month.

GE Oil & Gas has been awarded a contract to provide Esso Exploration Angola with subsea production equipment for the Kizomba Satellites deepwater oilfield.

Energy Solutions International is to provide its PipelineManager application to Ecopetrol’s Oleoducto de los Llanos Orientales unit for pipeline management and leak detection on the 235km, environmentally sensitive Rubiales-Monterrey pipeline in Colombia.

Invensys is to supply a SimSci-Esscor-based operator training simulator to Petronas’ Tiga Sdn Bhd site, the ‘world’s largest integrated LNG plant.’ A total of eight production trains will provide a 23 million tons per annum capacity. The solution also embeds Invensys’ Dynsim LNG modeling technology and Foxboro I/A Series emulators.

Landmark has renewed its global technology and services agreement with BP. The three-year extension gives BP continued access to a broad suite of Landmark technology and petro-technical consulting services. Software covered in the agreement includes OpenWorks applications for seismic processing, geophysical and geological interpretation, reservoir simulation and drilling engineering.

IBM has signed a seven-year strategic IT outsourcing agreement with Korean refiner S-OIL Corp. IBM and S-OIL are to establish a ‘Value Creation Center’ to ‘capitalize on IBM’s expertise and S-OIL’s industry knowledge and market insights. The VCC’s first project is an ‘advanced IT system for the energy industry.’

IDS has signed a new contract to provide DrillNet-based drilling reporting services to Addax Petroleum.

Newfield Exploration has adopted Oildex’ Spendworks to streamline its invoice workflow and approval processes.

Mid American Natural Resources has licensed Quorum’s gas marketing, query and reporting toolset to manage its gas marketing activities, fair value mark-to-market accounting and gas scheduling activities.

Siemens has concluded a partnership agreement with virtualization specialist VMware for the delivery of ‘innovative virtualization solutions.’


Standards stuff

New from PODS, US Dept of Transportation, API, Energistics and SEG

Speaking at the fall meet of the Pipeline Open Data Standard Association (PODS), John Jacobi of the US Department of Transportation Pipeline and Hazardous Materials Safety (PHMS) reported on PHMSA transformation to a ‘risk-based and data-driven organization.’ This centers on a new ‘One Rule’ regulatory reporting environment that is intended to ‘promote consistency in incident reporting for gas transmission and distribution systems, liquid transmission systems, and LNG facilities.’ The goal is for PHMSA to collect better data necessary to analyze trends in pipeline damages and incidents. The new regime sees the creation of a National Registry of Pipeline and LNG Operators, LNG incident reporting and new forms for safety and offshore pipeline condition reporting. One Rule also sees mandatory electronic report filing. More from links/0909_5.

The American Petroleum Institute (API) has rolled out a new website to provide users of API-certified products an opportunity to ‘confidentially communicate’ concerns they may have regarding nonconformance with API’s specifications and standards. Whistleblowers can log on at api.org/ncr.

Energistics has kicked-off a new ‘Standards for Energy Industry Use of Metadata’ initiative that sets out to provide stakeholders with an opportunity to improve operational efficiency within the community through ‘pragmatic and judicious adoption of metadata standards and best practices.’ The initiative promises a future in which all stakeholders exploit a common set of metadata standards and guidelines to enable efficient cataloging, discovery, evaluation, and retrieval of available information resources, regardless of whether those resources are hosted internally or externally to their organization. More from links/0909_6.

Energistics’ is also working on Services Version 2.0 of the PRODML Standard notably with specifications for ‘limited capabilities of a shared asset model concept.’ The new release will include means to query instances of shared asset model services for available components.

Project leader Jill Lewis (Troika) announces that the SEG-D 3.0 seismic acquisition standard has been submitted for final approval and ratification. Lewis emphasizes that processors and data managers and navigation specialists should familiarize themselves with the new standard asap.


Invensys outsources 400 employees, rolls-out Olga/Dynsim interface

Cognizant now ‘global development partner’. Another deal links DynSim with SPT’s Olga.

Invensys has signed a five-year agreement with Cognizant, a provider of consulting, technology and business outsourcing, to help improve the design, development and delivery of its products worldwide. Cognizant will serve as Invensys’ global development partner.

Cognizant is to recruit 400 Invensys development and maintenance employees. The outsourcing includes Invensys’ Hyderabad development team. Invensys CEO Sudipta Bhattacharya said, ‘This relationship will accelerate our ability to expand capacity and move into adjacent markets. Customers and partners should see no difference in our products or their working relationship with us. By leveraging our combined expertise and capabilities customers should benefit in terms of faster time-to-market on new solutions, innovation, resources and a deeper focus on the design and development of our strategic product portfolio.’

In a separate announcement, Invensys has introduced an improved multi-phase flow simulator for the upstream oil and gas market. The new solution links Invensys’ DynSim simulator with SPT Group’s flagship Olga multi-phase simulator of wells, pipelines and receiving facilities.

Invensys VP Tobias Scheele said, ‘The new interface includes improved numerical integration, drag-and-drop configuration and the ability to view dynamic profiles of key Olga parameters within DynSim.’

Use cases targeted by the new solution include design, control system checkout and operator training of platforms, FPSOs, subsea-to-beach and the new floating LNG plants. Flow assurance engineers can now build holistic models of such complex assets and ‘virtually test’ the plant’s integrated process and control systems - reducing control system commissioning time. More from www.invensys.com.


Expand Networks to optimize oil and gas WAN

Oil and gas wholesaler Jim Woods Marketing deploys Virtual Accelerator at data center.

Oklahoma-headquartered oil and gas wholesaler Jim Woods Marketing (JWM) has deployed ‘virtual network accelerator’ technology from Expand Networks of Slough, UK. Jim Woods has virtualized applications at its data center with VMWare and is now leveraging Expand’s WAN Optimization technology to enhance performance. JWM uses a bandwidth intensive logistics application whose performance was downgraded following virtualization.

JWM IT director, John Parnell, said, ‘With the virtualization project holding the business data over 100 miles away, WAN users were faced with latency and congestion challenges that compromised business efficiency. Bandwidth was inconsistent and choppy and application performance was sub-optimal.’ JWM compared Expand with Riverbed’s accelerators before selecting Expand for its capability, cost and low TCO.

Expand’s Virtual Accelerator uses byte-level caching and dynamic compression to optimize bandwidth. ‘Layer 7’ quality of service has also enhanced JWM’s SIP phone VoIP calls, which were previously prone to packet loss and jitter. Parnell concluded ‘The Accelerator has significantly improved user productivity and business efficiency at the company. Expand is delivering four times the bandwidth on the same infrastructure.’ More from www.expand.com.


3D intelligent CAD/CAM meets PLM on rig build

Paper by ShipConstructor IM guru outlines state of art in ‘intelligent’ design and construction.

Oskar Lee of Victoria, BC, Canada based ShipConstructor Software has supplied Oil IT Journal with a paper* describing the state of the art in combined computer aided design/manufacturing (CAD/CAM) and intelligent product lifecycle management (PLM)—as applied to the design and construction of offshore oil rig fabrication. Lee describes the ongoing ‘revolution,’ driven by the offshore engineering boom, of 3D engineering models and associated databases. Today, database-driven CAD/CAM systems are used to build different designs of jackup rigs and production vessels, managing multiple complex engineering components and accommodating repetitive and last minute design revisions from owner operators and classification societies. Such systems can also integrate with Enterprise Resource Planning (ERP) systems to drive supply chain efficiencies and optimize investments.

Lee suggests that some technological innovation amounts to ‘distressed purchases’ in that it is only when an engineering company is faced with near-insurmountable problems of documentation, management and resource problems that a move to an integrated manufacturing system is initiated. Modeling the oil rig construction process involves everything from design to maiden voyage. Clients want an efficient product designed and engineered to the highest standard and built to the best quality in a planned timescale. But such simple models fail to embrace complex interactions between stakeholders such as designers, class societies and owner-operators. Even the simple task of ensuring that all project documentation is available to appropriate stakeholders in a timely and accurate fashion is a challenging objective in view of the immense data volumes. The aerospace industry migrated to computer based management tools when it realized that the sheer weight of documentation was overtaking the weight of the airplane!

Lee’s paper outlines a number of real-world builds where ShipConstructor’s AutoCAD-based engineering package and ‘Database Driven Relational Object Model’ were successfully deployed. The common project database approach links into specialists’ design and management tools and produces a quality product through a well coordinated production process. The initial pain of migration to CAD/CAM/PLM is worth it in the long term.

* www.oilit.com/papers/shipconstructor.htm.


Knowledge Support System introduces hosted fuel pricing

Software-as a-Service automates fuel pricing and customer relations management.

Manchester, UK-headquartered Knowledge Support Systems (KSS) is rolling out ‘PriceNet SaaS,’ a hosted solution that offers fuel price management without upfront technology investments. The new solution provides automated fuel price management leveraging the software-as-a-service (SaaS) paradigm. All applications and customer databases are managed by KSS on its servers. SaaS eliminates the need for end users to deploy servers and other software and cuts IT maintenance costs. A ‘pay as you go’ licensing model includes software and model maintenance and support—allowing users to fine-tune the system to optimize performance.

KSS also announced that it has received a patent from the U.S. Patent and Trademark Office for the ‘intellectual property and processes’ underpinning its RackPrice fuel price management system for wholesalers. The patent includes RackPrice algorithms, calculations and formulas that help visualize competitive landscapes for each fuel site, review historical pricing data and analyze that information in real-time to measure the optimal fuel price that will maximize demand and improve profits for any location at any given time.

KSS CEO Bob Stein said, ‘Like our PriceNet system for fuel retailers, RackPrice was built from the ground up using advanced scientific and mathematic formulas at the core of the application. These were tuned to the fuel pricing process to give wholesalers, jobbers and marketers the automated tools and information needed to efficiently and accurately set competitive wholesale fuel prices.’

RackPrice 2.0 was released in February 2009 with enhancements to usability and pricing efficiency. The new features let wholesalers continuously review the competitive landscape and optimize pricing. More from www.kssg.com.


EON Reality—VR training for geologists, seismic profilers

Virtual reality specialist supplies rock outcrop and vertical seismic profile simulators.

Irvine, CA-based EON Reality has developed a virtual reality training system (VRTS) for Saudi Aramco. The system simulates real-world objects such as drilling rigs, borehole sub assemblies and even rock outcrops. Aramco has prioritized two use cases for the VRTS, in rock outcrop simulation and planning vertical seismic profiles (VSP).

The rock outcrop simulator leverages high resolution LIDAR data to provide multiple levels of detail, texture-mapping and Interactive navigation and manipulation of the 3D model. The VSP is a key technique used to calibrate and tie-in a seismic survey to well data. The new solution provides a ‘realistic and detailed’ environment for demonstrating and training users on VSP acquisition. The simulator embeds basic physics and wave propagation through rocks and ‘fully functional’ equipment as used on-site. The VRTS simulates the VSP acquisition processes with real-time feedback in a scalable 3D environment and interfaces to existing geo science applications.

The system is now being extended to other applications such as training and retention of drilling procedures, borehole assembly and surface facility design and improved safety. The VRTS provides a scalable 3D VR environment from laptop, through 3D power wall to immersive four wall ‘ICUBE’ environments. EON Reality clients include Atlas Copco, BP, Bechtel, Boeing and Intel. More from www.eonreality.com.


Palantir advocates application of game theory to E&P

Bart Willigers to present game theory paper at upcoming SPE New Orleans meet.

Palantir Economic Solutions has been advocating the application of game theory to E&P decision support, with road shows in Europe, the US and an upcoming presentation at the New Orleans SPE meet. Palantir’s Bart Willigers, along with researchers from the University of Stavanger, has been working to apply game theory (made famous by the book and film ‘A Beautiful Mind’) to a variety of upstream scenarios. The idea behind game theory is that the profit of each stakeholder depends on the strategies of all others. Willigers suggests using game theory to model decision making in joint venture operations and government relations.

The E&P industry develops complex economic models to obtain insight into the commercial attractiveness of joint ventures but these ignore the influence of other stakeholders. In fact Palantir notes that of the SPE’s library of 50,000 papers, some 6,300 cover ‘risk,’ around 800 real options and only one on game theory! This despite the fact that game theory has generated two Nobel prizes.

Palantir suggests moving forward from classical decision analysis that focuses on a single decision-maker facing an uncertain environment to a paradigm that takes account of multiple competing players with ‘diverging interests, objectives, and influence.’ Palantir’s offering in this space is an extension of its Rapid Portfolio Evaluation (RPV) methodology. This leverages Palantir’s Spotfire-powered CASH and PLAN packages along with third party data from WoodMac and IHS. More from www.palantirsolutions.com.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.