November 2008


Total’s Field Monitoring

An ‘agile’ IT infrastructure centered on OSIsoft’s PI System and Microsoft BizTalk are key to Total’s drive to maximize production and reserves through more efficient oilfield management.

Speaking at the OSIsoft user group in Amsterdam this month, Total’s Pierre-Henry Tincelin and Xavier Lacoux described upstream monitoring as ‘complex,’ with data coming from diverse sources such as wells, subsea networks, 4D seismic and production. Communications are very different compared with the downstream—operators have to deal with low bandwidth and high latency satellite connections.

Total’s Field Monitoring (FM) R&D program was launched two years ago to maximize production and reserves by more efficient field management and to optimize monitoring with an ‘agile’ IT architecture. Reservoir, well and plant monitoring use a variety of applications such as production analysis and gas lift optimization. The program started in Qatar in 2004 with the development of a gas lift tool. FM involves making connections to multiple systems—well simulators, transient and steady state pipeline networks. A framework of identical tools is used at all E&P sites—with a constellation of localized applications built around the core. Total uses marketplace software ‘as far as possible.’

Energistics has been involved since last year with the development of XML-based objects used across applications. The challenge is to promote collaboration around quality real time data. Previous attempts have involved the ‘spaghetti approach,’ with many to many connectivity. Total is now moving towards a more integrated solution with two focal points, PI and BizTalk. XML data exchange is making systems more flexible and reactive, ‘nearly plug and play.’

A data layer includes the PI real time database, lab, production and reservoir data and SAP. Above this sits the application layer, with data analysis tools and business applications such as SAP PM. An integration layer comprises the asset model and BizTalk for orchestration, routing, transformation and connectivity. Web services are used to encapsulate databases.

OSIsoft’s latest PI AF2 technology is used for asset models and meta data. PI AF allows Total to handle dynamic asset models—changing manifold status, routing and separators. PI RtWebParts provide a tree view of the infrastructure and deliver trends and synoptics. Total has developed its own custom web parts to integrate its SharePoint/BizTalk infrastructure. PI-ACE notifications are broadcast to users through the enterprise service bus. For example during a well test, PI sends an indication that flow stability has been reached. An enterprise service bus routes notifications to email or smart phones.

Total is bullish about its PI/BizTalk-based SOA architecture—‘it works!’ Total’s time line for FM means that two years after deployment, ‘everyone is using it.’ More from the OSIsoft user group on page 6 of this issue.


Open Source WITSML

Petrolink is releasing ‘Powerstore,’ a WITSML-based well site data server and query engine, as open source software to ‘promote take-up across E&P.’

Petrolink is to release its ‘Powerstore’ WITSML server and toolset as open source software. The release includes the WITSML store for Oracle and SQL Server, along with the WITSML Query Workbench. The initial release will be an ‘open license’ runtime with source code to follow once Petrolink’s lawyers have advised on the license model. A new website, witsml.net has also been announced for public discussions and software download.

Petrolink CEO Jon Curtis said, ‘This move is meant to promote WITSML take-up across the E&P community, reducing startup costs for both service companies and operators. Our core business is service—we are not selling software.’

We asked Petrolink if the open source offering would extend beyond the Microsoft .NET implementation that has to date prevailed. Petrolink’s CTO Giorgio Drei told us, ‘Today, Powerstore uses Microsoft .NET technology exclusively—running on Windows 2003 Server or later and a SQL-Server or Oracle database. We have no plans for a Java interface or core—but it is quite possible that the release as open source to the community will stimulate contributions from passionate developers in many directions—including Java.’


On ‘productivity,’ keyboard shortcuts, futzing and automation

Oil IT Journal editor Neil McNaughton wonders what it means when an oil company ‘standardizes’ on Microsoft. Futzing with Office 2007 he wonders what happened to ‘productivity,’ elaborates a conspiracy theory and offers a snapshot of the interoperability situation in oil and gas IT today.

Reading through this month’s issue I was struck by the extent of Microsoft’s penetration into the oil and gas vertical—from Total’s use of BizTalk orchestration (page 1) to Saudi Aramco’s SharePoint Server and the ubiquitous WebParts in evidence at the OSIsoft user group conference (page 6). It is either frustrating, fantastic or just puzzling (depending on your viewpoint) to reflect on the fact that Microsoft’s penetration is riding high on the web services bandwagon. The fact is that the web services ‘standard’ adopted by the upstream seems to be most all Microsoft .NET. I’m not sure that ‘standardizing’ on Microsoft’s technology was what was intended when the World Wide Web Consortium (W3C) started laying down the SOA law.

But there is another interesting facet of Microsoft’s penetration into our technologically focused vertical—the paradox of a user-focused operating system penetrating an automation-focused environment. Let me try to explain...

To take a really high level view of IT you might ask what is information technology really about? Is it about helping knowledge workers do their jobs, or about eliminating tiresome, repetitive jobs (and users) entirely? A tough question. One significant claim for IT is increased productivity. Although this is something of a contentious statement, it is pretty clear that it holds true in many fields. A good example is e-commerce. When you buy something online, a plethora of IT systems replace old fashioned human intervention with electronic payment systems, inventory management, mail routing and web-based tracking systems. In the end, the employment picture is not too bleak, since although automation has eliminated many jobs, others have appeared as, for instance, banking personnel are re-allocated from checking checks to selling insurance and household alarm systems on the front desk. But I digress.

What is the effect of eliminating jobs to a vendor of a personal computer software? It means less ‘bums on seats’ and less licenses sold! Microsoft’s answer has been to re-invent ‘productivity,’ notably though the marketing of its flagship Office Suite. I submit that ‘productivity’ in this context does not mean removing the person sitting at the keyboard! In fact, my recent experience with Vista and Office 2007 suggests that productivity is actually on the wane.

Ask any secretary (oops—they went with the IT revolution too!) what slows down typing and they (would) reply, the mouse! Keyboard shortcuts are the way to write fast and accurately. But keyboard shortcuts get dreadful support from Microsoft, with inconsistencies across different tools—and this seems to have gotten worse with Office 2007. It sounds like a crazy conspiracy theory, but I wonder if there are people in Redmond saying, ‘if we drop this and introduce that, we’ll increase our worldwide bums on seats time by so many zillion hours per year!’ How long can we count on CTRL X,C,V and Z?

So if ‘productivity’ is in the eye of the beholder what of ‘standards?’ It’s pretty much the same thing I’m afraid. Standards, like the above cut and paste shortcuts, are also about productivity. By standardizing stuff, you hope that your knowledge workers will be able to get on with their real jobs instead of futzing with work-arounds and data re-formatting. SOA is the latest manifestation of a long battle for interoperability between different hardware and software vendor’s tools. What has caused previous attempts at upstream interoperability to come unstuck has been the devil in the detail. Fine grained incompatibility between different versions of databases, of middleware and operating systems plagued previous COM and CORBA attempts to make it work. The drive to a ‘service oriented architecture’ involved decoupling systems to make them less prone to the versioning ‘gotchas.’

I was therefore surprised as I traipsed around the SPE and SEG tradeshow floors to hear from several WITS/PRODML vendors that these protocols are also prone to versioning problems. Maybe that’s why, as Chevron’s Jim Crompton reported at the Energistics Standards Summit (page 5), WITSML take-up is somewhat less than hoped-for.

But assuming that all the detailed devilry can be overcome, there remains an interesting tension between the ‘bums on seats’ business model and automation. Software marketing departments are doing a great job of selling notions like ‘empowerment,’ ‘collaboration’ and ‘visualization.’ These notions go down better that ‘efficiency,’ ‘head count reduction’ and ‘automation,’ which is almost a taboo in the upstream.

What makes the whole thing so fascinating is that Microsoft has had a pretty successful role in automation too, as Christian Roller described at the OSIsoft meet, with the first Windows-based HMI in 1985 and the later OLE for process control (OPC) standard in the 1990s.

To temper the impression that Microsoft has taken over the upstream I offer the following. First, despite Microsoft’s arrival in the ‘Top10,’ (page 3 of this issue), it’s still Linux that ‘dominates’ high performance computing in oil and gas. Second, in the interpretation arena, we report (page 12) on a study by Welling & Co.* that found a majority of users in the majors see Linux as the ‘way forward’ for geoscience interpretation—although this perception is reversed for smaller companies. And finally, following pressure from automation vendors, the latest flavor of the ISA’s standard for process control—the ‘unified architecture’ (UA) consolidates a move away from the older Microsoft OLE-based technology to a vendor-neutral standard.

In the end, standards are a cat and mouse game—with the purists trying to level the playing field and push technologies into the ‘commodity’ category. The vendors, even when they are ‘on board,’ are genetically programmed to wriggle their way out of this commoditization. But to ‘standardize’ on a proprietary infrastructure like .NET seems a bit like letting the cat catch the mouse!

* 2008 Welling Survey of the Worldwide Seismic Market—www.welling.com/studies/seismic.html.


Anadarko’s financials mapped to US GAAP XBRL taxonomy

A team from the Coles College of Business finds XBRL suited to reserves and financial reporting.

Tim Mahon and Ernest Capozzoli (Coles College of Business at Kennesaw State University) have just completed a proof of concept study of the use of the new US GAAP Taxonomy and XBRL standard for oil and gas financial reporting. Capozzoli’s team has used publically available information published by Anadarko Petroleum to show how the ‘unique’ reporting requirements of a large oil and gas producer can be captured in XBRL (OITJ September 2007).

The US GAAP financial reporting taxonomy is now complete and has been recommended for filings with the SEC. This new taxonomy will transform the process of financial reporting. Another significant change for the oil and gas industry is the CIFR 2008 recommendation to adopt XBRL for SEC filings. Organizations with a market capitalization in excess of $5 billion must file their XBRL-formatted financial statements for periods ending on or after 15 December 2008. Other publicly traded companies that file with the SEC using US GAAP will have to adopt XBRL by 2009 through 2010.

XBRL is said to ‘vastly improve the timeliness, accuracy and flexibility of data in financial statements and other business reports.’ However, the scope and impact of XBRL on financial reporting is largely unknown. To determine the extent to which oil and gas disclosures are covered in the XBRL taxonomy, Anadarko Petroleum’s disclosures for 2006 were tagged using ‘Dragon Tag,’ an add-on to Excel from Rivet Software. The whole process took only three to four hours.

XBRL ‘scenarios’ were used to distinguish between Anadarko’s US and overseas operations and to differentiate between different operations and aggregates. XBRL’s expressiveness was used to tag a text block explaining in detail Anadarko’s Oil and Gas Reserves, including information about two major acquisitions.

Capozzoli concludes that the US GAAP taxonomy and its XBRL manifestation covers the majority of the information required by SFAS No. 69. Anadarko’s supplemental information was relatively easy to tag and was presented in the same general format as the taxonomy.

Overall, XBRL is still improving but already, as John White, Director of the Division of Corporation Finance at the SEC, stated in a 2007 speech to the AAPG/SPE International Multidisciplinary Reserves Conference, XBRL will allow end users to drill down into the information in XBRL filed documents, and companies to produce and quickly analyze their disclosures. Finally, XBRL has made great strides recently, and with the support of the SEC and release of a new US GAAP taxonomy may soon be the required filing standard for public companies. Read the full paper on www.oilit.com/papers/Capozzoli_0811_1.pdf.


Paradigm issues InnoCentive challenge for fault representation

Open Innovation Marketplace to solve thorny problem in reservoir characterization.

In his keynote address to the SPE Digital Energy conference in Houston earlier this year, Paradigm CEO John Gibson vaunted the merits of the net for innovation—citing Ely Lilly’s innocentive.com website. InnoCentive lets technology providers post problems to the site and invite solutions from the community at large. Paradigm has now signed a contract with InnoCentive to use its Open Innovation Marketplace to attempt to solve a thorny problem in reservoir characterization—that of fracture network description.

Paradigm is seeking a way to combine fault interpretation picks into ‘high performance’ representations of 3D fracture networks. The network should support geometric query and understand topologies such as fault displacement and connectivity.

Paradigm CTO Duane Dopkin said, ‘15 years ago, we instituted a program which allowed oil industry geophysicists to contribute algorithms—effectively ‘etching’ their signature on a global and commercially available system. By working with InnoCentive today, Paradigm is again challenging the broader E&P community to recognize open innovation as a way of accelerating technology development.’ A $10,000 prize waits for someone who can fix Paradigm’s fractured networks!


Microsoft now at N° 10 as NVIDIA Tesla enters TOP500

Roadrunner breaks petaflop barrier as Microsoft and NVIDIA make significant moves in HPC.

This month’s supercomputer ‘TOP500’ list is headed up by the US Deparement of Energy’s Los Alamos ‘RoadRunner,’ one of two machines to break the petaflop barriers with 1105 teraflops. Most significant novelty is the arrival of Microsoft Windows HPC 2008 at number 10 with the Dawning 5000A system at the Shanghai Supercomputer Center. But before you rush out to buy one, note that the Dawning is equipped with a hefty 122 terabytes of memory!

The only dedicated oil and gas machine in the list is Total E&P’s 106 teraflop (for a more modest 20TB memory) SGI Altix ICE 8200. But the absence of other machines is probably a reporting issue as many seismic contractors have better things to do than tune their clusters for Linpac.

We tried to normalize some of the TOP500 statistics by dividing the Rmax teraflops ratings by the aggregate number of processors. This showed that Microsoft’s 2008 HPC offering has tripled its GFLOP/Processor rating to 6.6, three times its 2003 HPC offering—bringing it in line with the Linux average. The most intensive machines on this composite rating run IBM’s AIX (10.4 GFLOP/processor), SGI (9.4) and OpenSolaris (9.1).

Another significant development is the arrival of the GPU-based heterogeneous cluster from NEC and Sun installed at Tokyo Tech. The ‘Tsubame’ (in at N° 29) includes 170 Tesla S1070 1U systems and produces 77.5 teraflops of Linpack performance. More from www.top500.org.


Dataflux, Terraspark and Intervera sign-up for interoperability

Data quality specialists and seismic interpretation vendor join OpenSpirit.

Three companies have recently signed with OpenSpirit, the upstream data connectivity specialist. Horizontal data quality specialist DataFlux is to use OpenSpirit’s middleware to access third-party geosciences applications and data stores. End users will be able to extract well data from OpenSpirit-enabled data stores to perform rapid data analysis and quality control tasks utilizing the DataFlux Enterprise Integration Server and business rules monitoring engine.

Geoscience software vendor TerraSpark is also developing an adapter with the OpenSpirit software development kit (SDK) to connect its ‘Insight Earth’ seismic interpretation flagship to third party data stores. According to Terraspark CEO Geoff Dorn, this will enable ‘best-of-breed’ workflows across multi-vendor interpretation and data management suites.

Intervera is integrating OpenSpirit’s integration technology platform with its upcoming DataVera V 8.0 release. This includes a DataVera Monitor module that provides a web services framework for integration with other middleware applications such as OpenSpirit. More from sales@openspirit.com.


CyrusOne to host Repsol’s Gulf of Mexico ‘Kaleidoscope’

Houston data center to house IBM Cell-BE-powered seismic supercomputer.

Repsol is to deploy its ‘Kaleidoscope’ technology, originally developed at the Barcelona Supercomputing Center (OITJ December 2006), in Houston for use by its Gulf of Mexico interpretation team. The Kaleidoscope project applies IBM’s Cell Broadband Engine-based computing to solve the numerically intensive process of reverse-time migration (RTM). A tenfold speedup over ‘conventional’ technology is claimed.

Repsol’s head of geophysics, Francisco Ortigosa, said, ‘We are pleased to launch Kaleidoscope’s exploration operations in the Gulf of Mexico and Brazil as the project proves the success of the collaborative approach to research we have pursued for the past two years. The combination of the IBM PowerXCell 8i processor-powered supercomputer and RTM imaging puts Kaleidoscope at the cutting edge of exploration technology.’

The Kaleidoscope machine is to be replicated at CyrusOne’s Houston data center. The CyrusOne facility is said to be one of the few data centers in the region with the capability of accommodating Repsol’s installation which requires some 750 watts per square foot of electrical power. Repsol’s RTM algorithm was a joint development between Repsol, the BSC and FusionGeo. FusionGeo was formed this month by the merger of Fusion Geophysical and 3DGeo.


Petris and Warrior form strategic alliance around UnRiskIT

Risk management package to be embedded in PetrisWinds Enterprise and Recall.

Petris has teamed with Warrior Technology Services to market Warrior’s UnRiskIT risk management package. The deal gives Petris the right to embed UnRiskIT into its own solutions as well as non exclusive rights to resell UnRiskIT on a standalone basis.

UnRiskIT provides Monte Carlo-based simulation of the well planning process and root case analysis of drilling problems. Time-based and financial risk analysis allows for alternative scenario planning such as sidetracking and other ‘remediations.’ Data exchange with Microsoft Excel is supported.

Petris will integrate UnRiskIT with its data management offering comprising PetrisWinds Enterprise and the well log management package, Recall. UnRiskIT will also be bundled with engineering applications including DrillNet and a new AFE Management solution, a component of PetrisWinds Operations Center. UnRiskIt will let project planners identify and rank risk factors and develop a risk management and mitigation strategy to minimize non productive time. UnRiskIT clients include Shell, Chevron, Hess and Halliburton.


ffA and Mercury team on high end visualization

‘Next generation’ tools for seismic interpretation to leverage GPU-based computing.

Foster Finlay Associates (ffA) has teamed with Mercury Computer Systems to deliver ‘next-generation’ visual computing tools for seismic interpretation. ffA’s 3D seismic analysis software is to be ported to graphics processing unit (GPU) based compute engines using Mercury’s Open Inventor and VolumeViz LDM 3D graphics software development kits.

The companies expect that GPU-computed seismic analysis and GPU-based surface extraction will increase the performance of interactive and automatic identification of faults, horizons and geo-bodies. ffA’s seismic image processing tools SVI Pro and SEA 3D Pro are to benefit from the new technology.

Mercury plans to integrate GPU-based geometry and volume processing capability in Open Inventor by 2009, providing a ‘high-level framework’ for integration of GPU visualization and computation. Visualization components will be directly linked to GPU-computed data, avoiding transfer between CPU and GPU.

ffA CTO Steve Purves said, ‘GPU-based computing will bring a new wave of highly interactive data driven interpretation tools for the analysis of very large 3D data sets.’ In an earlier announcement (OITJ June 2008) ffA teamed with Nvidia to leverage GPU-based number crunching in SVI Pro.


Software, hardware short takes ...

News from OpenGeoSolutions, Cartopac, ffA, Intervera, OHM/RSI, WesternGeco, SMT and Tecplot.

Calgary-based OpenGeoSolutions has ported its spectral decomposition technology to NVIDIA’s Tesla C1060 GPU-based compute engine, bringing an ‘order of magnitude’ performance hike. OGS president Jim Allison said, ‘We are seeing unprecedented speedups with processing time down from two hours to two minutes. A single Tesla C1060 delivers the same performance as our 64 CPU cluster.’ Along with its processing offering, OGS provides support to users of the open source FreeUSP seismic processing toolset.

~

CartoPac’s new Field Server automates and streamlines many field data collection processes providing centralized data storage and internet connectivity. The solution targets, inter alia, field workers at oil and gas companies and can be used as a stand-alone product or with ESRI ArcGIS Server, ArcSDE and other data stores. Field Server provides geo data QA/QC, GPS post processing, work order generation, printable field reports and database loading.

~

ffA has released SEA 3D Pro 2008, bringing ffA’s Windows seismic imaging application SVI Pro to the Linux workstation. SVI is now interoperable with GeoProbe, adding volume interpretation workflows to Halliburton’s seismic interpretation flagship.

Intervera Data Solutions has announced the ‘pre-release’ of DataVera 8.0 its data quality solution. The new release includes a ‘Monitor’ module that leverages an industry standard web services framework to interoperate with SOA-enabled applications, such as middleware and workflow tools. The pre-release includes 200 new E&P business rules focused on master data management.

~

OHM—Rock Solid Images has added rock physics electromagnetics (EM) to RSI’s iMOSS modeling tool. iMOSS-em was developed under the industry funded WISE JIP.

~

Aberdeen, UK-based Perigon Solutions has added OpenSpirit connectivity to its iPoint Visualization System. iPoint can now access well data from any OpenSpirit-enabled data store and share interaction events among applications.

~

WesternGeco has just announced ‘UniQ’ its ‘next-generation’ land seismic acquisition system. The point-receiver acquisition and processing system combines and ‘extreme’ (up to 150,000 at a 2ms sample rate) channel count with support for multiple simultaneous sources. Our thumbnail calculations suggest the UniQ system is capable of producing a few terabytes of seismic during a day’s shooting. Seismologists might like to brace themselves for the data onslaught.

~

Seismic Micro Technology’s 8.3 release of its Kingdom Suite interpretation suite adds AVO conditioning, forward modeling and patent-pending neural network-based log ‘replacement’ technology. The latter is claimed to ‘identify areas of pay, even where records have been lost, through the automated generation of missing log curves.’ According to SMT, in a recent survey by Welling & Co., geoscientists rated Kingdom as ‘the interpretation software they recommend most.’ More from the Welling survey on page 12.

~

Tecplot RS 2009 has just been announced with ‘deeper analysis and clearer visualization,’ new platforms (64 bit Vista and Linux) and added support for the VIP and NEXUS simulators. A FLEXnet License Manager also has been added for enhanced installation and license management. User requirements from Chevron and International Reservoir Technology drove the upgrade.


Energistics Standards Summit

AspenTech, Chevron, ONGC and Oracle present at upstream standards meet.

Around 90 turned out for the 3rd annual ‘Standards Summit’ hosted by Energistics in Houston last month. CEO Randy Clark reported that corporate membership has doubled in the last two years to over 100. Mike Strathman (AspenTech) summarized standards of importance to the upstream tracing the evolution of the POSC/Caesar data model into the present day ISO 15926—described as a ‘somewhat seamless’ vendor-neutral interface used notably in tools such as Aveva’s PDMS. In production and operations the situation is more complex with the ‘challenge’ of the process industry ISA-S95 standard—said to be ‘very active.’

Chevron fellow Jim Crompton described the challenge of standards deployment in a major oil company. Chevron has been involved with POSC for some time but still struggles with the move from pilot to adoption. Crompton notes issues such as standard ‘ownership’ and support, corporate culture, and the cost of change and the benefit of adoption. Frequently the beneficiaries of change may be mismatched. Producers may think, ‘Our job is to produce oil not to have a good IT.’ And all of this is happening in the face of an explosive growth of data volumes. Crompton suggests that, ‘planning for a journey not a sprint’ is part of the answer noting that ‘sprints always fail however interesting. Well data log is still on WITS and has not moved to WITSML!’ The real challenge is to see a standard all the way to adoption.

PSN Kutty (ONGC) suggested there are opportunities to reduce cycle times in geophysical work processes with enhancements to the SEG’s standards portfolio. ONGC estimates it could avoid 83 person days/year with better standards. These should address acquisition, logistics and velocity data and XML-based data exchange. Improved SEG standards are also required for seismic processing.

Michael Rowell’s (Oracle) presentation on global standardization was well received. Rowell outlined Oracle’s approach to standards—in particular the UN CEFACT Core Components Technical Specification* (CCTS) e-business standard. CCTS core and infrastructure standards are stable but standards content ‘is like the wild west!’ Rowell sees hope from growing cross industry collaboration needs—particularly in the context of services-oriented architecture ‘governance.’

* ebxml.xml.org/node/163.


OSIsoft/PI System user group 2008, Amsterdam

Presentations from Shell, Iberdrola, RasGas, Saudi Aramco and NiSource put OSIsoft’s PI System at the heart of their enterprise production monitoring effort. Microsoft SharePoint Server is an infrastructure component for some—while niche applications from Magion (for data quality) and Intellution (for connectivity) add to the OSIsoft process monitoring ecosystem.

The OSIsoft User Group held in Amsterdam last month had over 650 attendees, including 100 OSIsoft personnel. OSIsoft founder and CEO Pat Kennedy’s keynote described the ‘meltdown economics’ of the current global situation. Oil price uncertainty means that projects will be canceled or scaled back and that ‘cash is king.’ There will be a slow down in projects and a shift to ‘low hanging fruit’ in the form of small projects. Companies will seek to leverage information. Small projects can accumulate to interesting result. Back in the last downturn, in 1982, OSIsoft noticed that sales increased as companies sought to optimize their business in challenging times. More recently, one client combined hundreds of mini projects targeting energy savings which led to a $30 million per year saving. The same kind of approach should help companies plan for an oil price in the $50-$200/barrel range and make sure their projects work ‘in all scenarios.’

While process control is a very broad church, spanning continuous, batch and discreet/manufacturing activities, OSIsoft’s PI System, with its focus on real time data capture has a particular affinity with both upstream and downstream oil and gas operations. It may be something of a paradox for the IT professional, that while the PI System is at the heart of process monitoring, it has little to do with process control. By way of an explanation, we start this report with a paper that presents the situation at the ‘coal face,’ the interface between PI and process...

Maria Aniorte of the Spanish utility Iberdrola outlined the problem of optimizing a complex plant with some 10,000 PI System tags and hundreds of process control loops. If control loops are not working properly the result is poor plant performance and oscillations in output. But optimizing across hundreds of process control loops and coupled valves can get very complicated—requiring ‘deep math.’ Iberdrola’s market analysis of optimization tools resulted in the acquisition of Expertune’s Plant Triage (PT). Iberdrola now monitors around 600 control loops through a central PT server that rolls up information into a summary dashboard showing plant performance. Insights from PT have allowed Iberdrola engage equipment vendors to adjust things like pump parameters. Savings can be ‘high to immeasurable’ if accidents are avoided. Asked if there was bi directional data flow to the plant Aniorte replied no—‘We make recommendations to the plant operators. The people in the plant take action and make the changes. There is no Plant Triage output to the DCS.’

OSIsoft’s Deter Van Wordragen described how PI System is being used for power monitoring in the data center. Flagship client Microsoft is currently spending $4 billion on four new data centers. One, located in Chicago will run MSN/Hotmail/Virtual Earth and is to consume some 60MW of power. Microsoft’s servers are managed with PI for power efficiency—one data center has around 600,000 PI tags.

In a panel session, Dick Wernsing outlined how PSE&G is extending its maintenance cycle by adjusting the maintenance interval to operational requirements—in a shift from ‘emotional’ decision making to data based decision making. This has been achieved by a link from the SAP asset registry to the PI historian to produce ‘just in time’ work orders. PI gives visibility into what’s happening in the plant so that a minor problem is fixed with an inexpensive repair before it results in a costly failure. It is also important to make sure that information gets noticed. So that for instance if a compressor runs over 2 hours/week (indicative of a leak), it gets attended to. ‘Don’t let the engineers get in the way, they make rules but should not be able to change them every time they get a notification.’

John de Koning described how Shell Chemicals is standardizing its multiple equipment and laying down the foundation of an operational data infrastructure. The key is a standard data acquisition layer. But for Shell, PI System ‘standardization’ is about more than just installation, it involves standard processes and design. Often PI is deployed to address a particular technical issue. But how do you justify continued deployment once a problem is fixed? It is rather like the justification for a road—you have to have a foundation to build on. Equipment is getting more intelligent. Rotating equipment and high pressure pumps deliver huge amounts of data. It required a significant investment in a brownfield site to get all this into the PI System. The situation is easier for greenfield sites. Shell’s PI systems hold 400,000 points which leads to data quality assurance issues—‘the more data, the harder it is to assure quality.’

Sarah Al-Aqaily told how PI is the foundation of the integrated control and information management (ICIM) system for RasGas’ North Field. RasGas is the largest LNG producer and transporter in the world with exports to Japan, Taiwan, the US India and the EU. By 2010, capacity is planned to reach 37 million tonnes/year from seven gas trains and 2 helium recovery plants. The ICIM vision is to expand the scope of automation and information management across all of RasGas’ activities. The system assures data integrity and avoids data duplication. Information flow is outbound to other systems—the ICIM can’t write or control the plant. Real time data from the process, laboratory information management systems, hydrocarbon allocation, revenue and SAP/ERP for operations, commerce, shipping and finance are all collected into a real time information system (RTIS) which assures the accuracy of metering data that is passed on to financial systems.

Christian Roller traced Microsoft’s history of involvement in automation dating back to the Windows-based HMI (1985) and COM/OPC (1990). Today, Microsoft supports a ‘rich’ vendor ecosystem including OSIsoft, AspenTech, Siemens, Emerson and many others. Pasha Ahmed (OSIsoft) joined Roller to sketch out how Microsoft technology was deployed by Saudi Aramco. Aramco addressed the challenges of scalability and data validity by implementing PI System and Microsoft tools—notably SharePoint. Aramco appears to hold the record for tag count at around 2 million all funneled into the control room for display on a huge 70x3 meter screen.

Richard Coomber presented Shell’s Production Portal Architecture. This leverages OSIsoft’s latest AF 2.0 technology. PI AF provides a consistent representation of a plant for analysis with tools such as ProcessBook and Excel. The AF SDK is used to tailor a system to an organization’s specific requirements. Shell found that the move to AF revealed considerable data quality issues with ‘non standard’ tag names. Data quality management is now plumbed into the network—notably using the µ-QA tool from Magion. Process Book provides a geographical view of assets with drill down to offshore meter data and status. The Production Portal is built atop SharePoint, PI RtWebMarts and ESRI ArcGIS. µ-QA checks that meters are calibrated, that the network is up and the tag data is good. When a problem is detected, an SAP workflow is triggered.

Nancy Shifflet described an interesting use of PI when a NiSource compressor station got hit by a tornado. Using Intellution’s Ifix on a PC and an interface to PI, NiSource got the station back in operations in a couple of days despite limited communications and power. A late as 1998 NiSource had mostly paper-based SCADA logs and physical archives. Since then a PI System has been deployed with ‘human’ tag names representing location and facility names. Today PI is used throughout the company—by HSE, facility planning, field services and marketing. Applications include regulatory, asset performance, lost and unaccounted for gas and fuel consumption. PI scope has expanded from gas control SCADA to embrace electronic measurement, weather data (for planning and operational support)—and modeling. AF has brought a global data view to optimization analysts. FERC, State and Federal and environment reporting is all done instantly from PI. ‘PI provides connectivity, performance, scalability and ease of use to gas controllers, engineering and field services and environmental groups.’ In the Q&A, Shifflet was asked how much value NiSource put on its real time infrastructure over the previous SCADA system. She replied, ‘If I took PI away, users would kill me! It is so embedded. Otherwise we now know much more about our capacity although it’s hard to put a dollar value on this. PI is used from execs to roustabouts.’

OSIsoft VP R&D Ray Verhoeff took a peek at the future of standards, in particular, the soon to be released OPC Unified Architecture (UA). This is to expose a new address space, OPC objects and relations from an OPC Server. Mapping has been achieved between AF and UA. The UA architecture ‘promotes’ an information model—exposing nodes and relationships such as wells, tubing node, casing node as a hierarchy. Different industry groups need to standardize definitions and then ‘the software will comply.’ Verhoeff offered a long list of standards—he has been personally involved with PRODML. While this group is not yet ready to endorse UA, it is recognized as a ‘strong offering’ and the situation may change in a year or so. UA addresses security issues. UA transport profiles are no longer COM based and so allow for non-Windows platforms through XML-based Web Services. UA XML is defined as an XML Schema plus WSDL running on HTTP/HTTPS or UA binary. PI System can act as an OPC UA server to feed Windows, Linux or embedded OS clients. OSIsoft’s new architecture has been released as V1.0 with a PI JDBC Bridge from Linux (Ubuntu 7.10) talking to a PI OLEDB (Windows Foundation Class RDS) on the server.

Michael Lamb, (Xcel Energy) believes it is ‘time ripe to blow up Edison’s electricity generation utility,’ to move from load following to load shaping. Xcel is the US’ N° 1 wind provider, with 3GW installed capacity. Smart Grid City (SGC), a project to create the ‘utility of the future’ in Boulder, CO, has 3.3 mm residential customers with currently one meter read/month. But there could be 20-30 readings every 5 minutes. A typical Smart Grid activity could involve a plug in Prius used as a 10 kWh energy source used over night for load shaping.

Finally our salesperson of the show award goes to Transpara’s indefatigable Mike Saucier who manned the coffee machine, monitoring who was drinking what and displaying a running total on his Blackberry with Transpara’s Visual KPI!

This report is an extract from The Data Room’s Technology Watch report of the OSIsoft user group—more from tw@oilit.com.


BP’s Inglis—‘no demographic crisis,’ ‘oil is technology leader’

Andy Inglis minimizes impact of ‘baby boomers’ imminent retirement but warns of ‘capability gap.’

At the 2nd Annual Energy Industry Director Conference in Houston’s Rice University last month, BP’s head of E&P Andy Inglis gave a talk titled ‘The Changing of the Guard.’ Inglis stated that he does not agree with the proposition that an ‘insoluble demographic crisis’ is to be caused by the retiring baby boomer generation. A far bigger challenge is the ‘capability gap,’ a paucity of ‘technology, skills and know-how.’ Bridging the gap will be necessary if the industry is to respond to IEA forecasts of a 50% hike in energy demand by 2030. BP is working to attract graduate talent with its ‘Challenge’ program which currently has 1,200 ‘Challengers’ from all over the world.

Inglis also described some ‘misconceptions’ about the industry. Notably the idea that it is ‘low tech and out of date’ when set against other verticals such as IT, media and pharmaceuticals. According to Inglis, ‘Nothing could be further from the truth’.

BP has been honing its technology as a means to plug the capability gap. BP’s technology showcase is the Advanced Collaborative Environments (ACE) where real time data gathered from oil and gas fields is analyzed offshore and onshore simultaneously. ACE and its embedded integrated surveillance information system (ISIS) were discussed at the SPE ‘Intelligent Energy’ event in Amsterdam earlier this year (OITJ April 2008). Inglis’ presentation included a graphic showing ISIS as built around OSIsoft’s PI System with a constellation of BP-developed tools and third party applications including Matrikon’s ProcessNet (now OperationalInsight) and SPT Group’s Advanced Warning System. AWS tracks production data streaming to the PI System and compares it with model forecasts for a variety of monitoring and optimization applications. Inglis outlined an ACE case history when an estimated $3 million of deferred production was saved by the timely creation of an ad-hoc team of troubleshooters. Today some 35 of BP’s assets have ACEs.


Folks, facts, orgs ...

Aclaro, Aveva, Spectraseis, Coade, Coreworx, US DoE, Energy Navigator, Exterran Holdings, ffA, FreeWave Technologies, FuelQuest, SRC, Caesar Systems, Spectrum, TietoEnator, LITE, IFP.

David Archer has been appointed president of Wellstorm. Archer was previously with Petris Technology.

Brian Evans has joined Aclaro as an integration consultant in Houston. Evans previously worked for Ocean Energy, Devon and Peoples Energy.

Santiago Pena is to head up Aveva’s new Latin America office in Rio de Janeiro.

Dale Blue has joined Zurich-based Spectraseis as software product manager. Blue was previously with Schlumberger.

Chris Bowd is to head up Coade Inc.’s new Japanese subsidiary. Bowd was previously president of Spatial Technology’s Japanese unit prior to Spatial’s acquisition by Dassault Systems.

Coreworx has hired Paul Haynes as COO. Prior to joining Coreworx, Haynes was a Managing Partner at Omazo Ventures, an investment firm focusing on early stage software firms, and President of Ever America, an ECM vendor.

CygNet Software has established a board of customer advisors (BoCA) to steer product development process and solution design. BoCA members include, David Wray (Anadarko), Darin Molone (Atlas Pipeline), Jason Offerman (Chesapeake) and Jim Wahrenberger (Devon).

The US Department of Energy has named Victor Der as principal deputy assistant secretary for fossil energy.

Tim Loser is to head-up Energy Navigator’s new Houston location. Loser was previously with Spotfire/TIBCO.

Exterran Holdings has elected Chris Seaver to its board of directors. Seaver was previously president and CEO of Hydril prior to its sale in 2007.

Jonathan Henderson, ffA’s MD is to head up the company’s new Houston office.

Daniel Steele is to lead FreeWave Technologies’ business development effort in the Rocky Mountain Region. Steele was previously with Bluewave Antenna Systems.

Matt Tormollen is to succeed Rich Cilento as president and CEO of FuelQuest. Cilento becomes Chairman of the company’s advisory board. Tormollen was previously with Pavilion Technologies, now part of Rockwell Software.

The Saskatchewan Research Council (SRC) is building an oil sands research laboratory, which will include a 3D scaled physical model for testing. The facility is being built thanks to a $1 million contribution from Oilsands Quest Inc.

Alan Jaschke has joined Caesar Systems as client services manager.

Michael Wells has joined SAIC’s Information Technology & Network Solutions (IT&NS) Group as senior VP business development. Wells was previously with Siemens.

Seismic services provider Spectrum has hired Rhys Edwards as CFO. Edwards was previously CFO of oil trading software provider OILSpace Ltd.

TietoEnator has appointed Leonid Bliachov as a strategic advisor for Russia and CIS countries. Bliachov was previously a director of Lukoil Technology Services.

The Louisiana Immersive Technologies Enterprise (LITE) has announced several management changes. Carolina Cruz-Neira is to serve as chief scientist. COO Henry Florsheim becomes interim CEO. Albert Baker has been appointed director of business development. Baker comes to LITE from ABC Virtual Communications.

In an internal move, Jean-Pierre Burzynski has been appointed director of the French Petroleum Institute’s (IFP) refining and petrochemicals technology business unit replacing Patrick Sarrazin.


Done deals

CGGVeritas, Cartasite, Siemens, FMC, IHS, ION, National Oilwell, Schlumberger, Quorum, Teledyne.

Following the failure of its sale to TGS-NOPEC, Wavefield Inseis has found another suitor in CGGVeritas which is to pay approx $310 million for the company—a 31% premium on the share price. The Wavefield board has ‘unanimously welcomed’ the offer.

~

Cartasite and Aragon ST have teamed on the provision of oil and gas monitoring. Through its Global Data Solutions Group, Aragon is to provide satellite, cellular radios, wireless sensors and communications services to Cartasite for use in its new ‘fieldFlow’ oilfield monitoring system.

~

Siemens has acquired German process engineering software boutique Innotec. The deal will enhance Siemens’ industrial automation offering. The purchase price was not disclosed.

FMC Technologies has acquired 10.26% of Roxar’s outstanding share capital for an undisclosed amount.

~

Halliburton has acquired the IPR, assets and existing business of Screen Imaging Technology, a provider of seismic depth imaging services and software.

~

IHS’ Herold market research unit and Jane’s Information Group have teamed to provide country risk assessments to the energy business. Clients can now access Jane’s ‘Sentinel’ country risk service from the IHS Herold website. IHS has also announced the completion of its acquisition of Global Insight, a provider of research and forecasts of worldwide economic, financial and political information.

~

ION Geophysical has teamed with Cairo, Egypt-based Guide Geoscience Technologies to provide advanced imaging and reservoir-related services to oil & gas companies operating in North Africa.

~

National Oilwell Varco and Schlumberger are to create a joint venture to develop wired drill string telemetry systems. The deal centers on the IntelliServ system, originally developed by Grant Prideco (OITJ December 2006).

~

Quorum Business Solutions has acquired Integra Solutions, adding business intelligence to its energy software portfolio.

~

Teledyne Technologies acquired UK-based Cormon and its Cormon Technology unit. Cormon manufactures subsea and surface sand and corrosion sensors, as well as flow integrity monitoring systems for oil and gas production systems. Terms of the transaction were not disclosed.


Social networking in oil and gas

Microsoft evangelist Zain Naboulsi advocates blogs, Twitter, Wikis and ‘hybrid’ tools for oil vertical.

Around 80 attended the SPE Sponsored ‘Social Networking in Oil and Gas’ lunchtime session in Houston to hear from Microsoft ‘social’ evangelist, Zain Naboulsi. Naboulsi’s talk was more about what social networking might bring to oil and gas than an account of how oil co execs Twitter each other or buff up their Facebook pages. The idea is that social networking is set to address some of the key issues in oil and gas like the big crew change/talent gap and to help with the data problem by helping ‘turn data into knowledge.’ The premise is the youth of today is different from the 40 yrs+ ‘as they have always been online.’ Industry needs to open up to these users of social tools like blogs, wikis, forums and ‘multimedia.’

Naboulsi straw polled the audience asking ‘Who has their own blog?’ About five raised their hands. Blogs can be divided into personal or corporate. The latter can have different scope—for a team or the whole company. Companies have to decide on blogging guidelines. Microsoft for example is ‘very open’ although bloggers must sign an NDA. SharePoint can be used for blogging—although Naboulsi admitted that this was not ‘superb.’ More likely a dedicated blogging application will be used. In a blog, information stays there permanently, whereas in a ‘micro-blog,’ the best example is Twitter, it is only visible for a limited time. Twitter and other instant messaging software, make for ‘real time information sharing’ of short text messages. Companies can also monitor Twitter channels for competitive intelligence.

A Wiki represents the ‘democratization’ of information providing a ‘persistent’ knowledge resource that can dynamically change. Anybody can work on a Wiki and miraculously, it is not total chaos! It takes around six months for a Wiki to build momentum. Naboulsi cited Scholarpedia and Congresspedia as examples of note*.

In the Q&A, Naboulsi was asked about the ‘rumor mill’ aspects of blogs and Wikis. He responded that this was a question of corporate culture and requires a degree of ‘pain and compromise.’ Clear guidelines need to be established and respected. But ultimately, the success of social networking is the proof of the pudding.

One route to Wiki development is to have a young intern sitting in with an older executive person—matching a technophile with a knowledge worker. More formal social networking is achieved with a Forum—usually on a specialized topic.

Finally hybrid social networking tools such as the popular LinkedIn offer aggregating portals of social networking technologies—providing a considerable time saving. Naboulsi finished with a pointer to his own oeuvre at http://blogs.msdn.com/zainnab where there are ten secrets to successful social networking.

* Naboulsi might have mentioned the API Petroleum Industry Data Dictionary Wiki on http://wiki.pidx.org.


Sintef, Hitec team on eDrilling Solutions software unit

Startup to develop computer aided guidance and control of drilling and well operations.

The Norwegian Sintef research organization has allied with Hitec Products Drilling to set up a new company, ‘eDrilling Solutions’ to develop software tools for computer aided guidance and control of drilling and well operations. First product to be released is the ‘eDrilling’ package, for simulating, monitoring and visualizing drilling operations in 3D and in real time.

eDrilling was co-developed by Sintef and Hitec, with help from Aker Solutions, during a three-year, NOK 35 ($ 5.2) million project financed by ConocoPhillips Norge and the Research Council of Norway. ConocoPhillips is currently testing the tool.

eDrilling business development manager Sven Inge Ødegård said, ‘With well costs of around $150 million, significant savings can expected from process optimization. We expect that the improved decision support provided by the eDrilling package will bring a 10% plus reduction in non productive time.’ The company will have offices in Stavanger and Bergen and expects to employ about ten people.


Riversand fixes ConocoPhillips’ master data in global SAP rollout

Global purchase to pay project eliminates ‘free form’ procurement and manual transactions.

Speaking at the US API/PIDX fall member meet in Houston last month, Raul Rom outlined Riversand’s work on master data and inventory/spares management for ConocoPhillips. The company was trying to implement a single global instance of SAP to span its worldwide operations and $188 billion revenues. A global purchase to pay project was kicked off in 2007 to reduce ‘free form’ purchases and manual intervention in transactions. The project was six months behind schedule—partly because of data conversion issues and change management issues as business continuity was essential throughout the project.

The solution was the restructuring of ConocoPhillips’ catalog to align with the PIDX classification and schema along with the development of processes and tools for ongoing catalog management. An idea of the scale of the task can be had from a few statistics. Some 1.2 million items were catalogued with 3,400 templates. On average each catalog item had around 20 attributes. The task involved the creation or modification of 3,000 PIDX templates. Finally, the process successfully matched some 200,000 individual free text purchase orders.


Sales, contracts and deployments

Brainware, Quorum, Emerson, Energy Solutions, P2ES, Paradigm, TriplePoint, Roxar, SMT.

Brainware has signed a $2.6 million follow-on contract with an unnamed private sector oil company—‘one of the world’s largest.’ The client uses Brainware Distiller as a front-end data capture tool for its global ERP rollout. Distiller will process millions of invoice pages per year from a hundred countries from shared services centers in Europe and the US. Brainware customers include Anadarko, ConocoPhillips, Halliburton and Shell. More on ConocoPhillips e-business on page 9 of this issue.

~

Quorum Business Solutions has sold its Quorum Pipeline Transaction Management (QPTM) solution to two natural gas pipeline operators bring the total major corporation deployments to 11. Mustang Fuel Corp. has acquired a license to Quorum Gas Marketing—the 5th client for the ‘end-to-end’ marketing solution. A third integrated super major company has licensed Quorum Land Suite to manage its leases and land-related GIS processes.

~

Emerson Process Management has signed a long-term support and service agreement with Qatargas for its expanding liquefied natural gas (LNG) operations. The deal includes asset management services, advanced skills training, and ‘comprehensive’ parts management.

Energy Solutions has made two major sales this month of its PipelineManager flagship. Alyeska Pipeline Service Co. is to deploy the tool for leak detection and slack flow monitoring on the Trans Alaska Pipeline System (TAPS) and PEMEX Gas y Petroquimica Basica is to manage its 12,000 kilometers of natural gas, LPG and petrochemical pipelines with the tool. Total number of deployments is now 150.

~

Quintana Minerals Corp. is upgrading its land management infrastructure from P2 Energy Solutions’ Tobin LandSuite to P2ES’ Enterprise Land 2.5. The migration, performed using standard migration scripts will address various reporting and operational requirements.

~

Dubai Petroleum Establishment, the national oil company, has extended its strategic consulting contract with Paradigm’s Earth Decision Sciences unit. Paradigm’s consultants are to build a new petrophysical database with ‘clean, consistent data and parameters’ to support DPE’s drilling and reservoir studies. The company also announced that Woodside Energy (USA) has chosen its ‘next-generation’ interpretation and velocity modeling solutions for its Gulf of Mexico exploration program.

~

Parallel Petroleum Corp. is rolling-out Triple Point Technology’s ‘Commodity XL for Fair Value Disclosure’ solution to provide credit-adjusted ‘mark-to-market’ fair value level assignment for its energy commodity derivative transactions. Other TPT clients include Copano Energy, Plains E&P, Dominion and Range Resources.

~

Roxar has received an order for NOK 21 million worth of topside meters and software for production data analysis from a major Malaysian offshore development. The deal includes Roxar’s Wetgas meters and its new Fieldwatch and Fieldmanager software that ‘bridges the gap’ between Roxar’s IRAP RMS reservoir software and its instrumentation solutions.

~

Calgary-based Talisman Energy has chosen Seismic Micro Technology’s Kingdom Suite for its global new ventures team. The software will be used to screen farm-in opportunities, conduct acreage evaluations and assess opportunities ‘within short timeframes.’


Standards Stuff

Energistics SOA, PPDM name change, Adobe joins Fiatech, OASIS DITA, eBusiness interop.

Energistics’s Web Services Interoperability Standards (WSIS) V 1.0 release candidate is available for member and public review and comment through December 15. This, the first deliverable from Energistics’ Technical Architecture Work Group sets out to guide application developers and buyers of future interoperable solutions. The work derives from SAIC’s SOA for Energy initiative (OITJ September 2008). More on WSIS from www.energistics.org.

~

The Public Petroleum Data Model Association (PPDM) is changing its name to the ‘Professional Petroleum Data Management Association.’ PPDM CEO Trudy Curtis explained, ‘Industry now looks to the PPDM for leadership in data management and governance, business knowledge and master data management. PPDM’s activity now includes best practices, certification and training. Our standards, including the PPDM 3.8 data model, have now been accepted by companies of all sizes.’ PPDM is also changing its rules to allow individuals to become members. The Association has also announced the creation of new workgroups for well description, ‘data focused business rules’ and data quality metrics. A comprehensive educational program to support professional development and certification is also being developed. More from www.ppdm.org.

~

Adobe has joined the FIATECH standards body and is working to promote its ‘de facto’ PDF format—now an ISO standard to address engineering and construction business challenges. More from fiatech.org.

The OASIS organization has kicked-off a program to leverage its Darwin Information Typing Architecture (DITA) as a ‘standard for standards.’ The new ‘DITA for Technical Standards Subcommittee’ sets out to leverage DITA for the reuse of common elements, to develop common technical glossaries and a DITA toolset for maintaining and publishing technical standards. More from www.dita.xml.org.

~

The European CEN/ISSS eBusiness Interoperability Forum (eBIF) has kicked off a ‘Global eBusiness Interoperability Test Bed.’ Partners include the Enterprise Interoperability Centre, ETSI, the US National Institute for Science and Technology (NIST) and industry bodies. More from www.cen.eu/isss/ebif.


Hess signs with RapidRatings’ supplier scorecard

Financial health ratings monitor ‘corporate counterparty’ risk exposure.

Hess Corp. has signed-up for Rapid Ratings International’s (RRI) suite of products for measuring corporate counterparty risk exposures. Under the agreement, Hess will use Rapid Ratings’ Financial Health Ratings (FHRs) to evaluate and manage the risks associated with its public and private suppliers.

RRI’s ratings provide comprehensive measurements of the financial health of a company, along with predictions concerning its ability to remain competitive within its sector. The ratings include ‘forward-looking’ indicators of changes in shares and bond prices, credit default swap spreads and changes in credit rating agencies’ evaluations.

Greg Cortez, director of credit risk control with Hess said, ‘RRI’s risk management tools for counterparty exposures rate public and private companies on the same basis. In today’s volatile credit environment, risk management is crucial to profitability. We believe Rapid Ratings will be a valuable addition to our current process.’

RRI Chairman and CEO James Gellert added, ‘Given the current climate of increased uncertainty and weakened confidence, relying on the large rating agencies and unreliable default tools that incorporate market pricing is too risky. Smart operators in the market demand better tools. The FHR methodology meets this need by telling businesses who they can trust.


SAS rolls out ‘PAM’—Preventative Asset Maintenance

New ‘maintenance centric’ data model builds on service intelligence architecture.

Business analytics specialist SAS Software is rolling out a new Predictive Asset Maintenance (PAM) offering built atop its Service Intelligence Architecture data integration server. SAS PAM promises ‘optimized, sustainable maintenance strategies’ and improved equipment performance and availability. PAM leverages near-real-time monitoring and alerts generated by predictive models to ‘proactively address’ potential performance issues before they cause downtime or increase the length of planned shutdowns.

SAS is to leverage its predictive data mining capabilities to drive ‘continuously improved’ reliability and equipment efficiency. Analytics also identifies what is really affecting equipment performance from the hundreds or thousands of sensor tags and other measurements.

SAS’ ‘maintenance-centric’ data model captures data from legacy and modern MES, ERP, CMMS systems. PAM then transforms and cleanses the data for consumption by a wide range of stakeholders. The PAM data model is claimed to overcome the barriers imposed by ‘siloed’ operational systems. PAM’s automatic engine continuously monitors asset health, testing new sensor or condition data against defined rules and thresholds. These are then analyzed using SAS’ JMP front end. PAM is designed for use by both operations and maintenance workers and senior-level managers responsible for quality, productivity and supply chain costs.


MegaPath broadband for Shell’s ‘CoolBand’ retail network

Deal brings managed security, VPN for credit card processing, cash transfers and tank monitoring.

Costa Mesa, CA-based MegaPath is to provide broadband network and security services to Shell retailers and partners via the Shell’s ‘CoolBand’ network. CoolBand provides broadband connectivity and integration services to support credit and debit card processing and in-store applications via a secure network. Operators can connect stores to headquarters over the private network, eliminating costly phone lines for ATMs, cash transfers, tank monitoring systems and other analog devices. The system includes the latest point-of-sale systems with web-based reporting, inventory and labor management. The network also supports video surveillance and digital signage all over a single broadband connection.

MegaPath senior VP Dan Foster said, ‘The Shell CoolBand program delivers value to local Shell operators, allowing them to increase productivity and security without requiring significant new investments.’ The deal includes MegaPath’s managed security, VPN and broadband services.


EIC Data Stream tracks oil and gas industry projects

UK Energy Industries Council’s database provides business intelligence on 6,500 upstream projects.

The London-based Energy Industries Council (EIC), a trade association of energy industry suppliers, has just announced EICDataStream, a new database that tracks major global energy projects. EICDataStream currently holds 6,500 international oil and gas sector projects, providing members with a ‘timely’ business intelligence solution.

EIC CEO Mike Major said, ‘EICDataStream provides quality information on a wide range of global projects. Users can search projects using parameters such as company, contract type, value range, project status, region and country.’ Members have web-based access to the database and can produce reports and graphs tailored to their individual needs. Completed projects are archived and remain accessible. Project information is sourced by the EIC’s International network of offices in London, Billingham, Aberdeen, Rio de Janeiro, Houston, Singapore and Dubai and by its links with major operators and contractors. EIC currently has 550 corporate members.


West Energy rolls-out esi.manage spend management system

3esi toolset supports planning, budgeting and spend tracking and analysis for E&P company.

West Energy Ltd. Has chosen 3esi’s ‘esi.manage’ package to support its planning, budgeting and spend tracking processes. Graeme Bloy, VP Exploration, said, ‘esi.manage improves visibility of our budget data. All the information is collected in one place and everyone sees what is happening, not only for our wedge production and capital but also for our base production wells. esi.manage has also improved communications between our different departments and executive management.’

esi.manage improves planning, tracking and forecasting of capex, production, opex and reserve additions. Key performance indicators such as finding and development costs, exit rate and lifting costs can be followed in real time.

The latest esi.manage release adds enhanced capital management workflows such as multiple AFE’s per asset/expenditure and web services integration with third party AFE systems. Other additions include XML export to business intelligence and data warehouse systems.

Other esi.manage clients include Bronco Energy and Delphi Energy Corp. For 3esi’s sister product ‘rapt’ (risk analysis and performance tracking) 3esi counts Repsol-YPF, Marathon Oil, E.ON and Tecpetrol amongst its licensees. Rapt was developed in conjunction with Rose & Associates.


‘CAMS,’ Capital Asset Maintenance Streamliner announced

ProgramFramework rolls out oil and gas project management toolkit.

Enterprise Project Management specialist Program Framework has launched ‘CAMS’, its capital asset maintenance ‘streamliner,’ a scheduling engine for maintenance in oil and gas production operations. The UK based company, a Microsoft certified partner is a specialist developer of solutions built atop Microsoft’s Enterprise Project Management toolset.

Program Framework MD Paul Major said, ‘Optimizing maintenance is critical for 24/7 operations in oil and gas, where the cost of lost time through shutdowns is high. CAMS goes beyond maintenance scheduling, showing what manpower and equipment is required, flagging critical safety issues and indicating how a specific task fits with other planned and reactive maintenance activities. For example, scheduled maintenance may mean shutting down other systems, providing an opportunity to perform maintenance on them at the same time.’

CAMS was developed from earlier project management work performed by Program Framework for Marathon Oil. Last year (OITJ April 2007), Marathon commissioned Program Framework to design an enterprise project management (EPM) solution for its North Sea Brae asset. Brae comprises three production platforms and multiple onshore engineering teams. More from lawrie.siteman@programframework.com.


Welling study—‘Linux key to future of geoscience interpretation’

Worldwide Seismic Market study compares G&G software, platforms and price leadership.

The 2008 edition of the 300 page biennial Welling Survey of the Worldwide Seismic Market* includes a 50 page section on geological and geophysical software. The Welling survey compared G&G software from the major vendors based on some 200 responses from companies of various sizes and locations.

Welling’s researchers asked what operating systems users perceive as being key to the future of geoscience interpretation. Linux is perceived by most (50%) as the way forward, with 36% betting on Windows. Perceptions for Unix (excluding Linux) continue to decline and are now at only 8%. But the picture gets complex for selected sub samples. Majors and NOCs vote 80% Linux to 15% Windows. But for small to medium sized companies the picture reverses with 21% Linux vs. 60% Windows. The Welling ‘value map’ for G&G software shows all vendors as being pretty near to the median value for relative performance. There is however a considerable spread in value with one company leading by a short head in ‘relative performance,’ but hands down on price. Unfortunately, Welling won’t let us tell you which company this is. They want you to buy their study—which kind of figures!

* www.welling.com/studies/seismic.html


AspenTech patents key aspects of aspenONE data model

Plethoric claims for data sharing, ‘amalgamation’ and vendor neutrality for ‘federated’ data model.

AspenTech has announced a ‘recently issued’ patent covering ‘key aspects’ of its aspenONE master data model. The master data model is said to provide a single enterprise view of process data associated with engineering design and supply chain operations and is integrated with control systems and enterprise resource planning systems via a ‘vendor neutral’ interface.

AspenTech’s patent, granted in April 2008, is somewhat broad in scope, covering a ‘system and method for organizing and sharing of process plant design and operations data.’ The method involves a ‘respective class view for each of multiple software applications, a composite class view, a conceptual data model and a resulting consolidated multi-tier data model. The model enables sharing of engineering and other data from software applications with other process and plant engineering applications and programs.’ The patent also covers an ‘amalgamator’ that synthesizes class and composite views, the conceptual and the multi-tier data model. aspenONE’s ‘federated’ data model promises connectivity for 3rd party applications such as ERP and transaction-based systems. The ‘enter once - share by all’ model allows process data to be shared between design, optimization and decision support. aspenONE claims adherence to industry standards including S95, ISO 15926, B2MML and Open Operations and Maintenance.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.