March 2009


GIS for deepwater FEED

Instead of using computer aided design (CAD) for front end engineering design (FEED), Chevron’s engineers are using a geographical information system to build its deepwater Indonesian hubs.

Speaking at the 2009 ESRI Petroleum User Group last month, Chevron’s Cory Moore, along with presenters from several of Chevron’s suppliers unveiled an ArcGIS-based, 3D simulation life of field planning tool. Originally developed for Chevron’s deepwater Indonesian Gendalo and Gehem hubs, the tool is used to kick off front-end engineering design (FEED) and to address subsea layout of field components. The system is used for clash checks and flow assurance in regard of submarine slopes. Desk top metrology enables jumper dimensions to be computed early in the design process—speeding fabrication. Components such as trees, jumpers are held in a growing oilfield equipment library—established from real world metrology. These ‘snap’ to hubs in a ‘spatially correct’ manner—a submarine equivalent of Visio! New objects can be customized from engineering diagrams and added to library.

GIS was chosen over a conventional CAD approach because of the complexity of engineering design in a complex deepwater environment. File planners have to cater for changing bathymetry, reservoir geometry and geohazards like old risers on the seabed and sea scarps. GIS allowed for pipe and flow line routing and investigation of the thermal interactions. GIS’ power as an integration tool was illustrated by incorporating completion data from Landmark, pipes from Autocad and high resolution bathymetry.

At the heart of the system is a database developed by Ellis Geospatial containing Chevron’s standard unique identifiers for equipment tag numbers and linking these across Autocad, ESRI and other systems. Chevron’s Global Information Link (GIL), a worldwide computing and communications infrastructure, is used to provide web-based access to approved external contractors.

One neat use case is in planning remotely operated vehicle (ROV) activity. GRI Simulations is using the model in ArcScene for a 3D representation of the proposed design. This includes dynamic interaction, bumping into objects, and a simulated virtual ROV-camera view complete with fish. The ‘VROV’ simulator is used by ROV pilots to practice jumper installation. Farallon Geographics built the password protected geodatabase which is accessible from anywhere 24/7.

The system also interacts with conventional FEED toolsets including Documentum’s e-Room and Autocad. Workflows have been developed to automate Autocad updates after GIS design changes. The demos were pretty impressive—realistic representations of complex objects are manipulated with seeming ease and the resulting design has immediate real-world use in construction planning and training.

More from the ESRI PUG in next month’s Oil IT Journal.


CSC for BHP Billiton

Computer Sciences Corp. has signed a $53 million IT services contract with BHP Billiton covering worldwide infrastructure, desktop, HPC and telecoms.

BHP Billiton Petroleum has awarded Computer Sciences Corp. a five-year, $53 million information technology services contract for the management of BHPB’s global IT operations. The enterprise-wide infrastructure includes help desk, desktop, midrange, network, high-performance computing and telecommunications services along with applications management. CSC CEO Mike Laphen said, ‘We have supported BHPB for several years and are now adjusting our solution and engagement model to address the company’s new decentralized approach to IT.’

BHPB CIO Zhanna Golodryga added, ‘This agreement extends our previous relationship with CSC to provide scalable services and an enhanced IT experience for our users.’ CSC has provided IT support services to various organizations within BHPB for the past eight years. Last month (OITJ February 2009) CSC announced a petroleum enterprise business intelligence solution with Oracle Corp. Other CSC clients include BP, Conoco, Chevron and ExxonMobil.


So that’s the ‘semantic’ web! Now I understand!

Oil IT Journal editor Neil McNaughton takes note of a groundswell of interest in semantics in the upstream—just as Tim Berners-Lee seems to be redefining the nature of the beast!

I am loath to bug you about the semantic web again, but since in the current issue of Oil IT Journal we have no less than five independent mentions of the technology, it looks like we may have a groundswell on our hands if not yet a tsunami. We hear from Norwegian Kadme on the use of semantic web technology in the new ArcticWeb portal. In our report from the Microsoft Global Energy Forum we hear Dan Ranta (ConocoPhillips) describe semantics as ‘very useful’ and Peter Breunig mentions Chevron’s use of the technology. In piece on the upcoming SPE1 Real Time Optimization Technical Interest Group’s Joint Venture Reporting workshop to be held at next month’s Digital Energy Conference in Houston, mention is also made of the use of a ‘semantic web (ontology) to capture work flow use cases.’ And in our report from the Houston chapter of the POSC Caesar association we hear again from the mother of all oil industry semantic efforts, the ISO 15926 reference data library.

But what really sparked off this editorial was a presentation by Tim Berners-Lee to the 2009 Technology, Entertainment, Design (TED) conference in Long Beach, California last month. No, I wasn’t there, but the magic of the webcast2 allows me to summarize what TBL told the entertainers.

TBL, ‘I invented the World Wide Web,’ made a passionate plea—asking for help to ‘reframe’ the web. TBL described how, when working at CERN3, he had some difficulty selling the project to management. The idea was hard to explain before the web came into existence. TBL had to get folks to imagine what it could be like. Some did, and the rest is history. Now nobody thinks twice about putting their documents online.

Today, TBL is on another mission. He wants people to put their data on the web. The idea is that once there is lots of data out there, it can be mashed-up to provide insights and innovation. TBL invited the TED audience to reflect on ‘a world of linked data.’ The idea is very simple. Just as a web document has an ‘http name,’ the concept can be extended to data. Thus people, things, events all have ‘http names.’ Users will then be able to retrieve standard formatted data from a ‘thing’ and will be able to derive useful relationships. Linked data allows for instance, a person born in Berlin to be linked to data about the city—which is just another ‘thing’ with more http relationships. And that is all there is to it! TBL noted en passant that president Obama has said that US government data will be on the internet. He hopes that it will be deployed as ‘linked data.’ Data from business, scientists, about events, and talks is all amenable to linking. TBL wants to ‘make the world run better by making data available and by avoiding database hugging.’ OK, you can craft a beautiful website, but first, ‘give us your unadulterated raw data.’ TBL had the TED crowd chanting ‘raw data now!’ enthusiastically.

I have to say that I was impressed by the clarity of TBL’s presentation. But I was equally surprised that it included no mention of the semantic web! Could it be that TBL is back tracking from the ‘semantic’ positioning of his new web? I mean ‘put your data online’ is quite a different, and infinitely more understandable suggestion than the ‘ontologies,’ ‘reasoning’ and ‘meaning’ of the semantic web. The very word ‘semantic’ implied that there was to be some kind of machine based ‘understanding’ of text. To quote TBL from a much earlier presentation4, ‘[semantic] search engines will start indexes of assertions that might be useful for answering questions or finding justifications.’ Quite a different proposition from ‘raw data now!’

Perhaps the ‘semantic’ side of the new web was a red herring. After all, if the intent was just to get as much data available as possible, the W3C might have encouraged data owners to offer an easier way of exposing their databases with a simple http version of SQL or something along the lines of the one line ‘APIs’ that retrieve Google or Amazon data. Instead the W3C elected, not just to encourage data to be made available, but also to specify the RDF5 protocol, which was to make it more amenable to analysis, semantic or otherwise.

This was probably a mistake. If you are encouraging third parties to put their data in the public domain, then imposing—even asking for, a particular format, is a burden too far—and an excuse for not doing anything. The situation reminds me of an upstream data project whose objective was not dissimilar to TBL’s. An EU government decided to open up exploration and told the incumbent National Oil Company to put bags of its data in the public domain. The NOC then proceeded to acquire hardware (the StorageTek tape robotics were particularly impressive) and develop software, using a comprehensive and incomprehensible data model. After several years, public data delivery was minimal, governments changed and it was business as usual. The lesson? It’s an IT classic—too much focus on the technology and not enough on the ‘business.’

How does the ‘business case’ of linked data apply to oil and gas? Where are all the massive public data sets of interest to oil and gas? There aren’t any in RDF although there are many potential candidates from institutions like the MMS, CDA and Diskos.

So if it’s not about linked public data, why the interest in ‘semantics?’ The reason is that with web access, enterprise IT is now a microcosm of the world wide web. Data, which folks inside the firewall do want to link to is held in multiple, incompatible sources and applications. In this context, the semantic web’s RDF is experiencing modest take up as a lingua franca for master data management. Disparate data sources can be remapped to RDF and the ‘semantic’ tools are used to bring it all together. Is this the best way of achieving what should be quite a simple task? I really don’t know!

1 Society of Petroleum Engineers.

2 www.oilit.com/links/0903_1.

3 EU nuclear research establishment.

4 In TBL’s foreword to Spinning the Semantic Web, MIT Press 2003, but referring to a talk given in 1997!

5 Resource Description Format.


Gould unpicks recession’s impact on upstream

Schlumberger CEO reduces head count—but sees supply cuts fuelling future turn around.

Speaking at the 37th Howard Weil Energy Conference held in New Orleans this month, Schlumberger Chairman and CEO Andrew Gould offered some thoughts on the oil and gas business in a receding world economy. For Gould, ‘We are entering a period that will be very different from the last five years [..] of spectacular growth for Schlumberger.’ The world has turned from one of stretched supply to faltering demand. Demand, governed by economic activity, is the overriding driver of oil and gas prices. Until the world economy stabilizes, we can expect demand to reduce further and this, more than OPEC production cuts, will govern prices.

Gould went on to analyze the consequences of all this for the industry in 2009 and beyond. Recent cost increases mean that only conventional oil and current deepwater projects remain profitable at today’s prices. Heavy oil, enhanced oil recovery and ultra-deepwater no longer cut it, let alone shale oil or coal to liquids. Much Canadian tar sand activity been cancelled and some national oil companies have slowed their heavy oil projects. The price decline has also affected exploration and the credit crunch is accelerating the decline in activity.

What does this all mean for Schlumberger? WesternGeco, Schlumberger’s seismic arm, will be badly affected by the reduced exploration spend, although the unit will be cash-flow positive in 2009. One ‘bright spot’ is in data processing where reverse time migration and full waveform inversion are ‘creating new markets.’ Deep-water activity is relatively unscathed and generally, Schlumberger’s measurement-based technologies should remain in demand. Gould also sees positives in Middle East natural gas and resilience in Latin America and parts of the Far East.

In anticipation of the slow down, Schlumberger is cutting its head count by some 10%. The company is reducing or cancelling many non-essential projects and has removed some levels of management. However, to prepare the company for when activity picks up, some investments need protecting. Paradoxically, in view of the staff cuts, the first of these is people. Schlumberger has recruited some 11,000 engineers in the last five years so cutting back will be relatively easy. The plan is to maintain a modest recruitment program and to ‘manage’ the retirement of a large number of baby-boomers.

Schlumberger also plans to protect its investment in R&D, the ‘fuel’ for the technology of tomorrow. Opportunistic corporate acquisitions may be envisaged especially as valuations are now ‘falling more into line.’

The silver lining on the cloud is the effect of the downturn on supply. Already, cutbacks have reduced short-term production capacity by anywhere up to two million barrels per day. A longer period of low spending will mean a dramatic fall off in capacity and, once demand recovers, a steep recovery in price. Gould did not say when this is going to happen, ‘Judging the moment demand will turn is still almost impossible—your estimate is as good as mine.’


SPE—joint venture production reporting workflow workshop

Houston Digital Energy Conference to checkout semantic web/ontology for use case capture.

A half day workshop at the upcoming Society of Petroleum Engineers Digital Energy Conference (DEC) in Houston next month is to present a joint venture production reporting (JVPR) investigation sponsored by the SPE’s Real Time Optimization (RTO) technical interest group, the IT Technical Section and the University of Houston. The JVPR was initiated at last year’s DEC to develop a ‘workflow perspective’ on JV reporting and to develop a library of use cases and reference workflows.

The project also sets out to analyze the economic impact of the workflow and to evaluate the use of a ‘semantic web (ontology)’ to capture the workflow and use case. JVPR was selected as it offered a ‘quick win,’ particularly in the areas of invoice reconciliation and avoiding penalties due to reporting irregularities. JVPR workflows should also benefit from improving real time production data exchange between operators and their partners. Project member subject matter experts have documented a number of JVPR workflows in considerable detail. These are available on a project Wiki at http://jvpr.wikidot.com/table.


DNO commissions real time visualization solution from Epsis

Originally developed for Kern River, Epsis Real Time Assistant now deploys to Iraq.

Norwegian oil independent Det Norske Oljeselskap (DNO) International has commissioned Epsis (headquartered in Bergen, Norway) to deploy a 3D data co-visualization tool for use on its Tawke field in Iraq’s Kurdistan province. Epsis will be combining its Epsis Real-Time Assistant (ERA) Visual and ERA Connect applications to provide DNO’s technical staff with access to ‘all relevant information and field data.’ Epsis president Jan-Erik Nordtvedt said, ‘After years focusing on product development of the ERA platform for the US market, we now want to target Norwegian operators.’

ERA’s flagship deployment is as a component of Chevron’s Master Schedule View, part of Chevron’s ‘Minerva’ data infrastructure. ERA Connect pulls up applications such as Excel, PowerPoint, video and domain specific applications. The collaboration tool is used to add in a participant’s PC and share a workspace. Workflows and workspaces can call any application such as Google Earth or E&P specific applications. Virtual teams can be created across remote locations sharing all information on the screen and allowing for active participation. Chevron uses ERA to co-visualize real-time information on its Kern River field in California.


Whereoil selected for ArcticWeb project

Semantic technology from Norwegian Kadme will power arctic information portal.

ArcticWeb, a Norwegian Joint Industry Project has selected technology from Kadme to link disparate web resources of use to the consortium. ArcticWeb is to harmonize and publish data from the Norwegian Geological Survey, the Meteorological Institute, the NPD and others. Oil company end users will use the ArcticWeb portal to browse environmental data from the above sources. Data types include oil spill information, oil and gas facilities, pipelines, fishing, recreational areas, wrecks and sailing routes.

Kadme’s Whereoil (formerly K-Map) technology works by adding a semantic layer to public data sources that enables information to be collated and gathered together in a single interface for end users. The Kadme ‘Virtual Warehouse Framework’ was used to power a joint Kestrel-Kadme solution for metadata management (OITJ September 2007). ArcticWeb members include BG, ConocoPhillips, ENI, Lundin, StatoilHydro and Shell. More from www.oilit.com/links/0903_2.


OGP to update position formats

Major initiative seeks to harmonize UKOOA, SEG and other oil and gas navigation standards.

The Surveying & Positioning Committee of the International Association of Oil & Gas Producers (OGP) is to revamp its positional data exchange formats. Back in 2005, ownership of the authoritative geodetic data set developed by the European Petroleum Survey Group was transferred to OGP, formerly the UK Offshore Operators Association (UKOOA). The plan is for a major revision of these documents to be conducted in cooperation with the Society of Exploration Geophysicists (SEG). The SEG’s own positioning formats are now deprecated and will be replaced by the revised OGP P-formats. The aim is for a ‘single, global source of positional advice, guidance and format provision for the upstream.’

Under consideration for revision are data exchange formats for processed (post plot) coordinate data, raw marine positioning data, seismic binning grids and well deviation data. Stakeholder mapping and scoping workgroups are already under way. The Surveying & Positioning Committee is seeking support, technical contribution and comment from interested parties involved in data acquisition, processing, software development, data management and quality control. The OGP is also revising the MODU site survey guidelines and, in partnership with the International Marine Contractors Association (IMCA), updating the guidelines for the use of differential GPS in offshore surveying. More from OGP on http://info.ogp.org.uk/geodesy/.


Help for UK supply chain, new energy jobs website

Oil & Gas UK offers help to distressed supply chain. Energy Institute launches job seekers’ website.

Oil & Gas UK, an industry trade body, has launched a confidential payment help-line for the oil and gas supply chain. The service lets the supplier community give feedback on purchaser behaviors that are ‘adversely affecting their ongoing business viability,’ particularly in respect of the speed at which invoices are being paid.

Paul Dymond, operations and supply chain director with Oil & Gas UK, said, ‘We would like to encourage all member and non-member companies within the oil and gas supply chain to contact us through these links to let us know their experience with payment terms, so that we can identify appropriate industry responses. Information will be held in confidence and only shared in a generic, non-attributable form unless with the express written sanction of the provider.’

The Energy Institute (EI), another UK industry body, is also reacting to a period of ‘unprecedented change’ with the creation of an online recruitment service at www.yourenergyjobs.com. According to EI, ‘thousands of new jobs’ will be created over the coming years thanks to government investment in nuclear and renewables. The site offers advertising options and a CV database. It is not clear how the EI’s effort will be received by exiting energy recruitment boutiques. But if they see anything ‘adversely affecting’ their business viability, they can always go talk to Oil & Gas UK


DEAL tender up for renewal

Common Data Access is to re-tender Digital Energy Atlas and Library contract.

UK-based Common Data Access has announced that it will be re-tendering for the management of its Digital Energy Atlas and Library (DEAL) catalogue of UK offshore geoscience data. The current DEAL management contract (held by the British Geological Survey) expires at the end of December 2009. CDA intends to issue a competitive tender on the 6th April 2009 and to award a new contract in September 2009.

Bidders must be able to demonstrate appropriate domain knowledge and track record, a suitable ‘internet-facing’ application, qualified personnel and financial stability. DEAL data types include wells, seismics, licences, infrastructure, cultural and environmental. Services expected from bidders include data management, database management, subscription services and online access.


Software, hardware short takes

News from ISS Group, Schlumberger, PRCI, WellEz, Paradigm, Roxar, Tecplot and WellPoint.

ISS Group has released a new version of BabelFish Verify (previously Operations Data Recorder) adding an enhanced GUI, a new installer and updated documentation. BFV is a data quality stage gate that checks operational data before it is distributed to production reporting, allocation and accounting systems.

The 2009.1 release of Schlumberger’s Ocean for Petrel development environment offers workflows in seismic, geological, modeling and simulation—all now accessing data types and functions in Petrel. The release includes best practices documentation, the first 64-bit implementation and ‘custom’ coordinate systems—described as the first step in the roadmap towards an ESRI implementation of Petrel.

The Pipeline Research Council International (PRCI) has announced a new release of its Submarine Pipeline Stability Analysis and Design package, PRCI-OBS. The software was developed by KBR under contract to PRCI, and it is considered a world standard for the design and analysis of sub-sea pipelines. PRCI is a non-profit corporation comprised of 38 pipeline companies in the US, EU, Canada, South America and Middle East. PRCI-OBS V3.0 is available from PRCI’s marketing partner Technical Toolboxes.

WellEz Information Management has updated its eponymous flagship service that integrates field reports and third party applications. The upgrade serves information from the field to accounting and technical applications, positioning the WellEz reporting service as an enterprise-wide source of operational information. The WellEz reporting solution is provided as a hosted, software-as-a-service solution.

Engineering design and simulation software developer Ansys has installed a high performance computing solution from HP. Two systems with a total of 76 server nodes and 576 cores will support increasingly compute-intensive engineering simulation workloads. The systems include 28 HP ProLiant DL 165/160 server nodes in the US 48 HP ProLiant BL465c blade server nodes in Germany. The systems are based on quad-core processors from AMD and Intel.

Paradigm released Sysdrill 2009 at the SPE/IADC 2009 Drilling Conference and Exhibition this month. Sysdrill 2009 combines well planning and drilling engineering in a single application. The 2009 release eases data entry and third-party data load, adds ‘result-driven analysis for rapid identification of drilling problems and now incorporates geological data for improved planning and visualization.’ Sysdrill is now tightly integrated with Geolog forming a ‘real-time, contractor-independent geosteering solution.’

Roxar’s Tempest 6.5 release comes with extended parallel processing capabilities and speed improvements in both serial and parallel simulations. Improved visualization capabilities, streamlines for each phase and 3D cross sections make reservoir simulation accessible beyond the specialist reservoir engineering community.

Tecplot RS 2009 R2 now automatically determines history match factors for multiple simulation runs allowing objective assessment of the accuracy of a reservoir model.’

A new release of WellPoint Systems’ WellPoint Integrated Suite (WIS) heralds a port to Microsoft Dynamics AX 2009. WIS 5.0 leverages core AX capabilities including role based front end, business intelligence, workflow and requisitions. WIS integrates WellPoint’s Energy Broker and Energy Financial Management into a single package.


PPDM Houston Member Meeting

PPDM back in business after ‘near liquidation’ with quality, MDM and GIS initiatives.

A couple of years back, the then Public Petroleum Data Model (PPDM) Association was almost in liquidation. A change of management and serious money from Chevron and ConocoPhillips halted the decline. The new Professional Petroleum Data Management Association is back in financial health and the board notes that ‘there is no longer an overlap with Energistics’ activity.’

CEO Trudy Curtis proposed a new workgroup on data quality and business rules. The project will assess the importance of different types of rules and make recommendations as to how rules should be collected, validated, published, and retrieved. The idea is to develop a web based application along with a ‘starter set’ of rules for well header information management and a practical solution that is easy to understand and deploy. Funding of around $150,000 is sought for the two phase project.

Integrashare’s Gus Nodwell advocated a toolset for the creation and administration of a PPDM database. The idea is to build a graphical user interface (GUI) front end for administrators. This could be a web application or an API. Nodwell invited vendors and PPDM to work together on a cost effective toolset that would be of help to all.

Paul Haines of Noah Consulting noted the fact that a decreasing oil price was creating resistance for master data management projects. This can be mitigated by clearly articulating the value of MDM to achieve business buy-in and by managing MDM project scope and process ‘focus.’ Haines cited his work on Marathon’s MDM presented at last year’s PNEC as a particularly successful project with business buy-in, focus and a proof of concept deployment prior to at-scale implementation. Looking forward, the economic downturn will impact projects, but Haines believes that careful ‘re-balancing’ of scope, use of in-house resources will help as will stretching out program spend though 2009.

For anyone who doubts the usefulness of a PPDM-based MDM solution, Laredo Energy CIO Steve Jaques offered a convincing analysis of Laredo’s GIS-based data management solution. Laredo uses a PPDM 3.8 well master data store to coordinate data across a range of public and in-house data sources. ESRI GIS adds mapping to which Laredo has added iOps, an end user application blending land and well data and documentation. Jaques warns that smaller companies deploying PPDM need to be aware that tools are limited—and invites interested parties to join up with Laredo’s iOps initiative.

An interesting comment in the Q&A noted shifting ethics in the new generation of knowledge workers who are not ‘detail-oriented’ and are going to ‘kill the E&P industry!’ ‘Detail takes too much time and effort.’ Fortunately, there are plenty of contractors who are willing to spend the time and effort required. Anyone for outsourcing?


6th Microsoft Global Energy Forum, Houston

Around 500 attendees hear from oils including BP, Chevron, Marathon, ConocoPhillips and Shell on matters like knowledge sharing, digital oilfields, business intelligence and operations information management. Star of the show was Microsoft’s SharePoint, now deployed up and downstream.

Dan Ranta explained that for Conoco-Phillips (CP), the key to successful knowledge sharing was the link to remuneration. This allowed CP to start collecting testimonials including knowledge sharing stories. CP has been using communities of practice networks since 2004. Now knowledge sharing is ‘nested’ in functional areas and has become part of peoples’ ‘day job’. Semantic analysis has proved very useful. CP now has a strategy for retaining corporate knowledge and a ‘FAST’ process for Finding information, Asking colleagues, Sharing expertise and Trusting global relationships. CP’s own semantic search technology has been a great equalizer for non English native speakers in the company. Ranta warns that networks can be silos too! They need to reach out to other network streams, for instance the downstream fixed equipment network now talks to its upstream equivalent, generating synergies.

Peter Breunig stated that Chevron’s IT is now part of every component of the energy value chain. While the ‘cool stuff’ gets the press, it is the ‘not so cool stuff’ that runs the business and pays the bills. Chevron currently makes good decisions that require manual effort. The company now wants to automate and optimize these proven business processes. The ‘cool stuff’ includes iFields, maintenance, production, reservoir asset management and in the downstream, optimizing value in tanks, supply/value chain optimization. In Bakersfield, Chevron’s data is now in good enough shape to identify opportunities. Teams use SharePoint Team Sites, there is no more emailing of Power Points. Chevron wants to leverage the ‘wisdom of crowds.’ More employees can work from home. Chevron had adopted SharePoint and will be adopting My Sites. The idea is to let users span different disciplines and systems, to ‘blend poroperm with ERP!’ Semantics also ran. Chevron’s IM architecture is treated as an asset, data is managed as an asset. This stuff is not ‘cool,’ but if you don’t do this the rest won’t work. Curiously at a Microsoft event, Breunig also argued in favor of doing things in an ‘open source-ish’ way.

Chris van Dyke presented Microsoft’s ‘Contoso Oil and Gas’ proof of concept oil production scenario, jointly developed by ESRI and SAIC. The fictitious Contoso uses SharePoint and Microsoft’s Business Intelligence (BI) stack to display key performance indicators such as wells operating below forecast. An ESRI web part pulled in satellite data from Virtual Earth and production data from SQL Server. An instant message was sent out to a co-worker to go check things out. A production ‘Wiki’ was involved. Wells behaving badly were broadcast to a production chart tool. SharePoint, Wikis, KPIs, forms, search, business intelligence also ran. Microsoft is now ‘committed to platform integration,’ ‘taking the integrating burden away from clients.’ All the above can be rolled in with other services or hosted to a greater or lesser degree in a ‘blended’ offering.

If Contoso was less than a tour de force, Bob Newton’s presentation of Marathon’s ‘ViewPoint’ came close. ViewPoint started life as a SharePoint-based proof of concept that was designed to ‘succeed or fail fast!’ It succeeded! ViewPoint was built according to design principles as follows, 1) read all data from a system of record, 2) don’t move data around, 3) create role based views, 4) use iterative developments and mash ups 5) be open—deployment is based on the ‘need to share,’ not on a ‘need to know.’ Finally ‘minimal to no training’ should be required to use the system. Marathon now has a standard SCADA system, a global data Historian and has extended ViewPoint to upstream marketing, reservoir engineering and accounting. ‘Gold standard’ databases feed into a ‘DataView’ middleware layer that supports ArcGIS/ViewPoint and other viewers. Marathon has standardized human machine interaction (HMI) across 10 domestic business units. ‘Situational awareness’ means more ‘eyes on data’ and improved data quality. Folks are more careful about what they capture. Field and well views are available. One big win has been in exposing reliability metrics with drill down to work orders, without having to ‘traverse’ to the ERP system. ViewPoint shone a spotlight on data quality and on Marathon’s business processes. ViewPoint also highlighted how much of Marathon’s business was previously done in Excel and other non systems of record.

Steve Walker showed how Chevron is leveraging the Microsoft Business Intelligence (BI) stack in its downstream operations. BI is integrated with Chevron’s master data management initiative. Part of the business is standardized on a SQL Server/SharePoint/Performance Point stack. Performance Point lets ‘non SQL geeks’ write their own reports. Chevron uses the tool to track corporate performance against set goals. BI was used to enhance Chevron’s gas estimation process using various SQL Server reporting tools. BI is now used to monitor Chevron’s refineries. Walker logged on to the El Segundo refinery to view KPIs of crude in, yield and planned production. The system embeds a mass balance dashboard. OSIsoft’s SigmaFine also ran.

ConocoPhillips is also using SharePoint to access multiple software silos in its refining operations as Zane Barham explained. Mergers and acquisitions have made for multiple, inherited strategies and applications. ConocoPhillips is now working towards consistency by making data accessible, even to casual users. The idea is to offer KPIs across the value chain including a commercial view, refinery view, unit view, and equipment view for maintenance repair and operations. The SharePoint-based operations information system (OIS) is being globally rolled-out over a 2007-2010 time frame across 10 refineries. AspenTech’s Operations Manager has been adapted along with SharePoint and other .NET/C# developments. A ‘target board’ shows live values (200—500 items) against targets for optimization. The system connects to ConocoPhillips LIMS systems and providing access to this previously inaccessible data has been popular. ConocoPhillips now has a single code base for all its refineries—a change from the days when one refinery was run from a 64 sheet Excel workbook!

Among the exhibitors we spotted a curiosity from Votum in the form of a solid steel 2D Matrix ‘barcode’ designed to ‘tag’ drill collars and other oil country tubulars. The company thinks that RFID is inappropriate for the tough downhole environment and offers this ‘hardware’ solution for downhole inventory management.

This article is an abstract from The Data Room’s Technology Watch report from the Microsoft GEF. More information from www.oilit.com/tech and tw@oilit.com.


SMi E&P Information and Data Management, London

Opposing views on the merits and otherwise of indexing vs. ‘Google’ type search expressed. Talks from Saudi Aramco, Hess, Pemex generally support tagging. OMV’s ‘ISIS’ project federates multiple data sources. Dong analyzes data access rights. A heartfelt cry for ‘some new ideas!’

About 80 attended the 11th Smi E&P Information and Data Management conference in London last month. Schlumberger ‘s Eric Abecassis’ presentation introducing Petrel Data in Context sparked off an interesting discussion on the merits and otherwise of indexing and cataloging against ‘Google’ type search. Classifying E&P documents is a very difficult task that ‘would take for ever!’ Hence the automated tagging approach of Data in Context/MetaCarta.

However, Caspar Schoorl and Karen Blohm from data management specialist Fugro Data Solutions came down on the index/catalog side of the debate, particularly in the context of acquisitions and mergers and the need to combine different corporate data systems. For Fugro, metadata data use has been increasing significantly since 2000.

Al Kok (Saudi Aramco) likewise argued in favor of cataloguing, advocating ‘right-size’ metadata capture which can be manual or automated. Saudi Aramco’s Exploration Legacy Data project ran from 2002 to 2006 and included document scanning and metadata capture to the database. Workflow and processes have been developed for indexing and data loading, assuring ‘rule-based metadata management.’ Search can now be by metadata, full text and/or GIS based. A data governance program monitors roles, responsibilities and data policies.

Martin Turner presented Hess’ PPDM-based master data store and federating GIS ‘data nodes’ and data governance procedures. Hess is also a ‘cataloguer,’ leveraging a modified FDGC metadata standard in a move from paper-based maps to a lightweight, thin client web browser accessing data in a federation of a PPDM well master database, Tobin Land Suite, SAP, seismics and production data. Federation is achieved through a combination of ETL, SDE, ArcGIS Server and web publishing tools. Geodatabase governance uses a simple model that ‘expects users to think, that’s their job.’ Data is spatialized in the global Hess GIS Geodatabase and stored using the GIS ‘node’ concept. A GIS data node is a combination of GIS data and application files—all stored in a common folder structure. ‘Node keepers’ determine folder structure and manage data and access. Nodes transform loose data concepts into ‘something that is useful to Hess corporate.’ Turner reports that ‘attitudes towards metadata capture and GIS data management are changing.’

Tarun Chandrasekhar (Neuralog) outlined Pemex’ deployment of a PPDM-based data store in the form of NeuraDB, and has developed well data lifecycle processes and a ‘satisfactory’ division of labors between IT and the business. Pemex’ well log library dates back over 90 years and includes a huge Mylar and paper archive. Today, Pemex’ digital well log repositories span physical documents, network drives, application databases, DMS and custom and commercial well log repositories. NeuraDB is now the hub of a well log ‘knowledge factory,’ capable of supporting the log lifecycle from field logs through processing and interpretation. Work order and data quality management are supported via the PPDM database. IT has been involved in the project to assure data QC. This has resulted in the publication of a ‘Guide for Certification of Analog and Digital Geophysical Data’ by Pemex IT/Operations unit. Pemex’ drilling department is now custodian of the well log repository, and uses ‘well researched’ QC practices to provide quality processed data for Pemex’ interpreters.

Achim Kamelger described OMV’s ‘ISIS’ Project, a master data store, built around Schlumberger’s Seabed data model. OMV’s acquisition of Rumanian oil company Petrom resulted in ‘too many databases,’ with multiple links, synchronization issues and the need to duplicate data across applications. Proliferating Petrel projects led to multiple sources of ‘almost the same’ data. Directional surveys in Excel required a huge editing effort prior to load to Petrel, which then produced multiple trajectories!

ISIS is a set of federated databases with master data in a shared master data store (MDS). The MDS holds master data for structured, unstructured and spatial data. Dataflows are driven by processes and standards. OpenSpirit and ProSource are currently under test. Schlumberger was the development partner. A proof of concept MDS was built using the Seabed data model accessing data in GeoFrame, OpenWorks, ArcGIS and a DMS (Documentum). The idea was to ‘buy not build’ and to ‘configure not program.’

‘Hard and soft facts’ emerged from the proof of concept. The key is that you need to deal with people issues and to focus on the end user’s needs. OMV has developed a cook book with Schlumberger addressing issues such as naming conventions. Scaling up from the proof of concept was a potential pitfall. This was addressed by user acceptance testing, service level agreements and ‘SP3R2’ (standard processes, roles and responsibilities). OMV underestimated users’ resistance to the DMS. If a secretary does not support the DMS, the boss will not use it!

Dong’s Fleming Rolle noted a ‘disconnect’ between data rights management and access/copy control, especially with regard to data ‘outside the firewall’ used in data rooms and partner meetings. Today, you can take all logs from a new well on a $50 SD card, or a 3D survey on a 250GB portable drive! We need more rigor regarding ‘informal’ copying and sharing. The problem is, if you deploy ‘ultimate security,’ people won’t be able to do their jobs and they will probably find ways around the security system anyhow. We protect our large data sets with ASP/DSP, VPN, Citrix/ThinAnywhere solutions. But what about the CEO’s PowerPoint presentation to shareholders? Dong has developed a data ownership model that defines data types, and ownership roles for ‘advisor,’ ‘strategy owner’ and ‘business owner.’ The model is being integrated into corporate ‘stage gate’ workflows, so that the company knows where data is at any point in time. Dong is using ISO 27001 and 27002. Rolle recommends that this 40 page document be read and understood by IM and IT. It deals with HR issues, encryption and access control.

The conference generally reflected industry consolidation around master data, GIS, data quality and governance. Not exactly rocket science or, as one observer put it, ‘Where are the new ideas?’

This article is an abstract from The Data Room’s Technology Watch report from the SMi E&P Information and Data Management conference. More information and samples of this subscription-based service from www.oilit.com/tech and tw@oilit.com.


Folks, facts, orgs ...

AMEC, Baker Hughes, CERA, CGGVeritas, Chevron, Devon, Energy Navigator, ENGlobal, Enventure, FMC, GE, Geokinetics, MMS, Octaga, OpenSpirit, Perficient, Geoservices, Sensornet and more...

AMEC Paragon has named John Harrower Senior VP Operations at its Americas oil and gas business.

Baker Hughes has named Andy O’Donnell VP Western Hemisphere operations. Derek Mathieson is VP Products and Technology. Martin Craighead is Senior VP and COO. Art Soucy is VP Supply Chain.

Bhushan Bahree has joined CERA as a Senior Director Middle East. Bahree was previously with the Wall Street Journal and Medley Global Advisors.

Ramy Kozman is managing CGGVeritas’ processing and imaging hub in Cairo.

Gary Yesavage has been appointed president of Chevron Global
Manufacturing
. He was previously manager of Chevron’s El Segundo refinery.

David Hager, formerly COO of Kerr-McGee is now VP E&P for Devon Energy.

Dan Shikiar is heading up Energy Navigator’s new office in Denver, CO.

Roy Dowd is VP Construction with ENGlobal Corp.

KJ Tan heads up Enventure’s new office in Beijing.

FMC Technologies has elected Claire Farley and Thorleif Enger to its board. Farley is advisory director at Jeffries, Randall & Dewey and Enger is a former CEO of Yara International.

GE Oil & Gas has appointed Shaun Kelly as Canadian Regional Sales Leader for its PII Pipeline Solutions division.

Lee Parker is now Executive VP Operations with Geokinetics.

Industrial Defender has named James Blaschke VP Worldwide Sales and TC Lau VP of Professional Services.

David Liddle is now operations director and Tony Zaccarini is the new business development manager of the UK Industry’s Technology Facilitator.

Wayne Hampel has joined James W. Sewall Co. as VP Marketing & Sales. Hampel hails from Rolta International.

Ron Brinkman has been appointed Senior Staff Geophysicist in the Minerals Management Service’s Resource Evaluation Office. David Trocquet is the new District Manager in New Orleans.

Ole Christian Lappen is now Sales Director in Octaga. Lappen comes from Nokia’s Trolltech unit.

Mehdi Belrhalia is Business Development Managerhas in OpenSpirit’s Abu Dhabi, UAE office. Belrhalia was previously with Schlumberger Information Solutions.

Perficient has appointed John Hamlin and David May to its board. Hamlin is president and managing partner of Bozeman Limited Partnership and May is co-founder and portfolio manager of Austin-based Third Coast Capital.

Geoservices has appointed Jean-Pierre Poyet as CTO.

John Dick has been named VP Europe, FSU and Africa with Sensornet. Dick hails from Baker Hughes.

Tektonisk has changed its UK trading name to ShareCat Solutions Ltd. to align with its flagship ShareCat software.

Claus Kampmann is to retire from the position of Chairman and Director at TGS-Nopec. His ‘likely’ replacement is retiring CEO Hank Hamilton. Robert Hobbs now becomes CEO. Hobbs joined TGS last year from Marathon.

VisionMonitor has hired David Pierce as Executive Director of Sales and Marketing. Pierce was previously with Primavera.

Suhaka Consulting, an oil and gas database management company, has changed its name to Wofda, LLC. The company is headed up by Dave Kotowych (president) and Brenda Sigurdson (VP Operations). The pair co-founded A2D Technologies in 1993.

Gil Weisberger is directing John Wood Group’s new Mustang operating unit in Abu Dhabi, UAE.

Richard Slack has been appointed to the WellPoint Systems board. Slack is also president and CEO.


Done deals

Aker, ACEC, Berkana, Waterfall, Geosoft, Golder, GE, ID, Spectraseis, Dexa, Sensornet, Siemens...

Aker Solutions has partnered with the Arabian Consulting Engineering Centre (ACEC) to provide engineering services in the Kingdom of Saudi Arabia. Aker is also forming a joint venture with ACEC owner Sheikh Bugshan for in-Kingdom engineering and construction projects.

Berkana Resources Corp. has signed with Waterfall Security Solutions for the provision of sales and support for Waterfall’s security solutions in the United States.

Geosoft Inc. has signed a memorandum of understanding with Golder Associates to deliver data management solutions based on Geosoft DAP server technology and Golder’s GIS expertise.

GE Energy has signed a global OEM agreement with Industrial Defender to provide its cyber risk protection services to critical infrastructure industries.

Microresearch Corp. has acquired the stock of Rebel Testing Inc. of Gillette, Wyoming.

Saudi Makamin Co., the Al-Khobar, Saudi Arabia-based oilfield services group has taken an equity stake in Zurich-based Spectraseis alongside major shareholders Warburg Pincus and StatoilHydro Ventures. The companies have formed a new Dhahran-based joint venture to deliver low frequency geophysical solutions to the Middle East.

Dexa Systems has acquired Schlumberger’s Enterprise Security business. Dexa is now Schlumberger’s preferred partner for identity management and security services in oil and gas. Mehrzad Mahdavi will become Dexa Systems president and CEO.

Sensornet has signed an exclusive agreement with Abu Dhabi-based AlMansoori Specialized Engineering for the provision of thermal profiling solutions to the oil and gas vertical.

Siemens Energy and Industry units have signed an agreement with Fluor Corp. establishing Siemens as a preferred supplier to Fluor and its global projects business.

Process Systems Enterprise has appointed Hyperion Systems Engineering as agent for its gPROMS process modeling system in the Kingdom of Saudi Arabia.

Schlumberger has succeeded in its patent nullification action against competitor EMGS. The UK Patents Court followed an earlier ruling by the European Patent Office in September 2007 confirming Schlumberger’s position that the patents were obvious.


SPT Group EDPM to monitor Marlim production

Real time multiphase production management system to address flow assurance challenges.

Petrobras is to deploy a real-time production monitoring and management system (PMMS) based on SPT Group’s eField Dynamic Production Management system (EDPM) and the OLGA dynamic multiphase flow simulator. The PMMS will be installed at the deepwater P-35 Marlim field to optimize production and address flow assurance challenges. The system includes an advanced dynamic gas lift solution to optimize subsea well performance.

Marlim, once billed as the largest subsea development in the world, is located in the northeastern Campos Basin about 110 km offshore Rio de Janeiro. Water depths range up to 1,000 m. Today eight floating production units produce oil from upwards of a hundred wells.

The PMMS project is a first for Petrobras, with the coupling of a dynamic flow simulator and a real-time data acquisition system. The ‘landmark’ project is part of an ongoing Petrobras investigation into tools for improving production, minimizing risk and providing ‘true monetary value to the end user.’ SPT Group has deployed 20 similar flow assurance solutions in the past three years and claims that the EDPM/OLGA combo is becoming an industry standard for multiphase production and real time production management systems.


StreamSim on massively parallel multi-core architecture

Three fold speed up reported with OpenMP-based code parallelization—bests cellular approach.

StreamSim Technologies and the Fraunhofer Institute have adapted StreamSim’s commercial reservoir fluid flow simulator to run on parallel multi-core architectures leveraging the OpenMP programming model. The work was presented at the Society of Petroleum Engineers Reservoir Simulation Symposium held in The Woodlands, Texas last month.

Parallelization was facilitated by the fact that the bulk of the serial program’s run time involved computing independent solutions for each streamline. Fraunhofer’s in-house software was used to identify ‘thread-safe’ variables that would migrate to the multi-core architecture with minimal rewrite of the existing code base. The test code was run on an AMD Opteron 8218-based machine with eight, 2.6 GHz dual core processors—for up to 16 OpenMP threads on RedHat Linux.

The parallel simulator was tested on several models including a Forties field model and a Middle East dual-porosity model. Speed up of between 2.5 to 3.5 fold was seen for 8-threads—reducing run time for these large models from around 12 hours to under 4 hours. Beyond 8-threads the serial portion of code became the limiting factor. The authors believe that the inherently parallelizable nature of streamline simulation makes it better suited to several modern reservoir evaluation workflows than traditional finite difference, cellular simulators. The full paper can be obtained from www.spe.org.


POSC Caesar Association US member meet, NorHub announced

Semantic oil and gas platform, information validation services and next generation integrated ops.

Norway came to Houston last month for the US member meet of the POSC Caesar Association. PCA develops and manages engineering lifecycle standards used in the offshore construction industry. Nils Sandsmark traced the history of PCA which started as a Norwegian R&D project in 1993 and which now has 34 corporate members in 8 countries. PCA originally leveraged the EPISTLE engineering data model of the 1990s which has seen real world deployment by a significant number of major projects, engineering contractors and owner operators. EPISTLE has since evolved into the ISO 15926 data standard which is to be finalized in 2010.

Development of PCA standards is largely driven by Norway’s Integrated Operations projects—leveraging standards from the world wide web consortium. In particular, automation of the next generation IO phase will utilize the W3C’s Semantic Web technologies (OITJ January 2009) of ontologies for automated reasoning and rule-based inference.

Much of PCA’s activity revolves around the establishment of reference data sets of engineering information. PCA has teamed with the US-based FIATECH organization on development of a reference data services with governance for interfacing with ISO and for maintenance of the reference data.

In Norway, the next generation of standards will come from a new Integrated Operations in the High North (IOHN) project that will look into unmanned drilling and production, production optimization and sub-ice operations. These are to be facilitated by further development of the ‘semantic oil and gas platform’ and ‘information assurance networks’ built around web services, subsea sensor networks and control systems and embedded information quality systems.

The semantic oil and gas platform will see enhancements to ISO 15926 based oil and gas ontology and the development of a prototype information validation service. A prototype web services platform will be designed to support automatic monitoring, simulation and optimization. A ‘semantic platform’ is at the heart of the whole process. IOHN participants include StatoilHydro, IBM, DNV, National Oilwell Varco, FMC, SAS Institute, Kongsberg, PCA and Norwegian trade body OLF. The project has a forecast budget of $10 million.

PCA, ShareCat and the E&P Information Management Association (EPIM) are working on a commercial model for disseminating ISO 15926 reference data and ShareCat’s catalog and documents. The plan is to provide equipment information for projects from ‘NorHub,’ a hosted database of equipment data for the offshore industry. NorHub should be operational later this year.


Sales, contracts and deployments

P2ES, Pansoft, ISS Group, Aker, Barco, Chiyoda, Coreworx, NVIDIA, Paradigm, Roxar, Octaga ...

AB Resources has licensed P2 Energy Solutions’ Excalibur energy management system, an oil country ERP and accounting package.

Shanghai-based Sinopec unit has signed a $1.4 million contract with ERP software developer Pansoft of Shandong province for a customized capital management solution for its overseas subsidiaries. Pansoft’s development effort was assisted with a $150,000 local government award.

Egyptian Operating Company for Natural Gas Liquefaction has signed with Perth, Australia-based ISS Group for a BabelFish-based Plant Information Management System. The contract is valued at approx. AUD $ 2 million over five years.

Dong E&P Norge has awarded Aker Solutions a NOK 400 million contract for a subsea production system on the Trym field.

The US Mineral Management Service has selected Barco as prime solutions provider of its new visualization center. The facility is built around a Barco CADWall, twin Galaxy 3D stereo projectors and Barco’s XDS-1000 display management system.

RasGas has awarded Chiyoda’s Almana Engineering unit a long term engineering, procurement and construction contract for several LNG and gas processing plants for a total contract value of approx. $300 million.

Acorn Energy unit Coreworx has seen its contract with Fluor Enterprises extended, reaffirming Coreworx as a key supplier for Fluor’s project collaboration and document management applications.

The University of Texas’ Bureau of Economic Geology is building a seismic data processing solution around NVIDIA’s Quadro FX Pro GPUs.

Sasol Technology has awarded Foster Wheeler South Africa a framework contract for feasibility studies and front-end engineering and design services.

StatoilHydro has extended its use of Octaga’s Enterprise 3D VR solution for the second phase of its Ormen Lange development.

Paradigm announced sales of Geolog to DONG Energy and a multi-year technology access contract to StatoilHydro. The latter provides for global use of Paradigm’s subsurface applications. The company also sold its geosteering and drilling software, Sysdrill and Geolog and GeoSteer to Williams Production.

Russian Salym Petroleum Development has awarded Halliburton a $100 million, four-year contract for provision of directional-drilling, measurement-while-drilling and logging-while-drilling services.

The program includes 400 ‘S’ wells, plus directional and extended-reach wells.

Dutch EPC Iv-Oil & Gas is extending its use of Intergraph’s SmartPlant Enterprise, integrating its existing solutions with SmartPlant Materials and SmartPlant Reference Data, streamlining clients’ supply chains.

BP, acting on behalf of the Azerbaijan International Oil Co. has awarded KBR the front-end engineering and design contract on the Chirag Oil field, Azerbaijan.

Kuwait National Petroleum Company has cancelled its contract with Fluor Corp. for work on the utilities and offsites for the al-Zour refinery. $2.1 billion will be removed from Fluor’s backlog in Q1 2009.

Roxar has won a NOK 10 million contract with a Middle Eastern client for the supply of Tempest, EnABLE and Geomodeling’s VisualVoxAT.

China Oilfield Services Ltd. has awarded a contract to Siemens Marine Solutions for the provision of electric-propulsion and automation packages on two new seismic vessels. The order is worth about €25 million.


Standards Stuff

Eurostep Java toolbox, Energistics geophysics SIG, API Rosettanet revamp, W3C rolls out OWL 2.

Eurostep engineering standards body has announced a beta release of a STEP AP233 toolbox for the Java platform. The AP233 toolbox is intended for creating interfaces to legacy engineering systems for data import and export. The Eurostep toolbox enables rapid development and deployment of ISO 10303-233 based applications and interfaces. AP233 is a STEP-based ISO standard for exchange of engineering data covering structural analysis and design, electronic assembly and interconnect and cable harnesses. While AP233 is predominantly a plant/manufacturing standard it does extend to the process industry. Eurostep clients include Shell, BP and StatoilHydro.

Energistics Geophysics Special Interest Group heard from India’s ONG this month on the need for an update of the Society of Exploration Geophysicists’ (SEG) seismic data exchange formats. ONGC’s J.V.S.S.N. Murthy noted that customization of existing SEG formats has led to proliferating ‘standards’ and large data sets stored in custom formats. Murthy further noted that there are no standards for exchange of velocity data and derived seismic attributes and interpretations. ONGC is proposing that the Energistics Geophysics SIG collaborates with the SEG, the RESCUE SIG and the OGP (see page 4) on the standardization of seismic-related data for ‘seamless data flow and application-independent interoperability.’

The American Petroleum Institute’s PIDX e-business standards body has ‘unanimously’ adopted the first revision to the PIDX Rosettanet (RN) specification since it first appeared in 2002. PIDX Vice-Chair Tim Morgan said, ‘RN provides data confidentiality, integrity, and non-repudiation—all of which are unaffected by the changes. The new standard does not introduce any new requirements. Rather, it addresses ambiguities and inconsistencies that existed in the original document and provides a cleaner implementation guide.’

The World Wide Web Consortium (W3C) has announced a new release of its web ontology language. OWL 2 Web is described as an ontology language for the Semantic Web with ‘formally defined meaning.’ Ontologies are formalized vocabularies of terms, often covering a specific domain and shared by a community of users. Definitions include relationships with other terms in the ontology. OWL 2 is designed to facilitate ontology development and data sharing with the ultimate goal of making web content more accessible to machines.


Meridium, APT team on asset strategy development

Deal combines equipment failure mode database with asset management strategy software.

Meridium Inc. and Asset Performance Technologies (APT) are teaming on what is claimed to be the world’s largest ‘asset strategy library’ for industrial equipment. The deal sees the integration of APT’s library of industry best practices with Meridium’s Asset Performance Management solutions.

APT’s database includes failure modes and preventative maintenance plans for more than 200 equipment classifications. The library covers over 13,000 failure modes and 1,400 preventative maintenance activities which have been developed and refined by panels of experts from plant and equipment vendors.

APT CEO David Worledge explained, ‘Traditional preventative maintenance improvement starts with a blank sheet of paper. This is not practical for large scale deployment in real production environments which is why we created the Asset Strategy Library. Now, maintenance and reliability engineers can start with an expert-recommended strategy and a sound technical basis.’

Customers use the library to tune operations, manage risk and costs. The resulting strategy can be transferred to corporate EAM or CMMS systems, such as SAP, via Meridium’s Asset Strategy Implementation (ASI) module. Meridium’s customers include Chevron, Marathon Oil and Xcel Energy. APT content is currently in use in power generation, oil and gas, and other major industries.


CygNet for Energy Transfer Partner’s Houston Pipeline

‘Non disruptive’ deployment provides single management console for whole network.

Energy Transfer Partners (ETP) is deploying a SCADA solution from CygNet solution on its Houston Pipeline asset. ETP SCADA manager Joe Schmid explained, ‘Our old system involved numerous servers, applications and interfaces for production and disaster recovery. Applications came from different vendors. Some were even written by Enron’s programmers! Using and managing the system was cumbersome and time-consuming. The new CygNet platform is far easier to manage and supports our aggressive growth and the massive amounts of data we need to manage. We now have a single management console for all three pipelines and the opportunity to use CygNet to improve business processes such as remote asset management.’

ETP completed its Houston Pipeline and another project in just 14 months in what is described as a ‘non disruptive’ deployment. ETP has over 17,000 miles of pipeline in service with another 750 miles under construction. CygNet provides a high-availability infrastructure for gas control, easy-to-use system tools, bulk data management, replication and validation and standard protocols for communicating with field devices.

CygNet CEO Chris Smith added, ‘CygNet provides the oil and gas industry with an advanced platform and a unified information ecosystem. Users see improved management of operational data and consolidation of information from every corner of the enterprise.’


EleSy deploys Iconics solution on Transneft’s pipelines

Genesys32 control system monitors country-wide network of 400 pumping stations and 100 tank farms.

Transneft, the Russian state-owned entity that manages the largest pipeline network in the country has deployed a Genesys32 control system from Foxborough, Massachusetts-based Iconics. Installation was performed by systems integrator EleSy of Moscow. The system monitors 400 pumping stations with 1,000 tanks in 101 tank farms.

The centralized dispatch and SCADA system runs on 500 PCs running Microsoft Windows NT and Windows 2000. Overall, some 800,000 digital and analog tags communicate over microwave, land telephone lines and 22 satellite links using a standard TCP/IP protocol. The 2,100 operator screens required to operate the pipeline have an average response time from any point in the system from between three to five seconds. Genesis32 is web-enabled and integrates seamlessly with large network applications.

Transneft has noted a substantial reduction in leak detection time and in dispatching repair solutions. Transneft’s network carries 99.5% of all the oil used in Russia. The system supplies 35 refineries and ships oil products to Europe and China.


Schlumberger announces Orion II telemetry

Downhole mud pulse telemetry rates reach new high—three bits per second in a 10km hole.

Schlumberger’s Orion II measurement while drilling system uses new downhole compression algorithms, enhanced signal detection and noise cancellation methods to detect weak signals at higher telemetry rates than before. Downlink commands can be sent in real time while drilling without affecting delivery of measurement data to surface.

Orion II was used last year on Maersk Oil Qatar’s Al-Shaheen extended reach well (over 12 km) to provide mud pulse bitrates of 3 bps at 10km and 1.5 bps from 10km to final depth. This translates into a possible recording bandwidth of 60 curves at around 8 data points per meter while drilling at 100m/hour.

Schlumberger’s Ian Falconer said, ‘With Orion II more quality data and higher resolution images can be transmitted from greater depths, enabling better drilling decisions to be made in real time. Both drilling efficiency and geosteering accuracy are enhanced by the system, which holds the records for the world’s deepest downlink, deepest logging and measuring while drilling transmission and deepest directional control.’


WiMax coverage for Gulf of Mexico, Colorado

IDT Spectrum has got FCC approval for new oil and gas-targeted broadband service.

IDT Spectrum has received approval from the Federal Communications Commission (FCC) to operate 46 WiMAX base stations, operating in the 3.65 GHz band, across Texas, Louisiana and Colorado. The company plans to offer WiMAX-based communication services to headquarters and regional offices of oil and gas exploration and services firms, enabling ‘cost effective, virtual connections with their rigs and platforms in the field.’ The area covered by the recent FCC approvals includes approximately 350 oil rigs.

IDT Spectrum CEO Mike Rapaport said, ‘Real-time streaming of well head data, VoIP, fax services, video surveillance and other types of information from the rigs and platforms can be transferred over a WiMAX network to onshore teams for monitoring and analysis. In the Gulf and on the mainland, enhanced communications between the on-site crew and off-site technicians are a key productivity driver.’

IDT Spectrum holds over 1,250 FCC licenses of fixed wireless spectrum including a nationwide 3.65 GHz license. The company leases spectrum for point-to-point links and large geographic areas of coverage. IDT Spectrum parent, IDT Corp., a telecommunications and energy holding company, is also engaged, through an affiliate, in an oil shale project in Western Colorado. WiMax promises a 70Mbit/s bandwidth over tens of kilometers.


Kappa—use of hacked software leads to ‘damaging’ decisions

Pirated versions of software may seem right—but come with a health warning.

Pirated versions of Kappa Engineering Ecrin and Emeraude packages were recently posted to an illegal website. Kappa has initiated legal action against the site’s host and warns that users of the pirated versions are in infringement of its intellectual and commercial property rights. Kappa has posted a warning on its website that while the ‘cracked’ versions of the software appear to perform correctly, they will produce incorrect results. Users of the illegal versions run the risk of legal action. Moreover, using the results may lead to ‘damaging’ decisions.

The pirates used a well-known weakness in FlexLM that makes it relatively easy to by-pass the protection. Kappa’s encryption and checksum tests were also by-passed to unlock the software. However, Kappa anticipated such an attack when it implemented its protection scheme and has built-in some tricky code. This means that although the pirated versions produce correct results for the sample runs used in Kappa’s training material, they will give incorrect answers on real situations.

The company further warns users of pirated software of embedded code that allows illegal use to be tracked, even years after the event. The latest releases of Ecrin and Emeraude will also complain if a file from an older pirated license is opened and the program will halt. Kappa invites users of illegal versions of the software to ‘fess up’ in view of a settlement. As the company points out, it is going to do whatever it takes to protect its €5 million annual investment in its code.


Nalco Chemicals deploys hosted data historian from OSIsoft

Software plus services model centralizes operations data in Managed PI/SharePoint combo.

Nalco Chemicals has deployed a hosted version of OSIsoft’s PI System data historian to automate data collection from its global oil, gas and chemicals sites. Operations data from remote sites is now streamed to a centralized ‘Managed PI’ repository and analytics engine. The system is hosted on Nalco’s servers and managed by OSIsoft. Staff and customers access data, analytics and process KPIs through a SharePoint front end.

Nalco’s marketing manager for automation John Schlitt said, ‘In the chemical business, it’s important to document the performance of the chemicals that are used by our customers. OSIsoft’s Managed PI approach has let us centralize data collection for our customers.’

The solution leverages a complete Microsoft stack with SharePoint, SQL Server, Excel and PerformancePoint services. Microsoft’s Office Communications Server and Communicator also ran. In what is described as a ‘software plus services’ deployment, real-time information and analytic tools are hosted at the central location. Nalco technicians and customers can collaborate and analyze data in real time.


Octaga Enterprise extended for Ormen Lange, Agbami

Toolset transforms huge CAD plant models into virtual reality training and visualization systems.

Since its first trials of Octaga’s 3D virtual reality viewing system (OITJ September 2006), Shell’s Norwegian unit has been working with Octaga to support very large plant data models such as the one developed by Aker Solutions for the Ormen Lange field. The results of this collaboration have been commercialized as Octaga Enterprise, now capable of visualizing the 35 GB onshore processing plant model with over 1.5 million plant items. For the offshore end, a 50 GB Pro/ENGINEER model includes two complete templates, 16 Christmas trees and a digital terrain model of the seabed.

The same technology is now being deployed by Chevron on its Angolan Agbami floating production storage and offloading facility. Chevron has used Octaga Enterprise to visualize the 15GB CAD model, with 160,000 equipment items and to provide access to Agbami’s training material, developed by Houston-based Epic Integrated Solutions. Octaga Enterprise was further extended under Chevron’s instruction to allow for manual Avatar placement, access-way highlighting and navigation, pipeline following and linking to Power- Point presentations.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.