June 2009


Chevron’s Keystone MDM

Chevron Fellow Jim Crompton notes ongoing data challenges in the upstream and outlines ‘Keystone,’ a master data management initiative that is to underpin its Information Architecture.

Speaking at the 2009 PNEC Data Integration Conference (report on pages 6&7) in Houston last month, Chevron Fellow and Master of Upstream Architecture Jim Crompton asked, ‘Why are we still challenged by data management?’ Crompton noted that many issues discussed at earlier PNECs remain. Some folks believe things were better 30 years ago before we went digital. The good news is that management is now concerned about data management but the bad news is that management usually turns to IT and says ‘fix it.’ This is not the way forward.

Meanwhile, the need for current data is growing. Users want data now and they don’t want to spend time making sure it’s right. This is especially true in the context of production and real time data. A multi disciplinary approach and asset-level partnership is needed. To date there have been some IM successes in restricted domains and geographies. But things fall apart when a production engineer needs data from the drilling system. For work in a collaborative environment such as an RTOC, many IM problems need to be addressed. Crompton showed a complex spider diagram of multiple systems of reference (SoR). Chevron has identified around 40 SoRs needed to run the business. So far only around 20 SoRs are under control—the rest are on spreadsheets or otherwise not managed! Chevron’s information architecture (IA) is an overarching approach to IM including quality improvement, governance across the information value chain. ‘You can build a portal in 6 weeks but it might take 6 months to figure out and solve the underlying data.’ The cornerstone of IA, the Keystone master data management (MDM) server, provides the ability to interoperate across key information objects. Here, ‘we are talking about the kind of data that most users don’t want to know about.’ In other words ‘referential standards’ rather than ‘how much is my well producing’. It is hard to get funds to attack this sort of problem and even harder to define cross discipline semantics in a way that is satisfactory to all. But without a common semantic there are ‘too many good meanings.’ Problems go from hard to ‘really intricate’ and the question is, ‘when do you stop analyzing?’ Keystone includes safety issues. ‘Bad data can hurt people. It is not just dollars and cents and the ROI of MDM may not be obvious to all. What we are attempting to do involves well, legal, land and more—all the semantics of our industry.’

Last December we reported from the Chevron-hosted Semantic Technology in Oil and Gas event where there was talk of using semantics to manage enterprise master data. Crompton told Oil IT Journal however, ‘While Chevron has several research projects around the semantic web, the Keystone project is not planning on using this technology for our Upstream Master Data Management approach at this time.’


SMT, JOA team

JOA Jewel Suite’s ‘true vertical gridding’ embeds Kingdom Suite which now offers a ‘modeling while interpreting’ capability.

Seismic Micro Technology (SMT) has added geomodeling to its PC-based seismic interpretation package in a deal struck with Delft, The Netherlands-based JOA Oil and Gas. JOA’s JewelSuite is to provide a ‘modeling while interpreting’ capability to users of SMT’s Kingdom Suite.

SMT president and CEO Arshad Matin said, ‘Previously, interpretation and modeling were dysfunctional, separate silos managed by different teams. Kingdom Geomodeling (KG) changes this with support for 3D modeling from inside the interpretation application.’

JOA president Gerard de Jager added, ‘JewelSuite’s orthogonal gridding algorithm has proven its worth in the engineering domain. Our partnership with SMT brings the true vertical gridding method and a seamless workflow to seismic interpreters.’

KG creates sealed structural models, 3D geocellular grids and properties. Tight integration means that there is no need to export an interpretation data to the modeler. ‘Next generation’ gridding provides better complex structure mapping than pillar gridders such as Petrel. SMT and JOA are to share technology and co-market the new solution. KG will see ‘limited availability’ in Q4, 2009. More from ron.boese@jewelsuite.com.


Back to the future—the return of the standard data model?

Editor Neil McNaughton reports from PNEC panel comprising BP, ExxonMobil, Chevron and Shell.

This time of the year we roll over the sponsors of our website www.oilit.com. It therefore behooves me to thank the following renewing sponsors:

Exprodat

geoLOGIC

Geosoft

Georex

Neuralog

OFS Portal

Petris Technology

Petrosys

... and to welcome the following newcomers:

Austin Geomodeling

LMKR

OSIsoft.

Oil IT Journal’s publisher, The Data Room has also taken up a sponsorship slot and to celebrate, has placed the full text of all of its Technology Watch reports from 2007 into the public domain on www.oilit.com/tech.

~

There was an interesting panel session at the 2009 PNEC Data Integration Conference last month (full report on page 6 of this issue) on the topic of ‘standard data models.’ Twenty years ago or thereabouts the standard data model was all the rage and very considerable amounts of industry money were poured into efforts such as Epicentre, Project Discovery, and POSC/Caesar. A smaller amount of money (but a lot of time) was also put into PPDM. What did the panel have to say on the subject in 2009?

The panel comprised co-chairs Cora Poché (Shell) and Mark McDermott (ConocoPhillips Alaska) along with Rusty Foreman (BP), Mike Ryan (ExxonMobil) and Mike Underwood (Chevron).

McDermott stated that ConocoPhillips is trying to globalize support and is getting interested in data models and integration. But the panel’s chosen topic, the ‘standard data model’ (SDM) has failed to gain traction. It has proved hard to overcome the lack of business sponsorship and recognition of the importance of the SDM. There are trade-offs between short term cost and long term benefits and interoperability maybe in users’ interests but not necessarily in the interest of a vendor! Perhaps a data integration layer is the way forward—anyway, ‘you need to put in your own effort.’

Poché’s field is seismic data management. Shell’s SDM (built around Landmark’s CDS solution) has been implemented across several business units and makes for easier support. Cross discipline data integration is easier with the SDM. The issues are cost and complexity.

Foreman stated that BP has looked into the SDM, but ‘people want what they want—and may have many reasons why they want different stuff from a standard.’ Foreman’s unit is working on a common solution for wells seismics and logs. But BP’s decentralized model meant that some potential early adopters had ‘rolled their own’ and opted out of the standard. Foreman’s team identified some serious deficiencies in the assets’ own solutions and eventually they came on board. The moral is that, left to their own devices, business units will do their own thing. But if we can show a better answer they will buy in.

Ryan noted that all data has entropy. Its natural state is chaos. If you leave data alone, it will ‘mutate, replicate and consume all available resources!’ Even the SDM is unstable. It is against nature and needs energy input to keep it in a stable condition. Exxon views data as a corporate asset. Oils no longer own drilling rigs. ‘We are knowledge workers who turn data into decisions.’ The bottom line is that businesses must own the data and pay the bill for keeping entropy at bay. Conversely, an SDM must support business processes.

Underwood agreed—good data drives better decisions. All are aware of the promise of the ‘digital oilfield.’ But are we keeping up with managing the data? Chevron experiences problems sharing and synchronizing data between disciplines. Requests for information can take days to fulfill. Data silos offer resistance to data flows. These problems have been ignored for too long. Noting previous failures, Underwood questioned whether the SDM was the right answer. The need to manage data remains but you need the right technology. Chevron’s thinking is that master data management is the way forward (see our report on Chevron’s Keystone MDM initiative on page 1 of this issue).

I thought that these answers seem to be ducking the issue a bit. The bona fide industry SDMs hardly got a mention. We haven’t heard from POSC/Energistics’ Epicentre SDM in almost a decade but we have frequently reported on putative deployment of PPDM. I formulated such musings in a question—who is using PPDM?

The answers were not as clear cut as I had hoped. Exxon is in the process of finishing a major upgrade of its own corporate data store. This has taken five years to tune to business workflows and will take another year to deploy, migrate, plug in data feeds and train. This has been a huge investment and is ‘not going to be swapped out in the near future.’ BP uses Schlumberger’s Finder in most business units, with a PPDM instance in Canada. There is no SDM in BP today although there is a growing recognition of the need to get to ‘a common place.’ Foreman noted that despite PPDM’s broad coverage, there will always be other systems on the edge of the SDM’s core.

Schlumberger consultant Steve Hawtin disagrees strongly with the whole SDM concept. According to Hawtin, Epicentre failed because each subject matter domain needs different things in its database. Even different instances of the same commercial database can be inconsistent. ‘There is no perfect platonic representation of a well!’ Hawtin does like systems of record and master data and wondered if there was any chance of seeing such in our lifetimes. Foreman thought that an E&P SDM was unlikely. Ryan and Underwood were less skeptical. IHS’ Richard Herman wondered if the whole SDM thing was ‘just too damn hard.’ But a database schema, hidden behind the firewall, is a ‘necessary evil.’ Data can then be delivered in any format over the web. This met with general approval from the panel. Flexible data access, round the world and round the clock, is the answer. How this is achieved just doesn’t seem to matter anymore.


New tools and workflows for the i-Field

Olivier Houzé (Kappa Engineering) aims to eliminate the ‘dumb’ bits of digital oilfield workflows.

Kappa Engineering MD Olivier Houzé presented a paper on ‘New tools and workflows for dynamic data processing in the i-field’ to the Association of French Oilfield Professionals (AFTP) this month. Oilfield dynamic data analysis involves some kind of disturbance to flow which produces pressure changes that can be compared with a model. Such techniques can be used for operational decisions and for real time forecasts. They are an essential component of today’s ‘intelligent’ fields. The technology toolset includes permanent gauges, production logs, temperature measurements and production data. Kappa is currently working with Total on a joint industry project to understand fiber optic temperature data which is ‘complex and poorly constrained.’ But it is expected that the huge data volumes streaming from instrumented wells will bring up something useful.

Production data itself is potentially a longer term, more representative measurement. The new tools and data are creating their own problems as resources are stretched. Houzé estimates that engineers waste 50% of their time ‘fighting the data.’ A disgraceful situation in the context of the ‘i-field’ buzz. On which topic Houzé is circumspect. The buzzwords—i-field, field of the future etc. are a dangerous trend. We run the risk of re-inventing the wheel and revisiting the whole ‘buy or build’ debate of the 1980s. Deployment is further complicated by demarcation lines between engineering and IT and between the human brain and automation. Here Houzé is clear, we need to use all available techniques plus data models, plus integrated workflows. But in the end it is human intelligence that needs to be leveraged in the decision making process.

For Houzé, automation has a role—in data preparation and analysis. For instance, the ‘dirty data’ of production monitoring is amenable for pressure transient analysis. It can be cleaned up with tools like Kappa’s Ecrin, deconvolved or wavelet filtered and compared with models to present engineers with actionable information. Deconvolution can also be used to combine two subsequent well tests to give a valuable ‘pseudo long term test’ that provides an estimate of reservoir extent.

The i-field workflow today involves a background, automated data collection process, back allocation of production, history matching and ‘virtual metering.’ The model is used as a proxy of the reservoir generating artificial production data which is compared with the cleaned-up real time field data. If there is divergence, alarms are sent to the engineers for in-depth analysis. The aim is to replace the ‘dumb’ bits of the workflow and supply engineers with high quality actionable information.

Houzé was pressed in the Q&A on the way the i-field is sometimes ‘sold’ as a deskilling process—with IT and automation replacing knowledge workers. Houzé observed that today’s engineers are not necessarily poorly qualified, but they do have a very high workload. Hence the requirement for workflows that minimize data handling. More from infos@kappaeng.com.


Athens Group report finds software root cause of rig downtime

Drilling Control System software ‘invisible root cause’ of equipment failure and safety incidents.

Speaking at the International Association of Drilling Contractors this month, Athens Group CTO Don Shafer presented research into root causes of drilling equipment failure and safety incidents. The conclusions of the Athens Group study are somewhat alarming for those involved in information technology since the report’s main conclusion is that software is the ‘invisible root cause of equipment failures and safety incidents.’

The culprit is the drilling control system (DCS) that manages ‘75-95%’ of the equipment whose operation causes most injuries on the rig. The Non Productive Time benchmark survey* pinpoints the causes of such equipment failure and makes recommendations as to how software integrity can be improved, reducing DCS-related equipment failures and safety incidents.

Safety aside, NPT has an ‘unacceptable’ economic impact with annual cost of NPT on a high-spec offshore in the $100-150 million range of which 20-30% is DCS related. Shafer recommends more education on the importance of quality software, more thorough testing of the DCS including commissioning by drilling software experts, industry-wide adoption of software quality standards and standardization of equipment interfaces. Customers for Athens Group’s Drilling Technology Assurance Knowledge Base include BP, ExxonMobil, Noble Drilling, ConocoPhillips, Pride, Shell and Transocean.

* www.oilit.com/links/0906_8.


Bull teams with ffA, new family of HPC blades rolls out

Deal targets data-driven seismic interpretation. Bull-X Extreme Computing announced.

Bull and Foster Findlay Associates (FFA) have teamed to port FFA’s 3D seismic analysis software to Bull’s GPU-based hybrid supercomputers. FFA’s high performance computing (HPC) tuned software and a ‘data driven’ approach to 3D seismic interpretation are claimed to provide a productivity step change in geoscience interpretation.

FFA MD Jonathan Henderson said, ‘Bull brings an innovative approach to systems design, incorporating computational power and scalability with advanced interactive visualization features. The challenge of extracting meaningful information from seismic data will be significantly advanced by using automated HPC techniques in real or near real time.’

Bull’s latest ‘extreme computing’ engine, the Bull-X family of blade supercomputers was announced this month. Bull-X promises scalability from teraflop to multi-petaflop bandwidth in a variety of configurations including Intel Xeon 5500 Nehalem processors and GPUs. Bull-X runs under a cluster controller developed by Bull using ‘open source and standard components.’ More from www.ffa.co.uk, www.oilit.com/links/0906_9 and pascale.bernier-bruna@bull.net.


Platte River, Exprodat team on GIS-based play fairway analysis

Deal heralds EU support for toolset of common risk segment and probability of success mapping.

Platte River Associates (PRA) has teamed with Exprodat Consulting to offer EU-based support for PRA’s GIS-based play fairway analysis tool, PetroAnalyst. Exprodat is to offer GIS support and training services to PRA’s European customers. PetroAnalyst is an ESRI ArcMap extension originally developed for Pemex. The package leverages ArcMap’s mapping and geoprocessing capability for analysis of both conventional and unconventional plays. The toolset provides common risk segment maps and probability of geological success evaluations. More from www.platte.com and www.exprodat.co.uk.


Terraspark Geosciences to market 3D seismic Data Showcase

Cut-down edition of InsightEarth provides vendors with seismic data visualization for the laptop.

TerraSpark Geosciences is offering a cut-down edition of its InsightEarth 3D seismic data interpretation package to seismic data vendors and others who require a high end ‘read-only’ display of seismic data volumes. TerraSpark Data Showcase helps seismic processing companies and data vendors communicate the results of their imaging efforts to clients and prospective customers.

Data Showcase displays 3-D volumes along with land grids, lease blocks, and other cultural information on a laptop. Clients can scroll through data volumes, previewing data and selecting an area of interest for purchase. Multiple attribute volumes can be displayed side-by-side or corendered for comparison. Data is encrypted prior to installation on the PC and decrypted on the fly for display. More from www.terraspark.com.


NVIDIA, Supermicro launch Tesla server. Petrobras backs GPU

Petrobras has deployed 190 Tesla node GPU cluster. New server class claims 2 teraflop/1U rack.

Petrobras has endorsed seismic data processing on NVIDIA’s ‘Tesla’ graphics processing units (GPU) and has invested in a GPU-based cluster consisting of 190 NVIDIA Tesla nodes. Petrobras’ geophysical technology manager Neiva Zago said, ‘With our GPU cluster we are getting performance improvements of 5x to 20x over our traditional multi-core CPU-based cluster. We expect that the continued use of GPUs in our business will result in significant reduction in processing time as well as savings in power consumption and datacenter floor space.’ Petrobras expects scalable increases in GPU performance will continue as it hikes its datacenter bandwidth to over 400 teraflops.

In a separate announcement, NVIDIA has teamed with Supermicro to deliver a ‘new class’ of server combining Tesla GPUs with multi-core CPUs. A 12 fold performance hike over a quad-core CPU-based server is claimed. The SuperServer 6016T-GF was unveiled at Computex 2009 in Taiwan this month. The 1U rackmount unit offers a claimed 2-Teraflop capacity. More from www.nvidia.com.


Energistics WITSML Public Forum and Exhibition

BP and Total back well data standard. Wellstorm, Moblize, Perfomix innovations on show.

Speaking at the Energistics WITSML Public Forum and Exhibition* in Houston earlier this year, Energistcs CEO Randy Clark bemoaned the fact that although WITSML is available to some 90% of high value, real-time, difficult drilling operations, ‘the value of WITSML is barely known to industry at large.’ For some companies though, such as StatoilHydro, WITSML is becoming ‘business as usual’ for real-time drilling operations. These companies are saying ‘no WITSML, no work for you!’

BP ‘s Julian Pickering presented an analysis suggesting an ROI of 200% for real-time standards and payback within a week! BP’s WITSML systems are tested in SAIC’s Aberdeen, UK facility for security. BP is also looking for ‘renaissance’ engineers who combine knowledge of their data with a desire to ‘use something besides a pivot table in Excel.’

Total described its Alwyn field, discovered in 1974, as a brownfield now entering the ‘craftsman’ phase. Alwyn is a test bed for new technologies including WITSML. One problem with new technology is that in a greenfield, ‘it’s too new’ and in a brownfield ‘it’s too late!’ More on Total’s Alwyn testbed in next month’s Oil IT Journal.

Wellstorm showed improvements to its automated rig state detection system and Moblize demonstrated wireless remote control of directional drilling. Geologix has defined a WITSML mudlog object containing information on lithology, formation top and show data. Such orphan data types have been bottlenecked in the past from poor formatting and the fact that they are not usually available outside of emails. Perfomix’ ‘PetroSocial’ social networking/communities of practice alerts management during geosteering projects. The analytical toolset comes with pre-configured cubes, exposed as web parts and portlets in SAP or SharePoint. Web 2.0 technology now replaces the morning meeting. Schlumberger reports that it now uploads 60,000 WITSML files per month to its Interact site and has 60 Operational Service Centers for internal QC of drilling and production data. Baker Hughes uses WITSML internally to manage job data and build up an electronic well file at job start up. The goal is to reduce the number of interfaces between storage and operation data. Challenges include applications that don’t interface with WITSML servers, large data sets, and building additional query functions to retrieve data by depth range.

* www.oilit.com/links/0906_10.


Software, hardware short takes

Paradigm, Badleys, OpenSpirit, Helix Wind, C&C Reservoirs, Hampson-Russell, SMT, Encom, K-Reservoir ...

Paradigm announced Rock & Fluid Canvas 2009 (including StratEarth, a new well correlation and geologic interpretation tool) and Epos 4.0 this month. The new releases promise ‘concurrent, seamless access to data in an integrated, multi-disciplinary software environment.’ More from www.askparadigmhow.com.

~

Badleys has released Trap Tester V6.0 with new structural modeling and fault-seal analysis functionality. A new ‘CubeXplorer’ offers 3D seismic rendering and transparency with multiple volumes and interactive probes. More from www.badleys.co.uk.

~

OpenSpirit has announced the 8.4 release of its SMT/Kingdom data connector and LoadIT WITSML 2009 for real time rig to office collaboration. More from www.openspirit.com.

~

Helix Wind has teamed with CheckPoint Process Pumps and Systems on a turnkey, wind-powered 2-4.5 kW electricity generator for remote field locations. More from www.helixwind.com.

~

C&C Reservoirs’ reservoir performance analog knowledge system (REPAX) covers both mature and ‘rejuvenated’ fields along with geologic and engineering databases. More from www.ccreservoirs.com.

~

CGGVeritas’ Hampson-Russell unit has released a 4D seismic inversion algorithm that simultaneously utilizes all available 4D data. More from www.cggveritas.com/StratiSI4D.

~

Seismic Micro Technology (SMT) has repackaged its PC-based, ‘Kingdom’ seismic interpretation suite. The 8.4 release now includes an optional ‘product line extension,’ Kingdom Advanced with high-end functions including advanced autopicking, ‘illuminator’ technology and support for surface and volume curvature analysis. More from www.seismicmicro.com.

~

Encom has announced V11.0 of its Discover tool with new functionality for multiple LIDAR datasets, multi-gigabyte grids and other geoprocessing options. More from www.encom.com.

~

McLaren Software’s Enterprise Engineer is now certified for use with IBM’s FileNet P8 4.5 Enterprise Content Management Platform. New compatibility features have shown a 30% performance hike for commonly used actions. More from www.mclarensoftware.com.

~

The latest release of Fugro-Jason’s Geoscience Workbench includes new AVO analysis and full waveform synthetics. More from www.fugro-jason.com.

~

Knovel has introduced Knovel Math for PTC Mathcad with fully documented Mathcad worksheets for engineering calculations from Knovel’s suite of trusted reference works. The solution is available as web service. More from www.knovel.com.

~

Knowledge Reservoir has announced the 2009 release of its deepwater Gulf of Mexico knowledge base, ReservoirKB. The upgrade includes a new GIS interface and usability enhancements. More on the 1000-well strong hosted subscription service from www.knowledgereservoir.com.

~

The new 3.1 release of Transpara’s Visual KPI operational intelligence package has been localized to the European market with support for European dates, times and numbers. More from www.transpara.com.


Tobin SuperBase adds web-based GIS to Swift’s IT stack

P2 Energy Solutions User Meet presentations cover data management and the parlous state of the API number.

Brad Kaufman, IT manager with Swift Energy described how Tobin SuperBase and a data management infrastructure is helping Swift’s evolution towards asset teams, where geoscientists, engineers, landmen and support staff ‘commune’ and office together. Swift deploys a heterogeneous software lineup with elements from Landmark, IHS, SMT, ESRI, P2ES and Schlumberger. The company is mid-way through organizing its well data. Data management is an ‘ongoing’ process where ‘100% done is a myth!’ You need to decide when is ‘good enough’ to roll out to business decision making.

A hierarchy of preferred location information is used with Swift or other trusted sources at the top, next Tobin Superbase then IHS Enerdeq (Swift’s assets are located in Texas and Louisiana). In 2007, Swift’s CEO asked for online maps for competitor analysis and Swift’s own asset position. To assure live data, a link to Swift’s Enterprise Upstream (P2ES’ ERP solution) was developed. Microsoft Virtual Earth was selected as a front end with Visual Fusion used to connect to ESRI SDE. VE’s proximity to SharePoint simplified unstructured and relational data access. Stonebridge Technology developed the solution, linking the VE-based front end to Enterprise Upstream, SDE and Infostat’s Rimbase wellsite information system. The project launched in May 2008 and was delivered in six months.

A P2ES presentation* underscored the parlous state of US well data management. Although they may not realize it, all US operators will be negatively impacted by recent developments at the American Petroleum Institute. Most all operators rely of the API’s universal well identifier (UWI) a.k.a the API number as their prime ‘integration vehicle.’ However there is a lack of uniformity on API number assignments from different data providers and from state /federal agencies. The API standard was first published 1966 and revised in the late seventies. Since then, no updates have been published. There is no provision in the current standard for horizontal drilling. Today, despite its importance to operators, the API is to discontinue support of its well number standard and is seeking a sponsor to take over. One problem facing current users of the API standard is the fact that, according to P2ES, 15-20% of wellbores were never attributed an API number. The status quo means that ‘geoscientists and engineers unknowingly gamble on a routine basis, assuming that the data they are using has been tied to the right wellbore. More from www.oilit.com/links/0906_15.

* www.oilit.com/links/0906_7.


PNEC Data Integration Conference, Houston

Although some companies are at what one observer described as the ‘Alfred E. Newman’ stage of data management—Excel and shared drives, most are making some headway. We report on data initiatives from Saudi Aramco, Highmount, Marathon and Shell, on Teradata’s brave new data world, and on new developments from Noah Consulting, Schlumberger, LogicalCat and Halliburton. We also hear two ‘boomers’ expound on ‘generational prejudice’ and how to entice Gen-Y into the E&P fold.

Phil Crouse’s PNEC Data Integration Conference got a respectable turnout of around 250 from 22 countries. Not bad in the face of the recession and the swine flu epidemic. As we reported last month PNEC saw the commercial launch of Petris’ WINDS OneTouch E&P ‘knowledge portal.’ Petris’ technology got a pretty good endorsement from Saudi Aramco’s Turki Al-Ghamdi who described deployment in the context of lifecycle seismic data management. Aramco has analyzed the seismic workflow in terms of a long term asset cycle (LTAC) that embeds shorter term operational cycles from acquisition through processing and interpretation. The LTAC approach is improving communication between the ‘producer’ and ‘consumer’ cycles. LTAC includes a data governance process, reports, QC and an authoritative data store. Underpinning LTAC is Petris’sWindsEnterprise, described as a ‘plug-in, a metadata-driven, vendor independent integration platform.’ The system complements Aramco’s Oracle and Documentum environments, avoiding ‘costly migration from legacy systems.’ The ‘data services solution’ (DSS), conceived by Aramco, deploys ‘smart’ business objects which allow new data types and workflows to be added on the fly. Al-Ghamdi concluded by recommending a division of labors whereby ‘the vendor handled the technology and the company handles the business.’

Tina Warner described Highmount E&P’s ‘Incentive’ data environment that couples an IHS PIDM database with data QC from Schlumberger’s Innerlogix unit. The corporate PIDM repository receives nightly updates ‘pushed’ through the firewall that respect in-house ‘preferred’ data. Innerlogix’ data QC tools automate quality data delivery to users—replacing previous inefficient manual process. Innerlogix has turned the PIDM database into a ‘trusted system of record’ and provided its business units with tools and a strategy for data management. Data flows from PIDM 2.5 to an OpenWorks master and on to Petra and OpenWorks projects. Warner noted some ‘friction points,’ such as inadvertently overwriting data that has already been ‘fixed.’ Highmount estimated the cost of a failure to implement would be around $3 million a year. The project took around 9 months to implement.

Petrosys’ Alec Kelingos noted that ‘data is worthless if you don’t know where it is located.’ The oil and gas industry lags in terms of coordinate reference system (CRS) management. V16 of Petrosys’ mapping package comes bundled with the European Petroleum Survey Group’s ‘well known text’ rendering and unique ID. All 200 Petrosys client sites have been spidered to identify CRSs in use. Kelingos recommends using a cross discipline team of geodeticist, IT and users. Pitfalls abound—units of measure, round off errors and novel undefined CRSs. Petrosys’ CRS crawler provides an audit report. ‘Spatial needs a unique data management strategy.’

Two ‘boomers,’ Schlumberger’s Richard Johnston and Dwight Smith did a good job of stepping into the shoes of ‘Gen-Y’ to report on ‘generational prejudices’ and a future world where collaboration (at a distance) is a given. Combining Twitter, YouTube and resources such as the Schlumberger Expert Directory might allow multi million dollar prospects to be developed on Facebook from a cell phone! The ‘millenials’ will see a paradigm shift from our map and hierarchy-based displays to multi dimensional displays, artificial intelligence agents, shared search and automatic translation.

Hector Romero presented Shell E&P Co.’s well log data environment which builds on another Petris product, the Recall petrophysical lifecycle data management system. Header data comes from Shell’s corporate data store to a Recall staging database for edit and curve selection and then on to OpenWorks, Techsia’s TechLog and the Recall master. Shell’s ‘EPICURE’ quality rules are applied en route. A variety of reformatting strategies allow log data to be browsed in Landmark’s PowerExplorer and a lot of work has gone into automating data loading. Some Shell units handle thousands of new curves per months. Shell is now working on tight integration between Techlog and Recall.

Niall O’Doherty gave another enthusiastic plug for Teradata’s visionary environment for seismic data. O’Doherty envisages the development of a ‘Google Earth for seismic data’ leveraging Teradata’s spatially registered data structure and a new logical data model. The benefits would be an environment that supports both analytics and data management—pushing the boundary such that users could ask any question any time. O’Doherty advocates ‘data refinery’ with a redefined demarcation of what is done in the database vs. what is done in the application. Teradata poster child is E-Bay which uses Teradata to process its 50TB/day incremental data flow. To get traction, Teradata has a lot of legacy systems and ideas to displace. O’Doherty notes that, ‘In the mind of an expert there are many established ideas. But in the mind of a beginner there may be a few new ones!’

Shari Bourgeois outlined how Marathon’s ‘Midas’ well master data solution was developed, leveraging HP’s ‘IQM’ data quality methodology. The Midas ‘golden’ database includes workflows for AFE, peer review, rig on site and well spud. Midas uses database triggers to initiate workflows of ‘mindboggling’ complexity. Marathon had a ‘three day fist fight’ over thorny questions such as ‘what is a well?’ VP sponsorship was necessary to convince skeptical asset teams of the project’s value.

Noah Consulting’s Shannon Tassin proposed an enterprise architecture for business intelligence and real time information quality management. This addresses the problem of multiple data silos and real time data that is ‘stuck’ in the operational historian. Noah advocates a ‘rationalized metadata repository’ a.k.a. the ‘glue that binds.’ A further recommendation is for federated MDM rather than a single centralized solution.

Rail Atay Kuliev (Saudi Aramco) observed that, with around 10 million bopd production, ‘a percentage point is a lot.’ Aramco’s production data was previously stored in multiple, isolated legacy systems. Correcting bad data was complex and no real time data was available. Now production data is served centrally from a web-based ‘multisystem’ for use by engineers and management. Real time data is now accessible from centralized production allocation and includes reliable on/off time information. A single field was used for pilot in 2007 which boosted confidence in the system. Subsequently projects were segregated into phases and areas. The same team was used throughout the life of the project—this helped with knowledge sharing.

Jess Kozman, formerly Schlumberger, now heading up Carbon Lifecycle Technology Consulting, outlined the state of play with Schlumberger’s analysis of oil and gas companies’ data management ‘maturity.’ Until relatively recently, ‘mid tier’ US oil companies did not really have a data ‘issue.’ But they sure do now with hundreds of PCs and complex data workflows. Analysis with Schlumberger’s data management maturity model shows some at the Level 1 stage, a.k.a. ‘Alfred E. Newman data management!’ They likely deploy Excel and shared drives and have no best practices. Progress up the data maturity matrix is possible but often interrupted as a small, lower maturity company is swallowed up by larger higher maturity acquirer. Schlumberger found that in the couple of years following a merger, from 20 to 70% of data management staff were ‘lost.’ Kozman noted that the next round of A&M will focus on PC-based companies who will be forced into more structured data management environments. In the Q&A it emerged that Schlumberger has not yet convinced any majors to do enterprise-level surveys so the data set is skewed away from supermajors. Outside of oil and gas, medical, military and intelligence have more data and higher maturity.

Bryan Hughes’ LogicalCat software startup is engaged in ‘fighting entropy with vendor-neutral search.’ LogicalCat’s technology targets mid-size companies which may not have OpenWorks and likely only PCs. They are unlikely to have a master database and may in fact have a couple of hundred of de facto systems of record! Projects (SMT, Petra, Geographix, SeisWare) make up the fundamental ‘document’. As such they can, with the right technology, be indexed and ‘googled.’ Shapefiles are still the de facto GIS standard. Hughes likens the approach to the ‘e-discovery’ of the legal profession. The result is near real-time reporting that can be tied into analytical tools like Spotfire for analytics.

Charlotte Burress described the migration of Halliburton’s Baroid drilling chemicals unit’s legacy knowledge management community of practice (CoP) to a Microsoft Office SharePoint Server (MOSS) environment. Each Baroid product has its own community where members can ask questions or learn on demand. Baroid’s ‘KM 1.0’ environment was built with Plumtree Portlets. Baroid wanted to move away from this top-down designed ‘busy’ portal with links, images and collaboration tools and to move to a ‘Web 2.0’ mentality à la Wikipedia, MySpace, LinkedIn, or Flickr. This process involved a two way conversion—of the site itself and of the users. The spec called for enhanced usability and the ability to find anything in less than two seconds. Burress emphasized that, ‘content management is a process not a software package.’ The development was done on a ‘vanilla’ MOSS, not the high end ‘Enterprise’ edition with more bells and whistles.

This article is an abstract from The Data Room’s Technology Watch from the 2009 PNEC. More information and samples from www.oilit.com/tech.


InnerLogix User Group

Data quality management case histories from Anadarko and Chevron show quantifiable benefits.

InnerLogix* is now part of Schlumberger, but it still gets to hold its own user meeting. This year, around 50 heard client presentations on data quality case histories.

Anadarko’s Marty Davis described data prep for the upcoming OpenWorks R5000 migration. This includes legacy Kerr McGee projects (yes, they still have those!) with over 1 million wells and 4 million tops. Davis noted that it only takes about 2 weeks for projects to degrade back to their previous quality level after 6 months of QC work. To counter this, users are now asking for enforced naming standards on horizon picks after years of resisting the idea! The data quality management (DQM) project has reduced the number of projects by 43%, and wells by 31% (half by removing duplicates). Future plans include capturing value added data and adding it to their PPDM (PPDM was a recurring theme this year). Davis said his largest OpenWorks project was about 300,000 wells, and SeisWorks would not launch with that many wells loaded.

Mike Underwood (Chevron) is still one of the industry’s biggest cheerleaders for InnerLogix. Chevron maintains its focus on G&G data quality with established rules at the corporate, business and asset levels. Quality standards are aligned with other concepts like establishing for each data type both a ‘system of record’ and a ‘first source.’ Quality data has led to reduced time spent moving A2D logs from the corporate data repository to OpenWorks.

Patti Bush (Anadarko) presented the new data tracking interface in Petris’ Recall log data management system. The new release has enhanced search, visualization and reporting. Real time notification of new data reception and synch with ESRI’s SDE das also been added. Anadarko adds around 3,000 new well curves a week. The old LAS batch process took four hours and has now been reduced to 30 minutes. Direct export to OpenWorks and Petra is also supported. Rapid search for information such as mud weight in a log header is now feasible as is data export to (what else?) Excel.

During the panel session, one observer remarked that, ‘despite the still heavy focus of data quality on E&P, for many segments of an oil company, a well isn’t a well until it is in SAP.’

* www.oilit.com/links/0906_12.


Folks, facts, orgs ...

Aker, Argus, CapRock, Cedigaz, Celerant, Hampson-Russell, CygNet, Daratech, Energistics, Combustion Safety, Hess, P2ES, Industrial Defender, IPS, Lazard, National Oilwell, Object Reservoir, Schlumberger, Palantir, Quantapoint, Rival, Ryder Scott, SMT, Technip, Universal, Chevron.

Aker Solutions has named Ida Helliesen and Mikael Lilius as directors replacing Heidi Petersen who resigned recently.

Ron Mobed has left IHS to join Argus as a non-executive board director.

CapRock Communications has appointed Pal Jensen as president of its Maritime Division.

Thierry Rouaud has been appointed Secretary General of Cedigaz, an EU gas industry trade body.

Celerant Consulting has appointed Jan Erik Johansson as VP energy, including E&P, operations and refining. Johansson joins Celerant from Schlumberger.

CygNet Software has appointed Tom Ordes as director, pipeline market. Ordes comes from Telvent.

Daratech Plant has changed its name to ‘The Digital Plant Conference.’

Alex Trower from the University of Kansas is Energistics’ summer intern. He will update the website and work with the WitsML and ProdML SIGs.

Combustion Safety has announced a Twitter on safety and risk. Subscribe at twitter.com/combustsafety.

Michael Turner has been named Senior VP, production with Hess Corp. Turner joins Hess from Shell.

Diana Lovshe has joined P2 Energy Solutions as director of marketing. Lovshe was previously with Halliburton.

Industrial Defender’s new team includes Jerry Fudge, Regional Sales Manager, Frederick Lenihan, Senior Quality Assurance Analyst, and Carl Zemke, Security Specialist.

Invensys Process Systems has joined Intergraph’s SmartPlant Alliance.

Jack Lentz, has joined Lazard as Chairman, International Oil & Gas. Lentz was previously head of Lehman Brothers’ energy practice.

National Oilwell Varco has named Hege Kverneland Corporate VP and CTO in replacement of Bob Bloom who is retiring. .

Pashupati Sah is to head-up Calsep’s new location in Kuala Lumpur.

Object Reservoir has teamed with DeGolyer and MacNaughton on ‘practical prescriptions’ for exploration of unconventional resources such as the Haynesville and Marcellus Shales. Object Reservoir’s unconventional Collaborative Exploitation Projects start this year.

Schlumberger has announced ‘Ocean for Academia’ in collaboration with Rice, Stanford, Texas A&M and Unicamp (Brazil) universities. The project will develop new technology based on its proprietary Ocean/Petrel data platform.

Chris Gibson has joined Palantir Solutions from Merck & Co.

Quantapoint has appointed Paul Hackleman business development director for its Western Region. Hackleman was previously with P2 Energy Solutions.

Rival Technologies has announced that Brendon Billings is the first member of its TRU Oiltech pilot project advisory team.

Jeffrey Wilson has been named to the Ryder Scott board. The company has also hired Gabrielle Guerre as PE, from ExxonMobil.

SMT has appointed Mourad Ait-kaci, adding French and Arabic language support from its London office.

Thierry Parmentier has been named director, HR with Technip.

Bobby Jarrell is to head-up Universal Well Site Solutions’ new Tuscaloosa office.

Ram Sona and Heng Li have joined Chevron from USC’s ‘CiSoft department.


Done deals

EMGS, Bridge, Cameron, Natco, Eagle, Energy Solutions, ECIL, ION, VSG, Weatherford, GTCR.

Electromagnetic Geoservices (EMGS) has spun out its Petrel EM plug-in to a new company, Bridge Software AS. Paul Hovdenak heads-up the unit which is developing the Bridge EM Data Integrator. Bridge is backed by Blueback Reservoir and Vestfonna Geophysical.

Cameron is to acquire oil and gas process specialist Natco in a paper-only deal that values the company at approximately $780 million. Natco’s 2,400 employees generated revenues of over $650 million in 2008.

Eagle Geophysical and wholly-owned subsidiary, Eagle Geophysical Onshore, have filed voluntary petitions under Chapter 11 of the US bankruptcy code. The Companies continues to operate as ‘debtor in possession.’

Energy Solutions International and Electronics Corporation of India Ltd. (ECIL) are teaming to promote deployment of PipelineManager to the Indian market. ECIL clients include BPCL and PCCKL.

ERF Wireless has completed is acquisition of the assets and operations of Frontier Internet LLC and iTexas.net, adding 1,800 customers and ‘new oil and gas revenue capabilities.’

ION Geophysical and WesternGeco are trading lawsuits in regard of Western-Geco’s Q-Marine system and ION’s patented technology. ION’s suit also alleges that WesternGeco ‘tortiously interfered’ with customer contracts and breached a confidentiality agreement. The suit follows an earlier claim against ION by WesternGeco claiming infringement of its own patents.

A management buy-out of the Visualization Sciences Group of Mercury Computer Systems (VSG) sees the unit’s return to independence. The MBO was supported by IRDI-ICSO Private Equity.

Weatherford is to acquire TNK-BP Oil Field Services in exchange for 24.3 million Weatherford shares. TNK-BP’s OFS business provides drilling and other well services in Russia. 2008 OFS revenues exceeded $650 million.

Private equity house GTCR and Ken MacDonald announce the formation of ReSurge Ltd. ReSurge is to create a seismic data licensing company in Western Canada. GTCR plans to invest up to $150 million into the unit which is headed up by MacDonald former Pulse Data CEO.


FreeWave uses heterogeneous radio networks for connectivity

Oil and gas specialist Jim Gardner advocates wireless-based plunger lift control.

Speaking at the annual Southwestern Petroleum Short Course held earlier this year at Texas Tech, FreeWave Technologies oil and gas guru Jim Gardner presented a paper on leveraging wireless and hybrid networks in applications like plunger lift optimization. FreeWave builds on recent efforts by automation manufacturers to optimize pumps, maximizing production and minimizing maintenance and energy costs.

For larger companies, buying a technology and network solution from a single vendor is a valid option. But for small and mid size companies with a diverse installed base, this is not practical. Gardner advocates leveraging existing networks alongside ‘heterogeneous’ radio networks to provide remote connectivity. FreeWave’s license-free spread spectrum and licensed band radio and wireless data solutions can be used to build a single ‘cohesive’ network. FreeWave uses bi-directional Modbus-based systems. One use case showed a solar powered kit system recording casing and tubing pressure and plunger data and two valve control channels. The system can be set up in a couple of hours at considerable savings to wired systems. Tank farms, flowlines and separators are other potential candidates for wireless. More from www.freewave.com.


Steve Robb on SCADA/Enterprise IT integration

Cygnet Software frees SCADA from ‘engineering-centric’ role, integrates IT infrastructure.

In a webcast this month, CygNet Software VP business development Steve Robb described how SCADA data is available throughout the enterprise. SCADA is no longer limited to its traditional role in real-time data from process applications like oil and gas pipelines. SCADA is breaking out of its ‘engineering-centric’ role and is seeing traction as a component of an IT infrastructure blending engineering data with business information. This lets companies use third-party systems to enhance operational, maintenance and financial business decisions. Robb noted, ‘Integrating existing SCADA systems with the IT infrastructure has proved complex and expensive due to incompatibility of time-series and transactional systems. SCADA has failed to deliver real-time information throughout the enterprise. Today, the technology is there to close the loop. CygNet’s model-driven integration approach and enterprise service bus means that integration is easy and can deliver cost-effective, accurate real-time information to anyone who needs it.’ CygNet has just signed a partnership deal with ESRI to combine its ‘next generation’ SCADA capabilities with ESRI’s geographic information systems. CygNet clients include Apache, ConocoPhillips, Devon, El Paso, ExxonMobil and Shell. More from www.cygnetscada.com.


European API PIDX meet hears from Cap Gemini, StatoilHydro

Cap Gemini—master data is root cause of IT project failure. StatoilHydro’s materials data cleanup.

The EU meet* of the American Petroleum Institute’s Petroleum Industry Data Exchange was held earlier this year in Stavanger, Norway. Keynote speaker Jim Abery (CapGemini) explained that although the process of converting data to information looks simple, it is full of pitfalls—from incomplete data capture, poor data cleansing en route and a general acceptance of such limitations by stakeholders. Master data is no exception with multiple legacy systems, inconsistent schemas and poor governance. All of which results in duplicates and inaccuracies and a ‘vicious circle’ of corrupt data with major costs, downtime, and unnecessary inventory. Abery recommends a two stage fix combining a ‘retrospective’ intervention on legacy data and processes and a forward planning of future IM requirements and prioritization. This includes design and implementation of a master data governance model, training and data cleansing. According to Abery, studies suggest that 80% of IT investment fails to meet expectations. Poor data rather than IT is increasingly seen as the root cause of such failure.

Bjorn Tesdal related StatoilHydro’s experience of synchronizing materials master data. This two-stage process involved legacy data clean-up and the implementation of SAP Master Data Management. The process produced some quick wins for StatoilHydro. Some 30% of 247,000 material masters were identified as obsolete in the clean-up process resulting in estimated savings of NOK 185 Million. Abery warned, ‘MDM is not magic. It requires hard work and cooperation between the central team and end users.’

* www.oilit.com/links/0906_11.


Industrial Defender signs OEM agreement with GE Energy

Deal to protect ‘Smart Grid’ from cyber attack.

GE Energy and Industrial Defender have signed a global OEM agreement, whereby GE will offer Industrial Defender’s technology suite along with its Smart Grid automation technologies including energy management systems, distribution management systems and substation automation solutions. The Industrial Defender technology suite is a value-added solution that helps utilities meet cyber security and NERC CIP regulations. This alliance enables utilities to address cyber-security issues and secure the emerging ‘Smart Grid’ roadmap, enhancing reliability, availability and efficiency of the power grid. More from www.industrialdefender.com.


Sales, contracts and deployments

Amec, Amphora, Tieto, Knowledge Reservoir, AspenTech, AVEVA, FMC, FuelQuest, Granherne ...

BP has awarded AMEC a contract for engineering services on its deepwater Gulf of Mexico Tubular Bells and Kodiak developments. AMEC is to evaluate development options prior to front-end engineering design. The deal is the first deepwater work to be executed under a global agreement with BP.

~

Salt Lake City-based Sinclair Oil has selected Amphora’s ‘Symphony’ Energy Trading and Risk Management platform for its crude oil and derivatives trading business. Symphony is a hydrocarbons trading and risk management system that operates across the transaction chain.

~

Tieto and Neste Oil have renewed their framework and service agreements. Tieto provides Neste with ICT services related to the internet, customer relationship management, order and delivery, logistics and corporate-wide knowledge management.

~

OMV has signed a frame agreement with Knowledge Reservoir for the provision of consultancy services and integrated geoscience studies across its international asset portfolio.

~

Microsoft has teamed with AspenTech to help process industry companies optimize their engineering, manufacturing and supply chain operations. The training modules will be a component of Microsoft’s Industry Solution University series and will leverage ‘Vista-compliant’ AspenOne process optimization software.

~

Dubai’s National Petroleum Construction Company has chosen AVEVA Plant for use on the Integrated Gas Development Project-Habshan Platform Offshore Facilities for ADMA. AVEVA PDMS provides multi-discipline capabilities across large projects, from front end design through to construction engineering.

~

TGS-NOPEC’s Geological Products and Services division is to make a complete set of well logs from the State of Ohio’s Department of Natural Resources available online.

~

Kosmos Energy has selected FMC’s ‘Subsea on Demand’ fast track development process for its Ghanaian ‘Jubilee’ deepwater development. The project includes horizontal subsea trees, production and injection manifolds and control systems. FMC also recently received a $60 million award from ENI for its Timor Sea Kitan field development.

Kansas-based Jones Oil has signed a five year deal for use of FuelQuest’s hosted Fuel Management System (FMS). FMS is to improve Jones’ fleet utilization and increase visibility into tank inventory and usage trends. FuelQuest also reports deployment of its ‘Zytax’ tax determination and compliance solution to refiner CountryMark.

~

StatoilHydro has chosen KBR’s Granherne unit to perform a ‘conceptual study’ as part of its ‘Gullfaks 2030’ project to extend the gas field’s life. The study builds on previous Granhearne evaluation of subsea wet gas compression and a tie-back of a satellite field to Gullfaks C.

~

UAE-based Petrofac has selected Intergraph’s SmartPlant engineering design solution for its portfolio of software tools used in the execution of its global design and engineering projects. The ‘substantial,’ multi-year agreement with Intergraph’s Process, Power and Marine division is to help Petrofac meet tight schedules and realize productivity gains.


Standards stuff

Safety Users Group and IEC, OSHA admonishes industry, PPDM—Business Rules OK?

The Safety Users Group, with representatives from Emerson, Shell and DuPont has engaged with the International Electro-technical Commission (IEC) to explore the maintenance of the IEC’s 61511 safety standard. Members shared practical experiences of the standard and its adoption in the chemical process, oil refining, tank storage and offshore industries. A 2007 Frost and Sullivan report ‘World Safety Systems Markets for Process Industries’ estimated the safety systems market at around $1 billion annually. However, despite the boom in safety programmable systems and growing awareness within industry, the IEC 61511 standard is still misinterpreted. The results of the group’s deliberations are available as a series of free online videos. More from
www.safetyusersgroup.com.

~

The US Labor Department’s Occupational Safety and Health Administration (OSHA) has written to over 100 US-based oil refineries emphasizing the need to comply with all applicable OSHA standards, particularly the Process Safety Management of Highly Hazardous Chemicals. The OSHA Refinery National Emphasis Program identified several compliance issues. The OSHA is now ‘urging’ refiners to comply with their obligations under the process safety management (PSM) standard. The standard requires employers to develop and incorporate comprehensive, site-specific safety management systems to reduce the risks of fatal or catastrophic incidents. More from www.oilit.com/llinks/0906_13.

~

Speaking at the 2009 PNEC Data Integration Conference, PPDM CEO Trudy Curtis described the organization’s work on business rules and data quality. These topics are getting attention from oil and gas companies—increasingly aware of the business consequences of bad data. Curtis’ paper reflects a shift of emphasis for PPDM and its flagship database. For Curtis, the PPDM data model is really a collection of terms and rules ‘that just happens to be a database too.’ Focus today is increasingly on the business itself—notably via the PPDM Business Rules WorkGroup. PPDM’s ‘What is a well?’ workgroup is underway—and under the business rules umbrella. Curtis proposes to bring all rules together in a central location, based on the PPDM 3.8 data model and possible leveraging the OMG’s Semantics of Business Vocabulary and Business Rules (SBVR) standard. More from www.oilit.com/links/0906_14.


IFS launches EPC toolkit with ISO 15926 support

ERP package scales to ‘mega’ engineering projects, provides ‘eco-footprint’ analysis.

Swedish component-based enterprise resources planning (ERP) software supplier Industrial and Financial Systems (IFS) has announced a new solution for oil and gas contractors. IFS’ EPC* Toolkit, described as a ‘standardized, agile tool’ provides control of projects of any size. The EPC Toolkit supports EPC operational processes including project management, engineering, material management and job costing.

The EPC Toolkit is designed to displace today’s ‘fragmented’ IT solutions, in-house development and ‘standard’ enterprise applications that lack EPC support. The Toolkit leverages IFS’ experience of Norway’s NORSOK standards and includes ISO 15926 in its portfolio of supported standards. The Toolkit includes an ‘eco-footprint’ manager to analyze and document projects’ environmental impact and suggest mitigation measures.

IFS Director of Oil and Gas Carl-Magnus Adamsson said, ‘With project management, document management, support for fabrication, installation and go-live, the EPC Toolkit gives large EPC contractors an enterprise solution that integrates with legacy solutions for financials and human resource management. Integrating the Toolkit is easy thanks to its open, service-oriented architecture.’ More from www.oilit.com/links/0906_3.

* Engineering, procurement and construction.


MatrikonOPC’s SNMP for OPC plugs IT into process control

New Agent technology brings hitherto ‘inaccessible’ digital assets into network administrators’ radar.

MatrikonOPC has released a new Simple Network Management Protocol (SNMP) to OPC* bridge to link the worlds of information technology (IT) and process control. Matrikon’s SNMP Agent lets network administrators monitor and manage previously ‘inaccessible’ assets including digital control systems (DCS) and programmable logic controllers (PLC) alongside conventional IT assets.

Sean Leonard, Matrikon VP OPC products said, ‘Historically, IT and process control worlds have been separate. We have been working to bring them closer together. The SNMP Agent gives our customers the power to manage IT and process assets in real-time, wherever they are located.’ SNMP Agent includes a wizard based configurator and can be downloaded from the Matrikon website at www.oilit.com/links/0906_4.

* OLE for Process Control.


Weatherford teams with Oilennium on risk management training

Drilling contractor and e-learning provider team on online safety management system.

Weatherford International has announced the global deployment of a multilingual risk management training program developed for its Drilling Hazard Mitigation (DHM) unit by UK-based Oilennium. Weatherford DHM global QHSE manager Adrian Houlbrook said, ‘Oilennium’s online Learning Management System (LMS) provides significant benefits in terms of efficient delivery and cost reduction—eliminating the need for classroom presentations. LMS also lets supervisory staff monitor employee progress. Oilennium managed to provide us with this valuable resource for a modest investment—a critical consideration in the current economic environment.’

Oilennium MD Kevin Keable added, ‘The Risk Management program was a massive undertaking involving an important and complex subject. Our partnership with Weatherford has produced a benchmark training standard for the industry.’

Oilennium notes that although offshore work is hazardous, when compared with other industries it is actually the safest, thanks to the health and safety standards that have been developed and implemented. Safety management systems build atop the HSE standard set to maintain and monitor a safe work environment. The six module system is available in English and Spanish. More from www.oilit.com/links/0906_5.


Implicit Monitoring radio service tariff tied to Henry Hub gas!

‘Intellisite’ first mile Ethernet solution offers ‘best path communications’ for remote assets.

Dallas-based asset performance monitoring specialist Implicit Monitoring has announced a novel pricing scheme for its radio monitoring service. In response to lower natural gas prices and the need for oil and gas producers to reduce operating costs, Implicit is offering an expanded EFM* whose monthly price is tied to the NYMEX Henry Hub natural gas spot price.

Implicit’s radio-based monitoring solution uses an enhanced polling engine to acquire and display data on Implicit’s flagship ‘Intellisite’ reporting platform or other industry applications. The radio service includes data acquisition, poll on demand capabilities and support for the API 21.1 data collection standard.

Implicit president Scooter Beachum said, ‘By relating the service cost to the price of natural gas, Implicit is continuing to help our customers manage operating expenses. By utilizing existing radio infrastructure, we can deliver cost effective solutions below typical market rates. Our ‘Best Path Communication’ service and application solutions leverage satellite, cellular or radio-based monitoring and the largest field service network in the industry.’ Implicit also offers an asset tracking service with ‘geofencing’ capabilities from an easy to use web interface. More from www.oilit.com/links/0906_6.

* Ethernet in the first mile.


IT/comms onboard Halliburton’s Stim Star Angola newbuild

Remote collaboration and software portfolio supports high-end frac, sand control for deepwater.

Halliburton has kindly supplied Oil IT Journal with more information on the IT and communications infrastructure that kits-out its new Stim Star Angola vessel launched earlier this year. Stim Star Angola will provide Angolan operators with advanced stimulation technology such as acidizing, fracturing, sand control and conformance solutions for the deepwater market. Frac jobs can be remotely monitored from Halliburton’s Real Time Operation Centers (RTOC) or at a client’s site over a secure Insite Anywhere link. Real time data on pressures, slurry rates and prop concentration is streamed during treatment into Halliburton and third party frac analysis applications such as FracProPT, MFrac, StimPlan and Gohfer. Real time data transfer is enabled via the Insite Anywhere RT Data Exporter. Engineers in the onboard Tech Command Center collaborate with more senior technical professional and customer representatives at the RTOC.

The centers enable experts to work concurrently on wells located in different parts of the world, minimize HSE issues by reducing people on board and speeding decision making. They also provide the ideal environment for experienced people to train and mentor the new generation of Halliburton’s engineers. More from www.halliburton.com.


Enigma announces ‘InService’ hosted oil country parts catalog

Electronic catalog of oil and gas equipment to facilitate part and service information delivery. .

Burlington, Mass-based aftermarket service specialist Enigma Inc. has released an electronic parts catalog for oil and gas equipment manufacturers. The Enigma InService catalog targets manufacturers and their customer base of distributors, operators and field service engineers. InService automates publishing and delivering of parts and service information, ensuring accuracy and simplifying part identification and procurement. The tool is claimed to help original equipment manufacturers (OEM) increase aftermarket share and profitability.

InService offers access control and multilingual support, allowing for secure collaboration between OEMs, distributors, operators and field service engineers. Content and application functionality can be delivered separately based on a user ID/log-on. InService integrates with back-end ERP/EAM systems, order processing, diagnostic, warranty and parts inventory systems.

Enigma president Jonathan Yaron said, ‘Aftermarket customer satisfaction drives an OEM’s market share. InService’s user-friendly environment streamlines part identification and ordering for service technicians and parts managers.’ More from www.oilit.com/links/0906_1.


Idea Integration brings upstream data into SharePoint

OpenSpirit leveraged to extend ‘Constellation’ data integration footprint.

Stafford, TX-based IT consultancy Idea Integration is extending its ‘Constellation’ geographic information system (GIS), leveraging its experience of GIS/Microsoft SharePoint integration in the upstream. The company has licensed the OpenSpirit software developer kit to extend Constellation’s data footprint to OpenSpirit-enabled geoscience applications and to provide a ‘unified view and dashboard of upstream assets.’

The Constellation framework, a software and services offering around ArcGIS Server and SharePoint, provides spatial search, charting, reporting and integration via an ‘industry-standard’ dashboard. The framework can be configured to access seismic, well, land, pipeline and other data sources.

Idea Integration Senior VP Adarsh Karia said, ‘The OpenSpirit data store connectivity layer lets us focus on clients’ portal solutions and gives them fast, vendor-neutral integration and access to their subsurface data.’ Idea Integration’s flagship upstream clientele includes Shell E&P Co. Speaking at the 2007 ESRI PUG*, Wnjoo Choi described how a prototype Constellation was used to get a handle on Shell’s global GIS deployment, a ‘humungous data management infrastructure.’ More from www.idea.com.

* The Data Room TW 0703_16.


New workstation runs GeoFrame and Petrel simultaneously

HP Z800, Intel Xeon 5500, NVIDIA SLI and Parallels’ virtualization cross Windows/Linux divide.

Schlumberger was showing off some new technology at the European Association of Geoscientists and Engineers (EAGE) this month (a full report to appear in next month’s Oil IT Journal). The new solution provides full bandwidth graphics running on a single machine in both Linux and Microsoft Windows. Thus a user of both GeoFrame and Petrel can have simultaneous windows open on both applications at the same time.

The technology stack begins with Intel’s new virtualization technology for directed—a new feature of Intel Xeon 5500 CPU. The demo machine was the new HP Z800 Workstation which includes NVIDIA SLI multi-OS enabled Quadro FX graphics cards along with the new Intel chip. The enabling virtualization technology is Parallels’ Workstation 4.0 Extreme that allows for full resolution graphics in both environments. Schlumberger’s Russ Sagert said, ‘our engineers were blown away by the performance of this system even under extreme workloads that stressed every aspect of the system.’

Curiously the EAGE demo, spectacular as it is, also served to underline interoperability issues at the Linux/Windows divide. Data exchange between the two applications is by file transfer! Things might get more interesting with OpenSpirit running on the Z800.

More from www.oilit.com/links/0906_2.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.