October 2008


i-Connect therefore IAM

Halliburton, Scandpower and Yokogawa are prime software providers to Chevron’s Agbami ‘i-Field’ integrated asset management program (IAM). i-Connect middleware helps tame the data tsunami.

Speaking in the ‘Digital Collaboration’ session of the Society of Petroleum Engineers’ Annual Technical Conference and Exhibition*, Tope Adeyemi acknowledged the contributions of Halliburton, Scandpower and Yokogawa to Chevron’s Agbami ‘digital oilfield’ development. The Nigerian supergiant, located 200km offshore Port Harcourt, has recoverable reserves of around a billion barrels produced from a floating production, storage and offloading (FPSO) facility with two million barrels of storage.

The last year has seen the development of a ‘relevant time’ asset management strategy for the flagship ‘i-Field’ development with its four stacked reservoirs of ‘uncertain’ connectivity. One i-Field application is to mitigate early gas breakthrough by monitoring fluid gradients from multiple zones and across the flanks of the main thrust fault.

Relevant time asset management uses digital oilfield concepts and ‘intelligent’ well completion to optimize recovery. Downhole control valves and instrumentation have created a data ‘tsunami’ with some 200,000 data tags sampled every few seconds. Chevron is taming the tsunami with the phased deployment of an asset decision environment (ADE) for interdisciplinary collaboration between office and field.

Data from applications including Halliburton’s AssetObserver, Yokagawa’s historian and Scandpower’s Topaz is channeled through Chevron’s i-Connect middleware to for ‘integrated, multi disciplinary work processes.’ The i-Connect data integration layer also provides access to maintenance repair and operations data (MRO), enterprise resource planning (ERP) and other systems of record. Initially the project provides pressure and flow monitoring allowing for field management by exception.

Adeyemi noted significant safety gains from the i-Field with fewer people on board the FPSO. In the near term this advantage is to be pushed home with increasing automation. The plan is to control the field from the office with a further head count reduction—mirroring a similar drive to automation and remote operations in the hurricane-prone Gulf of Mexico. IT infrastructure has proved critical to the project’s success and significant effort went into assuring the bandwidth required for production optimization. A 95% uptime was reported at first oil. Data redundancy is assured with a mirrored infrastructure onshore. Future phases of the project will increasingly integrate information stored in Chevron’s AvevaNet database of facilities information—an example of a successful handover of data and information from construction to operations (OITJ, July 2007). More from the 2008 SPE ATCE on pages 6&7 of this issue.

* SPE paper number 11536.


Wellcore for P2ES

Petroleum Place Energy Solutions has acquired the Wellcore software assets from Decision Dynamics. The package is to integrate with P2ES’ land offering.

Following its recapitalization earlier this year by San Francisco-based VC fund Vista Equity Partners, P2 Energy Solutions (P2ES) is now in acquisitions mode with purchase of the Wellcore application and Software Platform from Calgary-based Decision Dynamics (formerly Malibu Engineering and Software).

Wellcore is a well life cycle information management solution spanning G&G, drilling, completion and production operations. Wellcore captures information in a central database for analysis and interdepartmental workflows. P2ES has assumed ownership of all assets (including intellectual property) and personnel associated with Wellcore development, marketing, sales and support. The product is already widely used within P2ES’ client-base.

P2ES VP Engineering Solutions, Gerry Conroy, said, ‘The incorporation of Wellcore brings integration opportunities spanning land, production and sales volumes and operating and financial data.’ Wellcore users include Husky, Petro-Canada, Shell, Devon, ExxonMobil, EnCana, Talisman, heavy oil specialists Petrovera Resources and ‘virtual’ E&P shop—Stylus Exploration.


On gas guzzling, CO2, horsepower and ‘green’ ...

Oil IT Journal editor Neil McNaughton allows himself an ‘off topic’ editorial, wondering how come with all the data available in car ads, folks don’t really understand the basic physics of getting energy out of gas and turning it into horsepower. How can a 200HP BMW be ‘green’?

The SPE always entreats its members to proselytize in favor of industry—you can even download an educational kit from www.energy4me.org. Personally, I find it hard to slip the topic into the conversation, let alone set up a PowerPoint show. I’m not sure whether what follows is aligned with the SPE’s general drift, nor even if it is really what people want to hear during the dinner party—which is mostly ‘is oil really running out?’ and ‘Where is the oil price going?, questions that I will refrain from answering right now.

My proselytizing this month is only tangentially about the industry and not at all about information technology so I apologize in advance for this abuse of my editorial position. But when I have a bee in my bonnet I just have to let rip.

A casual remark was at the origin of this investigation. A Houston-based colleague questioned the extent to which driving faster increased fuel consumption. I was surprised by this because I ride a bike and I can assure you that it takes a lot more effort to ride at 30 than 20 mph! There probably is a formula with an exponent or so relating speed to consumption, but that is not quite my concern here. What interests me is how, in the face of an amazing amount of information available in car ads and reviews, can there be such a lack of understanding of the basics.

The matter is not helped by a degree of dissimulation by motor manufacturers world wide on the subject of fuel economy and its corollary, CO2 emissions. Which is my first point. It may have escaped you, it seems to have escaped just about every advertising copywriter, regulator and ecologist in the world, but the carbon in the ‘CO2’ can only come from one place—the carbon in the hydro-carbon, gasoline or diesel that goes into your tank. In other words, fuel efficiency (in miles per gallon or liters per 100km) is equal to a constant times the CO2 produced!

But it gets better. Where does the much vaunted horsepower generated from the engine come from? Unless your car has pedals, it can only come from the same place as the CO2, from the calorific value of the fuel consumed. In other words, the engine’s horsepower at a particular regime is equal to a constant times the fuel used.

To summarize the foregoing, we have the following equation which modesty precludes me from calling ‘McNaughton’s Law*,’ although you are free to do so if you wish.

Horse power = a x Fuel = b x CO2

Now you have to ask, why do you see in car ads three figures, for fuel efficiency, for horsepower and CO2—when one would suffice? Could it be marketing, whereby vendors of performance cars present their vehicles as both powerful and fuel efficient—essentially a contradiction? One such ad that caught my eye is for the BMW 123d, described as ‘the first car with more than 200 HP and under 140 g/km CO2 .’ How does BMW square the circle of low CO2 and heaps of horsepower? Is it because the BMW 123d has achieved a breakthrough in efficiency? I suspect not. The answer lies more in what could be complicity on the part of the regulator or stupidity, probably both. To achieve three numbers, power, fuel and CO2 which are intrinsically related, the regulator allows manufacturers to use different tests. It is obvious that the horsepower number is achieved with the accelerator flat on the floor. The fuel efficiency and CO2 numbers are derived from three standardized driving tests (urban, extra urban and motorway). Where’s the catch? By standardizing on a series of programmed driving cycles, no attempt is made to use the car as intended—i.e. benefitting from its ‘sporty’ nature.

This allows both manufacturers and regulators to appear be ‘green.’ Manufacturers can target fuel efficiency at specific engine regimes and leave the playing field wide open for the usual old gas guzzling technology that we have gotten so used to now that we don’t even notice the absurdity of 200 hp and a 70 mph speed limit or whatever it is in your jurisdiction!

While the relationship between CO2 and liters or gallons of fuel consumed is easy and unequivocal, tying in measurements of power—HP or KW is harder. This is because on the one hand, the petrol engine is rather inefficient—producing a lot of heat—so a lot of the calorific horsepower is wasted. On the other hand manufacturers measure horsepower at an intermediate point in the engine—somewhere around the clutch I believe—although every boy racer wants his HP at the wheels. According to an article in the current issue of MIT Technology Review, some 74% of the energy available from the calories in the gasoline is wasted as heat, leaving 26% for measurement on the test bed. If, as is the case for the 123d, the test bed horsepower is around 200, this implies that there were 555 ‘HP’ going in. From this number and the calorific value of gas, we can figure fuel consumption at max power. Fortunately, BMW also provides us with the vehicle’s top speed—which one presumes is achieved by pressing the accelerator to the floor and working the engine at its maximum power. Going back from top speed through max power (losing 74% of it on the way), we can figure how many oil calories were burned in the process and from there, the fuel burned and C02 produced. My calculations are available on www.oilit.com/links/0810_1. It turns out that CO2 emissions at full power are around 600 gm/km equal to around 36 liters per 100km or if you like, 9 miles per US gallon. Just as a rain check, I Wikipedia’d for data on some real gas guzzlers, Formula 1 cars, which burn around 75 liters per 100km or 3mpg US.

All this profusion and confusion of numbers has served one purpose—to ‘educate,’ especially the US consumer, into the belief of a free lunch—where ‘green’ and ‘power’ can happily co-exist. Motors are run ‘green’ in the test and then the twin turbos kick in for lift off! It would be a good idea if we could standardize on a more telling metric. How about making manufacturers quote fuel consumption (and CO2) at maximum power?

* Not to be confused with the ‘McNaughton Rules’—a good Google if you want a laugh at the expense of my clan.


Microsoft HPC Server, Linux or both?

Microsoft rolls out Windows HPC Server 2008. Cluster Resources announces ‘Moab’ Linux hybrid.

Microsoft was showing off its Windows High Performance Cluster Server (HPCS) 2008 at the Society of Petroleum Engineers ATCE last month with the promise of ‘more accessible HPC capabilities for computer-aided engineering (CAE) in the upstream.’ Microsoft partner JOA Oil and Gas was running its ‘Jewel Suite’ field development planning application on a Windows HPCS machine from Nor-Tech. Nor-Tech’s ‘Portable Cluster’ clusters operate up to 128 compute cores on two 120VAC 20-amp circuits, using AMD’s new low-power Quad-Core Opteron HE processors. Nor-Tech’s clusters can be optionally delivered as fully sealed units for use in harsh environments using SprayCool’s liquid cooling technology.

Nor-Tech’s VP engineering, Dominic Daninger told Oil IT Journal, ‘Interest in portable supercomputing is being driven by the availability of multi-core CPU technology that allows high compute density from standard electrical outlets. Portable/work group clusters are much more economical than their big brothers and cost of ownership is lower.’

We asked Daninger how Windows HPC Server was faring against the ‘incumbent’ HPC operating system, Linux. Daninger explained, ‘In the past we found that Linux usually beat the previous version of Microsoft’s HPC operating system (CCS 2003). With the introduction of Windows HPCS 2008, Microsoft has significantly improved node-to- node networking latency by using remote direct memory access. On the same cluster we have seen benchmarks improve 25 to 30% between CCS 2003 and HPC Server 2008. But many other factors come into play when comparing Linux and Windows, such as how you have optimized math libraries. These can make direct comparison challenging.’

Visual Studio 2008 now provides a ‘comprehensive’ parallel programming environment with support for OpenMP, multiprocessor interconnect (MPI) and web services. HPCS supports third-party numerical library providers, performance optimizers, compilers and debugging toolkits.

Microsoft announced a new partnership with supercomputer leader Cray to introduce a new compact supercomputer, the Cray CX1 with a starting price of $25,000. Windows HPCS 2008 itself costs $475 per node.

In Houston earlier this month, HP and Cluster Resources were showing off an alternative, ‘hybrid’ approach to HPC for the oil and gas vertical. The ‘Moab’ hybrid cluster dynamically changes cluster servers between Linux and Windows based on workload, defined policies, and application needs. The Moab embeds virtualization technology from PCPC Direct and now runs Windows HPCS 2008. The hybrid cluster on show was built around an HP Cluster Platform Workgroup System (CPWS). HP product manager Ed Turkel said, ‘Customers operating high-performance clusters need a flexible infrastructure. The HP BladeSystem ‘c-Class’ coupled with Cluster Resources’ Moab solutions enables customers to maximize HPC investments by eliminating the complexities associated with analyzing large volumes of scientific data.’


BGS GSI3D 2nd International Meeting

British Geological Survey’s package used to produce BG Libya’s surface maps and well prognoses.

The second international GSI3D meeting was held at the British Geological Survey’s (BGS) offices in Keyworth with around 100 in attendance. The Geological Surveying and Investigation in 3D (GSI3D) package is used by the BGS and others to make, not just geological maps, but a 3D geology database. GSI3D is a component of a much larger initiative by BGS, ‘DigMapGB’ to move from traditional maps to the database. GSI3D aims to make geological information understandable to users with flexible 3D viewing. The tool also offers geologists a ‘traditional’ approach to capturing field data from boreholes and cross sections and other observations. BGS’ project manager Holger Kessler claims a ‘paradigm shift’ in the way geological survey communicates to whole population. Kessler sees ‘Lithoframes,’ 3D models of the subsurface, as the ‘natural successors to geological maps.’

Hans Georg Sobisch of Insight Köln originally developed GSI3D for the Lower Saxony Geological Survey. BGS bought a license in 2001 via its £5 million, 3 year Digital Geoscience Spatial Model (DGSM) R&D program. GSI3D blends traditional geological mapping techniques with digital terrain models and controlled lists of formation names. The package presents survey geologists with familiar workflows and tools for interactive cross section drawing, fence diagrams, envelope (coverage) construction and mapping.

Andrew Newell (BGS) presented a project performed under contract with BG Libya for the generation of a GSI3D model in support of oil exploration. The model of the South Sirte basin was initially conceived to source groundwater for BG’s drilling program. Newell noted that ‘all ingredients for an overseas 3D model can be rapidly gathered over the internet,’ along with paper data sources. The digital terrain model came from NASA’s SRTM and InSAR’s free global coverage. Previously such work would have involved much complex downloading and data re-formatting. Today, a Kings College London add-in for Google Earth lets you pull up SRTM data in ARC, ASCII or GEOTIFF format—providing ‘high quality data for free.’ Base maps in the form of Landsat ETM tiles are a one click download from https://zulu.ssc.nasa.gov/mrsid and borehole data came from Libya’s ‘Great Man Made River’ project. Deliverables included well geology prognoses and contour maps of the main horizons. The model, which only took six days to complete is now also used for exploration well prognoses.

The GSI3D meet finished with a demo of ‘GeoVisionary’ a 3D stereoscopic visualization package developed by Virtalis to browse BGS’ multi-terabyte data set. More from www.wikipedia.org/wiki/GSI3D. The full text of The Data Room’s Technology Watch report from the GSI3D conference is also available on www.oilit.com/tech.


Information management in upstream survey announced

Venture Information Management invites participants in benchmark survey of business performance.

According to UK-based Venture Information Management, recent studies in other verticals have shown that top performing companies have a strong information management culture. Benchmarking tools such as ‘information orientation’ can now be used to measure information value against business performance. However oil and gas companies often find it hard to ‘show the value’ of good data and information management.

Research by IMD Lausanne professor Donald Marchand has shown a link between business performance and three IM/IT capabilities—’information behavior and values, IM practices and IT practices. These measures have been rolled up into the ‘information orientation’ model. This work has been applied to the upstream by researchers from the University of Stavanger and applied in a pilot survey of six E&P companies. Initial results confirm the correlation between IM/IT capabilities and business performance.

Venture is now proposing to extend the survey to the UK upstream, incorporating its best-practices for information management and lessons learned from previous IM projects and is available for E&P oil and gas company employees to complete up until 31st December 2008. Initial results will be presented at PETEX. More from robert.day@venture.co.uk.


Jewel Suite 2008, DIANA ‘matrix free’ geomechanical solver

Update to JOA Oil and Gas’ reservoir modeling toolkit includes ‘dynamic’ uncertainty management.

JOA Oil and Gas has released Jewel Suite 2008, an enhanced version of its integrated reservoir modeling toolkit. CEO Gerard de Jager said, ‘We expect this release to see rapid take-up by existing customers and also to open new markets where users are looking for cost effective and intuitive reservoir modeling tools.’ New functions include dynamic uncertainty management, improved handling of large seismic datasets, smart data interfacing and macro tools for automated modeling.

Jewel Suite now also includes Diana, a ‘matrix free’ solver for geomechanical analysis developed by the Dutch R&D organization TNO. Matrix free methods are claimed to offer orders of magnitude speed-up in compute intensive work and are particularly amenable to parallelization on compute clusters (see page 3 of this issue for more on JOA’s role as partner in the launch of Microsoft’s Windows HPC Cluster Server 2008). At the Society of Petroleum Engineer’s ATCE, JOA was showing Diana running on a $100,000 machine with 100 cores from Nor-Tec. More from info@jewelsuite.com.


‘Tigress Live!’ real time data management for SMEs

Geotrace’s revamped data management and interpretation toolset claimed as low cost, low overhead solution.

Geotrace has revamped its E&P data management solution for small and medium-sized companies with the release of Tigress Live!, an integrated, real-time framework at a ‘fraction of the cost’ of existing solutions and with ‘less technical overhead.’

Geotrace CEO Bill Schrom said, ‘Our surveys show that the biggest problem with the current generation of data management tools is still the amount of time it takes to find data. We developed Tigress Live! to offer a more streamlined approach to managing data and capital.’

Tigress Live! supports the entire exploration lifecycle in real time integrating data sources such as existing physical and digital archives, interpretation systems and geo-streaming services. Geotrace’s ‘drop box’ technology is used to pull data together for production reporting and real time well performance analysis. Static models can be automatically updated with fresh data. Closed-loop simulation can be performed with Tigress’ embedded fluid flow simulator—said to be an early derivative of ECLIPSE. More from www.oilit.com/links/0810_2.


Chesapeake video details SharePoint Server 2007 deployment

Microsoft Office SharePoint Server 2007 underpins independent’s collaboration and decision support.

Chesapeake Energy, the US’ largest producer of natural gas, has deployed Microsoft Office SharePoint Server (MOSS) 2007 as the foundation of its ‘MyChk’ enterprise portal. MOSS underpins Chesapeake’s intranet, extranet and internet presence, streamlining collaboration across project teams and providing a foundation for business intelligence, content management and external communications. Hardware includes twin front-end web servers, index and search servers and a database server running a 64-bit SQL Server database along with MOSS 2007.

Working with Microsoft, Chesapeake rolled-out the enterprise solution in under three months. Lori Garcia, manager of IT business systems at Chesapeake, said, ‘Training users on how to find information with the new solution has been easy and can be done in a single afternoon.’ Garcia and Chesapeake CIO Cathy Tomkins take starring roles in a snazzy video* explaining how Microsoft’s .NET infrastructure has eased MOSS integration with other Microsoft applications. For Chesapeake, MOSS goes beyond file sharing with secure multi-user ‘TeamSite’ documents and out of the box web parts. Garcia singled out Chesapeake’s ‘Well Detail’ decision support ‘mash-up’ that has replaced some 8-10 applications. MOSS has empowered Chesapeake’s users to manage their own sites and presentations—with a central repository of corporate artwork. Tomkins concluded that data integrity and central analytics mean that ‘everyone including field-based employees can access the same information as employees working at headquarters. This is a real productivity booster.’ More from www.microsoft.com/oilandgas.

* www.oilit.com/links/0810_3


Software, hardware short takes ...

Fugro, Erdas, SPT, Control Micro, Energy Solutions, Invensys, Primavera, SensorTran, Intertec.

Fugro-Jason has rolled out PowerLog 3.0, a new version of its Windows-based petrophysical interpretation suite. New features include a ‘robust’ data manager and enhanced interactivity between processes. In place editing allows users to correct errors as they are identified. Also new is a basemap viewer of cartography, contours and attributes and multi-well, multi-user support. History tracking ensures that actions can be undone and that, months or years later, the project methodology can still be understood. Users can add their own routines developed in C# or VB.Net.

ERDAS has debuted its geospatial business systems, ERDAS 2009 with new versions of ER Mapper and ERDAS Image Web Server. A new tool, Erdas Apollo enhances the management of large volumes of geospatial data scattered around the organization, simplifying the use of vector, raster and terrain data.

SPT Group has released V6.0 of its Olga simulator with a new simulation engine. Olga now incorporates results from the Olga verification and improvement JIP—with better quality results for gas condensate cases. Modeling is improved with more consistent results between the preprocessor and the dynamic simulator. A new framework allows for integration of compatible tracking modules. Dynamic reservoir/wellbore interactions can be studied with the RocX Module. GUI and plotting improvements complete the picture. SPT has also enhanced Mepo with a Python pre processor, objective functions and compatibility with Platform’s LSF high performance file system.

Control Microsystems has announced the release of ClearSCADA 2009, its SCADA host. The new release adds support for 64-bit operating systems, a .NET 2.0 API and integration with Kepware’s KEPServerEX OPC Server.

Energy Solutions has received SAP Certification for the interface between its PipelineTransporter Gas (PLTG) 4.0 package and SAP’s ERP Release 6.0. PLTG has also achieved ‘SAP Solution Manager Ready’ status.

Invensys has released a new version of its Dynsim dynamic simulation program. Dynsim 4.3 improves modeling of distillation and compressors and enhances process visualization and ease of use.

Primavera’s Inspire integration module for managers of large capital projects is now a SAP-endorsed business solution. Inspire provides connectivity between SAP’s project management and plant maintenance solutions and Primavera’s P6 package. P6 provides transparent resource, schedule and financial information of past and future projects. By aligning data in both SAP and Primavera’s systems, P6 supports business forecasts with complete visibility into key project milestones, deadlines and resources.

SensorTran’s distributed temperature sensing (DTS) new ‘Gemini’ platform introduces the company’s ‘PerfectVision’ multilaser technology to the oil and gas vertical for use in hot-well monitoring. A Gemini DTS system deployed in a 6km well is capable of making single-ended measurements with a temperature resolution of 0.005°C in less than 1 hour, at a 1 meter spatial resolution.

Intertec’s passively cooled enclosure uses a novel phase-change material to maintain the temperature of electronics equipment at 10 degrees lower than peak ambient temperatures with zero power requirements. Applications include pressure transmitters and process analyzers used on oil and gas pipelines in desert environments.


ECIM 2008—Petrobank user meet

StatoilHydro is ‘taking control’ of its Petrel projects. New E&P IM e-regulation body established.

Following last year’s merger, StatoilHydro has standardized on one common OpenWorks/SeisWorks (OW/SW) platform for G&G data—with Schlumberger’s Petrel used for reservoir modeling. Petrel, as essentially a ‘single user’ tool has proved difficult to integrate into corporate environments. Cathrine Gunnesdal described how StatoilHydro is ‘taking control’ of Petrel, moving it into a multi-user environment and developing a methodology for systematic Petrel project building from its OW/SW master data sets, for capturing model interpretations into a results database and back populating reference data to the corporate store.

The new workflow begins with project loading—using approved reference and horizon lists from the OW/SW database into local Petrel projects. Project building currently takes around 16 days—but this is expected to shorten with experience. Petrel projects have read write/access to a local ‘results’ database. When the modeler is through, a project data manager intervenes to quality control data prior to upload to the reference database and back populate the OW/SW store.

The workflow relies on defined roles and responsibilities—with a dialogue between end users and data managers for data snapshots at significant milestones and ‘decision gates.’ Naming conventions, Petrel file folder structures and access rights are standardized and closely controlled. All Petrel projects are globally stored using the standard disk structure. Petrel coordinate reference systems are inherited from the OW/SW via the OpenSpirit interface. Schlumberger’s ProSource with its GIS viewer is used to browse the Petrel projects on disk.

The technique was successfully piloted on StatoilHydro’s Jeanne D’Arc project—with work shared between Oslo and Calgary. Enterprise level deployment and training is now underway.

~

Ann-Christin Schill (StatoilHydro) introduced the new ‘E&P Information Management’ (EPIM) organization, an umbrella group tasked with taking control of three of Norway’s prior e-regulatory initiatives—LicenseWeb, AuthorityWeb and EnvironmentWeb. EPIM’s objective is to develop IT solutions that facilitate information flow between authorities and licensees involved in exploration, production and transport on the Norwegian Continental Shelf (NCS). EPIM began operation in January 2008 and has administrative support from the Norwegian OLF trade body. To date 33 operators have joined the association. E-regulation and reporting has benefited from the same Secured Oil Information Link (SOIL) network that has been the backbone of Norway’s DISKOS/Petrobank data sharing initiative for more than a decade. Although cost considerations and new technology mean that alternatives to the dedicated SOIL network are being evaluated. More from www.epim.no.


SPE ATCE 2008, Denver

Attendees to the ‘Unconventional Resources’ session learned that these are fast becoming ‘conventional.’ A ‘Data and Decisions’ session discussed rig scheduling and the application of MWD techniques to well intervention. Chevron describes critical path-based ‘under rig floor’ logging.

In the plenary ‘Unconventional Resources’ session, Marshall Atkins (Raymond James Energy Group) noted previous cyclicity as the downturn of the 1980s effectively killed off most unconventional projects. At today’s prices, wind, fuel cells and so on make sense. But if oil doesn’t stay high the situation for unconventionals will be a return to the 1980s.

Chuck Stanley (Questar) recalled that only a decade or so back most US gas came from conventional reservoirs. We still dream of finding such reservoirs, but reality is that unconventional is becoming conventional. Typical tight gas wells show initial decline rates of over 60% and 50% of reserves are produced in 5 years. Gas in place was hugely underestimated in these exotic reservoirs. According to a CERA study, half of US gas comes from tight sands, shales and coal.

Glenn Vawter (NOSA) described oil shales, a.k.a. ‘the rock that burns’ as a huge, untapped resource. In parts of the Piceance basin there are 2 million barrels per acre. Shale oil is a huge resource with great long term potential but it is ‘certainly not a short term play.’

In the ‘data to decisions’ session, Morten Irgens (Actenum) presented work done for Saudi Aramco on scheduling its drilling rigs. Actenum blends artificial intelligence and operational research to increase knowledge workers’ productivity. Today’s decision support applications such as the ubiquitous Microsoft Project does not really cut the mustard. What’s needed is a tool that helps, that criticizes decisions which may be too complex for humans to understand. Saudi Aramco used the tool to plan an optimal drilling sequence for its 130 rigs. Schedules were required to meet targets, minimize transport costs and maximize productivity. Actenum’s tool analyzed 1,500 well locations with 32,000 constraints and competing objectives. There are more ways of scheduling Aramco’s rigs than atoms in the universe. The answer is a ‘two experts’ approach—man and machine sharing the decision.

You would think that in a costly well intervention such as fishing for lost or stuck equipment, that operators in general know what is happening at the tool face. Not so according to Sid Huval (Baker Hughes). Traditional techniques mean that knowledge of what’s going on downhole is often little better than guesswork. The situation is changing with real time logging during well intervention. New downhole sensors have been added to traditional service tools with the results displayed on the rig floor. A compelling case was made for real time observation of milling, debris cleaning and fishing jobs. ‘Now we can see what we’re doing downhole—we need this in every complex well.’

Paul Gregory (Intervera Data Solutions) listed some data management scare stories such as the one where a rig was on standby for five days while engineers tried to confirm that the correct depth has been reached. The information was stored in a drilling and completions database but had been associated with the wrong well. Data forensics allowed the correct well data to be located. Similar mix ups occurred during a fishing job when there was an expensive mix-up over data that belonged to a different lateral. Afterwards a detailed check of the well database revealed that 5% of the client’s wells had similar issues. In all cases the problem was fixed with data quality profiling (DQP). For Gregory, DQP is like anti virus software and should be running in the background continuously monitoring data.

Milad Ershagi (USC, now with Schlumberger) presented a GIS tour de force combining data from the Minerals Management Services’ website with published work from the SPE digital library and other sources, using ArcScene and ArcGlobe for 3D visualization of the whole Gulf of Mexico data set. 30,000 wells and deviation surveys were analyzed with geoprocessing to produce probability surfaces of sands and suggest new targets, using the objective function to forecast production indices.

Schlumberger was showing early results from the integration of ISS’ BabelFish production data access toolset with its Avocet asset management environment. Bablefish plugs a gap in Schlumberger’s offering with customizable visualization of production operations data. One initial target workflow is gas lift optimization. BabelFish provides a map of the operation along with key performance indicators, traffic lights and so on. Drilling further down into the display pulls up well schematics—and allows engineers to fire up applications such as PipeSim in context for nodal analysis.

Julian Pickering outlined the design of BP’s Advanced Collaboration Environment (ACE) for onshore drilling on the Tangguh LNG development, Indonesia. This is the first time BP has used an ACE for drilling performance enhancement, rather than to reduce offshore headcount. WITSML has provided an enhanced data environment suitable for ‘faster and better informed decision making.’ The ACE provides high value problem solving while crossing faults and allows for learning from other operations. The ACE is located in Jakarta and houses an immersive environment with curved screens, built on the control room paradigm. Drilling and completions workflows leverage Halliburton’s DecisionSpace, engineering data model and OpenWorks. The ACE has provided ‘hard benefits’ such as improved recovery from non productive time and abnormal situations. Teething troubles with some WITSML implementations have meant that BP has now set up a WITSML test environment in Aberdeen where vendors’ implementations can be assessed. This test bed is currently running the SiteCom WITSML server—but BP expects Energistics certification to replace in-house testing eventually.

Eric Upchurch presented the results of ‘under rig floor’ logging that Chevron has been using in the Gulf of Thailand for the past couple of years. The idea is simple, the rig is never left idle, after drilling the surface section of one hole, the unit is skidded over to drill a second hole while logging operations progress on the first. Such ‘batch drilling’ operations require modifications to the rig and ‘a very good knowledge of drilling operations.’ Chevron’s wells now take 4½ days to drill, an 11% saving on rig time. Less items in the critical path makes for smoother safer operations.

This report is a brief extract from The Data Room’s Technology Watch report from the 2008 SPE ATCE. More from www.oilit.com/tech.


SPE Information Technology Technical Section

Integration and IT/engineering ‘hybrids’ a.k.a. ‘digital engineers’ discussed by SPE luminaries.

Yanni Charalambous (Oxy), who chairs the SPE oilfield integration workgroup, described the multiple integration opportunities that exist across data, process, asset, people systems integration. The intent is to facilitate the development of integration solutions through organizations such as Energistics and PPDM. An online library of use cases is being developed in conjunction with the University of Houston.

Bertrand du Castel (Schlumberger) wondered why computing in oil was so separate from rest of the IT universe. du Castel believes that Berners-Lee’s Semantic Web is going to take us to a ‘contextual and intelligent’ web where we can ‘look for meaning rather than key words.’ The World Wide Web Consortium got in touch with du Castel a year ago and the kick-off Semantic Web in Oil and Gas conference will be held later this year.

Kamel Bennaceur (Schlumberger and SPE IM Director) believes that information dissemination is both a key role for the SPE and a significant IM enabler. The SPE spends a lot on dissemination. The spe.org website has seen a major revamp, with the ‘OnePetrol’ e-Library. A new review process is accelerating the time from presentations to publication and the ‘Petabytes for Asset Management’ forum has been kicked off. A new journal on ‘decision making’ is mooted.

Peter Breunig (Chevron) asked if integration was an obvious target for the SPE? du Castel questioned the premises of this meeting – today we can build systems that automate entire oilfield systems. We should be looking less to integrate systems, rather to automate human processes. Ron Cramer (Shell) expressed concern over the enormous scope of what was being discussed—suggesting homing in on an ‘80/20’ solution. Matthew Kirkman (BP) suggested a focus on IT infrastructure/framework was required saying that ‘We are realistically still 5 years away from SOA and still need standards to support this. Kirkman also questioned why the industry was so far behind retail asking, ‘Why is there no barcode on my 9 5/8th casing?

A second panel, of Don Paul (Chevron), Herb Yuan (Shell), Don Moore (Oxy) and Iraj Ershaghi (CiSoft) deliberated on the concept of a ‘Digital Petroleum Engineer.’ Paul noted that IT is engaged and ‘blended’ into the upstream. Managing upstream information systems increasingly involves domain knowledge and business content. Historically, people involved in this activity were ‘raised’ internally—engineers became IT folks. But it is now necessary to look outside of the organization.

Yuan is a petroleum engineer by training and is now an IT manager. Yuan thinks that industry has ‘tolerated’ IT amateurs and vice versa. But today we need a new generation that can combine both IT and domain skills. The problem is that in Shell, people can make career progress either as an engineering or IT professional—but not as a ‘hybrid.’ Ershagi described the Chevron-supported USC/CiSoft ‘experiment’ of bringing IT and petroleum engineering students together to create a ‘renaissance’ or ‘hybrid’ engineer. But do you take engineers and teach them IT or take IT students (who couldn’t do physics) and teach them engineering? Ershagi’s answer is unequivocal, ‘Do the former—teach domain specialists the IT they need. Youngsters may be great on an Xbox, or with Excel but this is not enough—which is why we have a degree program combining engineering with IT.’

The ensuing debate ranged widely around the subject. One major reported ‘losing control’ when it went over to full IT supported projects—IT should be an enabler. Another ‘hybrid’ reported that HR couldn’t track his move from IT to drilling. This can adversely affect one’s career prospects. It’s preferable to be in drilling, production or subsurface—especially in a downturn! An IT specialist offered that ‘you can’t turn an engineering prototype into an enterprise system in six weeks.’ There is a problem of mutual recognition of competencies.


Oil IT Journal interview—Fatmir Hoxha, VP R&D, SeismicCity

Seismic processing guru reports on seismic imaging tests on NVIDIA’s Tesla GPU-based computers.

Fatmir Hoxha—Our current R&D focus is on depth imaging and pre stack migration. We started using GPUs last year, learning how to program them and we got our prototype running earlier this year. The ‘killer app’ is finite difference modeling—the key to reverse time migration (RTM) where we were surprised to find that GPUs gave an immediate 30 times improvement over the CPU.

What exactly is compared here?

A single core of a 64 bit AMD CPU against a single NVIDIA Tesla C870 GPU Card with 128 cores.

Is that fair?

That’s not the point—it is consistent. We were interested in testing code across two architectures, the NVIDIA card does its own parallelization—parallelizing across multi-core CPU architectures is a different story! In a production environment, taking account of the restricted memory bandwidth of the GPU cards (limited to 1.4Gigs), we still found a 10 fold speedup—with no tweaking of how the code runs across the 128 cores.

So are GPUs to replace the thousands of clusters in seismic processing shops around the world?

Probably not. GPUs are great for programming some tasks such as RTM. But even this does require a lot of GPU-specific development using NVIDIA’s CUDA API, for memory management and scheduling. For companies with masses of legacy code it is unrealistic to imagine that this can be effortlessly ported to CUDA. Apart from anything else, there is a skills shortage and companies don’t want to get dependent on a few CUDA developers.

But won’t there be a parallelizing Fortran or C compiler for CUDAs?

Probably but this will never remove the need to tune code to the GPU architecture. It’s unlikely that the benefits will be that easy to realize.

What about floating point and double precision math?

CUDA provides single precision floating point math which we find enough for RTM.


Folks, facts, orgs ...

Movers this month hail from Baker Hughes, BearingPoint, Black Elk Energy, Caesar Systems, CERA, USC, Deloitte, Emerson, Fusion, Geomodeling, IES, Techsia, New Digital Business and Iron Mountain.

Baker Hughes has appointed Clif Triplett, as VP and CIO. Triplett joins Baker Hughes from Motorola where he served first as VP and CIO in the network and enterprise group and most recently as VP, Global Services.

BearingPoint been recognized by SAP as a ‘preferred partner’ in Russia for the oil and gas and metals and mining industries. Natalia Krasnoperova is VP and practice leader, BearingPoint Russia. Joint BearingPoint and SAP customers include Lukoil, TNK-BP, Urals Energy, SUEK, Severstal-Resource, Metalloinvest Holding and Evraz Group S.A.

Houston-based Black Elk Energy has appointed Terry Clark as Executive VP and CTO. Clark sold his privately held consultancy, Atlantis E & P Services to Hamilton Engineering last year. Clark was previously with Amoco Production Company (now BP America).

Caesar Systems has named Alan Jaschke as client services manager. Jaschke is an expert PetroVR modeler for large deepwater and unconventional oil and gas projects.

Cambridge Energy Research Associates (CERA) has announced two new hires. Jonathan Parry is Director Natural Gas Supply and Fabien Roques, Associate Director, European Gas and Power with a specialization in the EU power and carbon dioxide (CO2) markets. Parry hails from Shell International and Chevron, Roques from the International Energy Agency.

Following his retirement as CTO Chevron, Don Paul is to become a senior advisor to the University of Southern California (USC). Paul is president and managing director of Energy and Technology Strategies LLC, an advisory group that is to work to expand USC’s new Energy Institute. Paul previously helped establish the USC/Chevron ‘CiSoft’ partnership.

Graham Sadler has been named head of Deloitte’s Petroleum Services unit following the departure of Ken McKellar.

Emerson has announced several internal promotions as follows. Steven Sonnenberg to executive VP Emerson and business leader of Emerson Process Management (EPM). John Berra is now EPM chairman. Mike Train is now president of the Rosemount unit and Sabee Mitra has been named president of EPM Asia Pacific and will move to Singapore. Dave Tredinnick is now president of EPM Middle East.

David Latin, VP, E&P Technology with BP was elected as chairman of the Energistics’ board. Mark Greene, executive partner in Accenture’s resources-energy consulting practice was elected Vice-Chair.

Fusion Petroleum Technologies has appointed Kevan Hanson VP for the Arabian Gulf region. Hanson was previously with PGS.

Geomodeling has appointed Jeff Donnellan as VP R&D and CIO and John Sherman as VP marketing and business development. Donnellan was previously with EDS, Sherman with Digital Earth.

G&G consultants Interactive Exploration Solutions has hired David Quintanilla (senior geophysicist and project manager) and Michelle Fullen (geophysicist and business development).

Jean-Etienne Jacolin has joined Techsia as support and studies engineer. Jacolin hails from French Petroleum Institute (IFP) unit Beicip-Franlab.

Mike Pollock is now operations manager at New Digital Business in Aberdeen. Pollock was previously with SAIC in Azerbaijan.

Marc Duale has been named president, international and William Brown CIO of Iron Mountain.


Done deals

Deals involve RPS Group, Paras, Geokinetics, Ikon Science, Ingrain, Siemens, Innotec and Senergy.

RPS Group has a acquired the UK-based information management consultancy, Paras Ltd. in a cash and paper deal worth up to £6.4 million. Consideration paid at completion was £4.7 million, comprising £3.5 million cash and 513,095 new RPS shares at a price of £2.34 per share with a total value of £1.2 million. Subject to certain operational conditions being met, further payments are scheduled over the next three years. In the year ended 31 October 2007, Paras had revenues of £3.0 million and £1 million profit before tax. Paras’ clients include BP, ConocoPhillips and BG Group.

~

Seismic acquisition and data processing specialist Geokinetics has raised $30 million from Avista Capital Partners which now holds approximately 38% of the company’s voting stock. Proceeds from the sale will be used to fund the Geokinetics capital expenditure budget, recently increased from $64.7 million to $80.0 million, for working capital required to support the Company’s growth initiatives and for general corporate purposes.

~

Fleming Family & Partners’ Private Equity (FPE) has bought into Ikon Science’s capital base and is now its largest shareholder. FPE plans to make further capital resources available to Ikon in order to fund the expansion of its business and has placed Henry Sallitt, an FPE director, as non executive Ikon director. UK investment bank KBC Peel Hunt acted as intermediary in the deal.

~

Ingrain has secured its second round of funding totaling $15 million from international investors and Stanford University. The funds will go to advancing the deployment of Ingrain’s digital rock property measurement technology (Oil ITJ July 2008) and for continued international expansion into Canada, South America and the Middle East.

~

Siemens has acquired Innotec GmbH of Schwelm, Germany. Innotech is an international vendor of digital engineering software and services for the process industry. Innotec will integrate Siemen’s Industry Automation unit.

~

Well engineering consultancy Leading Edge Advantage (LEA) has been acquired by Aberdeen-based Senergy—the company’s third acquisition this year. LEA adds advanced drilling techniques expertise to Senergy’s wellbore and well performance consulting.


CapRock announces AssetTrax rig tracker

International communications specialist rolls out GIS-based asset location for the lobby!

Houston-based CapRock Communications is launching ‘AssetTrax’ to monitor the position of vessels and drilling rigs in real-time. AssetTrax is a map-based tracking service that monitors the position and movement of critical assets around the world. CapRock VP Ron Long explained, ‘Customers told us that they wanted to walk into the lobby of their office and see a plasma screen with dots all over the world representing the locations of their global drilling rigs. AssetTrax lets them see this 24/7 in the lobby, or from any desktop with an internet connection.’

Cal Dive VP Allan Palmer added, ‘We manage a fleet of 27 diving and construction assets in the Gulf of Mexico. AssetTrax allows us to quickly identify the closest asset for response to urgent situations, such as a pipeline leak or marine emergency. AssetTrax was invaluable during Hurricane Ike when we were dispersing our fleet to safe harbors across the Gulf.’

CapRock uses field-proven VSAT satellite technologies to deliver highly reliable managed communication services for broadband networking, real-time video and digital telephony to the world’s harshest and most remote locations. Its global infrastructure includes five international teleports and eleven regional support centers across the US, Central and South America, Europe, West Africa and Asia Pacific.


Industrial Defender, Kepware team on intrusion detection

Host intrusion detection package to augment KEPServerEX security agent for automation.

Kepware is teaming with Industrial Defender to develop a version of Industrial Defender’s host intrusion detection software as an add-on to the KEPServerEX security agent. The new plug-in will monitor cyber security parameters of the environment including failed login attempts, file system consistency, removable media access, registry modifications, process and socket violations, and other SCADA parameters. When an Industrial Defender Security Event Management Console (SEM) is present, it will aggregate, correlate, store and report security events. Industrial Defender’s ‘Defense–in-Depth’ offering is now part of the ‘Connected with Kepware’ partner program.

A new release of ID’s ‘Gauntlet,’ V 2.2 adds NERC CIP critical infrastructure protection compliance—protecting against internal and external cyber attacks and vulnerabilities. Gauntlet was added to the Industrial Defender product suite with the acquisition of Teltone Corp. New features include virtual router redundancy, cyber assets flagging and IP domain lockdown.


PPDM User Group Meet, Perth, Australia

PPDM and master data management, Woodside’s EpiDB, Petrosys CEO on unstructured data.

Speaking at the Public Petroleum Data Model Association’s user group meet in Perth, Australia last month, CEO Trudy Curtis noted growing interest in upstream master data management (MDM). MDM plays a role in data collection, quality assurance and distribution throughout the organization. Curtis believes that the collective industry investment in the PPDM data model, which she puts at $75 million, its scope and user base make it a logical a starting point for an MDM strategy.

Hélène de Beer described how Woodside has built its EpiDB corporate upstream database around the PPDM standard. Work on Woodside’s data infrastructure began in 1996. EpiDB is still the focal point of Woodside’s information management system and has recently been extended with new applications for well data management, petrophysics and Petrel project management. Woodside’s ‘eWell,’ a SharePoint development that provides central data loading and continuous data QA. Woodside uses OpenSpirit to sourcing basic data for Petrel reference projects. Corporate reference data is loaded using a combination of the EpiDB Browser and Excel. Work on well data quality metrics has led Woodside to tweak the PPDM data model with the addition of new data types and automatic triggers and scripts to update the database with information from SharePoint. Data management has given ‘clear value and benefit’ to Woodside, providing continuity in the face of new software, data and people with new ideas.

Petrosys CEO Volker Hirsinger compared the merits of different strategies for linking structured master databases to unstructured document collections. There is a trade off between the difficulty of implementation, the accuracy of search and ease of use. Structured data management enforces uniqueness and encourages ‘final decisions’ but can be time consuming. Unstructured (document) data management allows knowledge to be shared in its most ‘natural’ form. But this can be compromised by varying formats and units, making cross referencing of numeric data hard. One solution is to hotlink to documents from the database providing drill down into specialized or technically complex information. Indirect, Google type search can be enhanced with standard classification schemes or used superficially. Free text search links can be established without data management overhead—they may allow new data relationships to be discovered or conversely obscure important connections through irrelevant ‘noise.’ Spatial-based search has proved so popular that it has create a whole new, costly layer of data management. For instance, the PPDM records management module comprises 52 tables. Spatial search is eased with tools such as the Petrosys asset module, naturellement.


Sales, contracts and deployments

IDS for Petrofac SPD, Amalto for Chevron, BASF’s virtualization, Panasas and WellPoint Systems.

IDS is to provide reporting tools to Petrofac’s SPD unit, a UK headquartered well services provider, in support of its southern North Sea Chablis operation. SPD is to deploy IDS’ DrillNet, a WITSML-enabled drilling data service that offers analysis and reporting tools and an integrated search engine. SPD operations manager Neil Robertson said, ‘DrillNet is the perfect solution for us as a company with globally dispersed operations. The system enables all types of operational data to flow freely through our Partner Portal and the secure web link allows full transparency in a safe environment. The system allows our engineers and clients in Aberdeen, London and Johannesburg to have up-to-date information 24/7 which is essential for multiple operations across the world.’

Chevron USA has selected Amalto Technologies’ ‘b2een’ e-commerce solution to enable electronic transactions with its suppliers. Amalto b2een combines a light web 2.0 client, encrypted document exchange between trading partners and a software as a service (SaaS) paradigm that validates and routes transactions. Amalto CEO, Jean-Pierre Foehn, said, ‘Amalto offers mid-size companies a new way to exchange purchase orders, invoices and field tickets with their larger clients. With Chevron’s support, we have demonstrated that the benefits of electronic transactions are accessible to trading partners of all sizes.’

Chemical giant BASF is AspenTech’s first client to use Microsoft’s Application Virtualization (MAV) technology to improve compute performance of its AspenOne process modeling suite. BASF uses AspenOne to simulate its chemical processes to improve plant design and optimize production. Virtualization decouples applications from operating systems and runs software as network services, making applications available whenever and wherever they are needed, even when disconnected from the network. Peter Michael Gress, senior VP Engineering at BASF said, ‘We run some of the largest and most complex process simulations in the chemical industry and need the right IT infrastructure and applications. Virtualizing AspenOne has improved the speed with which we can investigate plant and equipment behavior, letting us optimize decision making, reduce costs, and accelerate time-to-market.’ Blair Wheeler, AspenTech senior VP marketing added, ‘This initiative will make our software easier to deploy and use across the enterprise. The world’s leading process manufacturing companies rely on the powerful optimization functionality of AspenOne and our partnership with Microsoft enables us to deliver more value to industry-leading manufacturers such as BASF.’ According to Microsoft’s Chris Colyer, virtualizing AspenOne has increased the speed of deployment by more than 50%, reduced software conflicts and the risk associated with migrating to new versions.

Polish seismic contractor Geofizyka Krakow (GK) has deployed a Panasas AS3000 parallel storage system to support its seismic processing jobs. The company reports a six fold speed up in turnaround times as a result. GK runs WesternGeco’s Omega seismic processing package. Leszek Boryczko, GK computer systems manager said, ‘The Panasas AS3000 parallel storage solution has made our IT workflow more manageable.’

International crude oil marketer Masefield Canada has selected WellPoint Systems’ Energy Broker to manage its crude oil marketing business and enhance its trading business. WellPoint’s integrated oil and gas software solution is built atop Microsoft’s Dynamics AX ERP system. WellPoint Energy Broker provides contract and price building, logistics, marketing and settlement in a multi-commodity, currency and language environment.


Standards Stuff

Open Geospatial Consortium releases spatial data quality survey. CAPE-OPEN gets industry backing.

The Open Geospatial Consortium has released the results of its spatial data quality survey. The survey, of several hundred GIS professionals around the world, is intended to guide the OGC’s development of geographic data quality metrics and a spatial data quality model. Following a laborious investigation into participant’s company type, location and industry the survey investigates how more accurate or more consistent spatial data would help organizations and if data is considered fit for purpose. The sections on data quality and management appear to have been answered by respondents but are frustratingly incomplete. However we did learn that 56% of respondents have no standards for data quality and 52% do not list data quality as a requirement in their contracts. Even more worryingly, some 25% of the sample said that they would not be prepared to pay for assured data quality levels in their contracts. The (incomplete) survey results are a free download from www.oilit.com/links/0810_4.

~

Michel Pons who heads-up the CAPE-OPEN organization reports ‘major progress’ in CAPE-OPEN implementation and support from vendors including COMSOL, ProSim SA and SimSci-Esscor. The CAPE-OPEN standard promises ‘plug and play’ interoperability of chemical engineering applications used in hydrocarbon process modeling. COMSOL Multiphysics V 3.5 features a CAPE-OPEN thermodynamic and physical properties interface enabling exchange of property calculations with other compliant packages. ProSim also has a CAPE-OPEN interface for its Simulis Thermodynamics package—available as an Excel plug-in, a MATLAB toolbox or dll. Interoperability has been demonstrated with COUSCOUS (AmsterCHEM) and Xist (HTRI). Invensys Process Systems (IPS) has participated in the development of CAPE-OPEN standards and implementations. PRO/II 8.2, released in July 2008, heralds improved CAPE-OPEN support, achieved through a close working relationship with the CAPE-OPEN Laboratories Network (CO-LaN) and customers. IPS has participated in several CO-LaN special interest groups and is working to extend CAPE-OPEN with new interfaces for refinery reactor simulation.


Wireless network upgrades for PDVSA and BP

Emerson deploys ‘self organizing’ wireless networks on oilfields and refineries of the future.

Emerson Process Management (EPM) reports growing take-up of its ‘Smart Wireless’ mesh network to PDVSA and BP. In a multimillion dollar automation upgrade, PDVSA is to deploy the ‘self-organizing’ wireless network in the Morichal district oil fields to monitor 180 wells. 600 networked devices will provide pressure and temperature data to predict well performance and report on the health of wellhead drives, mitigating well shutdown.

EPM VP wireless, Bob Karschnia said, ‘Operators are using more and more measurement to improve operations. Cost benefit analysis demonstrates that wireless is often preferable to traditional wired systems.’

In a separate announcement, BP is to deploy a Smart Wireless mesh at it’s R&D unit in Naperville, Illinois and at its Cherry Point, Washington refinery. Smart Wireless will be used at Cherry Point to mitigate fan and conveyor failure in the calciner unit. BP’s Mark Howard said, ‘Wireless is an important enabler for our refinery of the future program. It helps us deploy the instrumentation, sensors, and analytical devices that we need for condition monitoring and predictive maintenance.’ Emerson has also announced new open standard WirelessHART devices. Karschnia told Oil IT Journal that he expects to see take up of WirelessHART to be as successful as the previous HART bus devices that now dominate the wired sensor marketplace.


Cyber security awareness month announced

US Department of Homeland Security sets out to minimize vulnerabilities.

It’s national cyber security awareness month in the US. The Department of Homeland Security’s (DHS) National Cyber Security Division (NCSD) is engaging public and private sector partners ‘to increase awareness and minimize vulnerabilities.’ Homeland Security Secretary Michael Chertoff announced the initiative earlier this month. In the Q&A, Bob Connors (Raytheon) opined that the private sector ‘understands the risks and is investing a lot of energy in cybersecurity.’ But Connors was skeptical in regard of the ability of ‘the tens of millions of folks out there with home computers’ to treat cyber security seriously—’They don’t get it and they don’t want to get it.’ Chertoff was also asked what the DHS is doing to protect information supplied through National Infrastructure Protection Plan especially concerning ‘high-risk chemical facilities.’ Chertoff responded that the DHS was attentive to its internal systems’ security. The DHS is ‘getting its house in order,’ implementing ‘Einstein,’ a robust penetration data protection system. Chertoff has resisted efforts to release some chemical data under the Freedom of Information Act—’The danger is that, if we put it out there, it ends up being read in caves somewhere in South Asia as well as on your home computer.’ More safety tips, from www.us-cert.gov, www.onguardonline.gov and www.staysafeonline.info!


Rebranding of enterprise single sign-on reflects expanded scope

Passlogix CEO convinced that ESSO has broad application in dispersed oil and gas operations.

Passlogix has re-baptized its flagship v-GO Sign-On Platform as the v-GO Access Accelerator Suite to reflect the expanding scope of its enterprise single sign-on (ESSO) flagship. Back in 2006, (Oil ITJ Sept 06) Passlogix claimed ‘industry standard’ status for v-GO in the oil and gas vertical, with sales to Chevron and others, notably through a non-exclusive marketing agreement with Schlumberger.

v-GO provides user access to business systems and applications both inside and beyond the firewall and claims to have solved the conundrum of proliferating application passwords. v-GO now also supports access for the ‘extended enterprise,’ including partners, suppliers and contractors.

Passlogix CEO Marc Boroditsky believes that v-GO’s functions are particularly a propos to the oil and gas vertical whose decentralized operations extend to remote oilfields and distant offshore drilling platforms. ‘Field personnel at a rig are often casual about sharing passwords. This creates the potential for unauthorized access to applications and sensitive data files.’ Almost half of the majors use ESSO to combat these and other problems. ESSO also aids compliance with the Sarbanes-Oxley Act and other data protection regulations. More from info@passlogix.com.


TrigPoint, Syntag announce heavy duty RFID tags

PromptT from TrigPoint equips Helmerich & Paynes drilling fleet. Syntag tags Peerless’ chains.

Two companies have announced heavy duty solutions for RFID*-based asset management this month. TrigPoint Solutions is to provide its ‘PromptT,’ oil country-specific RFID tagging solution to Helmerich & Payne International Drilling Company (HPIDC) for asset management on its drilling fleet. PromptT helps ensure compliance with corporate tracking, preventative maintenance, certification and safety procedures. Earlier this year, Ensign US Drilling also signed up with TrigPoint.

TrigPoint combines RFID technology, ruggedized mobile hand-held devices and interactive front-end and back-end applications to deliver a rapidly deployable system that is effective and user-friendly. RFID tags permanently attach to rig assets to validate proof-of-presence.

In a separate deal, Syntag Manufacturing announced that it is to supply The Peerless Chain Co. with synthetic, durable RFID chain tags. The tags connect items of equipment to computer driven inspection systems and asset tracking programs. Along with its corrosion-resistant RFID-equipped chain sling identification tags, Syntag offers RFID ‘Identiplates’ for heavy service equipment and machinery.

* Radio Frequency Identification


RigEye IP Video for rig site surveillance

Ping and power are all that’s needed for secure, remote surveillance of drilling and unmanned facilities.

RigEye’s visual communication for the oil field offering is now fully operational. Ping and power are all that’s needed to support RigEye’s IP-based video monitoring systems. RigEye claims to be the ‘Swiss army knife’ of applications such as mitigating mud losses, near misses, accidents and vandalism. The system also helps with remote monitoring of critical applications like bit inspections and coring. License plate recognition can be used to identify and approve site visitors and the system avoids costly trips to the field since operators can log-in securely with RigEye’s web based video software. RigEye also has applications in safety training, using archived video to replay accidents and near misses.

RigEye’s oil country specific videocams offer pan, tilt and zoom functionality and can be installed on onshore, offshore rigs or in refineries and plants. A top drive video system is available for a bird’s eye view of drilling activity.


IBM adds Hubwoo’s e-sourcing toolset to outsourcing offering

Software as a Service (SaaS) e-business hub adds e-procurement to business process outsourcing.

IBM has selected Hubwoo’s SAP ‘source-to-pay’ solutions for integration with its procurement outsourcing offering. Hubwoo’s solutions spans spend analysis, e-sourcing, procurement and invoicing, content management and supplier connectivity. Hubwoo includes a hosted, ‘software as a service’ (SaaS) multi-client technology platform.

Mark Williams, Hubwoo CEO said, ‘We have already signed our first joint customer—the alliance builds on existing outsourcing agreements between SAP, IBM and ourselves. Our expertise in hosted source-to-pay technology now benefits from IBM’s overall outsourced procurement offering.’

Hubwoo’s trading hub annually processes 4,5 million transactions representing € 7 billion in customer spend value. For the first six months of 2008, the company reported revenues of €14.9 million (of which €11.4 million from SaaS) for a net loss of €5.3 million. Williams explained the ‘deep’ first half loss as ‘a consequence of our revenue transition on Trade Ranger customers, and restructuring costs.’ Looking forward, he expects organic growth to return the company to profitability ‘in short order.’ Two thirds of Hubwoo’s SaaS revenues come from its US operations. Hubwoo clients include Total, Shell, EcoPetrol, ConocoPhillips, ENI, Repsol YPF and Statoil.


WorleyParsons, Intergraph, Devon and Canadian megaprojects

Data re-use key to increased adoption of SmartPlant for tar sands megaprojects.

Devon Energy has awarded WorleyParsons Canada an engineering services contract for its Jackfish II 35,000 bpd steam assisted gravity drainage (SAGD) project, Alberta. WorleyParsons’ MEG Division is to provide detailed engineering design services and procurement for Jackfish’s central processing facility. The deal is worth an estimated $32.6 million CDN.

In a separate announcement, Worley-Parsons has upped its adoption of Intergraph’s plant management software—particularly the SmartPlant engineering and design solution. WorleyParsons’ director Lindsay Wheeler said, ‘SmartPlant supports global workshare and provides data in a format that customers can re-use, allowing for more efficient design solutions with improved integrity.’

WorleyParsons’ 2,000 oil sands specialists have been involved in over 3,000 oil sands upgrading and extraction projects, from feasibility studies to mega-project construction. WorleyParsons’ ‘EcoNomics’ initiatives target ‘sustainability’ in engineering with for instance, the replacement of natural gas as fuel for steam and power generation and the application of novel water treatment technologies.


WeatherBug StreamerRT for energy markets

Energy traders to benefit from web-based GIS front end to National Weather Service data.

‘WeatherBug,’ a brand of AWS Convergence Technologies has announced ‘StreamerRT,’ a real time weather visualization tool targeting the energy market. StreamerRT provides detailed, live weather feeds and GIS mapping. The web-based tool provides energy traders with custom views of existing and potential weather conditions that are likely to impact energy price markets.

WeatherBug senior VP sales, John Doherty, said, ‘A small rise in temperature can mean the difference of thousands of dollars for real-time traders in the energy markets. With the ability to quickly assess the impact of weather conditions, instantly and accurately, StreamerRT offers a distinct competitive advantage, enabling energy traders to better prepare and respond with greater precision and performance.’

Live weather data and forecast information is collected and aggregated from the National Weather Service (NWS) and the WeatherBug Network, a national network of professional-grade weather stations and cameras located at schools, public safety facilities, and in rural and residential areas throughout the country, including on and off-shore locations along the entire Gulf Coast. StreamerRT also includes severe weather alerts from the WeatherBug Network such as lightning, heavy precipitation, and heat index in addition to NWS’ watches and warnings. StreamerRT is described as a ‘true’ web-based application with no downloads or plug-ins required.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.