June 2008


BP trials virtual reality

Advanced Collaboration Environment blends physical world with models for remote troubleshooting activities. Technology components come from Qwaq and HP’s Knightsbridge unit.

At the SPE Digital Energy conference in Houston, HP unveiled its Advanced Collaboration Environment (ACE) for Oil and Gas, a virtual reality-based environment that BP has been trialing for a couple of years. HP’s Paul Helm told Oil IT Journal, ‘ACE for Oil and Gas targets the need for collaboration in short bursts—to solve problems by bringing people together. Our Palo Alto R&D center has applied Web 2.0 technology to information rich environments. Some key oil and gas accounts wanted to add VR to their collaborative workspaces to bring everyone involved in drilling a well together.’

Avatars

ACE blends information from the physical world with applications and data. Avatars, simple representations of individuals inhabit ‘rooms’ with walls showing information from diverse sources. 3D objects—such as a CAD model of a drill bit are available for discussion. The ACE leverages technology developed by VR startup Qwaq, which was set up to develop software and services around the OpenCroquet project. Croquet, originally developed by IT legend Alan Kays, provides an API that guarantees that distant users of server-less, peer to peer systems all see the same information. HP has developed wrappers so that non-Croquet apps can run on the ACE’s walls.

Azerbaijan

Helm continued, ‘High bandwidth connections between Houston and Azerbaijan have allowed engineers to track the drilling decision process, clearing up ambiguities. Geologists manipulate models in front of drillers and live data streams in from Halliburton feeds. Mikes and stereo headsets allow for user interaction as if everyone was in the room. Simulators can model physical processes and messages passed between applications running across the ACE.’

Nuyens

Qwaq CEO Greg Nuyens told Oil IT Journal, ‘The technology is also used in training. Experts walk through complex processes on the rig alongside new hires to familiarize them with hazardous environments. We have three ongoing knowledge management projects in large energy companies. Croquet provides object synchronization and replicated computations. HP has extended the framework to include a dozen or so vertical applications from Schlumberger and Halliburton.’

Knightsbridge

Helm concluded ‘The ACE competes with Second Life or Sun’s Dark Star but this is an enterprise tool with scalability and security that goes beyond social networking. What is also key is we tie all these applications together using techniques developed by our IM Practice (IMP—formerly Knightsbridge Solutions). Applications running in the ACE s are synchronized using IMP’s metadata orchestration. We are now working to include VR Context’s WalkInside product to allow for smart click through to underlying plant data.’


ISS, Schlumberger team

Exclusive rights to market BabelFish production integration suite conceded in $16 million deal—ring fenced to oil and gas upstream.

Perth, Australia-based ISS Group Ltd. has sold Schlumberger exclusive rights to market its ‘BabelFish’ production integration software to the upstream. The deal is worth at least $16 million over ‘several’ years—although revenues from maintenance and future sales will likely increase this amount.

Bowman

Schlumberger Information Solutions president Tony Bowman said, ‘In combination with our Avocet production data management, surveillance, analysis and modeling solutions, ISS’ BabelFish web-based visualization framework will give production engineers and managers immediate access to actionable information—to proactively monitor and make adjustments to operations systems.’

Attwell

ISS MD Shane Attwell added, ‘This deal is ring fenced to the upstream oil and gas industry and excludes our midstream and downstream oil and gas activity and the broader manufacturing sector, which has great potential for BabelFish applications.’ Last year, Saudi Aramco kitted out some 380 wells with ISS Group’s technology as a component of its ‘iField’ initiative (Oil ITJ July 2007). Other ISS clients include BG, Hess, Santos, ConocoPhillips and Petronas.


Oil IT Journal welcomes 2008/2009 sponsors

Oil IT Journal editor Neil McNaughton thanks the renewing sponsors of Oil IT Journal’s website—www.oilit.com and welcomes two newcomers. He then does some investigating of some grand claims from ExxonMobil’s researchers made in a TV ad to conclude that the marketing folks have run amok. In his curmudgeonly fashion he bemoans the encroachment of marketing on science.

It’s that time of the year again—when the oilit.com website sponsors roll over. As in previous years, we have had a great renewal rate—with ten out of our last year’s twelve sponsors signing up for another year. So first I’d like to thank the following companies for their continued support ...

Exprodat Consulting

geoLOGIC Systems

Georex

Geotrace/Tigress

Geosoft

Ikon Science

OFS Portal

Petris Technology

Petrosys

TGS-Nopec/A2D

... and also to welcome our two new sponsors

Neuralog

Paradigm.

Don’t forget to visit these great upstream software and service providers regularly from the www.oilit.com home page.

~

I was watching CNN the other day and saw an ad from ExxonMobil vaunting the merits of its researchers’ efforts. I was surprised when the last ExxonMobil engineer came on and stated that ‘we have found a way of transporting up to 80% more liquid natural gas (LNG).’ An interesting claim I thought, as LNG is a liquid and liquids are somewhat incompressible. How could such a claim be made? The secret was undoubtedly in the differential equations that were flying around onscreen as the engineer gave her spiel. They flew by too fast for me to write them down so I tried another tack.

Efficient

On the ExxonMobil website there is a feedback form which I filled out to ask for more on the discovery. A few days later, the remarkably efficient public relations team gave me chapter and verse on the ‘breakthrough’ technology. I invite you to read the ExxonMobil paper on www.oilit.com/links/0806_1. Bottom line is, to achieve the 80% hike in transported LNG, you build a vessel with 80% more capacity!

Dumbing-down

Well I didn’t need a differential equation to figure that out. ExxonMobil’s engineers no doubt did a good job in designing the fuel efficient boat and all that. But somehow the TV ad encapsulates the dumbing-down of science in the media. The ad has a veneer of being educational. But you are not meant to understand anything—least of all the equations which are complete bull.

Smokescreen

In the landscape of upstream IT there is a similar trend to dumbness. Science hides behind a smokescreen of marketing spiel. While all of this is happening, oils and service companies alike are bemoaning the fact that science grads aren’t like they used to be, that more needs to be put into educating new PEs and so on. But you have to ask—is an ad that portrays science and engineering as something clever that you won’t understand the way to go about it? And is ‘slick’ marketing that tries to sell software like soap powder any better?


Letters to the editor

Feedback on the ‘Data Management 101’ editorial in the April 2008 issue of Oil IT Journal.

The discussion was very interesting. I agree with the fundamental terminologies used (master data and metadata). I have learnt this already and propagate these ideas to my clients. Recently, as you rightly said, some vendors are come into the E&P domain and are confusing this with terminology such as ‘data mining,’ ‘data warehouse’ and ‘business intelligence.’

Olabode Ojoade, Schlumberger.

~

I thought this was an interesting piece. What a shame we are here in 2008 and it is still required, but I see why, it is those horizontal vendors messing up our already complicated vertical. Here are a few comments. I find the term Master Data very misleading and I’m not sure you have clarified here. You say ‘master data is what ties different data stores together’ and is confused by ‘the fact that the concept came, not from E&P but from the data warehouse community.’ I think we have an opportunity to clarify and differentiate. We talk about ‘master tapes’ and remastering. Surely this term means the accepted best copy. Therefore I suggest that raw data is the data as the acquisition company captured it. Master data is the accepted best version i.e. what the expert produced from the raw version once it was corrected and quality controlled. Reference data is the data that ties different stores together. Wells, licenses, seismic surveys etc. should all have a standard accepted set of reference data values. We can have strings of reference values, a well log referenced to the ‘well name’—the well to the field, field to license, license to country, country to division of company. Meta Data—is data about data (ouch I got punched!) and should contain the reference value if there is one. Master data management as a discipline has to include the maintenance of both reference and master data. I’d suggest it should also ensure each data type has a ‘Data Definition’ which would define the quality, naming, meta data, storage standards and security classification. On a lighter note, I’m currently consulting in Copenhagen—my Danish colleagues here have asked me to point out that words such as obfuscation only adds to the obfuscation, but you have no need to worry as they have not got pugilistic tendencies!
David Lecore, Schlumberger.


Book Review—‘In Pursuit of the Perfect Plant’

New publication provides recipe for ‘perfection’ in process and manufacturing. But will it work?

There is an old IT joke (we’ve told it before) that goes, ‘How come God only took seven days to make the world?’ The answer is, ‘No installed base.’ In Pursuit of the Perfect Plant* (IPPP) is about the corollary of this. That most plants, whether they produce petroleum or paper, include a huge and heterogeneous installed base, much of whose technological ancestry goes back 40 years. And which makes for quite an obstacle in the pursuit of perfection!

What is the perfect plant? For the authors it is one that is capable of reacting to the ‘pull of demand’. Moving faster or slower depending on customers’ needs. And one with a supply chain that can keep pace. Much of today’s plants are ‘big black boxes’ where orders are processed—but where ‘nobody has a clue about what’s going on inside’. Perfection is about visibility and flexibility, about full capacity, quality, change management and getting the right information in the right place at right time so that ‘decisions are based in information not instinct.’

Lofty stance?

Cynics might see in this work authored by thought leaders from OSIsoft and SAP an attempt to sell software (see below**). But the book has a good bash at taking a loftier stance—perhaps too lofty in fact as coverage is vast. Dissertations, on membership of the local Rotary club, trades union activity and carbon cap and trade may be interesting but make for just a bit too much scope. A more serious failing is that the book is without an index, an unforgivable omission in the days of digital authoring. Moreover the table of contents and layout is also lacking in pertinence. Chapter titles like ‘making progress to perfection’ and ‘making it happen’—were not very helpful for the hard pressed reviewer. In fact the only way to get anything from IPPP is to read from cover to cover which is more or less what we did. The book has a very large number of contributors and authors—although the editorial team from Evolved Media have hidden all this behind a fictional dialog between an executive VP Manufacturing and three ‘analysts’ making up the Perfect Plant Research and Communication team. This makes for quite an easy read—but it is sometimes frustrating not to know exactly who is speaking.

Pitfalls

The dialog includes a lot of advice and caveats—pitfalls in plant revamps abound—investment decisions are often made without input from the plant and systems may be ‘inflicted’ on a plant to solve problems that may not exist. Such unwanted ‘solutions’ can come in the form of software—but also procedures for safety, audits and inspections. Such systems fail in their objective of improving performance and instead ‘cause more work than they’re worth’.

Oil industry ‘inflexible’

The authors note that although the oil industry is in a phase where capacity is ‘sold out at high prices,’ and now suffers from inflexibility (the key to refinery optimization) having blown the last 15 years or so of its margins on ‘compliance and cutting costs’. The Purdue Reference Model is used to analyze different levels and vintages of systems deployed in a plant—from ERP, through MRP and enterprise manufacturing intelligence (EMI). Standards coverage give the impression that the plant world is at the dawn of a new age in interoperability as ‘new standards are just starting to affect the way products are created.’ The commercial message is the promise of service-oriented architecture in the plant. Although here there is a mixed message as legacy interoperability through OPC is pretty good and is credited anecdotally as ‘doing a better job of sharing information in a plant than web services do in corporate computing!’

IP-based

Elsewhere, legacy network topologies are considered to hold back progress towards the perfect plant. The authors advocate IP-based networks but recognize that this might stress older dedicated machines on the network and introduces the risk of viruses and other malware. ‘Connecting a network to the plant freaks out some of the engineers***.’ But increased visibility—one of the tenets of the PP requires a ‘converged, common infrastructure.’

Slow loop and fast loop optimization is discussed and the thorny issues of data management, Excel hell and the lack of a single version of truth.

Architecture

A disappointingly slim chapter introduces architecture standards and interoperability. Plant networks were not designed to support a world of ‘any information, any time any place’ but you can’t afford to ‘tear everything out and start over.’ It can cost $17,000 to install a new transmitter in a refinery—and it’s unclear if the data will even be monitored. The remedy is to encapsulate to legacy data sources to ‘stateless chunks,’ wrapping legacy PLC, DCS and Historian data and exposing it to ‘mashups.’ Another round on standards and interoperability paints a complex picture of plethoric standards—ISA S95 for ERP and MES, ISA S88 for batch, OAGIS—ISA for messages and Mimosa for maintenance.

Intractable?

To sum up, the book’s unusual format is a mixed blessing. It is readable, but rather wordy. Its authors have an axe to grind—retooling the plant to a modern IP based infrastructure. But this is not oversold and there are plenty of caveats and a fair presentation of the subject’s complexity. In fact, there are so many caveats, and the technology ‘sell’ is so soft, that the authors fail in their perfect plant advocacy. This reviewer concluded that the problems of multiple ‘standards,’ interoperability and a lack of visibility across the plant appear to remain rather intractable. But it is certainly worthwhile spending a few hours with IPPP for the insights and contradictions that its assuredly well qualified authors offer.

*ISBN: 978-0-9789218-6-6. More from orders@evolvedtechnologist.com.

**In a recent press release, SAP describes its Perfect Plant strategy to ‚‘bring together core SAP solutions with the software, hardware and services offerings of ecosystem partners to drive innovation for discrete manufacturers.

*** A report in the Washington Post this month described how the Hatch Nuclear Power Plant near Baxley, Georgia was forced into a 48-hour emergency shutdown when a computer on the plant’s business network was rebooted after an engineer installed a software update. According to the Post, the operator was investigating cyber vulnerabilities when it was realized that the business network and the plant were communicating, causing the malfunction. Since then, plant engineers have physically removed all network connections between the affected servers.


ImageConnect Oil and Gas hikes geodata offering

DigitalGlobe has announced high resolution imagery for surveillance and infrastructure development.

DigitalGlobe, provider of high-resolution satellite imagery and geospatial information, has announced ‘ImageConnect’ Oil and Gas (ICOG). ICOG provides on-demand access via GIS and OGIS Web Map Services (WMS) to coverage in areas of global oil and gas exploration. ICOG provides high resolution imagery of oil basins, refineries, pipelines and ‘high-interest geological areas.’

Tremblay

DigitalGlobe SVP Marc Tremblay said, ‘By accessing our advanced imagery online, oil companies can visualize acreage, select the best location for infrastructure placement in remote and rugged terrain and monitor facilities and reclamation areas—reducing the time and costs of onsite monitoring and surveying.’

QuickBird

DigitalGlobe operates the QuickBird high resolution satellite constellation and launched its first ‘next-generation’ satellite WorldView-1 this year. DigitalGlobe claims one of the world’s largest image libraries and provides online search and retrieval, production ready image layers, development tool-kits for internet enabled applications and devices, and software solutions for integration with GIS products and services.

Strategy

DigitalGlobe targets high-interest areas around the world to add new images from its sub-meter resolution satellite constellation. By identifying and collecting geographical areas that impact geospatial decisions for the oil and gas community, DigitalGlobe plans to provide imagery solutions that deliver a ‘consistent, accurate, real-world perspective.’ An ImageConnect subscription provides direct connectivity to the library from desktop applications such as ESRI ArcGIS, MapInfo Professional, Autodesk Map 3D or any Web Map Service (WMS) enabled client.


ffA, Mercury Computing leverage Nvidia’s CUDA technology

Seismic imaging and processing gets performance boost from graphics processing units.

Seismic image analysis specialist Foster Findlay Associates (FFA) has teamed with Nvidia to leverage graphics processing units (GPU) based number crunching in its SVI Pro seismic volume imaging and 3D visualization package. Adding Nvidia’s compute horsepower means that FFA can provide a scalable and portable solution for volume based attribute and object generation on large data volumes.

Purves

Steve Purves, FFA’s Technical Director, said, ‘The advanced processing capabilities within our SVI Pro and SEA 3D software applications together with the HPC capabilities provided by NVIDIA CUDA-enabled graphics processing units (GPUs), will have a direct impact on hydrocarbon exploration and production by changing the way in which geoscientists work with and analyse 3D Seismic data.’ FFA will integrate NVIDIA CUDA technology with SVI Pro on both Windows and Linux platforms over the next 18 months.

Mercury

In a separate announcement, Mercury Computer Systems’ Visualization Sciences Group is also leveraging CUDA to deliver massive parallel computation and 3D visualization capabilities to E&P applications. Mercury has integrated CUDA with its Open Inventor 3D development toolkit for compute-intensive E&P data analysis, interpretation, and simulation—providing ‘interoperability between 3D visualization and the computation on the fly.’ Open Inventor components include VolumeViz LDM (visualization of large pre and post stack seismic data) and ReservoirViz LDM (reservoir modeling and simulation.)

Tesla 10

Finally, on the GPU hardware front, NVIDIA has announced the Tesla 10 series with ‘double the precision and performance.’ The top of the range Tesla S1070 unit provides up to 4 Teraflops per 1U system, double precision IEEE 754 match and 16 Gigabytes of memory per 1U system. A one rack unit (1U) S1070 1U system will retail at $7,999 when it ships in August 2008.


Total enters TOP 10 high performance computing list

A new 10,240 core SGI Altix ICE provides Total with access to 2 petabyte seismic data set.

Total has purchased an SGI Altix ICE system with compute bandwidth of 123 teraflops and a petabyte of storage. The machine is to be used by Total’s geophysicists in Pau, France on high-end seismic imaging. The system runs SUSE Linux across 10,240 Intel Xeon cores each with 2GB memory. Storage comprises 500TB of SGI InfiniteStorage plus another 500GB of in a distributed file system based on Sun Microsytems’ Lustre technology. Total’s new machine came in at number 10 on the June 2008 TOP500 list with a LINPACK performance of 106 teraflops.

Chalon

Total E&P SVP Philippe Chalon said, ‘Drilling cost increases justify significant investment in seismic processing capabilities to better model the subsurface. Total has invested in one of the most powerful computers in the world and will continue adding compute capacity over the next years. Our investment in SGI storage solutions gives us optimal flexibility in accessing and managing up to 2 petabytes of seismic data on current and future exploration prospects.’ More from www.sgi.com/industries/energy.

Comment

It should be noted that the TOP500 is not an exhaustive listing of HPC deployments—in fact this is the first time we have seen an oil company machine in the list. It is therefore likely that there are many HPC installations in oil and gas of the caliber of Total’s new supercomputer. Large machines used in seismic processing houses would likely fill up the top ten if they all entered.


Software, hardware short takes …

MVE, QuickWells, SPT, CD-adapco, Exprodat, Galdos, Landmark, OpenSpirit, PSE, Fugro, Merrick, Perpetuum, CAP-XX, Trango, Badleys and Troika.

Midland Valley Exploration (MVE) is about to ship a ‘pre-release’ version of ‘Move2008,’ a common desktop for its 2, 3 and 4Dstructural analysis packages.

QuickWells has announced a toolset for the design, procurement and management of advanced completions. All completion data is held in a SQL Server database and a plugin development environment is available.

SPT Group and CD-adapco are to collaborate on optimizing computational flow assurance processes using their OLGA and STAR solvers.

Exprodat has released a new version of its Team-GIS Acreage Analyst, a data independent play fairway and acreage ranking ArcGIS 9.2 extension.

Galdos Systems has announced the free KML Validator on kmlvalidator.com for Google Earth data files.

Landmark is working with OpenSpirit to develop a data connector for its GeoProbe volume interpretation tool.

A new release of Process Systems Enterprise’s gPROMS process modeling suite adds multi-layered flow-sheeting and on-flowsheet results in the form of tables, plots and colored animations of PVT data.

A new release of Fugro-Jason’s Geoscience Workbench includes ‘StatMod,’ a Monte Carlo statistics module developed for Jason’s reservoir characterization consultancy. StatMod combines geostatistics with seismic inversion to ‘reliably quantify uncertainty.’

The 2008 release of Merrick Systems’ eVIN mobile oilfield data capture tool offers route and stop management, capture of daily data, run ticket, and tank volumes and alarms and graphs for problem resolution. eVIN 2008 is used by Dallas-based Exco Resources to manage its 15,000 active US wells.

Perpetuum has teamed with CAP-XX Ltd. on a vibration energy-harvesting micro-generator and ‘supercapacitor’ hardware combo to provide battery-free wireless based condition monitoring of rotating equipment.

Fugro’s Trango unit has updated its Trango Manager to the .Net Framework 2.0 and the integrated GIS mapping module has been upgraded to ESRI ArcObjects. A web interface, Trango Expo is also available for ESRI ArcGIS Server back ends.

Badley Geoscience has announced new features to appear in TrapTester 6 later this year. A ‘TrapAnalyst’ faulted trap delineation and analysis tool completes the fault-seal analysis workflow. The new CubeXPlorer infrastructure facilitates data management and visualization of large seismic data volumes.

Troika has announced ‘Minima,’ an XML-based seismic data utility for managing SEGY and SEGY Rev 1.0. Minima reads SEGY navigation data for merging and workstation data loading. The tool is also suited to managing seismic data across high capacity media, disks and networks.


BabelFish Aspect for web based spatial data integration project

ISS Australia’s data integration environment for the web blends Santos’ GIS data and documents.

Speaking at the ESRI Australia Resources Symposium this month, Grant Eggleton outlined ISS Australia’s work for Santos on the development of a bespoke spatial data integration environment for the web. Santos is using ISS’ BabelFish Aspect (BFA) data integration toolkit to combine geospatial and spatially-referenced unstructured data such as maps, documents, reports and real-time telemetry.

Multi-source

BFA combines information from multiple data-sources into a single, user-defined integrated view. Data types include geological cross sections, seismic sections, wireline logs, documents and ESRI GIS maps. A tree menu facilitates navigation and display. According to Eggleton, the ability to store and overlay information from different systems enhances knowledge management and retention.

Real time

BFA combines GIS data with non-spatial data sources, such as real-time systems and document repositories. Map data, documents and real-time data are selected and BFA combines the information into a single web view. The tool also tracks static and mobile assets such as wells, equipment and vehicles. More from enquiries@issgroup.com.au.


TOP500—Department of Energy’s ‘Roadrunner’ breaks petaflop

Microsoft Windows Server 2008 reaches Number 23 with NCSA’s ‘Abe’ 9,600 core dual-boot machine.

The big news from the June 2008 TOP500 tradeshow was the breaking of the petaflop barrier by the US Department of Energy’s Los Alamos National Laboratory where IBM is to design and build ‘Roadrunner,’ a $100 million, ‘hybrid’ supercomputer. The hybrid tag indicates that Roadrunner’s architecture comprises 6,562 dual-core AMD Opteron chips and 12,240 IBM Cell Broadband Engine. The system has 98 terabytes of memory. Computations are routed to the Cell processors. Roadrunner operates on open-source Linux software from Red Hat.

Microsoft

Also of note was Microsoft’s entry into the Top500—at number 23. NCSA’s ‘Abe’ is a dual boot Linux/Windows Server 2008 machine with 9,600 cores. Running in its Windows mode, Abe was clocked at 68 teraflops. However, despite Microsoft’s grand claims of ‘dominance’ of HPC (Oil ITJ March 2007) its overall share of the TOP500 ‘cake’ is only 1%. We quizzed Mike Showerman of the NCSA as to the relative merits of Windows and Linux in HPC. Showerman said, ‘At the moment, our workloads are separate, so I do not have any relevant measures of comparison. We hope to have some in the future. Even comparing TOP500 runs would be quite misleading due to some significant differences in supporting libraries and optimization efforts*.’

Top 10

IBM held five of the top 10 slots, SGI got two (including the Total machine on page 4 of this issue) with Sun, Cray and HP sharing the top of the list.

* A criticism that could probably be made of the whole TOP500 effort.


Digital Energy, Houston

The Society of Petroleum Engineering’s ‘Digital Oilfield’ conference heard Don Paul (Chevron) defend oil country IT as outpacing other verticals. Shell, BP report on groundbreaking production surveillance and gas lift support. Panel sessions debate data, standards and living with $130 oil.

Previous SPE-supported events have often touted the idea that the oil industry is a technology ‘laggard.’ Not so according to keynote speaker Don Paul (Chevron) who cited a recent study by the US Council on Competitiveness that found that Tier 1 US energy companies outpaced other sectors. Oil and gas has embedded IT ‘all the way to the front line of the business.’ Chevron had 5,000 terabytes of storage 2007—growing at an annual rate of 80% for technical and 60% for business data. Paul also analyzed the current supply and demand situation to conclude that demand growth is likely to continue apace—and that for the industry, ‘this is as good as it gets.’

Shell

Tom Moroney described Shell’s ‘next generation’ production surveillance deployed in the deepwater Gulf of Mexico (GOM). Key to the new Production Operations Management Center (POMC) is that it is ‘exception based,’ providing operating limits, drawdowns, performance curves for topsides and subsea equipment. The idea is to understand conditions as they occur. A virtual asset team can be constituted from POMC staff, operations and process engineers. Technology in the POMC includes a portal, advanced alarm tool, workflow automation, dynamic reporting and the knowledge repository. Alarm notifications pop up PI Historian trends and well test data for analysis and ‘situational awareness’—all managed in the workflow engine. After an intervention, the system captures new parameters and performs short cycle optimization before issuing recommendations to operations.

BP

Sergi Sama (AspenTech) outlined model based optimization on BP’s Azeri Field. This uses a Hysys model and facility simulator coupled with well models (Invent/Prosper), a well test database and the IP21 Historian. The optimizer has simplified validation and reconfiguring the plant as wells come on and off stream. The asset-wide model/optimizer calculates wells, separation trains, compression and pumping trains for onshore and offshore facilities. Business workflows have been ‘canned’ e.g. first optimize oil volumes then commercial gas etc. Slight tweaks increased production by 3% (15,000 bbl/day). The optimizer supports condition monitoring, ‘what if’ and engineering studies.

Panel sessions

A couple of interesting panel sessions debated the state of the art. Katya Casey (BHP Billiton) said that interoperability is not the problem it used to be—for instance ArcSDE works well with Oracle Spatial. But there are too many ‘standards’ like SOA where everyone does their own thing! ‘We need coordination down to operating systems, middleware etc.’ The engineering community has huge amounts of data, but is only now thinking about databases rather than Excel! Tony Edwards (BG) saw ‘confusion’ as to what should be standardized, outsourced and automated. Edwards believes standards should be limited to HSSE, integrity, procurement etc., but that well placement and other ‘creative’ processes should be approached differently. If you put too much in a standardized box you can inhibit creativity. Steve Fortune (BP) ventured that information management was more important than the digital oilfield. This was an interesting observation in so far as others, Don Paul included, considered the data management problem as more or less intractable. Fortune also noted that the digital oilfield was moving from pilots into ‘full scale, value generating deployments.’ Don Moore (Oxy) said that oil and gas industry doesn’t get as much credit as it should for getting to where it is today. A decade ago, these young folks from the technology side were telling us ‘you oil and gas guys just don’t get it!’ Then the tech bubble burst—‘Today you should be encouraged how fast the gap closed. And this with far less people.’ Paul noted that high oil prices have a direct impact on operations, ‘For Chevron, every barrel not produced must be bought. At $130/barrel, this means direct financial consequence for every shortfall. Environmental liabilities are no small deal. We are constantly observed by the government and NGOs. Hence the need to predict and prevent.’ Steve Fortune (BP) thinks information management is more important than digital oilfield technology. BP has a parallel IM track in its field of the future program. IM is ‘a massive problem that requires a level of engineering input that is not there.’ BP has re-introduced document controllers and data managers—roles that went out in the low oil price days. This was a big piece missing from the organization that has been brought back in. Washington Salles explained the Petrobras’ approach is to blend in house solutions and service providers with an integrated database for geology and production data—this is being extended to include real time digital oilfield data.

Shell Brunei

Ron Cramer (Shell Global Solutions) described gas lift optimization on Shell Brunei’s Champion field. Gas lift involves injecting gas into the casing so that it lightens the oil on its way back up the production string. The technique is amenable to optimization and is now considered a classic brown field operation. Gas lift optimization (GLO) the old way involved awkward setting of downhole valves in the well. In 2003, fluid control valves and smart multi variable flow transmitters with onboard diagnostics were installed. These are now linked to twin FieldWare/Production Universe systems, one for monitoring and one for optimization. For Cramer, the key thing is ‘get your instrumentation good and keep it good.’ Shell could not afford multi phase flow meters in each well at around $500,000 each! Multi rate well tests were used to build a ‘data driven’ model—deemed ‘more sustainable’ than a physical model. These real time estimates were then used to optimize gas lift. A configurable objective function sits on top of Production Universe (PU)—which can be set to maximize production or OPEX or other KPIs. PU is not a control system—it provides ‘advisory’ set points when something changes. Today the loop is not closed because of lack of confidence and the fact that operators are there anyhow. Even though the system is not automated, it is ‘miles better than the old system’ when it might take weeks before optimization after a compressor went down. Gas injected now exceeds the historical figure by 20-30%. There is still potential for improvement—especially in saving gas (less gas equals less compression.) Gas injection is now controlled frequently from the surface with much less wireline operations. The business benefits include reduced gas lift fees (40% down on one platform) and a sustained 20% production hike.

Exhibitors

Many exhibitors turned out to be Chevron’s suppliers. Coreworx was showing its engineering document management workflow management and collaboration toolset. Chevron has 100,000 documents and 5,000 active workflows stored in a single Coreworx project covering worldwide projects. EPSIS’ Realtime Assistant (ERA) is a component of Chevron’s Master Schedule View which plugs in to Chevron’s Minerva data infrastructure, pulling together work order data from real time sources. The ERA Visual component provides visualization of key information from the field including terrain model, sea surface, sea bottom, geological horizons and well trajectories. 3D icons show objects of interest such as buildings, constructions, equipment, vehicles etc. These are connected to data sources to track vehicles and vessels and sensors/alarms. IOCOM ( formerly InSors) supplies Chevron with collaboration room software. The IOCOM Grid runs Windows across conference rooms, command centers, laptops and tablet PCs. A single server supports up to 20 simultaneous connections in multiple meetings for collaboration via VoIP/SIP and traditional video conferencing. Other exhibitors included Credant Technologies whose Mobile Guardian provides centrally managed, policy-based mobile data security and management solutions for data on laptops, desktops, and mobile devices. Infonic’s Geo-Replicator plugs a gap in Microsoft’s offering with replication of content in SharePoint across remote sites, mobile workers and low bandwith networks. Optelligent Solutions’ OSViz provides data visualization and analysis of production, injection, reserves and decline trends. The software or service solution can be coupled with IHS or proprietary data sets which are presented spatially and with time series animation. The tool was originally developed for Canadian oil company Esprit Energy Trust where it was used to optimize fluid flood. P2 Energy Solutions has re-designed its Tobin LandSuite, now called Enterprise Land. A land ‘DataMart’ decision support tool uses technology from Informatica and Hyperion to provide reporting and analysis tools for drill down. A shiny truck in the parking lot turned out to be Cisco’s Network Emergency Response Vehicle (NERV), a mobile communications and command center for disaster management. The system provides instant voice, video and data communications. Cisco’s IPICS technology allows disparate radio systems to communicate with each other TelePresence, video surveillance, Wi-Fi, satellite communications, and IP telephony on-board. The NERV was used during last year’s San Diego fires to patch fire and Sheriff’s radio systems. The DSS Satellite dish brought in television news which was then encoded to Windows media player for the Sheriff’s PCs.


IBM’s ‘Pulse’ conference hosts Maximo for Oil and Gas users

UK KP3 Report shows North Sea assets in poor condition. BP’s goal—‘never another Texas City’

IBM’s new ‘Pulse’ conference was held last month for its combined Tivoli and Maximo customer base. A session on Maximo in oil and gas brought together users from clients including ADCO, Petroci, OMV, BP, Shell and Repsol. IBM’s Terry Ray explained the need for maintenance repair and operations in oil and gas quoting a report from the UK regulator which stated ‘More than half of the oil and gas industry’s assets in the North Sea that have been inspected over the last three years are in poor condition, and companies will face closure or prosecution if they do not improve safety standards. Safety related incidents had occurred because of the poor upkeep of basic structures and some maintenance backlogs were unacceptable.’ The ‘KP3’ report noted that ‘senior managers are not making adequate use of integrity management data and are not giving ongoing maintenance sufficient priority.’ A new theme for Maximo is the digital oilfield, described as a ‘step change in available data and visualization techniques.’ Reference was also made to IBM’s Integrated Information Framework (IIF) and its smorgasbord of supported standards.

Lifecycle management

Ed Popko’s paper demonstrated the benefits of integrating maintenance data with product lifecycle management (PLM). This provides manufacturers and owner/operators with a shared view of key information and automated processes such as repair vs. replace decision support, alternate parts and disposal and handling procedures. A demo showed the combined use of Matrix One (Dassault) and Maximo to analyze a faulty pump on an oil platform. A websphere-based integration automatically notifies engineering that there is a recurring problem with specific asset types, helping to reduce warranty costs and improve product support. These techniques are being field tested by both Halliburton and Schlumberger.

BP Anadarko

Paul Millburg (BP) described Maximo’s work management functionality as fitting BP’s safety and operational integrity goal of ‘never another Texas City.’ BP has 38 instances of Maximo worldwide ‘directly affecting’ some 10,000 staff. These customized systems are proving expensive to maintain and upgrade and BP is looking to consolidate its operations and maintenance activities at the enterprise level with a global work management solution using the latest Maximo technology. A multidisciplinary team built a work management template and toolkit and BP has been working with IBM to influence the development of the Maximo tool and the oil and gas solution. Both template and toolkit have been tested in a pilot at BP’s Anadarko basin unit. The results were very positive—with zero incidents (due to increased visibility of work) a $220,000 annual OPEX saving, a 20% decrease in unscheduled activity and a ten fold increase in documented work. For BP, ‘Maximo is a way of life.’ The 38 instances in E&P today will be consolidated to five and moved to one global Maximo standard.

Turnaround tracking

Mike Sims (Placid refining) described turnaround tracking in Maximo. Turnaround involves the shutdown of a plant or section of a plant for the purpose of inspection or upgrade. The aim is to achieve as much work as possible in a tight time window to minimize lost production. Turnaround challenges include managing a large number of outside contractors and temporary labor, making effective use of equipment, unplanned work and managing a large amount of materials that are needed in a short time frame. A significant data collection and organization effort is a prerequisite and data needs to be kept up-to-date in real time. Training is critical—but ‘elegant solutions are only as good as the people using them.’ Maximo partner Electronic Data Inc. was integrator for the project.


Folks, facts, orgs ...

Movers and news this month from API, Absoft, Aker, Cortex, CRA International, CSM, dGB, Badleys, CMG, HTC, Fugro, Marathon, Information Logic, Paras, MicroSeismic, Ryder Scott, Utilipoint, SensorTran, SMT, Techsia and WellPoint Systems.

Jack Gerard, currently president of the American Chemistry Council is to replace retiree Red Cavaney as president CEO of the American Petroleum Institute.

Don Valentine has been named leader of Absoft’s new oil and gas business practice.

Greg Ross is MD of Aker Solutions’ new subsea business in Perth, Australia.

Following a request from the Alberta Securities Commission, Gord Herman has resigned from Cortex’ board of directors.

CRA International is expanding its chemicals and petroleum practice with the addition of Bob Peterson as VP. Peterson was previously with Schlumberger Business Consulting.

Roel Snieder is the new director of the Center for Wave Phenomena at the Colorado School of Mines.

Chris Collacott is to head up Deloitte’s new Petroleum Services bureau in Sydney, Australia.

Layton Payne has been named director marketing of dGB-USA. Payne hails from Fugro-Jason. Otelindo Medina and Farrukh Qayyum have joined dGB as a geoscientists. Farrukh was previously with Schlumberger. Nageswara Rao has joined dGB’s software team in Mumbai and Venice Kostandy is now office manager for dGB-USA.

Pete Boult is now the official representative of Badley Geoscience in Australia.

Computer Modelling Group has named Dan Dexter as VP Canada and Marketing and Jim Erdle as VP USA and Latin America. Ron Kutney is now VP Eastern Hemisphere in Dubai.

Deena Carstens has joined the Houston Technology Center as director of marketing and communications. Carstens was previously with Halliburton.

Edward Cherednik is now business manager for Fugro-Jason’s Central and Eastern European unit. Cherednik was previously with Landmark Graphics.

Linda Capuano has been appointed VP Emerging Technology with Marathon. Capuano hails from Solectron Corp.

Matthew West has been named Principal Consultant with Information Logic. West was formerly with Shell.

Ned Voelcker and Daniel Robinson have joined Paras’ team of consultants. Voelcker hails from Chevron, Robinson from IQPC.

Cameron Crook is to head-up MicroSeismic’s new Calgary office. Crook was previously with Shell Canada.

Victor Hein has joined Ryder Scott as a senior petroleum engineer from Albrecht and Associates. Scott Quinell has also joined the company as petroleum engineer. He was previously a reservoir engineer at GLJ Petroleum Consultants.

Martin Dunlea, formerly CIO at Bord Gais Éireann, has joined Utilipoint as VP Consulting, Europe.

Sensornet CEO Neale Carter is now chairman of Flotech Ltd.

Mikko Jaaskelainen is now CTO of SensorTran, he was previously with PGS.

Bill Stephenson has been named VP sales with SMT, he was previously with Symantech.

Jacques Ita and Ibrahim Al-Quseimi have joined Techsia as petrophysicists. Al-Quseimi was previously with Shell’s PDO unit.

WellPoint Systems has appointed Richard Slack as COO. Slack was previously with the recently acquired Bolo Systems.

LAS Reader

I just found a request for my LAS Reader software in your November 2006 edition. If anyone is still interested in LAS Reader it is available from http://www.seismatters.com/LAS Reader.html.’ Ian Vincent.


Done Deals

Schlumberger, Decision Strategies, TGS-Nopec, Quest, Acorn, Aker, Fugro, HSE, Boots & Coots, IDS, Global Energy Services, Perficient, Roxar, RPS Group, Sense EDM, Triple Point, Invensys, Petrosys, ION and KBR.

Schlumberger has acquired Integrated Exploration Systems (IES). IES’ Aachen, Germany location will become a Schlumberger center of excellence for petroleum systems modeling.

Decision Strategies is to acquire oil and gas consultants RMI, in what is described as the first of several planned acquisitions for 2008.

TGS-Nopec has acquired the staff and assets of Center Line Data Corporation. The purchase price included approximately US $5 million in cash and 30,600 shares of TGS.

Quest Energy Services (a subsidiary of Al-Qahtani Marine & Oilfield Services) has acquired the outstanding common shares of Production Enhancement Group.

CGGVeritas’ manufacturing unit Sercel has acquired Metrolog, a provider of high pressure, high temperature gauges and downhole instrumentation.

Acorn Energy is to acquire Software Innovation—developer of the Coreworx tool for capital project collaboration.

Aker Solutions has signed a contract with Shell Deutschland for engineering services on refineries in Germany. The five year contract is worth around €100 million.

Fugro has acquired satellite mapping specialists NPA.

Halliburton is to acquire the remaining 49% of its WellDynamics joint venture with Shell Technology Ventures.

HSE Integrated and Boots & Coots International Well Control are to create a jointly owned company, Boots & Coots HSE Services, to provide oilfield safety services.

Independent Data Services has partnered with Global Energy Services to offer a ‘seamless, integrated hardware/software package for well site rig activities.’

Invensys has made it back into the FTSE 100 Index. The company’s classification has moved from ‘Electronic Equipment’ to ‘Software.’

Perficient has announced an expansion of its credit facility with Silicon Valley Bank and KeyBanc Capital Markets of up to $75 million. The new line of credit permits stock buybacks of up to $50 million.

Petrosys has appointed Fugro-Jason as business partner for the Commonwealth of Independent States (CIS). The company has also signed an agency agreement with Reservoil, an oil country supplier on the Indian subcontinent.

Global Geo Services has spun-off the majority of its seismic business to a wholly owned subsidiary, Spectrum ASA—valued at MNOK 275.

Roxar reports a software sale to a ‘major operator in the Asia Pacific region’ with a $10 million value over three years.

RPS Group has acquired the Geocet consultancy in a $2.3 million cash transaction.

Sense EDM is to acquire Wellquip for NOK 70 million cash plus an contingent earn-out of NOK 20 million.

Triple Point Technology has acquired Investment Support Systems, a supplier of treasury management and regulatory compliance solutions.

ION (previously Input/Output) has teamed with Moscow-based Largeo to offer seismic data processing services to the Russian market.

KBR and its Granherne and GVA units have been awarded a front end engineering and design contract by BP for offshore projects across the globe.


XML Convertor update set to ‘encourage standard adoption’

DataDirect’s bi-directional Edig@s to XML convertor opens up gas transmission data.

Progress Software Corp. unit DataDirect Technologies has released a new version of its DataDirect XML Converter, with expanded support for the European Edig@s standard transmission protocol. Edig@s is a protocol for operational data and messaging between gas dispatching centers based on the EDIFACT e-commerce standard. DataDirect’s XML Converters convert EDI, flat files or other legacy data formats to XML, making data accessible to virtually any application. Java and .NET components convert legacy data formats into XML and back without retooling Edig@s members’ applications.

Innocenti

DataDirect’s Carlo Innocenti said, ‘Our converter insulates an application from the details of the EDI message so that developers can quickly integrate EDI messages into downstream systems. Simplified support for Edig@s should encourage adoption of the standard and enhance interoperability.’ DataDirect notes the ‘persistence’ of the EDI standard for data exchange and views its bi-directional converters as providing a ‘best of both worlds’ solution that shields an organization from having to choose between XML and EDI.

IT infrastructure

Innocenti added, ‘Gas distributors now have a way to consume an EDI message and manipulate the information as XML, which is easily handled by the rest of the IT infrastructure. Documents and databases can be pulled together and converted to an Edig@s message and sent to other parties involved.’


IDS Datanet teams with Global Energy Services on rig IT

Single software and hardware solution for rigsite data acquisition and reporting announced.

Adelaide, Australia-headquartered IDS DataNet is teaming with Global Energy Services Ltd to create a single software and hardware solution for rig sites operating in the Western Canadian oil patch. The deal combines IDS DataNet flagship package with Global’s hardware and communications solutions for the rig site.

DataNet

DataNet is a suite of reporting tools for the upstream oil industry. The web-based system is claimed to facilitate data acquisition, management and sharing over the web. DataNet’s tools support drilling project life-cycles from concept to decommissioning.

Global

Global adds hardware and rig site communications solutions to the mix. Recently, IDS signed an internet connectivity and hosting relationship with NTT Singapore—extending the reach of its communications infrastructure across the Asia Pacific region.


Saudi Aramco views billion cell model with OcTreemizer

Fraunhofer’s volume rendering tool supports move from mega to ‘gigacell’ simulator.

Writing in the Spring 2008 issue of Saudi Aramco’s Journal of Technology, Ali Dogru (head of computational modeling at Aramco’s E&P Advanced research Center) described Aramco’s progress from mega cell to giga cell simulators. The current version of Aramco’s POWERS fluid flow modeler has already been used to study a 258 million cell model of the Ghawar field—the largest in the world. Models run on a 256 node Linux AMD clusters.

Gigacell

The next generation POWERS II is to target gigacell models. Early testing of the concept is investigating visualization of billion cell models using Fraunhofer’s OcTreemizer, a volume rendering toolkit for roaming through very large volume data sets on standard PCs. Aramco is planning a dedicated ‘virtual reality (VR) room.’ Dogru argues that VR is under utilized in oil and gas as compared with other industries. Aramco plans to interact with huge data sets using sound and ‘immersive’ technology. Again, Fraunhofer is involved, supplying novel technologies such as auto stereoscopic displays with gesture control and hapitc devices. Such force feedback tools might bring a new meaning to the expression ‘massaging the data!’


CERA Upstream Capital Costs Index up 6% in 6 months

Costs have doubled since 2005. For Dan Yergin, ‘a new fundamental driving the oil price.’

The latest Upstream Capital Costs Index (UCCI) published by IHS unit Cambridge Energy Research Associates (CERA) shows costs have risen by 6% in the last six months and have doubled since 2005. CERA chairman Dan Yergin said, ‘Rising costs have become one of the new fundamentals driving the oil price. This is a serious concern and a major challenge for oil and gas companies and is leading to delays and postponements of many projects.’ Exchange rate fluctuations and the weakening dollar also contribute as do rising costs for raw materials and transportation. Specialized deepwater equipment showed the largest increase of any area on the index. Costs vary regionally with highest rises in areas of high activity including the Middle East, West Africa, South America and Australia. Track the UCCI on www.ihsindexes.com.


eDynamic retail franchise management for Suncor

Application management system handles surge in demand for North American gas stations.

Suncor Energy Products has deployed eDynamic’s franchise application management system (FAMS) to handle the increasing number of applications for franchises of its gasoline retail operations in North Amercia. FAMS is a customized version of eDynamic’s Apps-In-Motion System (AIMS), a hosted solution for workflow automation. Suncor’s business was expanding rapidly with some 40 applications per day from potential franchisees. Screening applications was ‘cumbersome’ so Suncor contracted with eDynamic for an application to manage the whole selection process, reducing the unproductive ‘paper processing’ work.

DeRusha

Suncor’s Linda DeRusha said, ‘FAMS has helped us focus on the applicants that have the appropriate profile and experience. It has made the whole recruitment process much more manageable, and time efficient.’ Following deployment, the franchise recruitment team was reduced from four to one and Suncor has expanded its reach at a ‘much reduced’ cost. New York headquartered eDynamic is a global IT and outsourcing consultancy.


Hyperion simulates GASCO expansion project

High fidelity dynamic simulator for Habshan Gas Complex for design and operations.

Hyperion Systems Engineering has completed an ‘engineering-grade’ dynamic simulation study of the Abu Dhabi Gas Industries’ (GASCO) Habshan Gas Complex expansion project on behalf of prime contractor Fluor Corp. The study verified the new compressor anti-surge protection systems and the performance of the load-sharing and control systems. The simulator was also used to check equipment prior to commissioning.

High fidelity

The study investigated the performance of the new plant and the potential impact on existing equipment. 60 scenarios were studied and recommendations made for control system changes to optimize overall plant design. A ‘high-fidelity’ dynamic model of the new and existing compressor trains was used to study both summer and winter operating conditions.

Flood

Fluor project engineer Steve Flood said, ‘Dynamic studies help predict how the new and existing machines will interact and allow anti-surge and load sharing considerations to be taken into account during the design stage of the project. The predictions of the dynamic simulations also provide valuable reference data for start up and commissioning of the machines.’


Standards Stuff

Energistics, PPDM, ISA 100, PIDX, Open Geospatial, Workflow Management Coalitions, PODS.

Long time sparring partners Energistics (formerly POSC) and the Public Petroleum Data Model Association (PPDM) have made-up with the signing of a memorandum of understanding (MOU) covering reciprocal memberships and ‘areas of cooperation.’ The organizations plan to develop connectors for ‘seamless data transfer’ between Energistics’ ProdML production data exchange and PPDM’s E&P Data Model.

~

The ISA100 standards committee on wireless systems for automation has created a new subcommittee to address options for convergence of the ISA100.11a and WirelessHART standards. The subcommittee’s goal is to merge the best of both standards into a single converged subsequent release of the ISA standard. ISA100 committee co-chair, Pat Schweitzer of ExxonMobil, said, ‘This new subcommittee is the next logical step in helping industry achieve the significant benefits of wireless technology.’ The ISA has also set up a Wireless Compliance Institute to develop compliance programs for ISA100 standards. Institute members include Shell, Chevron, Honeywell, Invensys and Yokogawa.

~

Speaking at the Energistics public meet in Houston last month, Lars Olav Grøvik lent support to Energistics’ WITSML standard for rigsite data exchange. Statoil’s experience in Norway shows that WITSML has solved many of the problems with the old binary WITS standard. WITSML is a great asset for real time drilling and i-field initiatives. StatoilHydro is now ‘100% WITSML’ and the protocol is considered ‘vital for the i-field.’ Where the idea is to ‘get the best of automation and human quality control.’ Shell’s Omar Awan offered an equally enthusiastic if slightly less bullish account of WITSML uptake in Shell. WITSML is used today and is delivering value, but not yet for all wells nor for all data flows. There remain untapped opportunities for use in real time validation and model optimization and automated reporting. Shell’s global standard architecture requires WITSML enabled/certified software solutions. Stricter compliance is required to improve uniformity of WITSML implementation in solutions before they displace legacy WITS in Shell.

~

The PIDX Classification Workgroup has released version 5.0 of its Product Classification Templates. The template count now stands at nearly 4000 thanks to some major corporations who converted their entire Material Master to the PIDX classification. The PIDX review and approval process has streamlined down from 180 days to 21 days.

~

The Open Geospatial Consortium and the Workflow Management Coalition have signed a memorandum of understanding to cooperate in advancing standards-based, interoperable work flow and web-enabled geospatial content sharing, modeling and visualization.

~

The Pipeline Open Standards Association’s (PODS) workgroup has approved the joint PODS/NACE External Corrosion Direct Assessment (ECDA) integrity data exchange standard. The standard is a data interchange database structure for standardized integration and reporting of ECDA data within the pipeline industry. There currently are no standards for submission and management of ECDA data. Meanwhile the PODS community is wrestling with the proposed future transition to GUIDs and the likely impact on database performance.


Oil Search reports successful leak detection test

EnergySolutions’ PipelineManager provides leak warning and accurate positional detection.

Oil Search, operator of all of Papua New Guinea’s oil and gas fields, has deployed and tested EnergySolutions’ Pipeline-Manager package. In two tests, Oil Search received a warning within seconds of the pipeline valve being opened and a full alert was received within minutes. A precise location was determined within 10 minutes, to within 0.44 km for one leak test and 0.67 km for another.

Webber

EnergySolutions CEO Jo Webber said, ‘Oil Search was experiencing an unacceptably high rate of false alarms with its custom-built system and looked to us for a field-proven off-the-shelf solution. PipelineManager meets the requirements for determining the severity and location of pipeline leaks without false alarms.’

Kayess

Oil Search Project Manager Lara Kayess said, ‘PipelineManager’s leak detection function provided the sensitivity, robustness, reliability and accuracy we were after. PipelineManager provides significant improvements over our previous rupture detection system.’ More, including a case study for download, from www.energy-solutions.com.


Honeywell simulates for Suncor Voyager expansion

Operator training simulator for Northern Alberta upgrader has dual use in plant optimization.

Suncor Energy has awarded Honeywell the contract for the operator training simulator on its Voyageur oil sands upgrader in Northern Alberta. Honeywell’s UniSim technology will be a key component of Suncor’s ‘operational readiness’ initiative covering staff training and safe plant start-up. The Voyageur upgrader will process bitumen into crude oil with a targeted production capacity of 200,000 barrels of oil per day by 2012.

Macleod

Honeywell’s regional general manager Tom Macleod said, ‘Operator training is at the heart of plant safety, reliability and efficiency—all of which are critical for a successful startup and continuing operations at green field facilities such as Voyageur. By preparing operators for potential problems, operators can mitigate the risk of lost production and catastrophic accidents.’

Optimize

The simulator can be used beyond its training role to accelerate time to first oil and, once the plant is operating, to further optimize overall production. Suncor pioneered bitumen extraction and refining in the late 1960s and is still the single-largest investor in Canada’s oil sands industry. Northern Alberta’s oil sands are the world’s second-largest oil reserve, following Saudi Arabia.


Shell’s ‘Downstream One’ reference data architecture

Paper by Matthew West outlines data infrastructure for Shell’s globalization initiative.

Speaking at the Vega Group’s Integrated Enterprise Architecture Conference in London earlier this year, Shell International’s reference data architecture and standards manager, Matthew West presented a seminal paper on Shell’s downstream information management. The context of West’s presentation is Shell’s ‘Downstream One’ initiative to globalize Shell’s downstream business around a single set of processes and systems. Downstream One aims for more accurate and responsive customer interactions, eliminating errors and rework, reducing costs by eliminating ‘noise’ in business processes and generally to provide ‘proven and simpler’ ways of doing things. Downstream One aims to reduce the number of operational systems by over 90% and to leverage consistent reference data—a critical element of business integration according to West.

Daratech

Citing a Daratech survey, West broke out the relative cost components of an information system. Software, Hardware and systems integration weigh in at around 10% of overall project cost. Training amounts for another 20%. But the lion’s share—the remaining 50% is accounted for by, you’ve probably guessed, data costs. West then discussed the key contributions to quality data and how companies could self-evaluate their information management in terms of an IM maturity model.

Staged improvement

IM maturity is a measure of the quality of information management in the enterprise. West stressed that ‘you cannot leap from having poor information management to having great information management in one go.’ A staged approach is required with consolidation at each level before moving on. Fortunately, each stage can deliver incremental business benefits. Shell is working towards information nirvana by first putting the IM Landscape infrastructure in place and using it. The next step is to change ‘practices and attitudes’ to enterprise information to assure take-up.

Nirvana

West, using an approach developed by Data Warehouse guru Larry English, outlined the different stages of IM maturity. Nirvana, the level at which Shell is or will be a world leader, supposes that management regards IM as an essential part of the enterprise. Change management processes would be in place and the enterprise architecture routinely used to address information quality issues as they arise. West outlines Shell’s roadmap to IM nirvana through ‘recognizing, specifying, managing and on to optimizing.’ Key enablers in the process are reference data, the corporate data model, data quality and standards. For West, IM maturity is a powerful tool to assess where you are and what you need to do next in building your information management landscape. Enterprise Architecture is a key part of the information management landscape required to improve information quality.


Hess goes on IT shopping spree

Infrastructure and applications software deals with Paradigm, Schlumberger and IBM/Cognos.

Hess has been out shopping for IT infrastructure and application software. An agreement with Paradigm is to provide Hess’ upstream division with enterprise-access to Paradigm’s Epos database and infrastructure along with Geolog (petrophysics), Focus and Geodepth (seismic processing and imaging). The ‘multiyear’ contract includes training and consulting services. Scott Heck, Hess’ SVP of E&P technology said, ‘This new relationship with Paradigm is an investment in optimizing our petrotechnical software toolkit.’

Schlumberger

Hess has also signed a deal with Schlumberger for application software, data management services and workflow support. This agreement adds GeoFrame, (reservoir characterization), Petrel (interpretation), ECLIPSE (fluid flow modeling), Merak (economics), OFM (production) and Drilling Office.

Cognos

Meanwhile, Hess downstream has signed with IBM/Cognos for the provision of business intelligence software. Hess is to standardize performance management on Cognos to deliver unified access to critical information in its downstream operations. Hess will use Cognos 8 Business Intelligence (BI) within its retail, refining, finance, energy marketing and corporate divisions.

Steinhorn

Hess CIO Jeff Steinhorn said, ‘Cognos give our managers self-service access to information that sill streamline operations and decision-making through a single, intuitive business intelligence solution.’


Free online data management maturity checker announced

Paras Consulting provides online capability analysis tool for data standards, ownership evaluation.

UK-based Paras Consulting is offering a web based tool to assess corporate data management maturity. The free online questionnaire provides a quick benchmark of data management capabilities and allows for comparison with industry peers. Capabilities are evaluated across seven areas and the results presented graphically as a ‘spider chart’ showing areas of strength and weakness. Paras assures confidentiality—no company names are required or revealed.

Subject areas

Data topics include management, ownership, standards and nomenclature, data architecture and loading, physical data management, quality, and access. The questionnaire rounds off with a section on backup and restore, employees’ data ‘culture’, document management and the value-add from a dedicated data team. If a more detailed analysis is required, Paras consultants are ready to check out the submission and provide feedback. More from paras@paras-consulting.com.


StatoilHydro, BP early Smart Wireless adopters

Emerson’s PlantWeb wireless extension described as ‘the new frontier in asset management.’

Emerson Process Management reports on early adoption of its Smart Wireless (SW) solution set, an extension of its PlantWeb digital architecture. SW provides communications spanning field networks, business and plant applications. BP uses the technology to monitor well-head annular pressure at its UK Wytch Farm field and StatoilHydro performs surveillance of wellhead and heat exchanger pressures.

Karschnia

Bob Karschnia, Emerson VP wireless said, ‘Diagnostic information from wired devices such as pressure, temperature, and vibration transmitters is captured through the SW gateway into a self-organizing network based on WirelessHART. Hardened Cisco outdoor access points are deployed to support applications such as VoIP, video surveillance and ‘people location’ to improve workforce productivity.’ ‘Up to’ 90% cost reduction is claimed over wired technology. Craig Llewellyn, president of Emerson’s Asset Optimization division described wireless as ‘the new frontier in asset management.’


Rapid Solutions’ master well list ‘appliance’

Prime West Energy Trust uses Rapid Integration Appliance as single source of truth for well data.

For companies struggling with well identity management, Calgary-based Rapid Solutions has come up with a hardware and software ‘appliance’ to serve a master well list throughout the enterprise and to orchestrate connectivity across well lifecycle workflows.

Granger

Rapid has experience in delivering master well lists to clients such as Prime West Energy Trust whose COO Tim Granger said, ‘Our focus is a definitive master well list for the entire company to bring efficiencies to land, accounting and production systems.’

RIA

The Rapid Integration Appliance (RIA) provides a ‘single source of truth’ of well information for cross functional business processes. The RIA provides prebuilt integration with legacy departmental applications and a data foundation for ERP, BI and BPM solutions. Data flow and processes can be automated through workflow/notification engines and a balanced scorecard measurement system.

Integration

The RIA can talk to and coordinate master data in SAP, Oracle, Sybase, Informix, Cognos and Crystal Reports and other applications that ‘talk’ .NET, J2EE, XML, SOAP and more. Prebuilt oil and gas interfaces are available for production, land, engineering, reserves, economics, drilling, completions, geology and more.


Mobile wireless networksfor Medco Energi’s drillers

Meshdynamics’ wireless solution links field base to mobile rigs operating on the Karim Small Fields Area, Oman.

Jakarta, Indonesia-based upstream operator Medco Energi has deployed a wireless solution to link its field base with several drilling rigs in the Karim Small Fields area in the Sultanate of Oman. Medco uses the network for the transmission of drilling data, Internet, email and providing for future voice and video. Increasing the challenge, the rigs move around an area of 250 square miles (650 square kilometers) typically every 15-20 days.

Hussam

Muscat-based Hussam Technology Company designed and supplied the turnkey solution built around MeshDynamics’ MD4000 third generation Structured Mesh wireless solution. The mesh network consists of wireless hops every seven to nine kilometers consisting of both point-to-point links and point-to-multipoint links. MeshDynamics MD4000 nodes operate in a wide variety of frequency ranges. Here the 5.8GHz band was used as local telco regulations allowed for higher transmit power in this band.

Bandwidth

The MeshDynamics technology provides bandwidths of 36Mbps to 54Mbps for email, Internet and drilling data exchange. The MD4000 family supports up to four radios in a rugged weatherproof enclosure about the size of a hardbound novel, ideal for Medco’s demanding application. Solar power systems were installed at a number of locations to power the mesh nodes.

daCosta

MeshDynamics CTO Francis daCosta said,‘Our business continues to grow with natural resources enterprises, customers now include seismic exploration, mining, and petroleum exploration firms worldwide. These markets are growing rapidly with these industries’ increased focus on worker safety and efficiency.’ Other HTC communications solutions leverage Free Space Optics (FSO), Millimeter-wave, Microwave, WiMAX, WiFi Mesh Networks, Outdoor and Indoor Wireless LAN. More from www.meshdynamics.com.


Baker Hughes licenses DiscoveryWells WITSML technology

Kongsberg Intellifield’s SiteCom provides secure communications for drilling and production data.

Baker Hughes has signed a long term contract with Kongsberg Intellifield for its SiteCom, Discovery Wells and Real-Time Intelligence modules. Baker Hughes is to leverage the Kongsberg tools in web-based delivery of real-time visualization and analysis of drilling operations. Discovery Wells enables remote visualization of WITSML data streams from the well site. Multiple inputs of time or depth-based, real-time or historical data can be blended and viewed in a configurable graphical user interface. Other data sources such as MWD, LWD, mud, cement, weather, etc. can be incorporated in real-time.

SOAP

SiteCom, one of the first testbed platforms for WITSML development is a real-time data management platform that is built upon the W3C SOAP technology for cross platform web services-based data exchange. SiteCom provides Baker Hughes with an integrated, secure environment for gathering, distributing and managing drilling, formation evaluation, completion and production data.

Model

The infrastructure is planned as a �model� WITSML deployment that will be of interest to both international and national oil and gas companies interested in deploying their own secure enterprise wide infrastructure. In addition to gathering, distributing, and managing data in real-time, it will allow the additional benefits of standardizing real-time and automated global processes including QC of data, seamless flow of data into interpretation packages, and performance of analytical calculations using real-time data. More from bill.chmela@kongsberg.com.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.