I have long been an admirer of erstwhile Byte magazine correspondent Jerry Pournelle, whose tales from ‘Chaos Manor’ can now be read in Dr. Dobb’s Journal. With a cupboard full of ‘decommissioned’ laptops, a three-hub network and seven PCs online, it struck me that if Pournelle can get away with it every month, then I should be able to indulge myself with a hardware story in celebration of the 100th issue of Oil IT Journal. But before I get into the story, you need to understand something of my psychological make-up.
Exchange and Mart
When I was a kid, one of my favorite magazines was Exchange and Mart—a UK collection of small ads of dubious provenance. As an example, there was the ‘seebackroscope,’ a monocle fitted with a mirror so that spotty youths could ogle young ladies without being seen (except for being seen with a strange black gizmo stuck in their eye). To my chagrin, I never got a seebackroscope. I did acquire kits for building crystal sets, later transistor radios, headphones from WWII and so on. Above all, Exchange and Mart catered to what the psychologists call the ‘spectrum’ male in me, fostering a life-long habit of pouring over magazines in a mindless quest for enlightenment and new toys.
Many years later, the combination of a modest amount of what I like to consider ‘disposable wealth’ and the advent of eBay, has led to another attack of my Exchange and Mart disorder. Although eBay comes somewhere between Spider Solitaire and online poker in time wasting and expense, the small add-ict in me was reawakened when I spotted a snazzy, 10,000 rpm SCSI disk with a 160 mbps LVD interface. Now for those of you who don’t care, know or want to know about this sort of stuff, you might as well stop reading. It gets worse.
A 10krpm LVD SCSI disk is nothing too fancy. But this technology of a couple of years ago that is now being sold really cheap—as folks move up to 320mbps SCSI and SATA. For me, it seemed like an opportunity to add a cheap, high performance SCSI subsystem to my IDE machine.
A long story
This was, as the French say, ‘le doigt dans l’engrenage’—the finger in the gear train. When my new disk arrived, I needed a SCSI card. Seeing as how eBay doesn’t have everything you need ‘in stock’, the card was purchased, for real money, from an online vendor. When it arrived, a problem emerged. The card was a 32 bit (long) PCI card of the sort that are used in servers. This meant that instead of an upgrade, I was suddenly looking at a new, server architecture. Being of a philosophical nature I thought, in for a penny, in for a few Euros. I bided my time watching out for a suitable motherboard to appear on eBay.
Soon enough a board came along with a couple of long PCI slots—and indeed, sockets for dual Xeon processors. This was getting to be more of ‘in for a penny in for a few thousand Euros.’ But what the heck! The board’s arrival was followed by the realization that, apart from the Xeons, these kind of boards need a) a special kind of rather expensive memory and b) a very big box with a fancy EPS12V power supply.
There ensued a combination of fortuitous eBay purchases (the Xeons, the heatsinks and fans for the Xeons, various SCSI connectors and terminators) and some buys-in-desperation from online merchants (memory and the big box). Parts jetted in from the four corners of the world. Esoteric manuals were culled from the internet. Acronyms were deciphered. An assembly plant took over the dining room table.
Next came the big decision—which OS? Another dreadful confession. This is actually the third machine I have built this year. I have been playing with OS installations—Red Hat, Suse and Windows 2003 server. The only OS that a) installed without too much hassle and b) allowed me to use all four processors (yes because in processor math, 1+1=4) was Mandrake Linux. Which neatly segues into the next editorial in this series. Operating systems I have known, from mainframes through calculators to desktops, back and forth from Unix to Windows to Linux. Or will it be on languages, from Fortran though Basic to the Unix shell, LAMP and Perl? I may even have something to say about web services!
BP, following in Shell’s footsteps, has installed its first IT ‘Mega Data Center’ in the Far East. BP’s ‘Most of the World’ (MoW) Mega Data Center, the first of three worldwide, was opened at the SingTel Telecom Complex in Singapore last month.
BP’s MoW data center is a 10Gbps, Tier 4 facility which will provide BP’s businesses with an ‘improved quality of service’. All of BP Singapore’s current digital infrastructure will be moved to the new Center which will also assure ‘enhanced’ disaster recovery thanks to a high speed link with a mirror facility. The Center will house upwards of 100, mostly Intel-based, servers.
BP CIO John Leggate said, ‘We are always looking for opportunities to enhance productivity. The consolidation of servers into the Singapore Mega Data Center will help our businesses in Asia to move at an even greater pace. In many ways, Singapore is at the commercial crossroads of the world.’
Teo Ming Kian, Chairman of the Singapore Economic Development Board, added, ‘We are pleased that BP has chosen Singapore as the location for its ‘Most of the World’ Mega Data Center. The decision is a testament to Singapore’s reputation as a trusted and reliable IT and business hub.’
Following BP’s disposal of its local refining interests last year, the company decided to turn its Singapore unit into a service center for the region, providing key functions such as legal, tax, audit, and digital business. BP Singapore’s role has since expanded to service BP’s regional and global business interests. An estimated 10,000 MoW users will be connected to the Center from BP’s upstream, midstream and downstream segments.
BP Singapore president Wu Shen Kong said ‘BP Singapore has transformed into a knowledge hub for Asia Pacific, playing a strategic role in BP’s activities including trading and service support functions.’
BP Singapore expects to grow from its current 430 employees to over 500 next year, especially as it scales-up its global trading activities. BP’s second data center is under construction in the UK, with a third planned for North America. Shell inaugurated its first Mega Center in Malaysia’s ‘Silicon Valley’ back in 2002 (see Oil IT Journal 7 N°1).
Phoenix-based Honeywell International has updated its Enhanced Ground Proximity Warning Systems (EGPWS) database for helicopter operations. The latest update has added 5,000 Gulf of Mexico oil rigs and 4,000 land-based European structures to the helicopter terrain database.
Dan Barks, Director of Customer Marketing for Honeywell Commercial Electronic Systems said, ‘The addition of the new data on these obstacles will add significantly to the safety of EGPWS-equipped helicopters flying in the Gulf of Mexico and Europe, and we encourage every operator to add the update as soon as possible.’
EGPWS compares the aircraft’s location, which is constantly updated from a Global Positioning System, to a built-in database of terrain and obstacles and provides the flight crew with a moving map display. If an aircraft approaches too close to an obstacle or terrain, the system displays a brightly colored warning icon and sounds an audio alert in the cockpit. The update is provided on a small data card.
‘I read you article about mind-numbing PowerPoint presentations with great amusement. I think you should claim a patent on your formula, there must be some money in this idea. I've always applied a less rigorous formula, thinking that it is ok to spend as much time preparing for the presentation as the combined time the audience is putting in. I then limit myself to 2 days work for a 45 min talk, but I had never thought of applying a formula to the actual content quality. It is quite scary really, because your reverse banality test (RBT) makes almost everything obsolete. However if you apply it to your own article then there is also not much left of it either, but I guess that just proves the point. Edward Tufte wrote in his book, ‘Envisioning Information,’ that PowerPoint has annihilated the quality of information and should be banned from schools and universities. My boss just asked me for input for a presentation he is giving, which is awkward as I can’t get the RBT test out of my mind: ‘The purpose of data integration is to integrate information so that it is not disparate’—you get the picture. All the best.’ Thierry Gregorius, Shell.
Alan Doniger (POSC CTO) heads the POSC/WITSML Special Interest Group (SIG) which now has 30 company members. The XML and XSD schema have been released and an API is available for active servers or repositories. The big news in the WITSML community is the arrival of version 1.3. WITSML is expanding from its drilling origins and is to replace other early POSC MLs, including WellLogML.
WITSML in BP
Matthew Kirkman (BP) said that WITSML has succeeded in creating an active community to exchange drilling and completion data. It automates tasks leveraging new IT technologies such as data push, sharing between operators, contractors and partners. It is also impacting statutory reporting, trading and data input to partner databases. All this is done with a single ‘freely available’ standard. Today, deployment is restricted to the operational data store and applications – it has not yet seen take-up on the rig itself. WITSML is also limited to high cost wells. Kirkman wants WITSML to be ‘more ubiquitous.’ WITSML is ‘working toward’ a taxonomy – a common terminology for data and context movement, providing a history of what happened to a well bore. BP is also funding a project for the inclusion of time-based (daily report) data.
Danny Bush, ChevronTexaco, is a long time member of POSC but found that WITSML has proved to be ‘the most compelling business case/activity in the POSC portfolio’. CTC has been using a vendor’s WITSML server for deepwater drilling MWD/LWD projects. CTC is working with INT to develop a Microsoft .NET-based well bore data viewer.
Peter Nielsen described how Statoil is linking offshore wells to its onshore Stavanger base with all MWD, FEWD service providers connecting over a WITSML data exchange link. Engineers load depth based drilling data to project databases and the drilling database with Landmark’s OpenWire. 80 wells and side tracks have been loaded to date with up to 7 simultaneous operations. The system also allows time-based data to be loaded to the drilling database. Real time QA/QC is performed on deviation data. Future developments will include a notification system to alert users of new data, a streaming mode with a publish and subscribe mechanism, enhanced data rates and 24x7 real time operations.
For Robert Aydelotte, ExxonMobil’s Technical Computing (TC) infrastructure is about the ‘convergence of science, data and workflows’. TC is run by G&G, not computing – WITSML is a good example of workflow driven optimization, providing a generalized method for describing what data means – providing context, how to read/write and where it should be stored. ExxonMobil is developing a WITSML mud log object. Aydelotte also notes the need for standardization of well deviation data and recommends a European Petroleum Survey Group-based representation of well locations and well bore paths including geodetic transform and processing requirements. POSC’s Well Schematic ML was developed a few years ago and thus pre-dates WITSML and XML schema. This has been updated and re-written in a WITSML style, allowing for planned and ‘as run’ data on tubulars, cementation, perforation, gas lift etc.
Stewart Robinson said that the UK regulator, the Department of Trade and Industry, is ‘interested’ in WITSML and could mandate its use for data exchange with industry. Robinson made a plea for better documentation of WITSML schemas. The DTI intends to deploy a UDDI, web services-based infrastructure for pre-application validation – checking metadata for wells, licensing through XML download and delivery to DTI.
Melissa Symmonds presented Schlumberger’s WITSML offering, a component of its ‘InterAct’ real-time drilling infrastructure. InterAct exposes mud, well-bore and log objects accessible through the WITSML API. Schlumberger applications including DataLink, Drilling Office, RT GeoSteering, Directional Drilling Toolbox, PorePressure, DrillViz and GeoFrame, are now WITSML-enabled. WITSML ‘is a key enabler’ of the Operations Support Center and ‘iCenter.’ In the Q&A, Symmonds said that third party data could, ‘with difficulty’, be hosted from the InterAct server. According to Sheldon Harbinson, Landmark is more accommodating to third party data in its OpenWire WITSML solution. This connects to Landmark’s Engineering Data Model (EDM), Open Works and GeoFrame. Open Wire is seen as the future of connectivity to Landmark’s Real Time Operations Center (RTOC). Landmark will add ‘a significant number’ of WITSML objects and is to provide an API for real time, bi-directional exchange with the EDM. This will be accessible through an EDM Applications Programming Interface (API) accessible to the WITSML community and exposing Landmark’s engineering data. John Shields outlined how Baker Hughes’ (BHI) RigLink aggregation server collects WITS and WITSML from BHI and third party feeds. This is exposed to the world as WITSML data, providing support to third party applications. A recent example of interoperability is KSI’s DrillWorks Predict connection via WITSML. Rune Skarbo’s company, Sense Intellifield, runs operations centers for BP, ConocoPhillips and Statoil. Sense’s WITSML building blocks ‘give customers ownership of their data’. Sense’s SiteCom V3.0 includes a real time database for curve data and a relational database for all other WITSML data.
WITSML has been a shot in the arm for interoperability, for the standards movement in general and has revitalized POSC. For those of you who, like us, want to try WITSML hands-on, visit http://w3.posc.org/demos/witsml/store_client/quickstart.html.
This article was taken from a 9 page report in The Data Room’s Technology Watch series. More from email@example.com.
HP is targeting the seismic interpretation workstation market with a high-end PC workstation, the xw9300. ‘Targeting’ is perhaps the wrong word since demand for a 64 bit AMD-based machine came from HP’s oil and gas clients. Interpretation workstations are memory hungry beasts—thanks to 64-bit Linux, the 9300 supports up to 16GB memory.
The system offers dual PCI Express x16 graphics and dual processors. Graphics are NVidia-based, with the option of Quadro SLI-capable boards announced for early 2005. SLI technology features multiple graphic processing unit capabilities hitherto only available on very high-end machines.
The new machine will compete with IBM and Sun workstations and to a lesser degree with HP’s own PA/RISC-based series. AMD clients in oil and gas include Veritas, Shell, Petrobras, Pemex, IBM, Sun and Microsoft (for development). Red Hat is the first available 64 bit OS for the 9300, with Suse certification to follow.
The AMD 64bit Opteron-based system sells at a bargain-basement price of ‘from’ $1,888 in the US. But a system which exploits the 9300’s expansion to the full will cost many times that amount. Using the HP ‘buy online’ configurator we priced a top-of-the-range machine with dual processors and 16GB memory at around $20,000.
TGS-Nopec unit A2D Technologies and Landmark Graphics have signed a data reseller agreement to offer a data and software bundle to clients.
A2D’s Rod Starr said, ‘We have adapted our solution to the exploration community’s workflows. This improved delivery channel provides the end-user with high-quality, readily available well log data.’
Landmark VP Doug Meikle added, ‘This agreement reinforces each company’s position and strength in the market. The current arrangement recognizes Landmark as a leading application provider and A2D as one of the industry’s premier data vendors. This new relationship encourages continued cooperation between the organizations and optimizes resource use in both companies.’
A2D recently completed the full integration of the data it acquired from Riley Electric Log. Of the 1.8 million wells in the hardcopy inventory acquired with Riley, 1.3 million now have digital raster images. A2D remains on schedule to complete this effort by mid-2006.
Two German companies, seismic software house TEEC and cluster manufacturer Megware have teamed to provide a stand-alone, cluster-based seismic processing solution. TEEC has ported its 3D seismic processing package to Megware’s Linux SuperCluster.
TEEC’s 3D- Common Reflection Surface (CRS) software is said to be suited to areas of strong geological complexity and/or poor signal to noise ratio. It is considered as an alternative or complement to the traditional pre stack time and depth migration.
Megware’s Slash Five rack system is particularly suited to the compute-intense nature of 3D-CRS. In a footprint of less than one square meter, a Slash Five unit can be equipped with up to 80 servers and 160 CPUs (Opterons, Xeons or Itanium 2s). The compact units include PCI-card extensibility and users can chose between Myrinet, SCI or Infiniband interconnect.
SeisWare 6.1, the 10th version of Zokero’s seismic interpretation solution includes new workflow-centric tools for ‘advanced’ cross correlation, log editing, enhanced synthetics, fault interpretation and improved mis-tie analysis.
An Open Spirit link now couples SeisWare to the major interpretation platforms. Zokero worked with Open Spirit Corp. to enable application interoperability via a SeisWare plug-in.
The cross correlation enhancements let users define and cross correlate multiple wavelets in a seismic volume to generate ‘complex pseudo-facies’ maps.
Calgary-based Divestco has announced the integration of its three interpretation tools, WinPICS (seismic), GeoVista (data browser) and a new 3D visualization tool, EnvisionVSX.
Dynamic data links between the products allow interpreters to access current well, production, land and pipeline data from the seismic workstation. EnvisionVSX provides ‘spectacular’ new techniques to view seismic in 3D using Divestco’s light and composite density methodology.
Shannon Niemi, VP sales and marketing said, ‘Exploration teams work closely together and so should the tools and technology they use. Our integrated geophysical and geological interpretation system is a result of Divestco’s drive towards integrated software tools.’ The current integrated releases are WinPICS 5.1, GeoVista V4.2, and EnvisionVSX 1.0.
As part of its National Hydrocarbons Data Archive (NHDA) initiative, the UK Department of Trade and Industry (DTI) has issued a consultation document for the archiving of geological and reservoir models. Kerr-McGee has provided data from its Hutton Field as a test case, providing the geological and reservoir models in ASCII formats based on Dynamic Graphics’ Earth Vision and Schlumberger Information Solutions’s Eclipse.
Because such formats are neither aligned with the UK Government’s mandatory XML format for data, nor with the POSC RESCUE geological and reservoir data model, the DTI is sounding out the industry as to best practices for model data capture.
The POSC-initiated RESCUE project is funded by BP, Exxon, Shell, Statoil, and Total and has wide support from software vendors (including Dynamic Graphics and Schlumberger!). But according to the DTI, has seen relatively little take-up in the UK.
Once in a lifetime
The DTI therefore hopes that interested parties take this ‘once in a lifetime’ opportunity of getting RESCUE recognized in the UK by responding quickly to the consultation. A special invitation is extended to the RESCUE sponsors.
POSC has annual revenue of around $1.5 million of which $250,000 comes from Special Interest Groups (SIGs) and the remainder from membership fees. POSC’s 50 corporate members include 9 oil companies. Currently active SIGS are WITSML, Practical Well Logs, Data Stores, e-Field (Integrated Operations), e-Regulatory and National Data repositories. Other activity of note is the Well ID service, which promises an online, globally unique well identifier service.
POSC CEO David Archer described WITSML 1.2’s successful deployment on some 120 wells to date. POSC’s Practical Well Logs SIG V2.1 results will be out by year end 2004. A new XML specification ‘ImageCal’ for well log scanned images has been submitted by A2D and IHS Energy for review in 2005. The Integrated Operations SIG has defined initial specifications and subsurface, surface and business data streams are ‘beginning to converge on common standards.’ IntOps, which has links to the Norwegian IIP Project aspires to be the ‘WITSML’ of production. Looking to the future, Archer anticipates significant growth in WITSML take-up for 2005 with new applications in mud logging and production.
Herb Yuan, (Shell and POSC Chair) described Shell’s Digital EP Business as enabled by ‘Process Portals’ delivering role and context-based information, connecting users to the processes and tools needed to do their jobs. Shell’s Well Delivery Project is leveraging WITSML to connect drillers to a range of repositories. WITSML enables connection between Halliburton’s InSite Service, RightTime Server and Landmark’s drilling applications. Over the past year, POSC has increased visibility and is keen to improve its marketing and certification programs. POSC challenges remain the delivery of a ‘digital EP business’ infrastructure supporting technical to business integration via data and real-time standards. This will be achieved by membership’s time commitment to SIGs and by improved coordination between standards bodies (work with PPDM). According to Yuan, ‘POSC is enabling the future data and real time standards needs of the EP industry. Shell is committed to make POSC work.’
Peter Breunig said that ChevronTexaco’s (CTC) active technical data storage reached 800 TB in 2004, ‘a huge and ongoing increase’. Enabling technologies like POSC’s projects are unlikely to get ‘mega funding’. To make them work, ‘less lofty’ near term goals are required, to demonstrate quick wins. WITSML is one good example of how this can work. ProdML should likewise ‘help accelerate implementation of smart field technologies.’ Breunig concluded, ‘Long term strategies are important, but given the business environment we are in, it behooves us to show results.’ Chevron Texaco joined POSC because of WITSML.
Chairman Lester Bayne (Schlumberger Information Solutions (SIS)) kicked off the proceedings by noting that hitherto, industry focus has been on cost cutting rather than on ‘adding value’. Nonetheless, a ‘relentless focus’ on lowering finding costs has translated to measurable gains. Today, industry can afford to be more ‘adventurous,’ to improve core E&P processes. With the spotlight on company valuations, Bayne pointed out that these are based on reserve estimates which are projections. Company assets equate to an audit trail of such projections.
Xavier Divisia (SIS) is project manager of Total’s Production Data Management Systems (PDMS Global) project. The PDMS combines a ‘commercial, market-proven solution’ (i.e. Schlumberger’s FieldView) with a consolidation of Total and SIS’ best practices. The project uses a ‘Program Office’ model whereby management is jointly owned by Total and SIS. FieldView data capture and production and accounting modules (PAR) are deployed along with OilField Manager (OFM). Following a successful pilot in Qatar, Total is rolling out worldwide in Brunei, the Netherlands, France and the Congo. A site assessment is underway in Venezuela. To date 12 affiliates have signed up to the project. Divisia concluded that the Program Office vision works, reconciling project management tensions – ‘the joint ownership concept works’.
‘No More paper’
Stuart Robinson (UK DTI) asks, ‘Do we really need paper logs, seismic tapes, hardcopy reports and maps?’ Today’s IT systems offer commoditized access control and security – as witnessed by online banking systems. A significant step in e-transacting is the development of an approvals workflow involving submission, review, approval and a legally admissible digital signature. Today, there is no more paper, ‘everything is digital’. The move to ‘all digital’ will help relieve companies of their permanent obligation to ‘deliver data in any format at any time from now to eternity.’
Richard Mapleston described Shell’s collaboration on the DTI’s digital signature initiative – a corollary of ‘no more paper’. Already, digital signatures are a key component of UK document management and could extend worldwide for a company like Shell. Other industries: defense, aerospace and pharmaceuticals are currently deploying digital signatures using the Public Key Infrastructure (PKI). The UK oil industry is implementing PKI from the Oil and Gas Trust Scheme. PKI is also of import to e-invoicing, most countries accept (or will accept) digitally signed documents. The DTI and Shell prototype uses either signed pdf or XML documents and a secure device (smart card or software) to hold the signature. A commercial signing plug-in ‘Squiggle’ is used in the DTI. Mapleston underlined the effort involved in maintaining a trust –based system. Pharma has 20 people managing its ‘SAFE’ signature program.
IM in Saudi Aramco
According to Ibrahim Al-Ghamdi (Saudi Aramco), ‘Data and information are as important as the entities they represent.’ The key components of the E&P IT system have been identified as process, people, technology and data. Of these, technology is ‘not a challenge,’ but the other three are. Al-Ghamdi analyzes Aramco’s processes and stakeholders into ‘supplier’, ‘input’, ‘output’ and ‘customer’ – the Six Sigma SIPOC paradigm. This was illustrated with a walk through Aramco’s Well Approval Package (WAP) workflow. Here a single system allows progress to be monitored and measured. Aramco’s experience shows that process improvement is the key –and that this is achieved by workflow mapping and measurement.
IT Metrics in Statoil
Kjetil Tonstad’s paper asked ‘how companies can realize the value of their IT investment?’ Studies have shown that this is impossible without re-tooling business processes to adapt to the new possibilities offered by IT. Tonstad focused on the merits of ‘front-end’ data loading – i.e. getting data in a project ready format from vendors and government data sources. Other work has shown that ‘soft factors,’ such as employee values and behavior, are often overlooked when measuring IT performance. These studies have led to a ‘breakthrough’ in IT metrics – the field of ‘Information Orientation’ developed by IMD professor Don Marchand. Marchand uses the product of ‘Deployment’ and ‘Effective use’ to estimate the business value of IT.
Business Process Portals
According to Erik van Kuijk (Shell), ‘There’s more to data management than we thought in the 1990s.’ Shell’s latest analysis sees the workplace as a factory for work products. Unfortunately, many workers don’t realize that everyone has ‘customers’ and fail to produce the right work products, leading to wasted effort and inefficiency. van Kuijk suggested that a rational division of labors would be around 30% on people, 30% on process, 30% on KID and work products and only 10% on tools. Today’s ‘people’ challenges include ‘balancing individual freedom against the interests of the Enterprise’, functional silos continue to cause problems as does a ‘lack of integrated thinking’. Other issues include poor definitions affecting process management, too many databases and applications and poor knowledge sharing. Today, it can take a year to implement required system changes. van Kuijk reasons that ‘if we can react in one week, this will give Shell a competition advantage.’ This can be done by taking the business logic out of the IT system so that it can be changed as required. Such an approach is described as ‘Business Process Management’ (BPM), where processes are managed as a portfolio, providing a unified framework for enterprise. Business Processes can be stored in a database a.k.a. the ‘meta KID repository’. Shell’s global well delivery process was identified as a candidate for automating the business process. This has resulted in a ‘global well portal’ built using SAP xIEP, Netweaver-based xApps for the upstream (see last month’s Oil IT Journal).
ENI’s data life-cycle
Mario Marco Fiorani outlined ENI’s objectives as ‘to speed data preparation and allow more time for analysis’. ENI has a ‘federation’ of databases, with a corporate master database (CDB) in Milan. The CDB holds interpretation results, entitlement information and real data from countries ‘at data risk’. Other federated databases are deployed in the North Sea and North Africa. Access is through the MyENI SAP portal which offers tabular, hierarchical or GIS-based data access. Both Landmark’s PowerExplorer and Schlumberger’s DecisionPoint are deployed as sub-portals in ENI’s system – leveraging XML data links to PetroBank and the Schlumberger Data Management Center (DMC). ENI also offers ASP data delivery to remote units.
GIS in Shell
Roger Abel, gave an update on Shell’s global ESRI-based GIS data management initiative. GIS is a ‘great front end’ for E&P data search, offering an 80% hit rate. This can be increased by using natural language and full text. Catalogs and taxonomies, which imply consistent publishing, are another way forward. Shell’s GIS deployment can be accessed through Schlumberger’s DecisionPoint or the SAP Portal (which also has ESRI-based GIS). Spatial is expensive in IT overhead. Shell would like to use Open GIS, but this is not realistic today. Problems arise from the vagueness of some spatial information: Is a well a surface location or a 3D track? What is the spatial extent of an oil field? What geodetics were used? To answer such questions requires a good understanding of data and geodetics. A trawl of Shell’s well data bases found that the main issues were with different coordinate reference systems (CRS). Projects have been known to ‘run for weeks’ with the wrong CRS! ESRI’s GIS solutions have proved ‘the most complex Windows application that Shell has scripted for global deployment’. Shell’s standard legend ‘symbology’ is to be released digitally into the public domain via the EPSG. In Shell ‘nothing happens without a SAP entry,’ so there is a GIS to SAP link key for assets. A map showing flow lines in Oman can be used to generate an SAP work order, which can be sent out to teams in the field. Other Shell GIS initiatives include IHS Energy data in ESRI’s Arc Globe, a ‘biodiversity’ tool and a database of global prospects and leads.
Paul Gregory, Intervera, cited a Gartner survey of data quality which showed that ‘1 in 4 pieces of information is flawed’. Moreover, 50% of data warehouses will experience ‘limited acceptance or failure’. Another survey (Deloitte Consulting Upstream Data Quality) found that ‘despite POSC, PPDM, various conferences and organizations, good data management is still hard to implement.’ One problem is trying to convince the CIO to spend non-allocated funds. An internal Intervera study found users did not trust the data quality of one drilling and completion application. The solution was DataVera’s ‘Health Check’ data quality tool.
John Shields (Baker Hughes) recapped WITSML’s history and purpose (see our report on page 3 of this issue). The big news in WITSML is the arrival of Version 1.3. This now points to the EPSG website for geodetic information. A new well log object is designed to replace LIS or DLIS, ‘old standards, getting to the end of their life’. This uses XML text for bulk data. WITSML uses simple zip compression, resulting in a file density comparable to a LIS binary. Some web servers and browsers perform compression themselves. A ‘Risk object’ has been added to help well planners flag zones of potential hazard, to report ‘incidents and risks.’ Units of measure validation is part of the V1.3 schema.
Paul Maton (POSC) enumerated the proliferating ‘standards’ for well path information which POSC is to ‘bring together’. POSC recommends adoption of the EPSG coordinate reference system, itself a component of ISO TC/211. A well actually needs three CRSs, local, geodetic horizontal CRS and geodetic elevation CRS. The EPSG offers 2,700 CRS and 1,100 transforms. A web-based service is planned for Q4 2005. UKOOA P7/2000 is to be embedded in the commercial release of WITSML V1.3 and will include planned and post-processed well paths.
Malcolm Fleming, OBE (CDA) reminded UK license holders, ‘past and present,’ of their ‘permanent legal obligation to provide seismic and well data to the government in any format, at any time from now to eternity.’ The UK National Hydrocarbon Data Archive was set up to remove this perpetual obligation by building a permanent digital archive of UK data. Companies who provide data to the Archive will be absolved of their data liabilities to the government. The NHDA has run into familiar data banking problems such as entitlements and obligations, especially when a license is sold on. Other issues arise with seismic group shoot data. Data modeling is problematical with disconnects between well, permit, seismics etc. These are mostly due to a historical failure to use consistent names for wells, surveys, etc. and to poor record keeping. The DTI is to launch a definitive license data base this year and will be issuing new guidelines to reinforce naming standards. Archive costs have been around 4,000 per well with baseline storage for the same data at around 900-2,000. Despite a 2-5 year payback, business drivers are weak because of the voluntary nature of the process. BG has archived 1 well to date; others (BP, CTC, Shell, Total) are to start archiving in 2005.
This article has been taken from a 13 page report produced as part of The Data Room‘s Technology Watch Reporting Service. More from firstname.lastname@example.org.
Kelman Technologies has reshuffled its board following the resignation of CFO Blake Lyon. Michael Van Every will takes Lyon’s seat on the Board. John Paul is interim president and Debbie Garbutt interim CFO.
Kevin Gerlitz has been transferred to Hampson-Russell’s Jakarta office. Gerlitz is a technical support geoscientist. The company is also looking to recruit a new support person in Jakarta.
Energy Solutions has appointed Tony Botterweck to its Board of Directors. Botterweck was previously with Koch Industries.
Bob Ge has joined Geotrace as Sr. Depth Imager in Houston. Ge was previously with Veritas GeoServices. Les Mitchell has also joined the company’s London office as EAME and CIS general manager. Mitchell was previously with Halliburton and Landmark Graphics.
Charles Zeltser is to manage CGG’s new Mumbai, India-based seismic processing center.
Anadarko has joined the BP Center for Visualization.
Tigress is looking to hire support geoscientists for work in Russia, CIS, Indonesia, West Africa and Iraq.
TGS Imaging has appointed Young Kim to head up its R&D effort. Kim was previously with ExxonMobil Exploration.
Norwegian consulting house Kadme AS is expanding operations with six recent hires—and is still searching for new employees.
Glen Murphy has resigned as president of WellPoint Systems. Randy Kuehn, VP Strategy, is also leaving. Both will continue to work in a consulting capacity.
Blair Wheeler has joined Aspen Tech as senior VP, Marketing. Wheeler started his carreer with Amoco.
Schlumberger Information Systems is to hold its annual Technical Symposium at the MGM Grand Hotel, Las Vegas from April 12-14.
Morten Tψnnesen is to head up Roxar’s new Asia Pacific headquarters in Kuala Lumpur, Malaysia.
Baker Hughes’ income for 2004 was $528.2 million (up from $177.9 million in 2003). Chairman Chad Deaton said, ‘2004 was an excellent year for Baker Hughes and we expect 2005 to be an even better. The oil industry has recognized the need to invest to meet consumer demand, offset depletion, rebuild inventories and restore productive capacity. We expect customers’ investments to grow in Russia, the Caspian area and in the Middle East and to remain strong in North America.’
CGG’s un-audited 2004 total revenues are estimated at €693 million, up 13% in euros and 25% in US$ over 2003. A weak dollar is making life hard for CGG and other non-US operators. A recent Standard & Poor’s study revealed that 85% of CGG’s billings are in dollars, but 50% of its costs are in Euros. A 10 cent difference in the exchange rate translates to a $15 million reduction in CGG’s EBITDA.
Computer Modelling Group (CMG) reported revenues of $10.5 million for the nine months to December 31, 2004, 15% up year on year. CMG’s total software license revenues were $7.9 million.
Fugro’s un-audited figures show turnover of approximately €1.02 billion for 2004, up from €0.83 billion.
Input/Output reported a net loss of $3.0 million, on revenues of $247.3 million for 2004 compared to a net loss of $23.2 million on revenues of $150.0 million for 2003. Two acquisitions, GXT and Concept Systems, added revenues of $59.1 million in 2004 and combined were ‘marginally profitable’. VectorSeis System sales amounted to $31 million during 2004. Fees associated with Sarbanes-Oxley compliance added over $1 million in cost. CEO Bob Peebler said, ‘For 2005, we are more closely aligning sales and product responsibilities and are focusing on E&P companies and contractors who want better seismic results. We expect 2005 revenues to range between $320 and $365 million.’ I/O recently completed a private placement of $30 million of a newly designated class of preferred stock to Fletcher Asset Management unit Fletcher International.
Privately held OSIsoft does not publish its financials but president Mark Hughes said the company was ‘riding the crest of a banner year’ in 2004. In 2005, OSIsoft, a.k.a. the ‘PI guys,’ is to reinforce and expand its partnerships, taking advantage of Cisco’s dominance on the Ethernet factory floor, and Microsoft’s Information Worker strategy. OSIsoft also plans to become ‘an integral part of customers’ SOA and IT strategies.’
TGS Nopec reported consolidated net revenues for 2004 at $ 171.6 million, up 25% from 2003. CEO Hank Hamilton said, ‘These results set new records for net revenues and earnings. Current market conditions give us reason for continued optimism in growing our business.’ TGS’ non-exclusive seismic activity accounted for 87% of business while its A2D digital well log unit accounted for 7% of 4th quater. consolidated net revenues. A2D completed its integration of Riley Electric Log and achieved its goal of 30% full year revenue growth. Of the 1.8 million wells in the hardcopy inventory acquired with Riley, 1.3 million now have digital raster images. According analysts, oil companies plan to up E&P spend in 2005 by between 6% and 15%.
Finnish IT behemoth TietoEnator has spotted the turn around and is to direct its development investments towards the oil and gas sector, where it plans to expand its customer base. In 2004, Energy Components, the unit’s spearhead product family, ‘continued to develop positively’ and these systems were delivered to several oil and gas industry customers around the world.
Oil well monitoring service provider eLynx is to deploy CygNet’s Enterprise Transaction Management software to host process control application services for oil and gas. ETM allows SCADA/real-time applications to be served over the internet, corporate intranets or private networks.
CygNet VP Joe Cusimano said, ‘eLynx is the leading application service provider (ASP) in the oil and gas market, providing remote management of well and pipeline assets via applications hosted at their headquarters. Using CygNet ETM as an application service tool gives eLynx’s customers the flexibility required to optimize production, reduce operating costs and increase compressor and well run-times.’
Doug Redmond, eLynx VP product development added, ‘CygNet software is ideal for use in hosting and managing application services. It’s the highest performance and most scalable real-time data management software in the industrial marketplace. ETM allows us to integrate with a wide variety of field controllers and flow computers, a boon to clients with disparate, legacy automation systems.’
ETM is a suite of integrated applications for enterprise-wide data acquisition, process control, data delivery and archiving. The ETM SCADA server will supply data to the eLynx Framework for ASP well-monitoring application.
New software from Aberdeen-based Helix RDS promises ‘non-intrusive’ measurement of corrosion in mature wells. Previously, assessing downhole corrosion required caliper survey. Helix RDS now promises a ‘VirtualCaliper’ (VC) corrosion prediction service. An ‘empirical corrosion model’ is based on field-measured carbon steel tubular corrosion rates as observed by caliper and pigging surveys.
The model considers both laboratory test and field monitoring data along with corrosion predictions and other variables including water cut rate changes and well geometry. The outputs are wall thickness or corrosion rates versus depth and time. Corrosion rates can be predicted from anticipated well lifecycle conditions.
North Sea test
VC has performed well in blind trials, allowing prioritization of remedial work by early identification of at-risk wells. One North Sea operator used VC for due diligence and to optimize workovers. Another used the program to analyze a 6-year-old producing well to determine if the well would retain its integrity for its remaining life.
Engineers from Baker’s Centrilift unit and BP Alaska have developed a simulator to predict ESP (electrical submerged pump) run time and reliability. The run life simulator (RLF) uses various production and operational parameters to optimize ESP deployment and maintenance. The software, which allows for dual-and triple-ESP systems, was calibrated on a large number of ESP installations.
In 2004, the RLS successfully predicted inventory requirements for a large South American operator. A simulation run in February predicted that 75 units would be pulled by July. In fact 73 units were pulled. Improving system reliability brings the largest return in reducing workover frequency and lost production. The effect of dual and triple installations with standby redundant units in a well is ‘dramatic’ and should be considered whenever workover costs are high.
In BP’s Milne Point Field in Alaska, the critical main factors affecting pump life were set up field units. See Earl Bruce Brookbank’s article in the current issue of Baker Huges’ InDepth Magazine for more on the RLF.
Houston-based Interactive Network Technologies (INT) has just released GeoToolkit 3.1, a set of graphic components (widgets) tailored to the upstream. GeoToolkit’s C++ based library lets software engineers ‘fast-track’ E&P software development under Windows, Unix or the cross platform environment, Qt.
INT president Olivier Lhemann said, ‘GeoToolkit now fully integrates Trolltech’s Qt environment. Developers now have access to all the power of the Qt environment, including integrated hardcopy output and support for Qt Designer.’
MRO Software has just announced its next-generation solution, Maximo Enterprise Suite (MXES). MXES provides asset and service management, leveraging Maximo’s IT Infrastructure Library (ITIL). MXES includes contributions from partners InteQ, Getronics and UNICCO. Users of Maximo Oil & Gas will migrate to the new platform in the coming months.
InteQ president Yash Shah said, ‘Including partners in MXES development allows for a solution that extends beyond our traditional expertise. MXES is a well-designed solution for the asset management and service desk markets.’
MXES helps ensure compliance with contracts, service level agreements, internal corporate standards, and government regulations such as Sarbanes Oxley. MXES consolidates asset management systems into one platform spanning production, facilities, transportation and IT assets, with access from a single service desk.
MRO Software executive VP Patricia Foye added, ‘By combining our background in asset and service management with our partners’ expertise, MXES provides deep functionality for IT-based asset management.’
The OPC Foundation (OPCF) has joined with the Foundation Fieldbus organization to provide a common, ‘unified architecture’ for open data exchange based on the IEC 61804-2 device description language. The OPCF joins the three existing IEC signatories, Fieldbus, Profibus and Hart Foundations to ‘extend the reach’ of electronic device description (EDD) into OPCF’s architecture.
Electronic Device Description Language (EDDL), a text-based language for describing the digital communication characteristics of intelligent control devices, underpins millions of installed field instruments. Device suppliers use EDDs to provide information on parameters and other data in a device.
The working group has developed extensions enabling robust organization and graphical visualization of device data, and provided support for platform independent, persistent data storage. When finalized, the extensions will be integrated within the respective control network technologies and added to the IEC 61804-2 standard.
OPC originally leveraged Microsoft’s OLE technology, but has since evolved through COM, DCOM and latterly .NET. It remains a primarily Microsoft-based specification. Signing up to Fieldbus means that OPC will be able to integrate non-OPC devices and vice-versa.
Fieldbus president Richard Timoney said, ‘This agreement will result in a vastly simplified approach for users to access and distribute performance measurements and process data such as alarms. Users will also take advantage of increased system interoperability and cost-effective control system integration.’
OPCF president Tom Burke added, ‘By working with standards like OPC and EDDL, users can take advantage of open systems and be assured of connectivity and interoperability.’
Landmark’s Consulting and Services unit has just announced a new Upstream Technology Review Practice to leverage Landmark’s domain expertise with a ‘classical consulting’ approach. Several Landmark clients are engaging in ‘comprehensive upstream technology reviews’ (UTR).
Practice director Rick Mauro told Oil IT Journal, ‘The UTR practice leverages Landmark’s in-depth domain knowledge from bits and bytes through portfolio management. We begin with extensive interviews to analyze the root causes of clients’ problems. UTR scope spans IT backbone, technical computing, knowledge management and best practices. The aim is to optimize how people use technology to address business needs.’
Landmark’s consultants typically have 15-20 years of hands-on oil and gas experience. The UTR begins with interviews of executives and asset managers to understand key business drivers and known technological challenges. Asset team members are then interviewed to analyze workflows and tools and to identify pitfalls, bottlenecks and ‘what keeps them awake at night’.
George Kronman, reported in Landmark’s Solutions newsletter said, ‘The outcome of a UTR is a coherent, sustainable digital technology vision and strategy.’
Early UTRs found that technical experts were spending around 30% of their time finding, moving, reformatting, loading, quality-controlling and otherwise dealing with data. ‘Software chaos’ and inconsistent databases are ‘epidemic’.
Oklahoma City-based Enogex has deployed simulation/optimization (simopt) technology from eSimulation Inc. of Houston. The eSimOptimizer process optimization service provided has undergone successful testing at Enogex’s Custer gas plant, justifying a more extensive deployment.
Dan Harris, COO of Enogex, said, ‘eSimulation’s process optimization service has contributed to our improved performance over the last three years. We like their business model whereby our technology investment adds value to their business in a sustained fashion. We like their focus on documenting post audit results to justify further deployment.’
eSimOptimizer determines optimal liquid recoveries based on energy costs, feed conditions, capacity, contract structure and commodity prices. Results are published in the form of operational ‘setpoint suggestions’ via a secure web page. eSimulation president Mark Roop added, ‘As our first client, we appreciate Enogex’s willingness to embrace web-based optimization to drive business results.’ eSimOptimizer stores plant data and optimization results in a historical database. Web-based access provides decision support and data management tools for plant personnel, engineers, sales and business managers.
A key differentiator in eSimulation’s business model is the inclusion of engineering services required to keep the economic and process models aligned with current conditions.
Houston-based Quorum Business Solutions is to release a new software package for oil and gas volumetric data management. Quorum Volume Management (QVM) combines traditional field data capture and oil and gas measurement capabilities with a centralized data repository.
The software manages and validates volume data including oil, gas, water and petroleum products and offers data editing, recalculation and volumetric allocation. QVM’s web-based reporting supports multiple units of measure and languages.
Quorum VP Roland Labuhn said, ‘We have identified a segment of the oil and gas market that is not adequately served by existing software products. QVM’s functionality will help oil and gas companies struggling to manage volumetric data with disjointed software tools.’
QVM is an integrated component of the Quorum Energy Suite, a set of integrated business applications for integrated energy companies. The package includes Quorum Upstream (land management, GIS and production accounting etc.), Quorum Midstream (including TIPS gas plant accounting) and Quorum Pipeline for pipeline transaction, integrity and right-of-way management. Quorum has 160 staff operating out of offices in Houston, Dallas, and Calgary and has 20 Fortune 500 clients.
Houston-based pipeline management software house Energy Solutions International (ESI) has reported two new clients for its pipeline decision support technology. Both Calgary-based Suncor Energy Services and Oleoducto de Crudos Pesados (OCP) of Ecuador have licensed ESI’s PipelineTransporter (EPT) package.
ESI VP Rene Varon said, ‘Suncor’s project is part of a corporate-wide initiative to standardize key processes and technologies. ESI is helping Suncor meet its objectives by focusing on production planning and scheduling processes, common points of integration and standardization.’
EPT streamlines information flow between the transportation company, its shippers and clients, gathering nominations and lifting schedules through a flexible yet secure web interface. These are validated against physical and contractual constraints, allowing the pipeline scheduler to program resources to meet the shipper’s requirements. The software includes an integrated hydraulic model for capacity studies, DRA usage, operating costs, fuel usage and downtime reduction.
OCP chief engineer Andrιs Mendizabal said, ‘ESI started doing business with OCP during pipeline construction, providing leak detection, offline simulation and operator training applications. ESI’s technology has proven its worth under this pipeline’s challenging conditions of extreme elevation and a very heavy product.’ OCP’ 500 kilometer pipeline crosses the Andes, reaching an elevation of 4,064 meters (13,333 ft.). The contract includes software configuration to OCP’s receipt, transportation, storage and delivery processes. EPT interfaces with third party components, such as SCADA, optimization/modeling, billing and accounting systems including SAP and JDEdwards.
BP’s Gulf of Mexico Deepwater Production Unit has selected P2 Energy Solutions’ Enterprise Upstream Volumes Management (EUVM) suite for deployment in the Gulf of Mexico. Implementation will be jointly managed by P2 Energy Solutions and Science Applications International Corporation (SAIC) and is to include ‘comprehensive functional and technical consulting services’.
BP chose Enterprise Upstream for its ability to handle complex deepwater production handling agreements with commingled production. Enterprise Upstream also offered BP a good fit with other engineering systems in use. The software handles the complex simulations used to allocate production and gas energy according to BP’s partner contracts.
BP will upload lease level volumes into SAP accounting which is performed by IBM through an outsourcing agreement. BP is migrating to EUVM to achieve one-time data entry and Sarbanes-Oxley compliance. EUVM also offers joint venture partners and regulatory bodies an audit trail. BP’s GoM production tops 300k boe/d and is expected double by 2007.
IHS Energy has upgraded its Que$tor Offshore (QO) economics package to enables cost, facilities and reservoir engineers to evaluate complex schedules for capital and operating expenses. QO V8.0 also heralds a port to .NET.
QO user and evangelist, Steve Johnson of Doris Inc., said, ‘When we chose QO for concept-level capex studies on deepwater field developments, we were concerned about the time required to run the large number of simulations required. In fact, we managed to evaluate 25 cases in a few days using a beta release of QO.’
QO’s cost calculation engine now embeds IHS Energy’s field and basin databases, as a starting point for project evaluation. The regional cost databases are updated every six months with real-world project data from oil companies, fabricators, vendors and service companies.
One additional capability receiving high marks from beta customers is the entry of data in any units, not just metric or Imperial, and the ability to change the units of any data item at any point.
Other capabilities include, enhanced field development plan graphics, access to the technical algorithm databases, allowing users to tailor these with their own technical data and sensitivity analyses. IHS Energy claims over 500 QO users in 40 countries. The development team is now working on an onshore version for delivery later this year.
Wholly-owned OSIsoft unit WiredCity has received certification from SAP for its ‘IT Monitor’ ERP performance interface. IT Monitor, based on OSIsoft’s RtPM real-time platform, monitors the end-to-end performance of SAP-based IT environments, allowing companies to reduce network maintenance costs and improve system performance through ‘complete’ system visibility. IT Monitor integrates with SAP R/3 as an ABAP add-on.
WiredCity CEO Martin Otterson said, ‘IT Monitor helps managers and administrators optimize their IT environment by providing an end-to-end view of all real-time and historical SAP R/3 Enterprise performance data. IT Monitor’s data collection and resolution capabilities minimize data loss and meter IT performance.’
IT Monitor relates SAP performance metrics to infrastructure components including network, server, database and devices. Users can drill-down through their data to analyze the causes of poor system performance. Powered by OSIsoft’s PI System, IT Monitor provides IT managers with accurate capacity planning and root cause analysis.
The software also tracks issues such as version upgrades, module additions and ‘general issues impacting overall IT performance’. IT Monitor displays the real-time status of any network, device or application, identifying ‘hot spots.’ By analyzing event history, managers can improve future performance, reliability and security.
Clean Air Act
Another new OSIsoft development helps organizations comply with new ‘Title V’ legislation which requires facility owners to certify environmental compliance. The new US rules shift the burden of compliance monitoring from regulatory agencies to the owner/operator. Companies must keep records proving compliance and issue exception reports when required. OSIsoft’s Real-time Performance Management Platform (RtPM) enterprise-wide data visualization and reporting solutions have already been used to demonstrate compliance by Wasatch Energy Systems. include ("copyright.inc"); ?>