We often hear calls for ‘standardization’ in the upstream IT context. For some this may mean laudable efforts to use published data standards. For others, a ‘standard’ may mean using Microsoft Windows as opposed to Linux or Unix—or vice versa! In the marketing department these two rather different usages are deliberately confused. I am not sure if anybody in the upstream actually believes that by using Microsoft’s software they are facilitating alignment with, for instance, the SEG’s tape standards, data standards from POSC or PPDM or any other vertical data formats. But this is the intent behind the marketing drum which is bang, bang, banging on—almost deafening anyone who attempts to speak to the real issues.
A good example of the practice can be seen in the upcoming Schlumberger Information Solutions ‘Open Technology Symposium.’ The preliminary agenda includes a Schlumberger presentation on ‘Delivering Innovation through Open Standards’ and another entitled ‘How Microsoft supports openness for the industry.’ You couldn’t make it up.
Moving the standards debate from where it belongs (in the business) to the application and operating system arena is rather like claiming that it is easier to communicate in Latin than in ancient Greek. This is not a bad analogy because learning an ancient language is hard—just like learning a computer system. And in the end, speaking a particular language has nothing to do with what you say and whether or not it makes sense. In fact, IT generally has very little to say about how computers are used to run a business. Most of IT is about arcane issues that are important if you are writing code, but which can actually get in the way of your business. But I digress.
A sad tendency of the orgs, the SPE, SEG etc. is to offer a lofty perch upon which the IT vendors can bang the marketing drum. At the SEG for instance, you will hear far more references to standardizing on Microsoft than on the Society’s own tape standards. Nevertheless, we listened attentively to what Microsoft had to say at the excellent SPE Gulf Coast Section’s Digital Energy 2006 conference—more of which in next month’s Oil IT Journal.
Mike Brulé, CTO with Microsoft’s Energy vertical queried whether SOA* could be considered a ‘the new silver bullet’. Brulé stated that, ‘SOA is orthogonal and says nothing about data management, data quality and system performance.’ Well I will credit Brulé with using ‘orthogonal’ which is a nice word. But as anyone from the WITSML community will tell you, SOA—in so far as it is represented by the real time data validation offered by the SOAP infrastructure—actually says a lot about data management and data quality. Incidentally, SOA probably also says a lot of probably bad things about system performance—but that is another matter.
Google was also present at Digital Energy 2006 with another strong marketing message. Google’s Enterprise Solutions Director Michael Lock deprecated the old way of doing business with ‘cumbersome manual tagging’ to enable search. Google is offering its pretty good search technology to enterprise clients (OITJ Vol. 9 No. 11) but Lock is playing to the gallery by suggesting that you no longer need to add metadata to your documents for accurate search. Google itself makes great use of the tags and keywords that conscientious web meisters add to their pages. When Google comes up with a load of junk it is because such context is not there. Google is good, but it’s not magic!
You may detect a degree of irritation in this editorial and I have to confess that both Brulé’s remark on SOA and Lock’s deprecation of tagging were in complete contradiction with the ideas I developed in my talk at the IQPC Data and Knowledge Management conference held in London last month (see page 6 of this issue). My thesis, which I will be further developing in a talk to be given at another IQPC conference in Bahrain this April, is that SOA is a game changer. But also that metadata, tags if you will, will be increasingly important if data managers are to keep their heads above water in the data ‘tsunami’. I should add hastily that I claim no exclusivity for these ideas. They are widely held and form the basis of much of the current W3C’s work.
Strong endorsement for SOA also came from Schlumberger Fellow and co-inventor of the Java Card, ‘the most used computer in the world,’ Bertrand du Castel, who emphasized that ‘web services is not a fad.’ du Castel stressed the twin fundamental advances of XML and ontologies and their particular applicability to oilfield automation.
Looking ahead, when some of these shiny new technologies are in widespread use, you might like to ask, how easy will it be to find information, and what will be its quality when you’ve found it. Well I would like to suggest that finding things will be easy and accurate if they are tagged right in the first place. And that information quality will be high if data is stored in a robust, self describing format that has been validated at every stage of its life cycle thanks to web services.
But the hard bit—and it is still going to be hard—is getting the data tagged and pushing the vendor community to perform the validation. SOA like WITSML is a good start, but it probably needs a generation or two of further development before its full potential is reached. As for tagging. Well, what is at issue here is not really data management, knowledge management or even information management. It is just plain old ‘management’. Getting people to do what they should be doing. If that means filling out a few boring old forms then so be it. Who said life was fun?
* Services-oriented architecture.
Arcapita Ltd., the European private equity arm of Bahrain-based Arcapita Bank, has acquired Roxar from its previous owners, Lisme AS, a Smedvig and Lime Rock Partners investment vehicle. The transaction is valued at around $200 million and is Arcapita’s first investment in the oil and gas sector and its largest investment in Europe to date. Royal Bank of Scotland and Barclays Bank provided senior and mezzanine financing of US$91 million for the acquisition, together with a US$25 million working capital facility. Roxar senior management, led by CEO Sandy Esslemont are to retain a 1.2% shareholding in the company.
Arcapita Executive Director Mounzer Nasr said, ‘We see substantial growth opportunities as operators embrace new ways of maximizing production and implement reservoir management techniques earlier in the production cycle. We believe Roxar is well positioned to take advantage of this trend.’
Arcapita’s John Madden told Oil IT Journal, ‘We have been looking at the oil and gas vertical for a few years and were particularly interested in the offshore and technology sectors. So when Roxar was put up for sale, we jumped on it.’
Arcapita typically holds its investments for three to five 5 years. But Madden explained, ‘We are not a fund and are not constrained by fund life. We can take a longer view. We also see this as a growth sector and will be looking for smaller companies to fit into the Roxar group which could add to either the equipment or software portfolios or boost the services offering. We are enthused with our first oil services technology investment and we want more!’
Esslemont added ‘The fundamentals for our markets, particularly sub-sea multiphase metering and 3D reservoir modeling, have never been stronger and continue to improve. Arcapita gives us the financial support that will help us develop our competitive position in these markets and increase our ability to research and invest in the oil and gas technologies of the future.’
Roxar has two distinct lines of business (LOB), software and real-time flow measurement. Roxar’s flagship software package is its Irap RMS reservoir modeler. Its flow measurement offering includes high tech hardware for subsea multi-phase flow metering and data acquisition. Roxar’s 2005 turnover was around $130 million, with almost 70% from its metering division.
The big question for Roxar’s new owners is whether to split the two LOBs or to try to bridge the gap between metering and modeling. For more on Roxar’s business, read our2004 interview with Sandy Esslemont (OITJ Vol. 9 N° 4) and the update to appear in next month’s issue.
Norwegian data consultancy and software developer Kadme AS and the Dutch geological survey, TNO are setting up a joint venture, Lerya, to ‘liberate’ E&P data from proprietary formats. Kadme and TNO have been collaborating on open source technology for use in National Data Repositories since 2002.
Last year (OITJ Vol. 10 N° 1) the companies announce a standards-based IM solution for the upstream, an international version of TNO’s DINO geo data repository as used to house Dutch geological and environmental data. Lerya’s software embeds Kadme’s K-map and K-doc web services components that leverage open source tools such as the OpenGIS-based MapServer from the University of Minnesota, web services for remote portlets, JBoss and the mySQL database.
Lerya is bidding for the renewal of the prestigious Norwegian DISKOS data management contract, currently run by Schlumberger’s SINAS unit using technology from Landmark, Petris and Kadme itself. Contract award is expected mid April. The word ‘lerya’ is taken from ‘Elvish’, one of several languages invented by Lord of the Rings author J.R. Tolkien. Lerya means ‘to set free, to liberate’.
How does Oildex SpendWorks fit into the e-commerce picture?
Operating oil companies (OpCos) receive a huge number of invoices especially while drilling. A midsize to large independent operator might have 12-20,000 invoices per month (see for instance the Chesapeake article, OITJ Vol. 11 N°1). This can involve quite complex activity between remote field offices and HQ. In the US this also means Sarbanes-Oxley tracability. OpCos make perfect candidates for a move from the invoice payment ‘paper chase’. Studies show that payment of an invoice costs in the region of $15-20. Oildex brings this down to $5 with our ASP* service. There are also considerable business intelligence benefits to be gained from going digital.
How does it work?
Oildex supports a network of thousands of suppliers submitting e-invoices. Some larger suppliers like Halliburton and Schlumberger are used to providing e-invoices but not everybody is equipped for this. Our three tiered service offering allows all companies to benefit from e-commerce. Companies submit invoices through the Oildex website either electronically or by keying them in. Big companies subscribe via XML and HTTPS using industry standards. For smaller companies, QuickBooks is the prevalent technology. Next we offer workflow support of the payment process. Invoices are ‘moved around’ digitally for approvals. Accounting codes are tied to well field tickets and distributed around the country from field to head office. All this used to be done with internal company mail, scanning invoices and using document management systems. Today, SpendWorks lets people do this in a Microsoft Outlook type interface with electronic workflows. This offers significant scaling benefits as industry activity ramps-up. One Oildex customer quadrupled procurement in three years without hiring anyone.
And your third tier?
In oil and gas, sales are rather straightforward; the gas price is determined by referring to the Henry Hub. But there is considerable potential on the bottom line for improvement. Our new business intelligence offering which is just about to be released, addresses the issue of reporting timeliness. In the past, reporting lagged spend by at least thirty days. The level of detail of traditional business intelligence also leaves a lot to be desired. There may be a hundred line items in a frac job, all of which got lost doing it the old way. Now all such information is there, available for analysis. Oildex is working on a new data warehouse and GUI so that anyone can look and analyze at the data. Real time business intelligence is now available within 48 hours of the activity.
Who uses your technology?
We were the first to go to market in 2002 and have three years of use by early adopters. We are now entering the mainstream.
In your Chesapeake announcement last month you made a lot of Software as a Service (Saas) and Application Service Provision (ASP). How important is this to your offering?
Critical. Our e-payable is entirely enabled by ASP/SaaS technology, including our workflow offering. Even larger companies with their own workflow tools (like Chesapeake) will use some of our SaaS technology for instance to map accounting codes to line items.
Some of this has been attempted before—like with PetroCosm and Trade Ranger.
Yes we saw them flame out! We did the same thing as the portals but with private VC funding. These e-marketplaces brought in technology from outside, we built an industry-tuned solution. Oil and gas is invoice-driven not purchase-order driven. A verbal order, or a master contract driven ordering process is the norm. You only know the real cost of a job after the fact.
Do you leverage the API’s PIDX standards in this space?
Sure, we are an active member of OFS Portal and PIDX, where we participate in the business messaging workgroup. The PIDX XML standard for invoices is in use by all OFS Portal members.
What do you use for your own formats?
XML is interesting technology. All our internal stuff is in XML. This is also deployed in our other line of business—Joint Venture reporting with CheckStub Connect. Production from 500 companies (40% of the US onshore total) is reported through Oildex. The technology started out as an EDI-based solution operated through General Electric. Today SpendWorks touches around 1,000 companies.
Who are your flagship e-payables clients?
Anadarko, Chesapeake, and mid market companies.
So how do the majors do it?
They have adopted private exchanges. Chevron for instance, like most majors, is an SAP shop. It acquired technology from Ariba for gathering XML data from OFS Portal members. Smaller suppliers key their data into the system. But even the majors lack our market penetration with suppliers.
Can you co-habit with SAP?
Our clients tend to look to us to manage the spend process. But we do integrate SAP systems if required. Some companies just use us for e-invoice delivery. The ecosystem is evolving—majors are customizing SAP workflows. As I said before, our latest offering, the next big thing, is a new business intelligence solution. The software is currently in beta.
Tell us more about SaaS.
Something was lost with the move from client-server to the web. ASP applications have been criticized as unresponsive. They are not so rich and seem ‘clunky’. This is why travel agents don’t use Travelocity! One weak link in many current ASPs is Citrix. So we are looking at newer technologies which put some of this performance back in the system. There are multiple lines of attack on this problem—AJAX, Smart Clients and much of .NET sets out to address these issues. Microsoft VISTA is another big push in this space.
What technology do you advocate?
Funny you should ask! Our workflow solution makes significant demands on the client and users want performance. Users would also like to work offline while in the field. Our new SpendworksUnplugged provides an Outlook-style interface—going beyond the browser, leveraging much more mature technology.
* Application service provision—hosted software.
Seismic Micro Technology (SMT) has just released V 4.0 of its (RC)2 geologic modeling package. SMT now claims a fully integrated product line comprising Kingdom, (RC)2 and Sure. (RC)2 builds geostatistical models for use in streamline flow modeling or in the SURE reservoir simulator. A free thirty-day evaluation copy of (RC)2 is available. SMT has also announced a Beta version its new data management tool, OpenKingdom, a data management toolkit for Kingdom projects.
dGB has released Stable Release 2.2.2 of OpendTect, adding new functions developed as part of the OpendTect User Friendliness (OUF) project. These include random lines through welltracks, shortcuts for moving inlines/crosslines and new display options. Plug ins are available for lithology classification from well log along with a Beta version of a link to ArkCls. Gaz de France supported the OUF project.
OpenSpirit has announced help for Canadian companies who are migrating spatial data from NAD27 to the more accurate geocentric-based NAD83. The Coordinate Migration Service used in all OpenSpirit products includes the NTv2 datum shifts, meeting government and CAPP requirements, assures spatial data integrity and minimizes datum migration costs.
Blue Marble Geographics has released V 3.0 of its Geographic Map File Translator now with Flex LM-based licensing. The Translator supports for ESRI Shapefiles, MapInfo TAB and MIF, AutoDesk DWG and DXF through v2006, Microstation DGN, and import of ESRI .e00 format. The Translator offers full EPSG database support, and creation of custom coordinate systems. The map file viewer window now supports raster formats for co-rendering of spatial and DOQQ image libraries.
Electronics Corporation of India Limited (ECIL) has chosen Energy Solutions International’s PipelineManager package for four major pipeline projects in India.
UK-based GIS data transformation specialist Laser-Scan has just announced its Radius Studio spatial data quality application. Radius Studio is a component-based data integration and quality management tool for applications that use Oracle Database 10g. Users can build and manage rule-based processing workflows for spatial data.
UK headquartered SDC Geologix has just released WellXP, a secure, hosted wellsite operations and reporting package. WellXP captures and manages well-related information in a sharable off site store. Entitlements are configured for role based access so that stakeholders can only see data to which they are entitled.
WellXP stores basic well documents from initial prognosis to final report along with associated data on a remote hosted server.Geologix claims that the easy access to information improves the decision making process and decreases the time required to make operational adjustments. WellXP collects digital information that is often dispersed across the enterprise and poorly organized. WellXP encourages information re-use, for instance in planning subsequent offset well programs.
A simple folder structure and Table of Categories allows any digital format to be stored and shared with team members, management, joint venture partners and regulatory or government agencies. WellXP provides users with concurrent access to the most up to date information. WellXP’s interface is opened from a standard web browser. On completion of a well, information may be archived or downloaded to a browseable DVD. Geologix’ servers are physically located in a secure server room and protected 24x7x365 with biometric technology. The secure website, outside of the corporate firewall, assures network integrity. All files are scanned for viruses during upload and download.
The European Petroleum Survey Group (EPSG) has been reformed as the Surveying and Positioning Committee of the International Association of Oil and Gas Producers (OGP). The former EPSG Geodesy Working Group is now the Geodesy Subcommittee of the OGP Surveying and Positioning Committee. This Subcommittee’s membership is unchanged and will continue to maintain the EPSG dataset. The dataset will continue to be known as the EPSG dataset.
The EPSG has also just released version 6.9 of its geodetic parameter dataset, a free download from epsg.org. V 6.9 includes transformations published by the US National Geospatial Intelligence Agency (NGA) for oceanic islands not previously included in the EPSG database. Area table records for maritime countries based on ISO 3166 have had their area of use clarified to cover the whole country, both onshore and offshore.
As in previous releases the V6.9 dataset is available in two forms: either an MS Access database which also includes a data reporting capability, or as a series of SQL scripts for populating an alternative relational database application. More from www.ogp.org.uk and www.epsg.org.
Calgary-based GeoLogic Systems has aquired the PetroCube product line from developer, PetroSleuth. PetroCube is a web-based, petroleum analysis service that provides subscribers with current, unbiased decision criteria for exploiting both new and existing petroleum assets in the Western Canadian Sedimentary Basin. PetroCube provides financial, technical and geostatistical analytical tools.
GeoLogic has also signed a multi-year development agreement with AJM Petroleum Consultants, the prior marketer of PetroCube and related services. AJM conducts acquisition and divesture studies and evaluations of non conventional resources such as coal bed methane.
GeoLogic president David Hood said, ‘We are combining our expertise in adding value to data products with the AJM consultancy to provide a service that is greater than its individual parts. PetroCube customers will be able to access more resources and solutions as we move ahead.’
Robin Mann, CEO of both AJM and PetroSleuth added, ‘GeoLogic has components that we would have had to develop. Our clients will retain our expertise since AJM will be an integral part of the development of the knowledge-based product that PetroCube has become.’
Houston-based Rock Solid Images (RSI) has added OpenSpirit data integration to its iMOSS flagship rock physics-driven seismic modeling and analysis package. RSI has licensed the OpenSpirit application and data integration Developer’s Kit to provide integration with third-party applications and data stores.
iMOSS applies rock physical principles to the study of seismic amplitudes and is used in high-end applications including 4D, time-lapse seismics, AVO and impedance interpretation and pore-pressure calibration. OpenSpirit provides access to input data and also allows for back population of geophysical applications and data stores for further interpretation.
RSI VP Gareth Taylor said, ‘Rock physics models and conditioned log data for synthetic seismic generation is a key to exploration success. iMOSS technology, paired with OpenSpirit, will provide efficient access to this data, enabling geoscientists to understand subtle relationships between reservoir properties and their seismic expression.’
OpenSpirit reports that 30 software vendors have licensed its dev kit and there are some 1500 oil company end users in more than 50 countries using its framework.
PGS has just laid claim to a 3D seismic production record for its Ramform Valiant seismic vessel which acquired 4,656 sail kilometers last December in the Camamu Almada Basin, Brazil. The Valiant towed a ten x 6,000 meter streamer array and achieved a peak production rate of 2,328 square kilometers per month.
Jeroen Hoogeveen supplied some technical tid-bits to Oil IT Journal readers, ‘9 seconds of 2ms, 24bit data was dual recorded in SEGD on IBM 3590 cartridges. Production data rates reached 6.8 MB/s and daily data volume was around 375GB. During the record month the total data volume was 11.66TB.’
PGS has successfully recorded 1ms data to 3590, but the 12MB/s data rates push this technology to the limit. PGS is moving to 300GB capacity 3592 tapes. Navigation was achieved with a network of 80 acoustic ranging devices, 220 compasses and 18 GPS units. The navigation solution was calculated in real time.
Saudi Aramco unit, Aramco Overseas Co. has teamed with Earth Decision Sciences (EDS) to commercialize its in-house developed GeoMorph package. Aramco’s patented geosteering tool takes real-time logging while drilling data into the visualization center and earth model. GeoMorph is now integrated with EDS’ geomodeling flagship GoCad and was used successfully during drilling operations on Saudi Arabia’s supergiant Ghawar field.
Aramco’s Ahmed Metwalli said, ‘GeoMorph excels in real-time earth modeling and geosteering and enables us to achieve pin-point accuracy at the reservoir contact.’ GeoMorph inventor Roger Sung added, ‘In complex geological settings, adjusting the direction of the drill bit in real time with an ‘on the spot’ roadmap can make or break the success of a well.’
EDS CEO Jean-Claude Dulac concluded, ‘GeoMorph shortens rig downtime by enabling models to be re-interpreted in minutes and hours, rather than days and weeks. Drilling plans are easily updated and adjusted, providing the flexibility so critical to field development.’
Oil IT Journal editor Neil McNaughton kicked off IQPC’s Knowledge and Data Management conference with a talk on ‘10 years of data management—from data management to managed data.’ Recently there has been a step change in how a ‘standard’ can be defined in IT terms. New technologies offer on the fly data validation. XML can be used to package, for instance, seismic trace data with unambiguous positional information as in OpenSpirit’s ‘managed’ seismic data format. Data validation can be applied to HTML—and McNaughton showed some interesting validation metrics from vendor and oil company home pages. The SOAP infrastructure that underpins WITSML can be used to QA data flows between drilling rig and visualization center. But it is in the context of the digital oilfield that these technologies will come into their own. Along with emerging standards for deploying machine understandable taxonomies like the W3C’s Simple Knowledge Organization System, SKOS.
Social Network Analysis
Repsol YPF’s Augustin Diz described the ‘medieval’ stage of data collection where ‘everything is preserved but relatively little is used’. The result is that knowledge decreases over time. Repsol now has 15 widely-used E&P communities of practice (CoP). CoPs may be perceived as outside the ‘process’ and issues that are not of local significance may receive poor middle management support. Nevertheless, Repsol has published its ‘better’ practices and is benefiting from the new possibilities of ‘social network analysis’ (SNA). Using data from CoPs and network traffic, it is possible to find out who people really talk to—their boss, or someone on the other side of world. Repsol is now performing process analysis and K-mapping to support and leverage future SNA.
Liv Maeland described three $100 million IM projects: BRA, a major finance/admin project and SAP migration that was completed in 2000 and the SCORE E&P technical computing project which ran from 1998-2001. Today, Statoil is finishing up its third project, ‘email@example.com’. This is a ‘major major’ project introducing comprehensive metadata across G&G.
Maeland cited Prof Marchand’s (IMD Lausanne) work on ‘Navigating Business Success’. Marchand’s ‘Information Orientation’ work investigates ‘soft factors’ such as how the interaction between people, information and technology affects business performance.
Loc Vo’s presentation covered Saudi Aramco’s plans for data management in the I-field. Making sense of the huge amount of data collected during a field’s lifetime is a major facet of the I-field. Aramco has complete field histories and plenty of locations to pilot test. Challenges include the large amount of production data that is hard to optimize and a limited infrastructure for staging and transmitting RT data. Critical technologies are still under development and there is a lack of experienced people. Data access is an issue across disparate IT systems. When data is needed for decision making, it may not be there! The reality is that while it is not an impossibility, the I-field requires serious investment. But the potential pay-off is tremendous in terms of enhancing asset value.
Jake Booth showed an internal Exxon-Mobil promotional video for its upstream technical computing system ‘concept car.’ The idea is simple, to bring data to where you need it when you need it, without manual intervention. Access control is through role based security smart card single sign-on. The card allows the system to recognize a user’s role and provide access to entitled data. Find data through map based interface—drill down, monitor wells, set triggers. (SharePoint Portal). The plan is to replace data silos with integrated, asset-oriented workflows. UTCS is also a key component of the digital oilfield and will support real time trending of well performance, automated analysis and alerts—expanding the use and value of RT data. Automation will detect problems and send alerts to engineers by wireless. Everyone sees the same information at the same time. A shared earth environment brings everything into the collaborative environment of XOM’s large visualization center. Steve Comstock, CIO and chief architect of the UTCS claims it is ‘unique among competitors, all talk about it, none have achieved it.’ The video concluded with a disclaimer along the lines of ‘your mileage may vary’ as Comstock drove off into the sunset in, not a concept car, but a 1947 Chrysler ‘Woody’!
IQPC’s conference leave a lot of time to panel discussions and round tables and these really took off with some great exchanges of views. Santos reported that it used to have little or no DM and it was necessary to ‘educate’ management with IM and data metrics, a risky strategy with the potential of embarrassing management regarding the state of corporate data. But now, ‘they understand’ and the drafting department is now performing GIS DM and the librarians manage taxonomies etc.
Santos uses Datalogix/Innerlogix for QC and data metrics. The company pays attention to training and has separated technical and scientific workflows, ‘so that folks don’t just learn how to push buttons.’ Santos wants to avoid people drifting into exploration through a software package.
Shell also stresses training with its own institutions that teach stuff like the theory behind synthetics seismogram generation. There is a general feeling in the industry for tightening rules, for data policing. Although Shell has deployed software for G&G field sheets, folks didn’t use them, ‘they can’t be bothered to enter a hundred or so parameters’. But from the management position there is a strong drive to ‘get these field sheets filled in’. This will be achieved by domain specialists pre-populating some fields and performing data QC.
While the National Oil Companies (NOC) may possess very rich data assets, their needs differ from the majors. Aramco has 30-40 years of production data and PVT analyses and implements strict data access controls which can be a problem for expatriates as they may not realize what is there. Kuwait Oil likewise provides a fine degree of control on what data is seen by an individual. People may only see a small piece of the whole data asset.
ExxonMobil is constantly seeking ways of getting meaningful coupling between business initiatives and computing. Approved projects are carried through to ‘gate 2’ and companies are ‘told to commit monies and people.’
Danish DONG has experienced ‘initiative overload’ in the
last few years and is cutting out a few of them. Statoil is trying to structure
data management with its ECIM initiative and is to sponsor a data
management Masters degree program in Norway.
At SMi’s E&P Data and Information Management conference, BG Group’s Karen Moore and Schlumberger’s Steve Miller described an e-field development on BG’s Tunisian Miskar field. This presented specific challenges with 400 operated wells, gas allocations, scheduling and blending. Although very high data volumes were coming in from the wells, little use was being made of it. The field proved a good test bed for a standards-based production data management system.
BG’s production engineers previously relied mainly on Microsoft Excel as a production management tool. Excel spreadsheets were emailed onshore, copied, cut and pasted into other documents for reporting. ‘Data gymnastics’ was the norm. One engineer had 27 spreadsheets to perform daily reporting. SCADA data was not used or even exposed to decision makers. Well performance and nodal analysis tools were used but these suffered from a big ‘data disconnect’.
Moore and Miller developed a business case that demonstrated direct benefits to the field from improved data management, but also to the asset and to BG Corporate. The system now takes data flows from a new Emerson Delta V DCS/SCADA network to OSISoft’s PI data Historian and Schlumberger’s Avocet repository for allocation and sales. Data then flows on to Schlumberger’s Decide for analysis and reporting.
Wells have pre-defined performance curves stored in the historian which are compared with daily production. Spreadsheets have given way to graphical reporting from PI. The performance curves link to Avocet DM. The Tunisia implementation is work in progress and will be implemented in next few weeks. Moore admits that BG is a late adopter of some of these technologies and has been slow to adopt the digital gas field. Tunisia is a pilot for global rollout.
Shell’s enthusiasm for corporate GIS has if anything increased over the two years since Thierry Gregorius first presented Shell’s GIS data ‘Swiss army knife.’ Shell uses GIS as a lightweight integration mechanism for corporate data so that it can be used across silo boundaries to visualize facilities, the subsurface, proximity to infrastructure, legal geo-constraints—just about anything.
Standards can bridge the gap between heavy applications, portals and a lightweight GIS integration. Shell is quite a way along the path of standardizing its IT infrastructure—especially with its heavily locked down desktop. This was initially ‘very painful.’ Users could not even set their own wallpaper! But this is now paying off and it is now possible to deploy software world wide across Shells’ 11,000 PCs easily.
Standards-based metadata underpins Shell’s data sharing. Tools crawl and index documents which can now be ‘spatialized’ with tools like Metacarta. Gregorius was sanguine about XML as a data panacea—seeing a lot of marketing spin in the way XML is used to put ‘lipstick on the data pig’. More generally, IT offers only partial support for the wide-reaching infrastructure Shell is trying to achieve. Someone only has to update a piece of software in one place to cause things to ‘fall over’. Standards are undoubtedly the answer, but in GIS particularly, their uptake is slow—and almost non existent in oil and gas.
Gregorius believes that the industry is in an interoperability crisis. One problem is that GIS data standards are often run by academic groups who may lack a sense of urgency. But there is hope. New York had been talking about an enterprise GIS for years before 9/11. After the attack, the system was in place within a week.
This article has been taken from reports produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this service please email firstname.lastname@example.org.
OpenSpirit Corp. is to team with Calgary-based E&P technology specialists GeoSeis Inc. to help Canadian clients with their deployment of the OpenSpirit application and data management solutions.
The Minerals Management Service of the US Department of the Interior has just published its ‘Green Book’ guide to ‘Leasing oil and natural gas resources on the outer continental shelf.’ The 63 page guide is a free download from www.mms.gov/ld/PDFs/GreenBook-LeasingDocument.pdf.
Calgary data specialist Intervera has appointed Ian King as Director of Client Services. King was previously manager of Schlumberger’s Calgary Data Centre and is on the Board of Directors for the Public Petroleum Data Model Association.
The American Petroleum Institute (API) has named Bob Greco to the post of Group Director, Industry Operations and Upstream. Greco will be responsible for managing oil and natural gas issues pertaining to exploration, production, marine and related industry operations. Greco is currently API Director for Policy Analysis. Before joining API, Greco was an environmental engineer with the U.S. Environmental Protection Agency.
Halliburton unit Landmark has been awarded a two-year ‘Global Purchase Order’ by Petrobras for support, training and consulting on software and IT services.
UK-based consultants Paras is expanding, hiring John Packer, Katerina Gunningham and Olivier van Belle to its team of consultants. Marketing executive Jodie Gillespie has left the company, she is replaced by Tracey Dancy. Gillespie has joined Australian oil executive search specialists Gerard Daniels.
Microsoft’s Worldwide Oil and Gas unit has appointed Craig Hodges as director, oil and gas sales. Hodges was previously with Dell Computer.
Richard Weindel is the manager of Seismic Micro Technology’s (SMT) newly opened office in Singapore.
Charlie Har is to head-up PGS’ new data processing center in Kuala Lumpur, Malaysia.
Paradigm has appointed Elijio Serrano as CFO. Serrano hails from EGL Inc. Prior to his stint with EGL, Serrano spent 17 years with Schlumberger.
Quorum Business Solutions has appointed Ken Sauers as Senior Project Manager with its Midstream Products Practice.
HP has signed a memorandum of understanding with the Alberta government for $8 million worth of IT R&D. The University of Calgary is to investigate advanced data center operations. The University of Alberta (UA) will work on 3D videoconferencing. Pierre Boulanger, UA iCORE industrial research chair is to join with HP’s Palo Alto R&D effort.
Hampson Russel is looking for a Senior Geophysicist’s role in Jakarta. Invensys is looking for a new CFO following the departure of Adrian Hennah. Paras is seeking consultants to support and develop its global client base. OpenSpirit is looking for experienced software engineers for its development effort.
AspenTech’s Q2 2006* saw the company’s highest operating profit and margin since Fiscal 1998. Total revenue of $76.4 million was up 7% from the same period last year with license revenue of $41.7 million, up 13%. Mark Fusco, President and CEO, said, ‘We have returned the Company to top-line growth in all parts of our business, including our best performance to date with our aspenONE solutions suite.’
Computer Modelling Group reported revenues of $4.3 million for its Q3 2006 quarter that closed 31st December 2005, up from $4 million. Earnings were $0.9 million, down from $1 million. Software license revenues were $3.5 million - up from 3.2. CEO Ken Dedeluk reported product demand as increasing with a $1 million growth in software license revenues in a nine month period.
In its guidance for fiscal 2006, Calgary-based Divestco is projecting revenue for fiscal 2006 of $75-80 million, up from $37 million in 2005. The revenue hike is due to acquisitions and ‘record organic growth.’
TGS CEO Hank Hamilton, reporting on record net revenues of $89.3 million (up 53%), spoke of ‘a robust and broad-based market response to TGS projects in all geographic locations.’
Invensys reported revenue of 628 million (up 2%) and operating profits of 48 million (up 4%). CEO Ulf Henriksson, said, ‘Process Systems produced an excellent result including a 14 million profit increase and a 17% improvement in orders at CER.’
PGS CEO Svein Rennemo reported on ‘a strong upward trend in world-wide E&P activity and spending,’ with ‘unparalleled strength both in Marine Geophysical contract and multi-client demand continuing in 2006.’ PGS revenues were $340.3 million, up 15% and an overall net loss of $80.4 million, including a one-time charge of $103.8 million from the refinancing completed in December.
CGG also reported a revenue hike—to €263 million (up 34%). CEO Robert Brunck said : ‘The fourth quarter was strong as anticipated, driven by continuing improved market conditions. In view of these figures, we can confirm our target for full year 2005 operating results.’
*All results concern the last quarter of 2005 calendar year unless otherwise stated.
Petroleum Geo-Services (PGS) is to expand its relationship with The Information Store (iStore), developer of the PetroTrek data management suite. PGS has been using PetroTrek to manage its multi-client seismic data library since 1997 (OITJ Vol. 3 N° 6).
PGS’ Marcia Starr said, ‘Without the right technology, it would be very difficult to accurately manage our contract/data order history. Using these tools, we can better track our customers’ orders, manage their contracts, and provide fast and efficient service.’ The present agreement concerns the worldwide deployment of the PetroTrek Direct Sales module. PetroTrek offers a web-accessible map interface to view and place orders, visibility of a client’s purchase history, access to contract details and automated generation of paperwork associated with an order.
iStore president and CEO Barry Irani said, ‘The maturity of the seismic industry has forced service companies to become more efficient. With PetroTrek, PGS can manage both seismic and customer data much more easily via our secure, Web-based solutions. The results are considerable cost-savings and better service to customers.’
Labrador Technologies Inc. (LTI) has added wireless data retrieval to its eTriever data access suite (OITJ Vol. 11 N° 1). The new module supports wireless data retrieval for hand-held devices, such as BlackBerries, Treos, and smart phones. eTriever now offers mobile staff and field personnel real time access to their data.
Labrador CTO Tim Breitkreutz said, ‘Having all of your oil and gas data available at your fingertips is highly convenient. The ability to have an oil and gas well-ticket right in the palm of your hand, borders on the revolutionary.’
LTI also announced $200,000 of investment financing, for the purchase of 2,000,000 shares, through LTI’s non-brokered, private placement. LTI will issue between 2 and 4 million Labrador units at $0.10 a pop—a total of up to $400,000.
Oslo-based software house IPRes has just won a contract extension for the delivery of well planning software to Statoil. IPRes’ IPRiskWell will now be deployed in all Statoil’s field units.
IPRiskWell was developed and tested in cooperation with Statoil’s North Sea Gullfaks asset where it serves as an integration platform for well planning. IPRes rolls-up risks and uncertainties from all involved disciplines and systematizes the decision-making process. Projects can be evaluated consistently and financial metrics, drilling costs, reserves and production can be consistently presented to decision makers and compared for alternative projects and scenarios.
IPRes MD Arvid Elvsborg said, ‘This agreement shows that our software is easy to use and can support a complex, information intensive decision making process involving many parameters. By continuing to partner with IPRes, Statoil shows that specialized, smaller companies can be niche market leaders.’
Schlumberger is to deploy satellite network management software from UK based Parallel. Parallel’s SatManage will be used to manage both client and internal satellite networks and provide a front-end for network performance transparency. SatManage will consolidate data and integrate with the existing infrastructure to provide a single interface for analysis and reporting. Clients will be given access to the system via Schlumberger’s myNetwork Portal.
SatManage gives end users transparent access to historical network data and lets them view their own ‘trouble tickets’ and get an exact location of their mobile assets. Schlumberger is also to optimize overall network performance using SatManage Signal Analyzer.
Two recent Invensys contract wins show how the process control industry is evolving in gas storage and oilfield automation. OMV Austria E&P has awarded Invensys a contract for the revamp of the SCADA control system on its Schönkirchen-Reyersdorf 1.8 billion m3 gas storage facility. OMV has upgraded its legacy SCADA system to handle modern device control technologies including field device tool (FDT) and device type management (DTM). OMV is leveraging these systems as part of its move to an open-systems based solution using a commercial off the shelf (COTS). The SCADA system will communicate with OMV’s 121 remote gas wells, solar turbines and a remote dispatch center, located 20km from the site.
In a separate announcement, Invensys revealed technology that SonaHess, a joint venture between Sonatrach and Amerada Hess, is to deploy on the Gassi El Agreb (GEA) Algerian redevelopment. Invensys was selected by EPC Bechtel to supply Triconex triple modular redundant (TMR) technology for safety instrumentation in GEA. TMR controls emergency shutdown in four gas compression and re-injection facilities using three isolated, parallel control systems. The system uses two-out-of-three voting to provide high-integrity, error-free, uninterrupted process operation.
Speaking at the SPE Workshop on Business Process Management (BPM) last December, Kurt McCaslin presented KerrMcGee’s (KMG) OpenText/Livelink-based applications for a variety of internally-developed applications. McCaslin compared an ‘out of the box’ Livelink forum with KMG’s finalized E&P Discussion Forum to stress the importance of the user interface. This now includes a variety of domain specific functionality to enhance the user experience.
BPM is being driven in part by the requirement for Sarbanes-Oxley compliance. The technology has eliminated documents previously lost in routing. Similar systems are used to track IT change management and compliance. User training time is down by a whopping 80% thanks to efficient communication and self paced learning.
HR - AFE
Another application helps in hiring and retaining new staff through high quality ‘orientation’ of new employees. A web site gives candidates access to information on the hiring process. Finally a new AFE workflow has turned a complex, 11 page document into a streamlined, one page web form. Attachments can carry additional detail including spreadsheets, presentations, maps, etc. Since all the information is full text indexed it’s all searchable. McCaslin concludes that the BPM process is a balance between information collection and usability. The trick is to focus on those attributes absolutely necessary to enhance searching and data integration, to build a great user interface and to get a good handle on the underlying data.
Nils Sandsmark, General Manager of Norway’s POSC Caesar Association (PCA), gave an update on the plant data standards body’s activity at the Daratech Plant Conference in Houston last month. The conference theme was ‘Better Decisions Through Data Standardization’. Sandsmark’s presentation showed how today’s ‘self-sustainable fields’ will give way to successive future generations of integrated work processes.
Generation One Future Fields will be emerging in the 2005-2010 time frame and include integrated onshore and offshore centers and processes and continuous onshore support. Generation Two—on the 2010-2015 time horizon will see integrated operation centers for both operators and vendors, ‘heavily automated’ processes, digital services and 24/7 operations.
The ISO 15926 plant data standard will be at the heart of these developments as it integrates lifecycle data for process plants including oil and gas production facilities. Common ontologies, XML schemas such as WITSML and PRODML will also provide cross-silo data communications.
Sandsmark outlined key PCA initiatives including the Integrated Information Platform for reservoir and subsea production systems (IIP—OITJ Vol. 11 N° 1) and the Intelligent Data Sheets and Collaborative Work Process—a $3 million initiative to share information across ISO, API, or NORSOK data sheets.
PCA has published its 2006 work program on its web site. The reference data library is to be augmented with an improved ‘reference data system’. PCA is to take part in the EU-funded DEPUIS e-learning project, developing course material for ISO 15926. Collaboration with the Petrotechnical Open Standards Consortium (POSC) and the US plant data FIATECH organization are to continue. Work on the Integrated Information Platform and Intelligent Data Sheet projects is to continue.
Korea National Oil Corporation (KNOC) has selected MRO Software’s Maximo asset management system to optimize asset management, work management, material and purchasing at the Rong Doi gas field, located approximately 320 km off the coast of Vietnam. The agreement was completed through MRO Software’s Vietnamese reseller, Avenue IT Solutions.
MRO Software’s flagship Maximo Enterprise Suite allows customers to manage the complete life cycle of strategic assets including: planning, procurement, deployment, tracking, maintenance and retirement. Maximo is delivered on a ‘web-architected platform.’
In a separate announcement, MRO Software has teamed with management consultants BearingPoint to ‘identify and implement strategies that increase return on investment from enterprise assets.’ The alliance will leverage BearingPoint’s ProvenCourse deployment tools and the Maximo Enterprise Suite (MXES).
BearingPoint MD Jill Goff said, ‘This alliance answers client’s questions on how to better manage their companies’ assets, including IT systems and facilities. Downtime, for instance to figure out how to process work orders, costs companies money. BearingPoint and MRO Software can cooperate to offer clients software solutions and services to enable them to improve their businesses’ efficiency while reducing costs.’
Tulsa-based Samson Investment Co. is to use Quorum Business Solutions’ Quorum Land package for its land management activity. Samson is a privately held E&P company with operations in the United States, Canada, and Venezuela. Quorum Land offers a lifecycle approach to land and lease management from field data capture through disposal and caters for the needs of landmen, lease administrators and production and finance personnel.
Quorum Land’s web-enabled design distributes land related information across the organization and allows non-technical users to access real-time data and create ad-hoc reports. The software captures, maintains, and distributes land information, including leases, easements, fee property, and contracts. Quorum Land is part of the Quorum’s Upstream Software Suite that includes modules for mapping, accounting, finance, volume management and gas marketing. Other Quorum tools address midstream and pipeline operations.
Calgary-based Digital Oilfield (DO) has released new technology for rigsite approval of daily charges. DO’s Daily Charge Recorder (DCR) is a touch-screen application that provides operating companies and suppliers with a simple daily charge entry and approval mechanism. DCR creates an electronic record of daily charges at the wellsite, so that goods and services are all reviewed at the rig on a regular basis. The daily charge items can then rolled up to form the approved daily charges for efficient invoice generation.
DO president and CEO Rod Munro explained, ‘DCR solves a key problem that operators experience at the wellsite by automating an onerous manual process, reducing errors and the time spent resolving disputes at the end of the well. DCR is in action on over 700 rigs throughout Western Canada and adoption is expanding rapidly.’
Netherlands-based plant data specialist InfoWeb has just announced a new ISO 15926 Knowledge Base that it is currently developing. The website 15926.org is a Knowledge Base ‘dedicated to the practical implementation of, and information about ISO 15926.’ This site is a utility for the development of guides, procedures, and software for information handover and data exchange on capital facilities projects. Capital facilities include industrial facilities, commercial and institutional facilities, infrastructure facilities and residential facilities.
This site contains ontology listings in RDF/XML format and in N-Triples format. Ontologies underpin information exchange and integration. The site breaks new ground as an industrial deployment of the somewhat controversial semantic web RDF technologies. These re-introduce richness in data modeling that some have considered missing in XML, but at the expense of some complexity. The promise of the semantic web is machine to machine communication and data discovery.
Chicago-based Knightsbridge, a professional services specialist in business intelligence and data warehousing has just ‘formalized’ its Energy Practice. Knightsbridge, which has offices in Houston and London has been working for oil and gas and utility clients such as BP, Devon Energy Corporation, and Williams since 2004. Services span E&P, midstream operations, energy trading, and refining and marketing.
Energy Practice Area Leader John Ruddy said, ‘Mergers and acquisitions, regulatory compliance, energy price volatility pressures, and market demands require that energy companies manage their operations using accurate and reliable data. This needs to be delivered where and when it’s needed to help management make actionable decisions.’
Knightsbridge’s oil and gas offering supports data mining of digital oilfield data. The company advocates deployment of standard data models and processes. The company recently joined the Calgary-based Public Petroleum Data Model Association (PPDM). Standards models support ‘improved information consistency, accuracy, and reporting.’ Other facets of Knightsbridge’s offering include well and field performance management solutions, consolidated portfolio management solutions for energy trading, regulatory compliance reporting and cash flow forecasting.
Knightsbridge has embarked on an active recruiting effort to support the rapid expansion of the firm’s Energy Practice and other areas. Knightsbridge currently employs more than 600 people.
The University of Colorado’s Technology Transfer Office has signed a spin-out and intellectual property agreement with the management of what was previously the BP Center of Visualization which has been re-branded as ‘TerraSpark Geosciences.’
TerraSpark is now capitalized as an independent research Center at the University of Colorado as part of both the College of Engineering and Applied Sciences and the College of Arts and Sciences. Sponsored by the Departments of Aerospace Engineering Sciences, Computer Science and Geological Sciences, TerraSpark will conduct R&D on advanced visualization technology across a range of disciplines including the oil and gas industry.
TerraSpark manages two industry-funded research consortia in drilling visualization and geoscience interpretation. The first program uses 3-D visualization to reduce drilling costs and to mitigate associated risks. The drilling program integrates geological and geophysical data with engineering algorithms through 3-D visualization. The geoscience interpretation consortium focuses on using 3-D visualization to improve the efficiency, accuracy and completeness of interpretation of depositional systems in 3-D seismic data. This research integrates attributes designed to improve imaging of depositional systems with new algorithms to extract and analyze the systems in 3-D.
R&D for hire
TerraSpark undertakes contract R&D and consulting for corporate clients and licenses prototype software to third parties. TerraSpark fault propagation technology is remarketed by Paradigm. TerraSpark also partners with IT companies to provide integrated solutions to E&P companies.
Startup Cobalt International Energy has awarded Halliburton unit Landmark Graphics a three-year contract for the provision of a complete, outsourced hardware and software solution for its ‘virtual workplace’ for geotechnical applications.
James Painter, VP Exploration, said, ‘As a result of our decision to outsource the support functions considered non-core to our business, we needed a company capable of providing a one-stop shop for all our software, hardware and IT service needs. Within a week of engaging Landmark, our staff was fully functional, interpreting data and finding prospects.’
The solution includes Application and Data Services Provision built on Landmark’s PetroBank technology. The hardware platform is supplied by HP, through a strategic alliance with Landmark.
Halliburton senior VP Peter Bernard added, ‘Cobalt’s decision to outsource its IT infrastructure gave us the opportunity to deploy our virtual technical workplace. This ensures knowledge workers have on-demand access to business critical data and applications. The scalable solution will alleviate the need for Cobalt to manage and maintain a technical infrastructure and enable them to focus on their primary goal: oil and gas exploration.’ Cobalt’s exploration effort was kicked off last December with a $500 million equity injection from Carlyle/Riverstone, Goldman Sachs.include ("copyright.inc"); ?>