According to Tarek Shahawy, Technology Manager with Oracles Middle East division, companies face the problem of complex legacy silo applications that are hard to adapt and maintain and which often duplicate functionality. This is compounded with diverse asset teams and IT spaghetti complexityto the extent that IT cant answer simple questions.
Oracle is therefore proposing a new approach to business process management (BPM), leveraging an enterprise service bus (ESB). Connections to E&P applications leverage the OASIS-backed business process execution language (BEPL), a layer on top of the W3Cs web services description language (WSDL) to make calls from the BPM system.
All this is built into an E&P Asset Data Hub offering connectivity to SCADA systems, operations, maintenance, SAP financials, custom developments, HSE and E&P applications. The idea is to have a single source of the truth and to leverage BPEL to automate business processes. Typically, these would include process control alarms, engineering data, and possible geology, drilling and workover activity.
Oracles BPEL Process Designer is used to build dashboards of real time KPIs, analytics, forecasts, alerts etc. Data flows in from sensors to the Oracle Sensor Edge Server and through to applications. Assets can call home for help. For instance, if a sensors temperature is over 200° for more than 5 minutes, an automated work order might be created. Operator intervention can be verified and spare parts reordered automatically. Oracle 11i, Retek and Oracle Fusion middleware also ran.
In the Q&A following Shahawys talk, one skeptic questioned whether this idealistic vision could overcome the many barriers to process automation. Shahawy acknowledged that the technology is only part of the equation. You need the need database and connectors to synch with applications. But we also provide validation and filtering tools which embed Oracle Consultings best practices.
Another question concerned the potential challenge that Oracles E&P Asset Data Hub represents to the major E&P application vendors. Shahawy pointed out that the latest manifestations of Landmark and Schlumbergers applications tend to offer web services interfaces. This makes our life much easier. Elsewhere, Clients should ask vendors to implement web services to ease integration with Oracle back office tools.
Oil IT Journal made several requests to Oracle for more on the Hub, with so far, no replies. Is the E&P Asset Data Hub real? Or another Project Synergy, Oracles earlier attempt to solve the E&P interoperability problem? (Oil ITJ Vol. 4 N°2).
Petris Technology has acquired the software and support assets of Maurer Technology, a Noble Corp. unit. Petris is to assume world-wide responsibility for the development, sales, service, and future development of the Maurer drilling and completion software.
Maurer offers 18 individual applications for wellbore trajectories cementing and casing strings design, torque and drag analysis, wellbore hydraulics, stability and control and various coiled tubing and completion tools. Maurers software holds data in Galaxy, a shared Microsoft Access database.
Petris has been offering Maurers software via its applications on demand service, PetrisWINDS Now, and also through the Society of Petroleum Engineers E&P Software Toolbox.
Petris CEO Jim Pritchett said, With the current outlook for the drilling and completion market, and our good relationship with Maurer, we see a great future for this venture. Our integration technology platform, PetrisWINDS Enterprise, will help take the Maurer toolkit to the next level of interoperability, performance and ease-of-use. Petris is to continue to use the Maurer name.
First a big thanks to our renewing website sponsors
Georex Assistance Technique,
TGS NOPEC/A2D and
Also a big welcome to our two new sponsors for the 2006/2007 year,
Ikon Science and
OilIT.com currently receives about 1,600 daily visits.
Phil Crouse, organizer of the Petroleum Network Education Conferences (PNEC) Data Integration conference which we report from in this months Oil IT Journal, pulled off a considerable coup this year by persuading Wal-Mart CTO, Nancy Stewart to present a paper on the retail behemoths data management systems. To my mind, Stewarts talk was the highlight of 10 years of PNECs. Instead of editorializing, this month I offer you a summary of her presentation which I believe may mark a turning point in thinking on the under-resourced and often overlooked activity that is oil and gas data management.
25 million customers
Stewarts presentation showed how great attention to data and information management underpins Wal-Marts business. Wal-Mart maintains strong data models of whats happening in its stores on a daily basis. No mean feat when you consider that in a single day there are up to 25 million customers and 12 million credit card transactions. Equally strong communications technology (that bypasses telcos and the banks) allows the company to process credit card transactions from any location in the world in under one second!
Manic about data
Wal-Mart is manic about data, We keep everything! Data is the great enabler. Data is kept online for 2 years and then it is paged out to a second tier of storage. Wal-Mart reviews 300 million items a day to see if they need to be restocked and keeps track of on-hands for 700 million items.
Outsource? No thanks!
We do not outsource or offshore because of our tightly integrated centralized model. Wal-Marts centralized Information Systems Division (ISD), located in Bentonville Arkansas, does a lot of its own development, tuning its technology to support its business.
The company has a heterogeneous IT environment with 11 single system image IBM mainframes each with 2,084 processors. There is one replenishment system for 700 million items, one HR systems for a million employees and a system supporting world wide trucking.
Each of its two redundant centers (plexes) has 40k mainframe MIPS compute bandwidth. Overall, network availability KPIs are 99.997%. Last December Wal-Mart achieved six nines for its global network and that includes Microsoft! In fact, Wal-Mart gets better performance in China by managing its data in Bentonville!
Performance is crucial for Wal-Marts business. The company loses $1,000/ hour per cash register that is down. A four hour outage means we start to throw away perishables. Once, a 6 hour outage was deemed so critical that, We filled the sky with planes to get it fixed.
Bentonville is located in a tornado zone so everything is built hardened and there is a second plex for tornado mode running. This was used most recently only a couple of months ago. To make sure everything works, Wal-Mart fails-over between the two plexes once a month.
This year, Wal-Mart is upgrading its NCR/Teradata data warehouse with nearly 900 terabytes of storage. The aim is for a central, single version of the truth. Users can ask any question any time. Wal-Mart develops its own systems because the sheer size of its transaction volumes and data mart stresses commercial tools to breaking. The company is always asking will we break the architecture especially of its single image store.
4 billion row table!
A 4 billion row table is used daily for sourcing and running the business. Wal-Mart exploits collaborative planning, forecasting and replenishment (CPFR) to anticipate orders. Merchants are always finding better ways of interrogating the data and now the decision support systems use 60% of overall CPU time. Data mining recognizes that each store is unique and is tuned to clients needs. Rather than putting stuff that is likely to sell into stores, Wal-Mart uses weather mapping to re-route merchandise to where it is needed.
Stewart slaughtered several sacred cows of the IT business during her 20 minute presentation. Remember buy not build? Forget it. Distributed databases? Forget them too. Outsourcing? You got it. If you want to leverage IT to understand and improve your business, build your own systems that do exactly what you want them to do. Sure, Wal-Mart uses components from many vendors in its solution. Open Systems are prominent with Apache web servers and Linux deployed. Along with Teradata and IBM, SAS Microsoft and HP got a mention. But Wal-Marts data volumes, query requirement and need for precision make it hard for a standard product to fulfil. All this need not cost a fortune. Wal-Marts IT spend is significantly under half a percent of turnover and the ISD gets the job done with only 70 full time employees.
Oil and Gas?
Wal-Marts enthusiasm for data contrasts with oil and gas which, as attendees to PNEC know, has neglected data management. In oil and gas, you can ask any question any time, but you maybe wont get an answer right away! Oil companies are likely to offer their knowledge workers several different versions of the truth to chose from! The Wal-Mart case history provides much food for though for oil and gas majors and argues in favor of a radical overhaul of IT/IM strategy. With oil at $70 this ought to be a better investment than share buy-backs.
What do the new regulations entail?
DreiblattThe US Environmental Protection Agencys Ultra Low Sulfur Diesel (ULSD) program mandates a 15ppm sulfur content as of June 1st 2006. Limits on particulate matter will be introduced next year.
How does the EPA do this?
DreiblattThe EPA adopts a custodial approach. Fuel custodians must indicate the sulfur content of their fuel. Enforcement is through spot checks carried out by the EPA.
Do they have enough resources for this?
DreiblattNo, there are very few EPA checkers compared with the custodians. But thats the point of the IT based compliance program. Each fuel transaction is accompanied by a Product Transfer Document (PTD) with a designated recipient and trace of provenance. A PTD is issued every time fuel changes hands. The information goes to the EPA and is stored for five years. While the PTD could be a paper document, the number of transactions and the reporting requirement mean that it has to be digital.
Whats on the PTD?
DreiblattThe PTD has information on transactions, volumes and dates. Specific codes identify all custody holders and the originating and destination facilities involved in the transfer. The truck volume report is submitted electronically to EPAs database. And here the volumes must tally. If I send you 1,000 barrels, you must report reception of same. A discrepancy equates to a violation and is detected within minutes.
What happens if there is a violation?
DreiblattEverybody upstream of the violation is considered in violationanother reason why robust systems are required to prevent mistakes. In a sense, the EPA considers you guilty until proven innocent.
What are the IT controls used here?
DreiblattCompanies can validate the PTD by checking against other reports (such as pricing). Companies can build in system-based controls that recognize and flag violating situations. But note that today, there is no third party software to do this, although there are some products under development. Such tools exist for reformulated gasoline (RFG) reporting. But the reality is that today ULSD is far too reliant on manual checks. Folks are walking on eggshells now that the regulations are live.
What is BearingPoints offering in this space?
DreiblattPre the go live of the new regulations we found that readiness was uneven. We spent a lot of time checking reporting and tracking systems and enhancing the IT. We also work with software vendors to install and configure their systems.
What packages are used?
DreiblattWe do a lot of business process mapping. But this is not off the shelf stuff. The more so because the new regulations allow for sulfur credit trading.
Like carbon trading?
Its the same idea although this is not done through an exchange. Companies who produce fuel that beats the new spec can sell sulfur credits to third parties. The idea is to encourage companies to make a serious effort to be cleaner. All this of course puts a greater burden of reporting to the EPAcertain language needs to be added to contracts and other reports are involved.
The Society of Petroleum Engineers (SPE) and the UN Economic Commission for Europe (UNECE) have signed a Memorandum of Understanding to develop one globally applicable harmonized standard for reporting fossil energy reserves and resources. The standard will ensure consistency and transparency in financial reporting.
The SPE, World Petroleum Council (WPC) and the American Association of Petroleum Geologists (AAPG), have developed definitions for reserves and resources. In 2004, the UN Economic and Social Council passed resolution 2004/233 inviting UN member states to ensure worldwide application of the UN Framework Classification for Fossil Energy and Mineral Resources (UNFC).
The UNECE has now created an Ad Hoc Group of Experts on the Harmonization of Fossil Energy and Mineral Resources Terminology in which the SPE plays a key role. The Group of Experts provides a forum for stakeholders to assist in defining the needs to be met by the classification, its definitions, specifications and guidelines, and a vehicle for recommending their application. Under the MOU, SPE will facilitate the development of the texts of a globally harmonized common standard.
Meanwhile, speaking at the 2006 Ryder Scott reserves conference last month, John Ritter, chairman of the SPE Oil and Gas Reserves Committee, said the SPE is about to publish what will be a key document in reserves definition, a handbook of practical examples using the SPE 2007 definitions. The summer issue of the Ryder-Scott newsletter contains a lengthy report from the conference, including an alphabet soup of stakeholder organizations such as CERA, the IASB, the US FASB, the minerals industrys Combined Reserves International Reporting Standards CRIRSCO and the United Nations Framework classification above. More reserves soup from ryderscott.com.
Blue Marbles GeoCalc 6.2 release supports the latest version of the OGPs EPSG Geodetic Parameter Dataset v6.10. GeoCalc leverages XML data formats and the OpenGIS Consortiums well-known text (WKT) definitions.
Encom has just announced a new version of Discover, its MapInfo based geoscience GIS environment. Discover 8.0 adds 2D and 3D data visualization and analysis grid creation enhancements, data selection tools, statistical reports and other productivity tools.
Enersights WellSpring 2.0 adds risk and sensitivity analysis into the network economics. Multiple risked development scenarios can be compared in terms of their impact on expected values and reserves. Assets can have multiple decline parameter sets defined. Drilling programs can be designed to take account of rig availability, drilling time and facility capacity. Support for coal bed methane allows for varying gas composition and well deliverability as the reservoir pressure declines.
Invensys Process Control has announced Move to the Mesh, a bundled I/A Series system upgrade incentive. Legacy I/A Series users can upgrade to a bundle of field-mounted I/A Series Control Processors, Windows XP-based flat-panel LCD workstations, software licenses and redundant I/A Series ATS modules. The I/A Series system also allows users to deploy Invensys new InFusion enterprise control system (ECS). InFusion ECS adds enterprise information and integration technologies from Microsoft and SAP to process control infrastructures. A 16-page brochure describing InFusion ECS is available from infusionecs.com.
M2M Data Corp. announced that it is providing maintenance scheduling service for over 4,000 assets only 10 weeks after release of its iPM Maintenance Scheduler Service product, claimed to be the fastest ever adoption of a new service from the company.
Petrospec Technologies has released Equipoise 2006, its pore pressure and wellbore mechanical forecast and analysis while drilling package. The new release includes modules for real-time analysis of wellbore stability and formation properties as well as integrated Internet security for real-time updates at remote network servers. Recent Equipoise licensees include Pioneer and Energy Partners Ltd.
Roxar has announced a port of its flagship 3D reservoir modeling package, IRAP RMS to the Windows 64-bit platform. IRAP RMS is now available on Linux 64-bit, UNIX 64-bit and Windows 64-bit platforms. Benefits include increased data volumes, faster computation and lower operating costs. IRAP RMS now also runs in native mode on all supported hardware platforms and operating systems, without the need for emulation.
Wellogix has just announced a new Complex Services Management Suite (CSM Suite), a solution for planning, procurement and payment of complex oil field services. Wellogix CSM Suite underwent rigorous architectural design review at Microsofts Technology Center in Austin. The new release takes advantage of Microsofts information worker strategy and 2007 Microsoft Office SharePoint Portal Server and will embed Microsoft SQL Server 2005, Microsoft .NET Framework 2.0, SharePoint and Microsoft Office Business Scorecard Manager 2005. Wellogix was granted US patent N° 7,043,486 for its complex business project workflow technology.
Safe Softwares FME in Oil and Gas conference was held in Calgary last month. Safes feature manipulation engine (FME) is a spatial extract, transform and load (ETL) tool that includes a GIS data integration environment. FME supports over 160 different raster and vector formats and databases. Mark Stoakes, of Safes Software Professional Services unit enumerated some of the many navigation data formats encountered in the Canadian oil industry, from legacy formats like GenaMap through SDE, SDO and raster formats to the emerging Google Earth KML. Surveys of Safe users have shown that while ESRI formats account for around 25% of overall usage, there are over 30 other formats in use.
Oil and gas
Stoakes described a few typical oil and gas uses of FME. EnCanas MapWiz, an application originally developed by Safe for PetroCanada, merges local GIS data from a central warehouse with IHS data. Talisman Energys land lease mapping application uses Safes ArcSDEQuerier to merge lease polygons with attribute data, creating line geometry for data resident in its PPDM well and seismic database. Devon Energy updates its SEGP 2D/3D, Shape 2D/3D and ArcSDE in one process a two day task reduced to 20 minutes. Kerr-McGee integrates ArcIMS and data in FMEs SpatialDirect format in real time, performing on-the-fly transformation to other formats including Tobin TDRBM II. In Nexen, FME runs every 30 minutes to check the PPDM master database for changes and rerun transformations.
EnCanas MapWiz application has been productized as SpatialDirect, part of FMEs Spatial ETL Server family of applications. This comprises web based components for creating sites where users can download spatial data in any format or coordinate system. SpatialDirect can integrate web mapping solutions such as ArcIMS, MapGuide, GeoMedia WebMap and MapInfos MapExtreme.
In the summing up, it was noted that more formats are always coming. Todays novelties include Google Earth KML, Google SketchUp, FalconView and Oracle Raster. Organizations are building web services and Service Oriented Architectures leveraging WFS, GeoRSS, Rest, and SOAP/XML. According to Safe, FME is the data integration/aggregation environment for consuming spatial web services. More from www.safe.com and the FME Wiki on www.fmepedia.com.
Shell E&P has rolled out a bespoke ER Mapper-based solution to make its twelve terabyte archive of satellite imagery available to users throughout the world. The Image Web Server-based solution was built by ER Mappers Enterprise Services team. Imagery can be imported into desktop applications such as ER Mapper, Microsoft Word or ESRIs ArcGIS. Specialist and non specialist users can now access. Remote Sensing, GIS, and CAD specialists can use the imagery in their desktop applications and non-specialists can embed imagery from Shells central repository into a document or presentation.
Shell Remote Sensing consultant Richard Eyers said, Specialist and non-specialist users can now quickly find and access imagery from the archive, providing us with an increased return on our investment in remotely sensed data.
Two G&G software boutiques have signed with Schlumberger this month to build software add-ons for Schlumbergers flagship interpretation application, Petrel. UK-based Ikon Science is to develop a data link between its RokDoc package and Petrel. Australian Petrosys will do likewise for its Petroleum mapping system. Ikon, with sponsorship from BG Group, is to develop a version its 2D, rock physics-based seismic modeler, RokDoc, which will be embedded in Petrel. SIS Microsoft .NET-based Ocean framework will be used to build a link between the two packages.
Petrosys is also to leverage the Ocean framework to integrate its mapping solution with Petrel. Petrel users will be able display their interpretations, along with information from other data sources, through the Petrosys map interface.
Storewiz has been trialing its data compression appliance with two unnamed oil and gas accounts to determine the feasibility of lossless seismic data compression. Tests ran on a NetApp FAS980C/980 network attached storage (NAS) array over a Gigabit network. The results, according to Storewiz, demonstrated storage savings of 60-70% and a corresponding reduction in NAS CPU use.
Processing from the compressed datasets was not penalized by the compression. In fact bandwidth actually improved slightly, benefiting from servers parallelism. Best results were obtained with larger file sizes. But even for small files with low compression, process duration was little changed. The Storewiz appliance remained transparent to both client application and storage systems. According to Storwiz, the technology should allow companies to save on storage resources or streamline processing by pulling seismic data from tape to disk.
In a Decisioneering webcast last month, BPs Hugh Williamson showed how BPs drillers use Crystal Ball to optimize time and cost estimates, using Monte Carlo analysis (MCA) as a risk management tool. BP is currently drilling 1,000 wells per year at a cost of $5bn (30% of Capex). The answer to risk management is a well costing template and workflow, BPs Single Well Estimator (SWE), a.k.a. Monte Carlo for the masses is a Microsoft Excel/Crystal Ball workbook developed by Williamson. Williamson insists on the importance of using a standard template rather than letting users develop their own. Even though engineers like to do their own thing, it is best to rein them in!
Evaluation starts by finding suitable analog wells for comparison. These are wells in similar circumstances, not necessarily close to the target well. BP tries to avoid automated estimation. The key is conversation around possibilities. Automation stifles conversation.
Cost/risk estimation starts by prioritizing top sensitivities in a risk register table. All significant risks are included in the probabilistic cost estimate. Monte Carlo is not random but focuses the mind on what is important in analysis. The P10/90 estimates of well costs represent the range of likely cost. BP always asks are these predictions sound or aspirational? Deliverables from the SWE include a drilling and completion uncertainty statement with a time and cost summary along with MC plots of estimates. Supplying MC to the masses implies continuous development, training and support. BPs technical management now expects to see, understand and criticize such estimates. Asked if BP applies MC to its business as a whole Williamson stated We are starting to do this but things get complex with options, and there are non technical problemsdecision makers may not want MC results or they may not find them useful. Opinions differ, but I think that all-inclusive MC is the best way of understanding the big picture. Specialists like me are trying to push this forward.
Paul Haines reported on Kerr McGees program to realize the value of its unstructured data. For Haines, There is no magic in data management, its just work and it can be hard to quantify its ROI. Kerr McGees data management vision is of a single doorway to data. Priority is given to unstructured data. A standards-based high level corporate taxonomy has been implemented and roles and responsibilities assigned to business users and data gatekeepers. Data acquisition goes through the gatekeeper before archival. The corporate taxonomy is held in OpenTexts LiveLink. Moving files to the EDMS has benefited from the standard taxonomy, search, version control and document management. This has positive spin-off regarding Sarbanes-Oxley and records and information management (RIM) compliance. A Kerr-McGee developed application, WellArchiver, manages well files and metadata capture. Search and retrieval leverages WellExplorer (Geologix) and NitroView (Exprodat). The ILX Viewer (InnerLogix) is also used to spider the Kerr-McGee repository for well log files and also provides access to CoreLabs off-site data store.
Wes Baird (Data Matters) outlined Nexens ongoing data management project. The data and process landscape shows up many point to point connections. Proprietary data is used in many areas and there is a lack of consistent business rules. The plan is to move from ad-hoc processes to scheduled processes with shared data leveraging the Carnegie Mellon capability maturity model (CMM). This starts with interviews to capture business rules. Early results show a data hell of in-house PPDM, OpenWorks and IHS data stores. A reference hell with inconsistent naming, a security hell of access constraints (tables, rows, roles). Baird described similar hells for interface, process and maintenance. Bairds solution involves data chains, simple tables showing data, schema, server, process, roles and responsibilities and what links to what. Nexen has now implemented a PPDM data model, and has monthly meta data capture and is in the process of capturing business rules and linking everything together into a repository ready for questions.
Philip Lesslar, Shell
Managers and users lack feedback on data quality problem severity so Shell is going for a single rolled-up data quality KPI per organizational unit, enumerating the problem of declining energy level as data goes through its cycle. Various dispersed data quality efforts were grouped into Shells EPiQ project which resulted in the development of Shells IQM tool. IQM offers query management, procedures and global metric sets, developed with local businesses to ensure take-up. EPiQ shows color coded quality metrics along with trend indicators. The project is aligned with Shells global standard taxonomies. Change management remains an issue.
Steve Cooper (IHS) outlined the conclusions from a recent survey of 50 IHS clients which found that data volumes are doubling every 6 to 12 months. Customers are creating master data stores using PPDM 3.7 which is emerging as the standard data model for industry world wide. Data movement is being revolutionized by web services, a game changer, and data exchange standards developed around xml. These can be simple wrappers to existing data servers that let customers go in and grab just what they need. Cooper gave the example of the Accenture xIEP applet developed around SAPs NetWeaver. Business process mapping, workflow engines and the IM framework are coming together.
Clay Harter (OpenSpirit) asks, Could Google Earth (GE) be used in the oil and gas business? GE Enterprise rolls in shape files and raster images through Fusion which blends the GE database and data on in-house servers. Googles KML is an XML format for GIS data points, polylines etc. This can be from static or dynamic sources. A zip version (kmz) embeds raster imagery. OpenSpirit (OS) has leveraged its integration framework to tie into GE by dynamically creating kml/kmz from OS sources that can be consumed by GE. Harter showed a movie demo of GE in action and a new OpenSpirit Web (OSW) product. OSW browses OS data sources in lists as html. A button allows for KML creation and visualization in a GE client. OS does the transform to WGS84 (the map projection assumed by GE). GE fills the need for a light weight easy to use 3D browser. GE can be used as an OS data selector.
AJAX in Oil and Gas
David Terribile described Kerr McGees use of Web 2 technologies a.k.a. AJAX to tie its diverse data sources together. Kerr McGee has built a master data store, a cheap, simple structure to expose master well headers with pointers to raw data locations. One quick win application lets a user enter an API number and retrieve the corresponding DIMS UWI. Kerr McGee leverages Oracle Dynamic Collections (ODC) and AJAX components for serious data management. An example of an AJAX component is the DbNetGrid which with 6 lines of code makes a highly interactive user interface. The grid component is deployed as a front end to Kerr McGees geopolitical database and used to filter queries with drop down pick lists, pre populated from standard values. Kerr McGees WellArchiver and WellExplorer apply a similar philosophy with widgets for printing and export to HTML, Word or Excel. AJAX has been a key enabler to add GUI functionality and to fire-up other apps in context. Asked about where Kerr McGee was on the buy not build scale, Terribile answered that this project was more of a configuration exercise, In fact there is less configuration here than for a normal GeoFrame installation. Were not AJAX/CSS experts. The displays come out with a professional look and feel because of the components.
Dan Shearer, (Burlington Resources (BR), now part of ConocoPhillips) challenged its IM specialists to raise geoscience productivity by 10%. Multiple corporate acquisitions had caused a software application explosion. A Global Tech Review (performed with Landmark) added some software to fill gaps and turned off maintenance on specialty software. Savings were put into a kitty for subsequent lease of specialty software in ASP mode if needed. OpenITs application usage monitoring technology, showed Burlington that although folks said, we use that package all the time it was actually last used nine months ago! Maintenance was reduced by 36% in 2003 over 2002 with a reduced data management effort. Application status was classified in terms of currency of use. Geoscientists can now shop from Burlingtons own list of 250 approved applications. Burlington has evolved from a cost conscious cult to a disciplined value cult. This has targeted shortening project lifecycles with a 3D earth representation and by preserving analyses. A study can then pop up if a subsequent oil price rise makes it economic. Burlington sets aside $14,000/year/geoscientist for training, has joined Nautilus and hosted a creative solutions conference. A data SWAT team has been formed composed of 50% geoscientists and 50% software engineers.
This article has been taken from a 10 page, report produced as part of The Data Rooms Technology Watch reporting service. More from email@example.com.
At the Spring meetings of the Public Petroleum Data Model (PPDM) and the Pipeline Open Data Standards (PODS) associations, held in Houston, metadata held the spotlight. PPDM CEO Trudy Curtis showed how the Dublin Core metadata standard supports records and information (RIM) functionality such as retention, content life cycle as well as considerations such as geographic coverage, format and language. PPDM is building support for ontologies, taxonomies and meta data into the PPDM data model, mapping the Flare Catalogue into a W3C-compliant ontology.
PPDM is planning to publish XML schemas for a range of activities from acquisition and divestment, through business associations, seismic data processing and well information. The schemas will be developed through architectural collaboration with other organizations including GML and POSC. The issue of standard data content continues to exercise the community. Current thinking is that if relevant value sets exist, then PPDM should point members to the list rather than take on the burden of maintenance.
Wanda Jackson (WHL Information Solutions) explained how the Taxonomy and Metadata workgroup is setting out to create PPDM modules to manage semantic information and taxonomies. The workgroup will also implement mappings between taxonomies and metadata in PPDM and generate mechanisms for sharing and exchanging such information. A terminological gear change saw Hakan Sarbanoglu (Kalido) speak about similar issues in terms of the Enterprise Data Warehouse and Master Data Management. Sarbanoglu, who previously delivered a federation of Kalido Data Warehouses across Shell Oil Products 21 European units, believes that technology is mature enough to make the master data management concept succeed where previous corporate database initiatives have failed.
Greg Smith (New Century Software) addressed the issue of data exchange between PODS and other pipeline industry stakeholders such as the inline inspection (ILI) community. PODS RP-0102 offers a standard data structure for the exchange of inspection data between vendors and pipeline operators. The project will embrace anomaly classification through Hunter McDonnells Anomaly Library for Inspection Assurance Standards (see pipelinealias.com). PODS is also working with the National Association of Corrosion Engineers (NACE) on a new External Corrosion Direct Assessment (ECDA) data exchange standard (RP 0502). Alan Herbison (Kinder Morgan) reported on the PODS Spatial Implementation Working Groups extension that can be implemented in various commercial mapping packages.
Harry Schultz (OilWare) explained how units of measure (UOM) were handled by various data initiatives including PPDM. The oil industry is a minefield of different unit systems and usages. The current PPDM UOM initiative will provide a unique UOM identifier in the database and will support original UOM data along with approved conversion constants and an indication of conversion accuracy. Schultz regretted en passant, the powerful API RP66 format which allowed for conversions to be derived by parsing the data. This is not the case for PPDM although UOM conversions can be put into stored procedures. Shultz noted that storing data in a default UOM with a pointer to the original units is useful but very confusing!
Last month, PPDM announced the Alpha release (V2) of its next major model, PPDM 3.8. The new release embeds the results of the data management, metadata, taxonomy and well operations workgroups.
Kerr McGee has promoted Frank Patterson to VP exploration and technology.
Norwegian Ability Group (AGR) has acquired The Peak Group (TPG) for £26 million cash. Tom Conlon has joined TPG as operations manager for Dubai.
Mike Larsen returns to Paras after a stint with Schlumberger Business Consulting.
AspenTech has received a Wells Notice from the SEC of possible civil action regarding its 2004-2005 accounts. In March AspenTech settled a class action with a $5.6 million payment to some shareholders.
Visean has appointed Lloyd Taylor as non-executive director. Taylor was formerly CEO of Fletcher Challenge Energy.
CGG/Baker Hughes joint venture VS Fusion has acquired microseismic monitoring specialist Magnitude.
Saudi Aramco has awarded Halliburton the oilfield services component of the 300 well Khurais mega project.
Norske Shell has awarded Norwegian
Octaga the real-time virtual reality
visualization on its Ormen Lange facility.
Rock Solid Images is to team on the integration of seismic and well log data with OHMs CSEM technology.
Don Roesle is CEO of Ryder Scott, John Hamlin is now managing senior VP and Joe Magoto has joined the board of directors. Jennifer Fitzgerald has left ExxonMobil to take a petroleum engineering position with the company.
Sun Microsystems and Stanford University have created the Stanford Computational Earth and Environmental Science (CEES) center for simulation and prediction of complex earth processes.
Valero LP has selected Energy Solutions PipelineStudio for its offline simulator.
Veritas has sold its land seismic acquisition business to Matco Capital Ltd.
Canadian Oil Sands Trust is to deploy WellPoints Oil Marketing System to manage its SynCrude participation.
Helix Energy Solutions has announced an IPO for its Cal Dive wholly-owned unit.
Cygnet has appointed Henry Hickey to its Application Engineering group. Hickey was previously with Oil & Gas Equipment.
Technologies is offering a 6% finders fee for fulfillment of a non-brokered
private placement of up to $CDN500,000.
Statoil is to become carbon neutral by buying quotas for CO2 emissions to offset its business travel and office heating.
Casey Johnson is North America Account Manager with Knowledge Systems. Johnson was previously with Mercury.
Tom Readinger has retired as Associate Director of the US Offshore Minerals Management Program. Readinger chaired the OMMs Information Management Committee and promoted e-government.
Datalog CTO Dave Hawker pointed out a couple of errors in our April article on Datalogs Anax 500 logger. Anax runs on OpenBSD not QNX as we wrongly stated. Although MySQL is open source, Hawker points out that this commercial use is under license.
IHS Energy, now rebranded as a segment of parent IHS, has been busy this month in the IT space. IHS has acquired GeoPlus Corp., developer of the Petra interpretation package, joined Texas A&Ms Crisman Institute for Petroleum Research and, in collaboration with Veritas, has launched SuperGrid a NAD83-compliant dataset of Canadian survey data.
Petra is a PC-based geological, engineering, and petrophysical analysis tool. A companion product PetraSeis offers seismic interpretation coupled with the Petra database. IHS is to offer these tools as front ends to its online data sets, providing users with near real-time project data refresh.
IHS president and COO Ron Mobed said, For Petra users, our ultimate aim is to refresh the project models for their full inventory of drilling prospects, automatically. This will enable faster and lower-risk decision making by our customers as they compare, evaluate and select new assets to drill, based on the latest information. The Petra software and customer service teams will remain in Tulsa and an IHS data integration team will work on the flow of IHS data into Petra. IHS will also maintain Petras support of data formats for a variety of data and software suppliers. Petra is capable of serving data to the Open Spirit data exchange bus.
IHS Canada has teamed with Veritas to offer SuperGrid a NAD83 compliant E&P survey grid for oil and gas companies, mapping professionals and industry-related software vendors. SuperGrid is available through IHS applications, from the IHS data hub or from third party vendors. SuperGrid currently covers the Western Canada Sedimentary Basin and results from an integration of government source points and IHS databases. The package includes the Western Canadian Dominion Land Survey grids and federal government theoretical grids (NTS, FPS). An Oracle database of more than 1.1 billion grid points allows for custom-built queries, layers and applications at specified point resolutions based on unique customer needs.
Following ongoing donations of its software, and gifts of free access to some of its data, IHS has become a member of the Crisman Institute for Petroleum Research at Texas A&M University. The Institute comprises the Halliburton Center for Unconventional Resources, the Chevron Center for Well Construction and Production, the Schlumberger Center for Reservoir Description and Dynamics and the Center for Energy, Environmental and Transportation Innovation. Other Institute members include ConocoPhillips, Anadarko, Baker Hughes, Total, Newfield, Devon, and Saudi Aramco.
Schlumberger Information Solutions (SIS) has acquired the rights to Quorum Business Solutions Volume Management (QVM) package. QVM provides field operations data management and is used on several heavy oil projects in North America and one brown field installation in Asia.
SIS president, Olivier Le Peuch said, QVM is compatible with our next-generation technology framework and this partnership will accelerate our strategic initiative to deliver a complete production offering that integrates with back office operations.
Quorum CEO Paul Weidman added, Quorum is focusing its efforts on delivering back-office solutions. We will retain our production and revenue accounting tools. Under the terms of the agreement, SIS acquires the rights to QVM as well as key development personnel. SIS will develop its production volumes management solution around QVM and will partner with Quorum to integrate with its accounting workflows.
Permedia Research Group has teamed with Altair Engineering to offer a grid technology based version of Permedias MPath petroleum migration and reservoir fill simulator. MPath, which is used in basin modeling and reservoir characterization, uses a high-resolution reservoir fluid pressure, continuity and mixing solver.
MPaths Distributed Risk Module (DRM) generates and ranks hundreds of realizations to constrain cases where large uncertainties exist in the input parameters. To speed this compute intensive application, Altairs PBS Professional grid technology is now available to run multiple simulations concurrently on a distributed computing grid.
PBS Professional is a workload management solution for high-performance computing (HPC) environments. The software intelligently schedules computational across available hardware and software resources through policy-based grid technology.
Permedia Dan Carruthers said, Powerful, easily accessed compute resources are essential to exploration decisions that have significant monetary consequences. By integrating PBS Professional with MPath, our customers can run more simulations with their existing hardware and make better decisions.
Landmark, (which has now dropped the Graphics to become a brand of the Halliburton Energy Services Group) has been awarded a two-year contract by Bermuda-based E&P company Energy XXI to provide total upstream information technology outsourcing, hosting and consulting services. The contract includes data loading, application hosting, data management, disaster recovery, geophysical, geological and production optimization software, field development consulting services and onsite IT services
Landmark VP Doug Meikle said, This contract demonstrates the value of Landmarks IT outsourcing, data management and our hosted environment.
Energy XXI chairman John Schiller added, By outsourcing these services to Landmark, we can focus on our core business without having to support local infrastructure, data management and IT staff.
The American Petroleum Institute (API) has released an interactive computer-based training course on the Basic Principles of Petroleum as part of its continuing education program for oil and gas professionals. The eLearning course can be taken on-line or delivered on CD-ROM.
API business director Kathleen Combs said, The course was designed to address the learning needs of newly-hired personnel but will also benefit cross-training of staff in engineering and support roles. Course modules include petroleum geology, exploration, drilling operations, production, refining, and distribution. An individual license to the course can be ordered through the API University website at api-u.org for $145.
United Arab Emirates-based Netways has automated the requisition process for the Lamnalco Group of Companies. Lamnalco provide marine support services to oil and gas terminals and ports worldwide. The business process automation project leverages Microsoft Dynamics GP (formerly Great Plains), Captaris Workflow and Microsoft Visual Studio.
A state of the art requisition model is seamlessly integrated with GreatPlains and now allows Lamnalco to initiate requisitions, approve them according to company rules, monitor inventory, automate RFQs and create purchase orders from Microsoft Dynamics GP.
Captaris business process workflow is applicable to any repetitive activity. Tasks are automatically assigned and delivered to individuals or groups. Tasks remain in the task lists until they are completed. Escalation rules allow unfinished tasks to be re-assigned to managers or overflow teams.
Floating production storage and offloading systems (FPSO) specialist SBM Offshore has signed a multi-year license agreement to expand its use of AspenTechs engineering solutions. The new agreement provides access to HySys Upstream which SBM uses to support design and optimization of its oil and gas production facilities. The HySys Upstream option adds industry standard methods for handling petroleum fluids to the base HySys engineering package.
SBM Chief Engineer Mike Wyllie said, AspenTechs simulation applications are tailored to our industry. Providing our engineers with flexible access to more tools will enable us to make better design and investment decisions, in addition to increasing our engineering efficiency on upstream projects. Other SBM Offshore units GustoMSC, SBM-Imodco and Single Buoy Moorings have already standardized on HySys. The new agreement expands usage and adds support for other applications including dynamic simulations and the design of heat transfer equipment.
AspenTech senior VP Blair Wheeler added,HYSYS is recognized as the leading platform for simulation and optimization in the oil and gas industry. SBMs decision reflects the value it can deliver to clients by optimizing the design and performance of production systems.
The Open Geospatial Consortium has released a Simple Features Profile fro its Geography Markup Language specification. This specifies 2D geometry features such as Point, Curve, Surface etc. and is claimed to ease development of GML-based software for communities that share a common data model.
Interoperable Web Services, another OGC spec, were successfully demonstrated in the Global Earth Observing System of Systems (GEOSS). GEOSS showed complex 4D wind data and online meteorological processing services being published, discovered, and accessed over the internet.
A new standard for geo-enabling RSS feeds, GeoRSS, was shown at the Location Intelligence 2006 conference. GeoRSS embeds location into RSS feeds for display in a GeoRSS-enabled client. GeoRSS is a simple XML format for associating a point, line, boundary or bounding box with an RSS item. More from georss.org.
A new White Paper, authored by Vibhor Gupta (Wipro) outlines likely applications for RFID technology in oil and gas. Along with the regular identification number (ID), some RFID tags contain kilobytes of data on, for instance temperature history. RFID data can be read by an operator with a hand-held device or transferred directly to a computer for automated inventory management.
RFID, also referred to as Automatic Identification and Data Capture (AIDC), embeds digital memory that can be programmed using radio signals. This can take place in tough production environments at high speed. Most RFID devices have a latency of 1/10th second.
Tags may be active (batteries included) or shorter range, passive tags that receive their energy from the radio frequency field supplied by the reader. Tag readers talk to individual items thanks to sophisticated anti-collision algorithms that uniquely identify each tag.
In-depth systems, a technology company established by Marathon Oil, offers RFID solutions for the oil well applications including accurate perforating gun triggering. Actuant unit Hydratight uses RFID tags to ensure correct assembly of pipe joints. Finally, there are multiple applications for RFID in e-business and cashless transactions. Read the full Wipro White Paper on www.oilit.com/papers/wipro.pdf.
Calgary-based Decision Dynamics has just announced Oncore, a real-time, project cost management package for oil and gas. Oncore, formerly known as Time Industrial tracks labor, equipment, materials and other costs for capital and operations/maintenance projects by line item and provides robust analytics for contractor performance monitoring. Business benefits include reduced post-project audit costs, improved owner/contractor relationships achieved through fewer invoice disputes, and increased accuracy of future estimates based on historical project data maintained in the Oncore database.
DD CTO Andrew Jarman said, Traditional financial accounting systems report project costs at a summary level, often 60 to 90 days after the work is done. This makes accurate root cause analysis, problem resolution, and proactive prevention of serious cost and schedule deviations impossible. Oncore tracks line items in real time and represents a breakthrough in cost control and financial and performance. In one six-week Oncore pilot project, an oil and gas company discovered $200,000 in duplicate charges for hours billed. ERP systems may check for duplicate invoices but lack the line item granularity that flags this kind of problem. Oncore is available both as installed software and as an application hosted in Decision Dynamics data center.
Dutch standards body, USPI-NL, is in the process of implementing the ISO 15926 standard for integration, sharing, exchange, and hand-over of plant data. Part 7, now implemented in OWL, the W3Cs Ontology Web Language, defines implementation methods required to use ISO 15926 to its full extent.
Fluor Corp.s Onno Paap, implementer of the RDF/OWL version of ISO 15926 told Oil IT Journal that the OWL/RDF mapping got a thumbs-up from none other than the father of the World Wide Web, Tim Berners-Lee. Berners-Lee has been pushing RDF as the key technology behind the Semantic Web (OITJ Vol. 9 N° 1) and considered the ISO 15926 initiative as a benchmark test of the Semantic Web. InfoWeb transforms rules from legacy EXPRESS-based models into OWLs axioms. Check out the results on www.infowebml.ws.
Interserve Industrial Services (IIS), which manages BP Explorations Dimlington, UK gas terminal, has selected UK-based Tadpole Technologys i-Plan to manage an upcoming plant shutdown. i-Plan is a one-stop technology solution for the plant shutdown process. The package was jointly developed by BP, Interserve and Tadpole to reduce the cost and effort involved in plant shutdown.
IIS Director Kevin Heilbron said, Ready access to accurate information is the key to shutdown management. i-Plan ensures that current information is available wherever and whenever it is needed for planning and ongoing optimization of resource management and a streamlined shut-down.
Tadpole specializes in the application of geospatial technology for office-based and mobile users. The company has previously developed GIS applications, workflow management solutions and geospatial data synchronization technologies that are used by national mapping agencies, utilities and government. Tadpole Technology is an ESRI business partner.
Three awards have been made from the US Department of Energy $1.3 million Low-Impact Natural Gas and Oil (LINGO) scheme, managed by the National Energy Technology Laboratory. The University of Arkansas and the Argonne National Laboratory, will develop a web-based package to enable small E&P companies to generate development plans for work in sensitive ecosystems within the bounds of the Fayetteville Shale play.
A second award goes to Michigan Tech for a new strategy to satisfy a state environmental regulation that places large tracts of the Michigan Basin off-limits to E&P. Interstate Oil and Gas Compact Commission (IOGCC) gets another award to develop an Adverse-Impact Reduction Handbook to help E&P companies identify onshore barriers to E&P and provide viable approaches to minimize impacts. The handbook will serve as a best practices guide with case studies, field research, and broad stakeholder input to overcoming opposition or delays for E&P activity. IOGCC is partnered by ALL Consulting, Devon Energy and various State agencies.
Schlumberger, through its Information Solutions (SIS) unit, has acquired exclusive rights to develop products and solutions for the upstream drilling and production market using Aspens Asset Builder. Asset Builder is an integration framework that enables dynamic asset models to be built from components such as Aspens own HySys, Schlumbergers Avocet and other tools including Microsoft Excel.
SIS president Olivier le Peuch said, Our goal is to drive operational efficiency by eliminating the divide between upstream and downstream business processes. The oil and gas industry has responded favorably to our integration effort. This step will provide the upstream market with accelerated technology delivery and a unified market presence.
In a separate agreement, SIS acquired the rights to develop or embed another Aspen package, Operations Manager (OM), in its own upstream products. OM provides infrastructure capabilities including role-based process data visualization, an open simulation environment, performance scorecarding and event management.
Petrobras is to take part in a Securities and Exchange Commission (SEC) pilot program to use interactive data in its financial statement filings. The pilot program will enable the participants to determine the benefits of using interactive data, provide feedback to the SEC and enable investors and analysts to assess new techniques for analyzing interactive data reports submitted to the SEC in the eXtensible Business Reporting Language (XBRL) format.
SEC Chairman Christopher Cox said, As the number of companies submitting interactive data grows, its clear that making information available to investors in a more useful way is cost effective. At a roundtable on interactive data held in Washington this month, Cox said that the XBRL initiative, avoids the drudgery of digging into todays dense documents to fish out the data. Cox invited participants to imagine the possibilities if all company and mutual fund financial information were available to everyone for free in real time, directly from the source.
Interactive data will level the playing field between the well endowed financial firms who can afford to compile comparative data and the individual investor. Companies interested in joining the test group should visit the interactive data spotlight page at www.sec.gov/spotlight/xbrl.htm.
French geophysical contractor Compagnie Générale de Géophysique (CGG) is to deploy IBM BladeCenter JS21 clusters at its data processing centers in France, London, Kuala Lumpur and Houston. CGG expects it total compute bandwidth to grow to a theoretical peak of 113 teraflops. The supercomputers will be used to speed CGGs seismic imaging effort.
The Power PC-based supercomputers leverage IBMs new BladeCenter JS21 systems. Initially, CGG has expanded its facilities by deploying some 2,800 JS21 nodes, comprising dual-core PowerPC 970MP 2.5 GHz processors with a 1MB L2 cache per core, in 200 BladeCenter H chassis. CGGs Geocluster seismic processing package has been optimized for the VMX vector accelerator in the JS21.
CGG VP Guillaume Cambois said, Technology is our number one ally and we renew our calculating capabilities every 24 months. BladeCenters will increase our calculating power and provide clients with more accurate, faster imaging.
Canadian Natural Resources Ltd. (CNRL) has contracted Emersons Process Management unit for the automation of its $9.4 billion Horizon oil sands development in Alberta. The first phase of the Horizon project will allow CNRL to produce 110,000 barrels per day. Later phases of the project will raise total production to 232,000 barrels per day upon completion in 2012.
Bob Wing, project coordinator at CNRL, said, Risk mitigation was a primary concern of the Horizon project team. We chose the PlantWeb architecture based on Emersons experience in oil sands projects and global engineering expertise. Emerson project members will join our Automation Integration Team to concentrate our ten process areas into a single control room.
EnCana Corp. also awarded Emerson a contract for a PlantWeb deployment on its Steeprock natural gas processing plant in BC. Steeprock will process 190 MMSCFD of raw gas. PlantWeb was selected in part because of its smart safety instrumented system technologies and for the predictive nature of the basic control system.include ("copyright.inc"); ?>