June 2006


Oracle’s E&P Asset Data Hub

Oracle’s Tarek Shahawy, speaking at IQPC’s Data and Knowledge Management conference lifted the lid on a new ‘E&P Asset Data Hub,’ an Oracle Fusion-based infrastructure for the upstream.

According to Tarek Shahawy, Technology Manager with Oracle’s Middle East division, companies face the problem of complex legacy ‘silo’ applications that are hard to adapt and maintain and which often duplicate functionality. This is compounded with diverse asset teams and IT ‘spaghetti’ complexity—to the extent that IT ‘can’t answer simple questions’.

BPM

Oracle is therefore proposing a new approach to business process management (BPM), leveraging an ‘enterprise service bus’ (ESB). Connections to E&P applications leverage the OASIS-backed business process execution language (BEPL), a layer on top of the W3C’s web services description language (WSDL) to make calls from the BPM system.

Data Hub

All this is built into an ‘E&P Asset Data Hub’ offering connectivity to SCADA systems, operations, maintenance, SAP financials, custom developments, HSE and E&P applications. The idea is to have a ‘single source of the truth’ and to leverage BPEL to automate business processes. Typically, these would include process control alarms, engineering data, and possible geology, drilling and workover activity.

Process Designer

Oracle’s BPEL Process Designer is used to build dashboards of real time KPIs, analytics, forecasts, alerts etc. Data flows in from sensors to the Oracle Sensor Edge Server and through to applications. Assets can ‘call home’ for help. For instance, if a sensor’s temperature is over 200° for more than 5 minutes, an automated work order might be created. Operator intervention can be verified and spare parts reordered automatically. Oracle 11i, Retek and Oracle Fusion middleware also ran.

Skeptics

In the Q&A following Shahawy’s talk, one skeptic questioned whether this ‘idealistic vision’ could overcome the many barriers to process automation. Shahawy acknowledged that the technology is only part of the equation. ‘You need the need database and connectors to synch with applications. But we also provide validation and filtering tools which embed Oracle Consulting’s best practices.’

Vendors

Another question concerned the potential challenge that Oracle’s E&P Asset Data Hub represents to the major E&P application vendors. Shahawy pointed out that the latest manifestations of Landmark and Schlumberger’s applications tend to offer web services interfaces. ‘This makes our life much easier.’ Elsewhere, ‘Clients should ask vendors to implement web services to ease integration with Oracle back office tools.’

Silence

Oil IT Journal made several requests to Oracle for more on the Hub, with so far, no replies. Is the E&P Asset Data Hub real? Or another Project Synergy, Oracle’s earlier attempt to solve the E&P interoperability problem? (Oil ITJ Vol. 4 N°2).


Petris buys Maurer

Petris has acquired the software assets of Noble Technology’s Maurer Technology division. The portfolio includes 18 drilling and completion tools.

Petris Technology has acquired the software and support assets of Maurer Technology, a Noble Corp. unit. Petris is to assume world-wide responsibility for the development, sales, service, and future development of the Maurer drilling and completion software.

Galaxy

Maurer offers 18 individual applications for wellbore trajectories cementing and casing strings design, torque and drag analysis, wellbore hydraulics, stability and control and various coiled tubing and completion tools. Maurer’s software holds data in Galaxy, a shared Microsoft Access database.

PetrisWinds Now

Petris has been offering Maurer’s software via its ‘applications on demand’ service, PetrisWINDS Now, and also through the Society of Petroleum Engineer’s E&P Software Toolbox.

Pritchett

Petris CEO Jim Pritchett said, ‘With the current outlook for the drilling and completion market, and our good relationship with Maurer, we see a great future for this venture. Our integration technology platform, PetrisWINDS Enterprise, will help take the Maurer toolkit to the next level of interoperability, performance and ease-of-use.’ Petris is to continue to use the Maurer name.


Data management underpins Wal-Mart’s business

Oil IT Journal editor Neil McNaughton welcomes sponsors to the www.oilit.com website for 2006/7. He then reports from the 2006 PNEC, where Wal-Mart CTO Nancy Stewart gave an epiphanous talk describing Wal-Mart’s ‘manic’ approach to data and its iconoclastic, cutting-edge IT.

First a big thanks to our renewing website sponsors—

Exprodat Consulting,

geoLOGIC Systems,

Georex Assistance Technique,

Hewlett Packard,

Petris Technology,

OFS Portal,

Kelman Technologies,

Landmark,

TGS NOPEC/A2D and

Tigress.

Also a big welcome to our two new sponsors for the 2006/2007 year,

Ikon Science and

GeoSoft.

OilIT.com currently receives about 1,600 daily visits.

~

Phil Crouse, organizer of the Petroleum Network Education Conferences (PNEC) Data Integration conference which we report from in this month’s Oil IT Journal, pulled off a considerable coup this year by persuading Wal-Mart CTO, Nancy Stewart to present a paper on the retail behemoth’s data management systems. To my mind, Stewart’s talk was the highlight of 10 years of PNECs. Instead of editorializing, this month I offer you a summary of her presentation which I believe may mark a turning point in thinking on the under-resourced and often overlooked activity that is oil and gas data management.

25 million customers

Stewart’s presentation showed how great attention to data and information management underpins Wal-Mart’s business. Wal-Mart maintains strong data models of what’s happening in its stores on a daily basis. No mean feat when you consider that in a single day there are up to 25 million customers and 12 million credit card transactions. Equally strong communications technology (that bypasses telcos and the banks) allows the company to process credit card transactions from any location in the world in under one second!

Manic about data

Wal-Mart is manic about data, ‘We keep everything! Data is the great enabler.’ Data is kept online for 2 years and then it is paged out to a second tier of storage. Wal-Mart reviews 300 million items a day to see if they need to be restocked and keeps track of ‘on-hands’ for 700 million items.

Outsource? No thanks!

‘We do not outsource or offshore because of our tightly integrated centralized model.’ Wal-Mart’s centralized Information Systems Division (ISD), located in Bentonville Arkansas, does a lot of its own development, tuning its technology to support its business.

11 mainframes

The company has a heterogeneous IT environment with 11 ‘single system image’ IBM mainframes each with 2,084 processors. There is one replenishment system for 700 million items, one HR systems for a million employees and a system supporting world wide trucking.

‘plex

Each of its two redundant centers (‘plexes) has 40k mainframe MIPS compute bandwidth. Overall, network availability KPIs are 99.997%. Last December Wal-Mart achieved ‘six nines’ for its global network ‘and that includes Microsoft!’ In fact, Wal-Mart gets better performance in China by managing its data in Bentonville!

Performance

Performance is crucial for Wal-Mart’s business. The company loses $1,000/ hour per cash register that is down. A four hour outage ‘means we start to throw away perishables.’ Once, a 6 hour outage was deemed so critical that, ‘We filled the sky with planes to get it fixed’.

Tornado

Bentonville is located in a tornado zone so everything is built hardened and there is a second ‘plex for tornado mode running. This was used most recently only a couple of month’s ago. To make sure everything works, Wal-Mart fails-over between the two ‘plexes once a month.

Teradata

This year, Wal-Mart is upgrading its NCR/Teradata data warehouse with nearly 900 terabytes of storage. The aim is for a central, single version of the truth. Users can ‘ask any question any time’. Wal-Mart develops its own systems because the sheer size of its transaction volumes and data mart stresses commercial tools to breaking. The company is always asking ‘will we break the architecture’ especially of its single image store.

4 billion row table!

A 4 billion row table is used daily for sourcing and running the business. Wal-Mart exploits collaborative planning, forecasting and replenishment (CPFR) to anticipate orders. Merchants are always finding better ways of interrogating the data and now the decision support systems use 60% of overall CPU time. Data mining recognizes that each store is unique and is tuned to clients’ needs. Rather than putting stuff that is ‘likely’ to sell into stores, Wal-Mart uses weather mapping to re-route merchandise to where it is needed.

Sacred cows

Stewart slaughtered several sacred cows of the IT business during her 20 minute presentation. Remember buy not build? Forget it. Distributed databases? Forget them too. Outsourcing? You got it. If you want to leverage IT to understand and improve your business, build your own systems that do exactly what you want them to do. Sure, Wal-Mart uses components from many vendors in its solution. Open Systems are prominent with Apache web servers and Linux deployed. Along with Teradata and IBM, SAS Microsoft and HP got a mention. But Wal-Mart’s data volumes, query requirement and need for precision make it hard for a standard product to fulfil. All this need not cost a fortune. Wal-Mart’s IT spend is ‘significantly under’ half a percent of turnover and the ISD gets the job done with only 70 full time employees.

Oil and Gas?

Wal-Mart’s enthusiasm for data contrasts with oil and gas which, as attendees to PNEC know, has neglected data management. In oil and gas, you can ask any question any time, but you maybe won’t get an answer right away! Oil companies are likely to offer their knowledge workers several different versions of the truth to chose from! The Wal-Mart case history provides much food for though for oil and gas majors and argues in favor of a radical overhaul of IT/IM strategy. With oil at $70 this ought to be a better investment than share buy-backs.


OITJ Interview—Adam Dreiblatt, BearingPoint

According to the US Environmental Protection Agency, reducing diesel engine emissions is one of the most important public health challenges facing the country. Diesel emissions are associated with premature mortality, cancer, heart and lung disease and respiratory complaints. BearingPoint’s Adam Dreiblatt spoke to Oil IT Journal about new regulations that came into force this month that mandate the use of clean, ultra-low sulfur diesel (ULSD) fuel in highway diesel engines. Dreiblatt explains how the EPA is monitoring compliance, how some oil companies are ‘walking on eggshells’ in respect of the new requirements and how IT can be leveraged for compliance.

What do the new regulations entail?

Dreiblatt—The US Environmental Protection Agency’s Ultra Low Sulfur Diesel (ULSD) program mandates a 15ppm sulfur content as of June 1st 2006. Limits on particulate matter will be introduced next year.

How does the EPA do this?

Dreiblatt—The EPA adopts a ‘custodial approach’. Fuel custodians must indicate the sulfur content of their fuel. Enforcement is through spot checks carried out by the EPA.

Do they have enough resources for this?

Dreiblatt—No, there are very few EPA checkers compared with the custodians. But that’s the point of the IT based compliance program. Each fuel transaction is accompanied by a Product Transfer Document (PTD) with a designated recipient and trace of provenance. A PTD is issued every time fuel changes hands. The information goes to the EPA and is stored for five years. While the PTD could be a paper document, the number of transactions and the reporting requirement mean that it has to be digital.

What’s on the PTD?

Dreiblatt—The PTD has information on transactions, volumes and dates. Specific codes identify all custody holders and the originating and destination facilities involved in the transfer. The truck volume report is submitted electronically to EPA’s database. And here the volumes must tally. If I send you 1,000 barrels, you must report reception of same. A discrepancy equates to a violation and is detected within minutes.

What happens if there is a violation?

Dreiblatt—Everybody upstream of the violation is considered in violation—another reason why robust systems are required to prevent mistakes. In a sense, the EPA considers you ‘guilty until proven innocent’.

What are the IT controls used here?

Dreiblatt—Companies can validate the PTD by checking against other reports (such as pricing). Companies can build in system-based controls that recognize and flag violating situations. But note that today, there is no third party software to do this, although there are some products under development. Such tools exist for reformulated gasoline (RFG) reporting. But the reality is that today ULSD is far too reliant on manual checks. Folks are walking on eggshells now that the regulations are live.

What is BearingPoint’s offering in this space?

Dreiblatt—Pre the go live of the new regulations we found that readiness was uneven. We spent a lot of time checking reporting and tracking systems and enhancing the IT. We also work with software vendors to install and configure their systems.

What packages are used?

Dreiblatt—We do a lot of business process mapping. But this is not off the shelf stuff. The more so because the new regulations allow for sulfur credit trading.

Like carbon trading?

It’s the same idea although this is not done through an exchange. Companies who produce fuel that beats the new spec can sell sulfur credits to third parties. The idea is to encourage companies to make a serious effort to be cleaner. All this of course puts a greater burden of reporting to the EPA—certain language needs to be added to contracts and other reports are involved.


UNECE and SPE team on reserves

A memorandum of understanding between the UN and the SPE paves the way for a global reserves system.

The Society of Petroleum Engineers (SPE) and the UN Economic Commission for Europe (UNECE) have signed a Memorandum of Understanding to develop ‘one globally applicable harmonized standard’ for reporting fossil energy reserves and resources. The standard will ensure ‘consistency and transparency in financial reporting.’

WPC

The SPE, World Petroleum Council (WPC) and the American Association of Petroleum Geologists (AAPG), have developed definitions for reserves and resources. In 2004, the UN Economic and Social Council passed resolution 2004/233 inviting UN member states to ensure worldwide application of the UN Framework Classification for Fossil Energy and Mineral Resources (UNFC).

Ad-Hoc Group

The UNECE has now created an Ad Hoc Group of Experts on the Harmonization of Fossil Energy and Mineral Resources Terminology in which the SPE plays a key role. The Group of Experts provides a forum for stakeholders to assist in defining the needs to be met by the classification, its definitions, specifications and guidelines, and a vehicle for recommending their application. Under the MOU, SPE will facilitate the development of the texts of a globally harmonized common standard.

Ryder-Scott

Meanwhile, speaking at the 2006 Ryder Scott reserves conference last month, John Ritter, chairman of the SPE Oil and Gas Reserves Committee, said the SPE is about to publish what will be a key document in reserves definition, a handbook of practical examples using the SPE 2007 definitions. The summer issue of the Ryder-Scott newsletter contains a lengthy report from the conference, including an ‘alphabet soup’ of stakeholder organizations such as CERA, the IASB, the US FASB, the minerals industry’s Combined Reserves International Reporting Standards CRIRSCO and the United Nations Framework classification above. More reserves soup from ryderscott.com.


Software/hardware, sales, new releases ...

Software news this month from Blue Marble (GeoCalc 6.2), Encom (Discover 8.0), Enersight (WellSpring 2.0), Invensys (InFusion), M2M Data Corp (iPM), Petrospec Technologies (Equipoise 2006), Roxar (64 bit IRAM RMS) and Wellogix (CSM Suite and new US patent).

Blue Marble’s GeoCalc 6.2 release supports the latest version of the OGP’s EPSG Geodetic Parameter Dataset v6.10. GeoCalc leverages XML data formats and the OpenGIS Consortium’s well-known text (WKT) definitions.

Encom has just announced a new version of Discover, its MapInfo based geoscience GIS environment. Discover 8.0 adds 2D and 3D data visualization and analysis grid creation enhancements, data selection tools, statistical reports and other productivity tools.

Enersight’s WellSpring 2.0 adds risk and sensitivity analysis into the network economics. Multiple risked development scenarios can be compared in terms of their impact on expected values and reserves. Assets can have multiple decline parameter sets defined. Drilling programs can be designed to take account of rig availability, drilling time and facility capacity. Support for coal bed methane allows for varying gas composition and well deliverability as the reservoir pressure declines.

Invensys Process Control has announced ‘Move to the Mesh,’ a bundled I/A Series system upgrade incentive. Legacy I/A Series users can upgrade to a bundle of field-mounted I/A Series Control Processors, Windows XP-based flat-panel LCD workstations, software licenses and redundant I/A Series ATS modules. The I/A Series system also allows users to deploy Invensys’ new InFusion enterprise control system (ECS). InFusion ECS adds enterprise information and integration technologies from Microsoft and SAP to process control infrastructures. A 16-page brochure describing InFusion ECS is available from infusionecs.com.

M2M Data Corp. announced that it is providing maintenance scheduling service for over 4,000 assets only 10 weeks after release of its iPM Maintenance Scheduler Service product, claimed to be the fastest ever adoption of a new service from the company.

Petrospec Technologies has released Equipoise 2006, its pore pressure and wellbore mechanical forecast and analysis while drilling package. The new release includes modules for real-time analysis of wellbore stability and formation properties as well as integrated Internet security for real-time updates at remote network servers. Recent Equipoise licensees include Pioneer and Energy Partners Ltd.

Roxar has announced a port of its flagship 3D reservoir modeling package, IRAP RMS to the Windows 64-bit platform. IRAP RMS is now available on Linux 64-bit, UNIX 64-bit and Windows 64-bit platforms. Benefits include increased data volumes, faster computation and lower operating costs. IRAP RMS now also runs in native mode on all supported hardware platforms and operating systems, without the need for emulation.

Wellogix has just announced a new Complex Services Management Suite (CSM Suite), a solution for planning, procurement and payment of complex oil field services. Wellogix’ CSM Suite underwent ‘rigorous’ architectural design review at Microsoft’s Technology Center in Austin. The new release takes advantage of Microsoft’s information worker strategy and 2007 Microsoft Office SharePoint Portal Server and will embed Microsoft SQL Server 2005, Microsoft .NET Framework 2.0, SharePoint and Microsoft Office Business Scorecard Manager 2005. Wellogix was granted US patent N° 7,043,486 for its ‘complex business project workflow’ technology.


Safe Software’s FME in Oil and Gas

Oil and Gas users discuss Feature Manipulation Engine applications in upstream GIS.

Safe Software’s FME in Oil and Gas conference was held in Calgary last month. Safe’s feature manipulation engine (FME) is a spatial extract, transform and load (ETL) tool that includes a GIS data integration environment. FME supports over 160 different raster and vector formats and databases. Mark Stoakes, of Safe’s Software Professional Services unit enumerated some of the many navigation data formats encountered in the Canadian oil industry, from legacy formats like GenaMap through SDE, SDO and raster formats to the emerging Google Earth KML. Surveys of Safe users have shown that while ESRI formats account for around 25% of overall usage, there are over 30 other formats in use.

Oil and gas

Stoakes described a few typical oil and gas uses of FME. EnCana’s MapWiz, an application originally developed by Safe for PetroCanada, merges local GIS data from a central warehouse with IHS data. Talisman Energy’s land lease mapping application uses Safe’s ArcSDEQuerier to merge lease polygons with attribute data, creating line geometry for data resident in its PPDM well and seismic database. Devon Energy updates its SEGP 2D/3D, Shape 2D/3D and ArcSDE in one process – a two day task reduced to 20 minutes. Kerr-McGee integrates ArcIMS and data in FME’s SpatialDirect format in real time, performing on-the-fly transformation to other formats including Tobin TDRBM II. In Nexen, FME runs every 30 minutes to check the PPDM master database for changes and rerun transformations.

Spatial Direct

EnCana’s MapWiz application has been productized as SpatialDirect, part of FME’s Spatial ETL Server family of applications. This comprises web based components for creating sites where users can download spatial data in any format or coordinate system. SpatialDirect can integrate web mapping solutions such as ArcIMS, MapGuide, GeoMedia WebMap and MapInfo’s MapExtreme.

Wiki

In the summing up, it was noted that more formats are always coming. Today’s novelties include Google Earth KML, Google SketchUp, FalconView and Oracle Raster. Organizations are building web services and Service Oriented Architectures leveraging WFS, GeoRSS, Rest, and SOAP/XML. According to Safe, FME is the data integration/aggregation environment for consuming spatial web services. More from www.safe.com and the FME Wiki on www.fmepedia.com.


Shell’s global remote sensing image server

ER Mapper’s consultants have developed a web-based image server for satellite imagery.

Shell E&P has rolled out a bespoke ER Mapper-based solution to make its twelve terabyte archive of satellite imagery available to users throughout the world. The Image Web Server-based solution was built by ER Mapper’s Enterprise Services team. Imagery can be imported into desktop applications such as ER Mapper, Microsoft Word or ESRI’s ArcGIS. Specialist and non specialist users can now access. Remote Sensing, GIS, and CAD specialists can use the imagery in their desktop applications and non-specialists can embed imagery from Shell’s central repository into a document or presentation.

Eyers

Shell Remote Sensing consultant Richard Eyers said, ‘Specialist and non-specialist users can now quickly find and access imagery from the archive, providing us with an increased return on our investment in remotely sensed data.’


Ikon, Petrosys jump on Petrel bandwagon

Boutique software houses to leverage Schlumberger’s ‘Ocean’ development platform.

Two G&G software boutiques have signed with Schlumberger this month to build software add-ons for Schlumberger’s flagship interpretation application, Petrel. UK-based Ikon Science is to develop a data link between its RokDoc package and Petrel. Australian Petrosys will do likewise for its Petroleum mapping system. Ikon, with sponsorship from BG Group, is to develop a version its 2D, rock physics-based seismic modeler, RokDoc, which will be embedded in Petrel. SIS’ Microsoft .NET-based Ocean framework will be used to build a link between the two packages.

Petrosys

Petrosys is also to leverage the Ocean framework to integrate its mapping solution with Petrel. Petrel users will be able display their interpretations, along with information from other data sources, through the Petrosys map interface.


Storewiz seismic compression benchmarks

Contrary to received views, seismic data does compress without loss, by as much as 60-70%.

Storewiz has been trialing its data compression appliance with two unnamed oil and gas accounts to determine the feasibility of lossless seismic data compression. Tests ran on a NetApp FAS980C/980 network attached storage (NAS) array over a Gigabit network. The results, according to Storewiz, demonstrated storage savings of 60-70% and a corresponding reduction in NAS CPU use.

Performance

Processing from the compressed datasets was not penalized by the compression. In fact bandwidth actually improved slightly, benefiting from servers’ parallelism. Best results were obtained with larger file sizes. But even for small files with low compression, process duration was little changed. The Storewiz appliance remained transparent to both client application and storage systems. According to Storwiz, the technology should allow companies to save on storage resources or streamline processing by pulling seismic data from tape to disk.


BP’s Crystal Ball templates for well risking

Hugh Williamson explains how BP risks it drilling decisions with a standard Crystal Ball template.

In a Decisioneering webcast last month, BP’s Hugh Williamson showed how BP’s drillers use Crystal Ball to optimize time and cost estimates, using Monte Carlo analysis (MCA) as a risk management tool. BP is currently drilling 1,000 wells per year at a cost of $5bn (30% of Capex). The answer to risk management is a well costing template and workflow, BP’s Single Well Estimator (SWE), a.k.a. ‘Monte Carlo for the masses’ is a Microsoft Excel/Crystal Ball workbook developed by Williamson. Williamson insists on the importance of using a standard template rather than letting users develop their own. Even though engineers like to do their own thing, ‘it is best to rein them in!’

Analogs

Evaluation starts by finding suitable ‘analog wells’ for comparison. These are wells in similar circumstances, not necessarily close to the target well. BP tries to avoid automated estimation. The key is ‘conversation’ around possibilities. Automation stifles conversation.

Register

Cost/risk estimation starts by prioritizing top sensitivities in a ‘risk register’ table. All significant risks are included in the probabilistic cost estimate. Monte Carlo is not ‘random’ but focuses the mind on what is important in analysis. The P10/90 estimates of well costs represent the range of likely cost. BP always asks are these predictions sound or ‘aspirational’? Deliverables from the SWE include a drilling and completion ‘uncertainty statement’ with a time and cost summary along with MC plots of estimates. Supplying MC to the masses implies continuous development, training and support. BP’s technical management now expects to see, understand and criticize such estimates. Asked if BP applies MC to its business as a whole Williamson stated ‘We are starting to do this but things get complex with options, and there are non technical problems—decision makers may not want MC results or they may not find them useful. Opinions differ, but I think that all-inclusive MC is the best way of understanding the big picture. Specialists like me are trying to push this forward.’


PNEC 2006, Houston

The Petroluem Network Education Conferences (PNEC) Data and Knowledge Integration Conference continues to thrive, with some 370 attendees this year, almost half oil company employees. The highlight of the 2006 PNEC was Nancy Stewart’s (Wall-Mart CIO) address (see this month’s editorial). Kerr McGee (now Anadarko) has been working on unstructured data management and using AJAX technologies to enhance its users’ experience. Shell continues to enhance (and measure) data quality. More metrics underpin Burlington Resources’ (now ConocoPhllips) application portfolio rationalization. Finally, OpenSpirit has been quick to jump on the Google Earth bandwagon and offers the popular GIS front end as a data browser for OpenSpirit-enabled data sources.

Paul Haines reported on Kerr McGee’s program to realize the value of its unstructured data. For Haines, ‘There is no magic in data management, it’s just work and it can be hard to quantify its ROI’. Kerr McGee’s data management vision is of a single doorway to data. Priority is given to unstructured data. A standards-based high level corporate taxonomy has been implemented and roles and responsibilities assigned to business users and data ‘gatekeepers.’ Data acquisition goes through the gatekeeper before archival. The corporate taxonomy is held in OpenText’s LiveLink. Moving files to the EDMS has benefited from the standard taxonomy, search, version control and document management. This has positive spin-off regarding Sarbanes-Oxley and records and information management (RIM) compliance. A Kerr-McGee developed application, WellArchiver, manages well files and metadata capture. Search and retrieval leverages WellExplorer (Geologix) and NitroView (Exprodat). The ILX Viewer (InnerLogix) is also used to spider the Kerr-McGee repository for well log files and also provides access to CoreLab’s off-site data store.

Nexen

Wes Baird (Data Matters) outlined Nexen’s ongoing data management project. The data and process landscape shows up many point to point connections. Proprietary data is used in many areas and there is a lack of consistent business rules. The plan is to move from ad-hoc processes to scheduled processes with shared data leveraging the Carnegie Mellon capability maturity model (CMM). This starts with interviews to capture business rules. Early results show a ‘data hell’ of in-house PPDM, OpenWorks and IHS data stores. A ‘reference hell’ with inconsistent naming, a ‘security hell’ of access constraints (tables, rows, roles). Baird described similar hells for interface, process and maintenance. Baird’s solution involves data ‘chains,’ simple tables showing data, schema, server, process, roles and responsibilities and what links to what. Nexen has now implemented a PPDM data model, and has monthly meta data capture and is in the process of capturing business rules and linking everything together into a ‘repository ready for questions.’

Philip Lesslar, Shell

Managers and users lack feedback on data quality problem severity so Shell is going for a single rolled-up data quality KPI per organizational unit, enumerating the problem of declining energy level as data goes through its cycle. Various dispersed data quality efforts were grouped into Shell’s EPiQ project which resulted in the development of Shell’s IQM tool. IQM offers query management, procedures and global metric sets, developed with local businesses to ensure take-up. EPiQ shows color coded quality metrics along with trend indicators. The project is aligned with Shell’s global standard taxonomies. Change management remains an issue.

IHS

Steve Cooper (IHS) outlined the conclusions from a recent survey of 50 IHS clients which found that data volumes are doubling every 6 to 12 months. Customers are creating master data stores using PPDM 3.7 which is emerging as the standard data model for industry world wide. Data movement is being revolutionized by web services, a ‘game changer,’ and data exchange standards developed around xml. These can be simple wrappers to existing data servers that let customers go in and grab just what they need. Cooper gave the example of the Accenture xIEP applet developed around SAP’s NetWeaver. ‘Business process mapping, workflow engines and the IM framework are coming together.’

OpenSpirit

Clay Harter (OpenSpirit) asks, ‘Could Google Earth (GE) be used in the oil and gas business?’ GE Enterprise rolls in shape files and raster images through Fusion which blends the GE database and data on in-house servers. Google’s KML is an XML format for GIS data – points, polylines etc. This can be from static or dynamic sources. A zip version (kmz) embeds raster imagery. OpenSpirit (OS) has leveraged its integration framework to tie into GE by dynamically creating kml/kmz from OS sources that can be consumed by GE. Harter showed a movie demo of GE in action and a new OpenSpirit Web (OSW) product. OSW browses OS data sources in lists as html. A button allows for KML creation and visualization in a GE client. OS does the transform to WGS84 (the map projection assumed by GE). GE fills the need for a light weight easy to use 3D browser. GE can be used as an OS data selector.

AJAX in Oil and Gas

David Terribile described Kerr McGee’s use of ‘Web 2’ technologies – a.k.a. AJAX to tie its diverse data sources together. Kerr McGee has built a master data store, a cheap, simple structure to expose master well headers with pointers to raw data locations. One ‘quick win’ application lets a user enter an API number and retrieve the corresponding DIMS UWI. Kerr McGee leverages Oracle Dynamic Collections (ODC) and AJAX components for ‘serious’ data management. An example of an AJAX component is the DbNetGrid which with ‘6 lines of code’ makes a highly interactive user interface. The grid component is deployed as a front end to Kerr McGee’s geopolitical database and used to filter queries with drop down pick lists, pre populated from standard values. Kerr McGee’s WellArchiver and WellExplorer apply a similar philosophy with widgets for printing and export to HTML, Word or Excel. AJAX has been a key enabler to add GUI functionality and to fire-up other apps in context. Asked about where Kerr McGee was on the ‘buy not build’ scale, Terribile answered that this project was more of a configuration exercise, ‘In fact there is less configuration here than for a ‘normal’ GeoFrame installation. We’re not AJAX/CSS experts. The displays come out with a professional look and feel because of the components.’

ConocoPhillips

Dan Shearer, (Burlington Resources (BR), now part of ConocoPhillips) challenged its IM specialists to raise geoscience productivity by 10%. Multiple corporate acquisitions had caused a software application explosion. A Global Tech Review (performed with Landmark) added some software to fill gaps and turned off maintenance on specialty software. Savings were put into a kitty for subsequent lease of specialty software in ASP mode if needed. OpenIT’s application usage monitoring technology, showed Burlington that although folks said, ‘we use that package all the time’ it was actually last used nine months ago! Maintenance was reduced by 36% in 2003 over 2002 with a reduced data management effort. Application status was classified in terms of currency of use. Geoscientists can now ‘shop’ from Burlington’s own list of 250 approved applications. Burlington has evolved from a ‘cost conscious cult’ to a ‘disciplined value cult’. This has targeted shortening project lifecycles with a 3D earth representation and by preserving analyses. A study can then ‘pop up’ if a subsequent oil price rise makes it economic. Burlington sets aside $14,000/year/geoscientist for training, has joined Nautilus and hosted a ‘creative solutions conference.’ A data SWAT team has been formed composed of 50% geoscientists and 50% software engineers.

This article has been taken from a 10 page, report produced as part of The Data Room’s Technology Watch reporting service. More from tw@oilit.com.


PPDM Spring meeting highlights metadata

The joint meeting of the Public Petroleum Data Model (PPDM) and the Pipeline Open Data Standards (PODS) associations debated metadata, semantics, data exchange and units of measure.

At the Spring meetings of the Public Petroleum Data Model (PPDM) and the Pipeline Open Data Standards (PODS) associations, held in Houston, metadata held the spotlight. PPDM CEO Trudy Curtis showed how the Dublin Core metadata standard supports records and information (RIM) functionality such as retention, content life cycle as well as considerations such as geographic coverage, format and language. PPDM is building support for ontologies, taxonomies and meta data into the PPDM data model, mapping the Flare Catalogue into a W3C-compliant ontology.

Schema

PPDM is planning to publish XML schemas for a range of activities from acquisition and divestment, through business associations, seismic data processing and well information. The schemas will be developed through architectural collaboration with other organizations including GML and POSC. The issue of standard data content continues to exercise the community. Current thinking is that if relevant value sets exist, then PPDM should ‘point members to the list’ rather than take on the burden of maintenance.

Semantics

Wanda Jackson (WHL Information Solutions) explained how the Taxonomy and Metadata workgroup is setting out to create PPDM modules to manage semantic information and taxonomies. The workgroup will also implement mappings between taxonomies and metadata in PPDM and generate mechanisms for sharing and exchanging such information. A terminological gear change saw Hakan Sarbanoglu (Kalido) speak about similar issues in terms of the Enterprise Data Warehouse and Master Data Management. Sarbanoglu, who previously delivered a federation of Kalido Data Warehouses across Shell Oil Products’ 21 European units, believes that technology is mature enough to make the master data management concept succeed where previous corporate database initiatives have failed.

PODS

Greg Smith (New Century Software) addressed the issue of data exchange between PODS and other pipeline industry stakeholders such as the inline inspection (ILI) community. PODS’ RP-0102 offers a standard data structure for the exchange of inspection data between vendors and pipeline operators. The project will embrace anomaly classification through Hunter McDonnell’s Anomaly Library for Inspection Assurance Standards (see pipelinealias.com). PODS is also working with the National Association of Corrosion Engineers (NACE) on a new External Corrosion Direct Assessment (ECDA) data exchange standard (RP 0502). Alan Herbison (Kinder Morgan) reported on the PODS Spatial Implementation Working Group’s extension that can be implemented in various commercial mapping packages.

UOM

Harry Schultz (OilWare) explained how units of measure (UOM) were handled by various data initiatives including PPDM. The oil industry is a minefield of different unit systems and usages. The current PPDM UOM initiative will provide a unique UOM identifier in the database and will support original UOM data along with approved conversion constants and an indication of conversion accuracy. Schultz regretted en passant, the powerful API RP66 format which allowed for conversions to be derived by parsing the data. This is not the case for PPDM although UOM conversions can be put into stored procedures. Shultz noted that storing data in a default UOM with a pointer to the original units is ‘useful but very confusing!’

3.8 Alpha

Last month, PPDM announced the Alpha release (V2) of its next major model, PPDM 3.8. The new release embeds the results of the data management, metadata, taxonomy and well operations workgroups.


Folks, facts and orgs…

News from Kerr McGee, Ability Group, Paras, AspenTech, Visean, VS Fusion, Halliburton, Octaga, Rock Solid Images, Ryder Scott, Sun, Stanford University, Energy Solutions, Ceritas, Matco, WellPoint, IFP, Helix, Fugro, Cygnet, Labrador, Statoil, Knowledge Systems, MMS and Datalog.

Kerr McGee has promoted Frank Patterson to VP exploration and technology.

Norwegian Ability Group (AGR) has acquired The Peak Group (TPG) for £26 million cash. Tom Conlon has joined TPG as operations manager for Dubai.

Mike Larsen returns to Paras after a stint with Schlumberger Business Consulting.

AspenTech has received a ‘Wells Notice’ from the SEC of possible civil action regarding its 2004-2005 accounts. In March AspenTech settled a class action with a $5.6 million payment to some shareholders.

Visean has appointed Lloyd Taylor as non-executive director. Taylor was formerly CEO of Fletcher Challenge Energy.

CGG/Baker Hughes joint venture VS Fusion has acquired microseismic monitoring specialist Magnitude.

Saudi Aramco has awarded Halliburton the oilfield services component of the 300 well Khurais mega project.

Norske Shell has awarded Norwegian Octaga the real-time virtual reality visualization on its Ormen Lange facility.
Rock Solid Images is to team on the integration of seismic and well log data with OHM’s CSEM technology.

Don Roesle is CEO of Ryder Scott, John Hamlin is now managing senior VP and Joe Magoto has joined the board of directors. Jennifer Fitzgerald has left ExxonMobil to take a petroleum engineering position with the company.

Sun Microsystems and Stanford University have created the Stanford Computational Earth and Environmental Science (CEES) center for simulation and prediction of complex earth processes.

Valero LP has selected Energy Solutions’ PipelineStudio for its offline simulator.

Veritas has sold its land seismic acquisition business to Matco Capital Ltd.

Canadian Oil Sands Trust is to deploy WellPoint’s Oil Marketing System to manage its SynCrude participation.

Helix Energy Solutions has announced an IPO for its Cal Dive wholly-owned unit.

Cygnet has appointed Henry Hickey to its Application Engineering group. Hickey was previously with Oil & Gas Equipment.

Labrador Technologies is offering a 6% finders fee for fulfillment of a non-brokered private placement of up to $CDN500,000.
Statoil is to become ‘carbon neutral’ by buying quotas for CO2 emissions to offset its business travel and office heating.

Casey Johnson is North America Account Manager with Knowledge Systems. Johnson was previously with Mercury.

Tom Readinger has retired as Associate Director of the US Offshore Minerals Management Program. Readinger chaired the OMM’s Information Management Committee and promoted e-government.

Correction

Datalog CTO Dave Hawker pointed out a couple of errors in our April article on Datalog’s Anax 500 logger. Anax runs on OpenBSD not QNX as we wrongly stated. Although MySQL is ‘open source’, Hawker points out that this commercial use is under license.


IHS steps-up IT pace with acquisition

GeoPlus’ Petra interpretation package now in IHS fold, NAD83-compliant SuperGrid announced.

IHS Energy, now rebranded as a ‘segment’ of parent IHS, has been busy this month in the IT space. IHS has acquired GeoPlus Corp., developer of the Petra interpretation package, joined Texas A&M’s Crisman Institute for Petroleum Research and, in collaboration with Veritas, has launched ‘SuperGrid’ a NAD83-compliant dataset of Canadian survey data.

Petra

Petra is a PC-based geological, engineering, and petrophysical analysis tool. A companion product PetraSeis offers seismic interpretation – coupled with the Petra database. IHS is to offer these tools as front ends to its online data sets, providing users with near real-time project data refresh.

Mobed

IHS president and COO Ron Mobed said, ‘For Petra users, our ultimate aim is to refresh the project models for their full inventory of drilling prospects, automatically. This will enable faster and lower-risk decision making by our customers as they compare, evaluate and select new assets to drill, based on the latest information.’ The Petra software and customer service teams will remain in Tulsa and an IHS data integration team will work on the flow of IHS data into Petra. IHS will also maintain Petra’s support of data formats for a variety of data and software suppliers. Petra is capable of serving data to the Open Spirit data exchange bus.

SuperGrid

IHS Canada has teamed with Veritas to offer ‘SuperGrid’ a NAD83 compliant E&P survey grid for oil and gas companies, mapping professionals and industry-related software vendors. SuperGrid is available through IHS applications, from the IHS data hub or from third party vendors. SuperGrid currently covers the Western Canada Sedimentary Basin and results from an integration of government source points and IHS databases. The package includes the Western Canadian Dominion Land Survey grids and federal government theoretical grids (NTS, FPS). An Oracle database of more than 1.1 billion grid points allows for custom-built queries, layers and applications at specified point resolutions based on unique customer needs.

Crisman Institute

Following ongoing donations of its software, and gifts of free access to some of its data, IHS has become a member of the Crisman Institute for Petroleum Research at Texas A&M University. The Institute comprises the Halliburton Center for Unconventional Resources, the Chevron Center for Well Construction and Production, the Schlumberger Center for Reservoir Description and Dynamics and the Center for Energy, Environmental and Transportation Innovation. Other Institute members include ConocoPhillips, Anadarko, Baker Hughes, Total, Newfield, Devon, and Saudi Aramco.


Schlumberger acquires volume management

Information Solutions unit acquires production management technology from Quorum.

Schlumberger Information Solutions (SIS) has acquired the rights to Quorum Business Solutions’ Volume Management (QVM) package. QVM provides field operations data management and is used on several heavy oil projects in North America and one brown field installation in Asia.

Le Peuch

SIS president, Olivier Le Peuch said, ‘QVM is compatible with our next-generation technology framework and this partnership will accelerate our strategic initiative to deliver a complete production offering that integrates with back office operations.’

Weidman

Quorum CEO Paul Weidman added, ‘Quorum is focusing its efforts on delivering back-office solutions. We will retain our production and revenue accounting tools.’ Under the terms of the agreement, SIS acquires the rights to QVM as well as key development personnel. SIS will develop its production volumes management solution around QVM and will partner with Quorum to integrate with its accounting workflows.


Altair, Permedia team on HPC simulator

New version of MPath basin modeling package tuned for distributed grid computing.

Permedia Research Group has teamed with Altair Engineering to offer a grid technology based version of Permedia’s MPath petroleum migration and reservoir fill simulator. MPath, which is used in basin modeling and reservoir characterization, uses a high-resolution reservoir fluid pressure, continuity and mixing solver.

DRM

MPath’s Distributed Risk Module (DRM) generates and ranks hundreds of realizations to constrain cases where large uncertainties exist in the input parameters. To speed this compute intensive application, Altair’s PBS Professional grid technology is now available to run multiple simulations concurrently on a distributed computing grid.

PBS Pro

PBS Professional is a workload management solution for high-performance computing (HPC) environments. The software intelligently schedules computational across available hardware and software resources through policy-based grid technology.

Carruthers

Permedia Dan Carruthers said, ‘Powerful, easily accessed compute resources are essential to exploration decisions that have significant monetary consequences. By integrating PBS Professional with MPath, our customers can run more simulations with their existing hardware and make better decisions.’


Landmark to provide Energy XXI’s IT

Bermuda-based E&P company to outsource upstream information management to Halliburton unit.

Landmark, (which has now dropped the ‘Graphics’ to become a ‘brand of the Halliburton Energy Services Group) has been awarded a two-year contract by Bermuda-based E&P company Energy XXI to provide ‘total upstream information technology outsourcing, hosting and consulting services’. The contract includes data loading, application hosting, data management, disaster recovery, geophysical, geological and production optimization software, field development consulting services and onsite IT services

Meikle

Landmark VP Doug Meikle said, ‘This contract demonstrates the value of Landmark’s IT outsourcing, data management and our hosted environment.’

Schiller

Energy XXI chairman John Schiller added, ‘By outsourcing these services to Landmark, we can focus on our core business without having to support local infrastructure, data management and IT staff.’


eLearning—basic principles of petroleum

American Petroleum Institute University offers on-line learning.

The American Petroleum Institute (API) has released an interactive computer-based training course on the ‘Basic Principles of Petroleum’ as part of its continuing education program for oil and gas professionals. The eLearning course can be taken on-line or delivered on CD-ROM.

Combs

API business director Kathleen Combs said, ‘The course was designed to address the learning needs of newly-hired personnel but will also benefit cross-training of staff in engineering and support roles’. Course modules include petroleum geology, exploration, drilling operations, production, refining, and distribution. An individual license to the course can be ordered through the API University website at api-u.org for $145.


Netways’ requisition solution for Lamnalco

Microsoft Dynamics and Captaris Workflow support Mid East oilfield service company’s procurement.

United Arab Emirates-based Netways has automated the requisition process for the Lamnalco Group of Companies. Lamnalco provide marine support services to oil and gas terminals and ports worldwide. The business process automation project leverages Microsoft Dynamics GP (formerly Great Plains), Captaris Workflow and Microsoft Visual Studio.

Seamless

A state of the art requisition model is ‘seamlessly integrated’ with GreatPlains and now allows Lamnalco to initiate requisitions, approve them according to company rules, monitor inventory, automate RFQs and create purchase orders from Microsoft Dynamics GP.

Captaris

Captaris business process workflow is applicable to any repetitive activity. Tasks are automatically assigned and delivered to individuals or groups. Tasks remain in the task lists until they are completed. Escalation rules allow unfinished tasks to be re-assigned to managers or overflow teams.


SBM chooses AspenTech for FPSO design

HySys technology key to simulation and optimization of floating production systems.

Floating production storage and offloading systems (FPSO) specialist SBM Offshore has signed a multi-year license agreement to expand its use of AspenTech’s engineering solutions. The new agreement provides access to HySys Upstream which SBM uses to support design and optimization of its oil and gas production facilities. The HySys Upstream option adds industry standard methods for handling petroleum fluids to the base HySys engineering package.

Wyllie

SBM Chief Engineer Mike Wyllie said, ‘AspenTech’s simulation applications are tailored to our industry. Providing our engineers with flexible access to more tools will enable us to make better design and investment decisions, in addition to increasing our engineering efficiency on upstream projects.’ Other SBM Offshore units GustoMSC, SBM-Imodco and Single Buoy Moorings have already standardized on HySys. The new agreement expands usage and adds support for other applications including dynamic simulations and the design of heat transfer equipment.

Wheeler

AspenTech senior VP Blair Wheeler added,’HYSYS is recognized as the leading platform for simulation and optimization in the oil and gas industry. SBM’s decision reflects the value it can deliver to clients by optimizing the design and performance of production systems.’


New standards from OpenGIS, GeoRSS, GEOSS

System of Systems GIS interoperability demo, GeoRSS location-aware news feed standard unveiled.

The Open Geospatial Consortium has released a Simple Features Profile fro its Geography Markup Language specification. This specifies 2D geometry features such as Point, Curve, Surface etc. and is claimed to ease development of GML-based software for communities that share a common data model.

GEOSS

Interoperable Web Services, another OGC spec, were successfully demonstrated in the Global Earth Observing System of Systems (GEOSS). GEOSS showed complex 4D wind data and online meteorological processing services being published, discovered, and accessed over the internet.

GeoRSS

A new standard for geo-enabling RSS feeds, GeoRSS, was shown at the Location Intelligence 2006 conference. GeoRSS embeds location into RSS feeds for display in a GeoRSS-enabled client. GeoRSS is a simple XML format for associating a point, line, boundary or bounding box with an RSS item. More from georss.org.


White paper explores RFID use in oil and gas

Wipro’s Vibhor Gupta describes multiple applications of radio frequency ID tags in the oil patch.

A new White Paper, authored by Vibhor Gupta (Wipro) outlines likely applications for RFID technology in oil and gas. Along with the regular identification number (ID), some RFID tags contain kilobytes of data on, for instance temperature history. RFID data can be read by an operator with a hand-held device or transferred directly to a computer for automated inventory management.

AIDC

RFID, also referred to as Automatic Identification and Data Capture (AIDC), embeds digital memory that can be programmed using radio signals. This can take place in tough production environments at high speed. Most RFID devices have a latency of 1/10th second.

Anti-collision

Tags may be active (batteries included) or shorter range, passive tags that receive their energy from the radio frequency field supplied by the reader. Tag readers talk to individual items thanks to sophisticated ‘anti-collision’ algorithms that uniquely identify each tag.

Marathon

In-depth systems, a technology company established by Marathon Oil, offers RFID solutions for the oil well applications including accurate perforating gun triggering. Actuant unit Hydratight uses RFID tags to ensure correct assembly of pipe joints. Finally, there are multiple applications for RFID in e-business and cashless transactions. Read the full Wipro White Paper on www.oilit.com/papers/wipro.pdf.


Real time cost management for oil and gas

Decision Dynamics’ Oncore enables highly granularity cost control of oil and gas projects.

Calgary-based Decision Dynamics has just announced ‘Oncore’, a real-time, project cost management package for oil and gas. Oncore, formerly known as Time Industrial tracks labor, equipment, materials and other costs for capital and operations/maintenance projects by line item and provides robust analytics for contractor performance monitoring. Business benefits include reduced post-project audit costs, improved owner/contractor relationships achieved through fewer invoice disputes, and increased accuracy of future estimates based on historical project data maintained in the Oncore database.

Jarman

DD CTO Andrew Jarman said, ‘Traditional financial accounting systems report project costs at a summary level, often 60 to 90 days after the work is done. This makes accurate root cause analysis, problem resolution, and proactive prevention of serious cost and schedule deviations impossible. Oncore tracks line items in real time and represents a breakthrough in cost control and financial and performance. In one six-week Oncore pilot project, an oil and gas company discovered $200,000 in duplicate charges for hours billed. ERP systems may check for duplicate invoices but lack the line item granularity that flags this kind of problem.’ Oncore is available both as installed software and as an application hosted in Decision Dynamics’ data center.


InfoWeb posts OWL version of ISO 15926

InfoWeb’s semantic model of the ISO plant data standard gets thumbs-up from Berners-Lee.

Dutch standards body, USPI-NL, is in the process of implementing the ISO 15926 standard for integration, sharing, exchange, and hand-over of plant data. Part 7, now implemented in OWL, the W3C’s ‘Ontology Web Language,’ defines implementation methods required to ‘use ISO 15926 to its full extent.’

Berners-Lee

Fluor Corp.’s Onno Paap, implementer of the RDF/OWL version of ISO 15926 told Oil IT Journal that the OWL/RDF mapping got a thumbs-up from none other than the father of the World Wide Web, Tim Berners-Lee. Berners-Lee has been pushing RDF as the key technology behind the Semantic Web (OITJ Vol. 9 N° 1) and considered the ISO 15926 initiative as a benchmark test of the Semantic Web. InfoWeb transforms ‘rules’ from legacy EXPRESS-based models into OWL’s ‘axioms’. Check out the results on www.infowebml.ws.


Tadpole’s i-Plan for BP shutdowns

BP, Interserve and Tadpole Technology have developed i-Plan, a ‘one stop’ technology for plant shutdown.

Interserve Industrial Services (IIS), which manages BP Exploration’s Dimlington, UK gas terminal, has selected UK-based Tadpole Technology’s i-Plan to manage an upcoming plant shutdown. i-Plan is a ‘one-stop’ technology solution for the plant shutdown process. The package was jointly developed by BP, Interserve and Tadpole to reduce the cost and effort involved in plant shutdown.

Heilbron

IIS Director Kevin Heilbron said, ‘Ready access to accurate information is the key to shutdown management. i-Plan ensures that current information is available wherever and whenever it is needed for planning and ongoing optimization of resource management and a streamlined shut-down.’

GIS

Tadpole specializes in the application of geospatial technology for office-based and mobile users. The company has previously developed GIS applications, workflow management solutions and geospatial data synchronization technologies that are used by national mapping agencies, utilities and government. Tadpole Technology is an ESRI business partner.


Michigan Tech speaks DoE’s LINGO

US Department of Energy awards $1.3 million to projects to mitigate environmental impact of E&P.

Three awards have been made from the US Department of Energy $1.3 million Low-Impact Natural Gas and Oil (LINGO) scheme, managed by the National Energy Technology Laboratory. The University of Arkansas and the Argonne National Laboratory, will develop a web-based package to enable small E&P companies to generate development plans for work in sensitive ecosystems within the bounds of the Fayetteville Shale play.

Michigan Tech

A second award goes to Michigan Tech for a new strategy to satisfy a state environmental regulation that places large tracts of the Michigan Basin off-limits to E&P. Interstate Oil and Gas Compact Commission (IOGCC) gets another award to develop an Adverse-Impact Reduction Handbook to help E&P companies identify onshore barriers to E&P and provide viable approaches to minimize impacts. The handbook will serve as a best practices guide with case studies, field research, and ‘broad stakeholder input’ to overcoming opposition or delays for E&P activity. IOGCC is partnered by ALL Consulting, Devon Energy and various State agencies.


Schlumberger markets AspenTech to upstream

Asset Builder and Operations Manager to link upstream and downstream business processes.

Schlumberger, through its Information Solutions (SIS) unit, has acquired exclusive rights to develop products and solutions for the upstream drilling and production market using Aspen’s Asset Builder. Asset Builder is an integration framework that enables dynamic asset models to be built from components such as Aspen’s own HySys, Schlumberger’s Avocet and other tools including Microsoft Excel.

Le Peuch

SIS president Olivier le Peuch said, ‘Our goal is to drive operational efficiency by eliminating the divide between upstream and downstream business processes. The oil and gas industry has responded favorably to our integration effort. This step will provide the upstream market with accelerated technology delivery and a unified market presence.’

Operations Manager

In a separate agreement, SIS acquired the rights to develop or embed another Aspen package, Operations Manager (OM), in its own upstream products. OM provides infrastructure capabilities including role-based process data visualization, an ‘open simulation environment,’ performance scorecarding and event management.


Petrobras takes part in SEC XBRL trial

Only oil and gas company involved in SEC benchmark test of ‘level playing field’ financial reporting.

Petrobras is to take part in a Securities and Exchange Commission (SEC) pilot program to use interactive data in its financial statement filings. The pilot program will enable the participants to determine the benefits of using interactive data, provide feedback to the SEC and enable investors and analysts to assess new techniques for analyzing interactive data reports submitted to the SEC in the eXtensible Business Reporting Language (XBRL) format.

Cox

SEC Chairman Christopher Cox said, ‘As the number of companies submitting interactive data grows, it’s clear that making information available to investors in a more useful way is cost effective.’ At a roundtable on interactive data held in Washington this month, Cox said that the XBRL initiative, ‘avoids the drudgery of digging into today’s dense documents to fish out the data.’ Cox invited participants to ‘imagine the possibilities if all company and mutual fund financial information were available to everyone for free in real time, directly from the source.’

Key facts

Interactive data will level the playing field between the ‘well endowed’ financial firms who can afford to compile comparative data and the individual investor. Companies interested in joining the test group should visit the interactive data spotlight page at www.sec.gov/spotlight/xbrl.htm.


CGG buys IBM PowerPC supercomputer

Geophysical company renews compute hardware on two year cycle—113 teraflop peak announced.

French geophysical contractor Compagnie Générale de Géophysique (CGG) is to deploy IBM BladeCenter JS21 clusters at its data processing centers in France, London, Kuala Lumpur and Houston. CGG expects it total compute bandwidth to grow to a theoretical peak of 113 teraflops. The supercomputers will be used to speed CGG’s seismic imaging effort.

Power PC

The Power PC-based supercomputers leverage IBM’s new BladeCenter JS21 systems. Initially, CGG has expanded its facilities by deploying some 2,800 JS21 nodes, comprising dual-core PowerPC 970MP 2.5 GHz processors with a 1MB L2 cache per core, in 200 BladeCenter H chassis. CGG’s Geocluster seismic processing package has been optimized for the VMX vector accelerator in the JS21.

Cambois

CGG VP Guillaume Cambois said, ‘Technology is our number one ally and we renew our calculating capabilities every 24 months. BladeCenters will increase our calculating power and provide clients with more accurate, faster imaging.’


Twin Canadian wins for PlantWeb

Canadian Natural Resources and EnCana select Emerson’s automation and safety technology.

Canadian Natural Resources Ltd. (CNRL) has contracted Emerson’s Process Management unit for the automation of its $9.4 billion ‘Horizon’ oil sands development in Alberta. The first phase of the Horizon project will allow CNRL to produce 110,000 barrels per day. Later phases of the project will raise total production to 232,000 barrels per day upon completion in 2012.

Wing

Bob Wing, project coordinator at CNRL, said, ‘Risk mitigation was a primary concern of the Horizon project team. We chose the PlantWeb architecture based on Emerson’s experience in oil sands projects and global engineering expertise. Emerson project members will join our Automation Integration Team to concentrate our ten process areas into a single control room.’

EnCana

EnCana Corp. also awarded Emerson a contract for a PlantWeb deployment on its Steeprock natural gas processing plant in BC. Steeprock will process 190 MMSCFD of raw gas. PlantWeb was selected in part because of its ‘smart’ safety instrumented system technologies and for the predictive nature of the basic control system.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.