September 2010


G&G visibility for XHQ

Siemens’ Energy unit extends its XHQ Operations Intelligence ‘digital oilfield’ offering into the subsurface with the OpenSpirit development kit. Siemens Energy’s Dave Horn explains how.

We have long argued that the absence of the major equipment providers has undermined the credibility of the ‘digital oilfield’ effort. This may be about to change as Siemens has acquired a license to the OpenSpirit software development kit (SDK) and will be offering visibility of geology and geophysics data to users of its XHQ Operations Intelligence dashboard. Siemens Energy’s Dave Horn told Oil IT Journal, ‘The partnership with OpenSpirit will offer access to a wide range of data resources and represents a significant step on the path to a true digital oilfield.’ Horn explained that initially, Siemens will leverage the SDK to offer visibility of OpenSpirit-enabled geoscience data to activities such as root cause analysis, operations and maintenance, conducted from the XHQ dashboard. XHQ provides plant and process industries with aggregated operational and business data in real-time, offering configurable dashboards and KPIs for performance management and decision support. XHQ’s use in oil and gas began in the refinery, with flagship clients such as Chevron and Saudi Aramco. But the technology is moving upstream—notably with Aramco’s deployment of XHQ in its Enterprise Management Solution.

The deal will extend XHQ’s i-field data visualization capability, pulling in key well data to, for instance, tune a model for performance enhancements. Results could then be passed to another OpenSpirit-enabled application like Petrel for more modeling before updating the XHQ performance dashboard. Other use cases include field planning and facilities design. Access to geoscience data types will be provided either by launching a native application or by using one of OpenSpirit’s basic data viewers. Siemens is also talking with INT about integrating some of its G&G visualization technology. Siemens recognizes that some of its planned use cases will only be fully realized when OpenSpirit further develops a web services-based information bus. Indeed it may be a while before the OpenSpirit footprint extends to embrace engineering data from operations and maintenance.

The press release made great play of the role of Microsoft’s Upstream Reference Architecture (MURA) in the Siemens/OS deal. We put it to Horn that Siemens developers will be able to use the SDK without any input from MURA. He agreed but noted ‘Siemens is a long term Microsoft partner and key XHQ functionality is based on Microsoft products. We could do all of this without MURA but we anticipate that the MURA products and services will bring more functionality to the table.’ Finally, Siemens were as surprised as we were to learn of OS’ acquisition by Tibco (see below). More from www.oilit.com/links/1009_1.


Tibco bags OpenSpirit

Unexpected deal sees ownership move from Chevron, Paradigm, Shell and Schlumberger into hands of horizontal data integration specialist.

In a surprise move, Tibco Software has acquired upstream data integration specialist OpenSpirit from its shareholders, Chevron’s Technology Venture unit, Paradigm, Schlumberger and Kenda Capital (representing, inter alia, Shell and the Abu Dhabi Investment Authority). Terms of the deal were not disclosed. OpenSpirit evolved from a Shell/POSC (now Energistics) project to develop ‘business objects’ for the upstream. Today, the company provides integration technology to most major upstream data stores and applications to its 200 clients in 57 countries.

Tibco provides a real time data infrastructure to (mainly) the financial services industry but has dabbled in the upstream previously, notably with its 2003 Well Development Optimization package (Oil ITJ June 2003) and more recently with its May 2007 acquisition of the risk modeling toolset Spotfire and its DecisionSite offering. OpenSpirit’s CEO, Dan Piette, has joined TIBCO to lead the development of the oil and gas business. More from www.tibco.com.


More (or less?) on MURA and on secretaries and social networking

Editor Neil McNaughton puzzles over the contribution Microsoft’s Upstream Reference Architecture made to the Siemens/OpenSpirit deal in this month’s lead. He then turns to social media and offers a few words of advice to would-be bloggers and LinkedIn groupies.

Pondering the Siemens/OpenSpirit announcement in this month’s lead, and passing, for the moment on what Tibco’s acquisition means for the deal, for OpenSpirit and its clients, I was fascinated by the insight that this deal gives us into the Microsoft Upstream Reference Architecture—MURA. MURA’s contribution to the deal can be ascertained by solving a couple of equations as follows. First, we have something like, Outcome (‘O’ of Siemens/OpenSpirit deal) equals the sum of the contribution of the parties, to the deal i.e. Siemens (S), OpenSpirit (OS) and MURA (M). So we have...

O = S + OS + M.

But we also learn from Siemens that the same functionality would have been achieved without MURA. This gives us another equation

O = S + OS.

Now I have been staring at these two equations for a while, trying to remember my elementary algebra. I finally came to the conclusion that by a judicious bit of subtracting, we can solve for M to give ...

MURA = 0!

(The ‘!’ is an exclamation mark, not a factorial, by the way, although this does not make much difference to the math.)

~

A while back, when working for an employer of a rather established olde worlde nature, I was surprised when, following some behind the back murmurings and complaints, I was gently reprimanded by the CEO. My crime? Writing my own reports on the computer, rather than going through the proper channels of the secretarial pool. My reaction was to ignore what seemed to me to be complete nonsense. And in so far as the big picture goes, I was right. Secretarial pools are no more, and everybody now uses the computer to write just about anything. As far as the smaller picture goes, I was arguably wrong, as I was soon ‘let go’ by said organization for this and other similar misdemeanors.

Those were the days though. When you had a big chair behind your own desk and a couple of little ones to make your guests feel small. At the time, even a middle manager would have his (or, much less likely, her) own secretary who when not typing out a memo did other cool stuff like ‘managing’ a phone call. The idea was always, even if you were actually making the call, to have the other guy hanging around at the other end of the line, while you seemingly busied yourself with business before just managing to attend to the call. This exercise led to pitched battles between secretaries in the two camps as they made sure that their adversary was actually waiting on the line before putting their boss through.

Later, but in a similar vein, when an email came in, it was beneath the dignity (and probably beyond the capability) of a manager to reply. So an incoming mail from manager A’s secretary would be printed out, placed in an in-tray, examined by boss B who would probably get his secretary to reply by fax. Nowadays, everyone is in front of the computer all the time and such silliness has gone—or has it?

Actually, a similar phenomenon is taking place today as somewhat out-of-touch managers wrestle with the new fangled ‘social media.’ Now I have to admit that there is a lot of social media that I don’t really ‘get’ myself. I have been around too long in IT and it seems that a lot of it is not actually very new. I met a friend at a recent tradeshow who has become a Facebook aficionado and brandished her Blackberry at me saying, ‘see how I am kept informed about all this stuff. Our users can even see what the current state of our printer is and plan their usage accordingly.’ I peeked over her shoulder at the magical device and saw what looked to me like an RSS feed—something Oil IT Journal has offered for about five years already. But I digress…

There is one facet of social media that I believe that I do understand—blogging. I even started a blog once. A short lived thing which stopped when it occurred to me that blogging, at least for me was exactly the same as writing an editorial. As I had been doing this for several years before blogging was invented, I saw no reason to change and stayed with the present media. Having blogged/editorialized for a while puts me in a position to offer some advice to those in a corporate or organizational environment struggling with the new technology.

So what makes a blog a good one? Two things, focus and disclosure. This is what your audience is looking for—whether you are a socialite telling all or a technologist unveiling the next i-Thing. Let’s turn this simple notion around and see how companies and organizations perform at the intersection of focus and disclosure...

Companies and orgs may or may not have focus and they may or may not want to disclose what it is. The blog is potentially a window into the program, business plan, emerging technology and other stuff of a sensitive nature. The modern CEO seeking to ‘leverage’ social media can no longer turn to his or her secretary. So instead, they engage public relations ‘boutiques’ offering social networking services. This means that blogs are filtered and diluted. Information goes from those that know through those that may or may not understand and what is often an insipid, inaccurate post results. Dilution occurs of the most ludicrous kind when postings divert completely from the blog’s focus. To give just one example, the virtual takeover of the SPE LinkedIn discussion group by a news outlet.

It is curious to note that the instigators of such blogs and LinkedIn groups are so enthused that they have amassed a head count of a few hundred users. What they should be looking at is alignment of social media with their goals. Is it focused? Is good stuff being disclosed? Has the blog been hijacked by an individual or a competitor?

Right, that’s that editorial finished—just got to get my secretary to type it up and post it to my blog...


Oil IT Journal interview—Tom Smith, GeoInsights

Founder of SMT and fledgling Geophysical Insights expounds on new pet topic—neural networks.

What have you been up to since selling Seismic Micro Technology (SMT)?

I stayed on for a few months to assist with the reorganization, I’m still on the SMT board. After that I began doing what I enjoy most—solving geophysical problems. The late Tury Taner of Rock Solid Images was inspirational and he helped me understand the theory and practice of neural net techniques—that was a high point in my professional career.

Geophysical Insights (GI) is separate from SMT?

Absolutely. GI is involved in geophysical research. We will be presenting a paper at next month’s SEG.

What are your other R&D areas of interest?

I can’t talk about this yet but we are working on challenges and on truly fundamental geophysical research questions. We are not taking the university/JIP route. Our clients are looking for a vehicle for research collaboration on such issues. We are also working with Sven Treitel on some pretty deep topics.

Is GI’s neural net technology pre or post stack or what?

It is essentially multi-attribute work. People can easily generate 20, 30 or 50 attribute volumes so for any time (or depth) you have a multiplicity of spatially distributed attribute values. Now I defy anyone to interpret more than 2-3 attributes at a time! You can do flicker type comparisons—but these remain a challenge to interpret. Interpreters like to leave no stone unturned—and need help from something like a neural net to do, say, spectral decomposition, with up to 20 frequency bands—and then to ‘ground truth’ the results against well logs. We like to think of seismics as a cylinder of data surrounding a well bore—maybe with 30 or so attributes. These need to be matched with another set of ‘attributes’ coming from borehole data—extending cross plots to many dimensions.

We have often wondered just how many truly independent (and meaningful) attributes can be computed from the limited number of field measurements in the seismic trace…

This is the ‘saliency’ issue in a neural nets and in machine learning. How much redundancy do you need and what makes up truly independent attributes. One tool to investigate this is principle component analysis. This has been around for a while but has had limited application. It is easy to run the wheels off the trolley! We use an unsupervised neural net and look for natural clusters in the data. The answer is, it all depends, some attributes are significant some of the time. Elsewhere, in other geographies, they may not work so well.

Any flagship client/projects?

Not that we can talk about yet. But Tury was working on this for a decade—using neural networks for lithology identification.

Basically, neural networking boils down to data mining…

Exactly. Supervised neural nets have been used in optical character recognition and in bibliographic text mining through thousands of scientific articles—using key word counts and clustering in terms of relevance into topic maps. It is easy to see how neural nets can be used to extend attribute analysis across a company’s 3D data resource—using it on all 3D surveys to understand different areas of the world and build a knowledge base. Neural net techniques can also be across a well log repository to check for a particular response and build company-wide wisdom. Note that there has been some reticence to neural nets and artificial intelligence. But such techniques are not intended to ‘replace’ the interpreter. We just want to get past the drudgery of interpretation—rather like when we went beyond manual digitization to the workstation. Neural nets can also be used for business trends in data. Wal-Mart is said to have a larger database than the CIA! More from www.geoinsights.com.


Symantec 2010 Information Management Health Check

Survey of 1,680 execs shows good understanding but poor execution. 5 step action plan proposed.

Last month Symantec released its 2010 ‘Information Management Health Check’ survey of 1,680 senior IT and legal executives in 26 countries. The report includes three main findings and Symantect’s recommendations for a fix. While the study covers many industries, its findings will have echoes for those working in the technical data field.

The first finding was that there is a huge gap between information management goals and practice, between what enterprises realize they should do and what they actually do. 87% of respondents realize that there is a need for an information retention strategy, but only 46% have such a plan in place.

Finding number two is that the gap between goals and reality is driving common mistakes. Enterprises are ‘over-retaining’ information, keeping everything ‘just in case.’ 75% of backups have infinite retention or are on legal hold. Current estimates are that there might be as much as 38 petabytes of backup tape dedicated to retaining enterprise information forever in a format that is difficult to access and manage. Many companies are performing legal holds incorrectly, using their backup systems inappropriately instead of an archival system. While most companies prohibit employees from creating their own archives, 65% admit that end users routinely do so anyway.

One reason for the large amount of unnecessarily stored information is the fact that it is 1,500 times more expensive to review data than it is just to store it. But the burgeoning data volumes blindly backed up mean that backup windows are ‘bursting at the seams.’ Some ‘weekend’ backups take more than a single weekend. The situation for recovery from a backup is even worse as the time taken to restore the backup behemoths brings a disaster recovery program to its knees.

Symantec recommends offers a simple 5-step action plan to solve the IM. This involves the creation of a formal information retention plan. Stop using backup for archiving and legal holds—backup is for disaster recovery and should just hold a few weeks of data—the rest goes to archive. De-duplication further reduces the data storage footprint. And finally, delete data according to the plan. The survey, performed by Applied Research-West, is a free download from www.oilit.com/links/1009_2.


All the offshore Norway data in 35 seconds?

Bold Petrel ad claim contrasts with data loading benchmark from EMC.

The Petrel ad makes a bold claims of ‘loading all the offshore Norway data in 35 seconds,’ but some users report less stellar performance. A white paper from EMC provides some chapter and verse on data load from network attached storage (NAS), in particular, EMC’s Celerra hardware and the ‘Upstream Application Accelerator’ (UAA), a ‘branded’ version the Celerra Multi Path File System (MPFS).

To cut to the chase, EMC’s tests with multiple high end Dell Workstations running against the Celera/UAA showed data load times of ‘less than a fifth of the time needed for traditional NAS solutions using CIFS.’ The system ‘even’ beat local disk performance, albeit by a modest 12%.

Elapsed time to prefetch the 30 GB dataset dropped from 22.3 minutes to 3.6 minutes. Tests on 64 bit Vista 64 (which, like Windows 7 includes SMB 2.0, an enhanced CIFS-type connection) improved prefetch time by 207% percent over the same ‘traditional NAS.’ Prefetch time for the 30 GB dataset dropped from 11.1 minutes to just 3.6 minutes.

All of which begs the question, how can you load ‘all the Norway data in 35 seconds?’ We learned at this month’s ECIM that the Norwegian DISKOS data repository now holds some 155 terabytes of data... Download the whitepaper from www.oilit.com/links/1009_3.


Subsurface Applications Interoperability Review

New Digital Business JIP addresses subsurface challenge of poor inter-application data flows.

UK-based consultants New Digital Business have announced the Subsurface Applications Interoperability Review (SAIR), a joint industry project that sets out to address the ‘subsurface challenge’ of poor inter-application data flows that hinder interpretation. While most software has considerable data exchange functionality built-in, using it is fraught with problems such as the nitty-gritty of which data exchange format and version is appropriate for a given workflow. This leads to a situation that NDB describes as ‘data atrophy’ and sub-optimal ‘stop/start’ interpretation practices. Reducing the number of applications may help—but this has its limits as today’s subtle traps and fast moving exploration targets require specialized algorithms and tools. NDB believes that software vendors are not in a position to thoroughly test workflows involving third party products.

Enter SAIR, a secure, online knowledge base that contains test information on inter-application data exchange. SAIR highlights cross application I/O deficiencies—particularly as component applications are upgraded with new versions. Users can login and record their own experiences with data exchange or see how other members fared with similar tasks. A matrix of applications and data types shows claimed versus real-world exchange successes. SAIR subscribers currently include Premier Oil, Centrica and Nexen. More from www.oilit.com/links/1009_5.


IPEGG leg-up for Elfen

Leeds University research unit develops link between geomechanical package and Roxar’s Tempest.

The University of Leeds, UK announces completion of its multidisciplinary Integrated Petroleum Engineering, Geomechanics and Geophysics (IPEGG) research project. IPEGG set out to ‘better integrate’ software and workflows for reservoir engineering. A noteworthy result of IPEGG was the integration of Emerson/Roxar’s Tempest reservoir fluid flow modeling suite with Rockfield Software’s Elfen geomechanical modeling tool.

Elfen is a 2D/3D numerical modeling package for multi purpose finite element and discrete element analysis computation. Elfen was developed in collaboration with the Institute of Numerical Methods in Engineering at the University of Wales Swansea.

Quentin Fisher, professor of petroleum engineering and IPEGG project lead said, ‘We coupled Tempest to Elfen with a message passing interface. This approach makes for better integration and faster simulation workflows than depending on restart files to transfer data between the programs.’ More from www.roxar.com and www.rockfield.co.uk.


Westheimer proposes spatial architecture maturity model

Oil IT Journal contributed white paper tracks progression from ‘obstructive’ to ‘managed.’

A whitepaper from the new upstream data management consultancy, Westheimer, proposes a spatial architecture maturity model (SAMM) to evaluate corporate spatial metadata management (MDM). Westheimer’s thesis is that proper spatial MDM gives end users confidence that spatial data on a map comes from a vetted repository—as opposed to a collection of unmanaged shape files, personal geodatabases, CAD files and un-referenced imagery. Repositories such as PPDM and Schlumberger’s Seabed can store geometry data in tables. But they lack robust methodologies to generate ‘true’ geometric objects. Enterprise level master spatial repositories such as ESRI’s SDE and Oracle Spatial address cartographic meta-data issues such as coordinate reference systems, versions and obscure legacy data formats. SAMM divides the maturity space into six levels from Level 0—or ‘obstructive’ where counter-productive work practices are imposed by management to Level V where spatial data is managed and QC’d on an ongoing basis.

The whitepaper analyses corporate GIS deployments—notably in BP—and describes the ‘compelling’ use of GIS in the National Oceanic and Atmospheric Administration’s (NOAA) GeoPlatform website in response to the Deepwater Horizon oil spill. Public interest in oil and gas spatial information is at an all-time high due to the recently raised profile of the industry as drilling continues to move into new onshore shale plays and deepwater. More from www.oilit.com/links/1009_4.


Software, hardware short takes

IDS, OSIsoft, Aveva, Caesar Systems, ESRI, FaultSeal, ffA, Geomechanics International, Golden Software, Iconics, Kalido, Kappa Engineering, LianDi Clean Technology, Meyer, Midland Valley, NVIDIA, CAS, Visual Sciences Group, Quorum, Roxar, P2 Energy Solutions.

IDS is now using Energistics’ WITSML drilling data standard to auto-populate daily reporting in its DataNet2 package—www.idsdatanet.com.

The 2010 edition of OSIsoft’s PI System combines PI’s time-series data archival with asset metadata management in SQL Server, connectivity Microsoft Office, SQL Server, SharePoint and SAP systems—www.osisoft.com.

The 4.5 release of Aveva Net Portal automatically identifies and dynamically indexes relationships between engineering objects and associated documents. The new GUI can be configured as role-specific dashboards to display information on a ‘need-to-know’ basis—www.aveva.com.

Caesar Systems has upgraded its PetroVR decision support package. Version 8.2 is faster and includes enhancements to its Monte Carlo-based risk management such as a convergence graph to track results across iterations. Other new functions cover EOR and unconventional gas development—www.caesarsystems.com.

At the FOSS4G conference in Barcelona this month, ESRI released the Open GeoServices REST specification, a standard way for browsers and other web clients to interact with geographic information systems (GIS). Developers can expose a GeoServices API from ArcGIS Server or non-Esri servers or geo-processors. The JSON-based spec is usable in many client-side development environments including JavaScript, Flex, Silverlight, iOS, and Android—www.esri.com/opengeoservices.

V3.0 of FaultSeal’s FaultRisk faulted prospect volumetrics calculator now runs on both Windows XP and Mac OS-X. The package is available in workstation, enterprise and hosted. FaultRisk generates Allan Maps, calculates Leak Points and integrates this information with depth area plots to produce distributions of trapped hydrocarbons—www.faultseal.com.

New releases of Foster Finlay Associates’ SVI Pro and SEA 3D Pro (both at V2010.2) include an optional ‘Link’ module for OpenSpirit connectivity. Link for OpenSpirit lets users create new projects, transfer volumes to and from third party data stores, and load well bores, well velocity and well picks into their SVI Pro / SEA 3D Pro projects—www.ffa.co.uk.

GeoMechanics International (GMI) has released GMI PressCheck for pore pressure and fracture gradient prediction. PressCheck includes tools for import, manipulation and combination of log data along with filtering and log calculations. New features include centroid and buoyancy analysis are also included—www.geomi.com.

Golden Software has announced Strater 2, an entry-level package for well log plotting. Strater comes at the ‘unbeatable’ price of $449—www.goldensoftware.com.

Iconics has announced Genesis64 V10.5 a new version of its 64-bit HMI/SCADA integration system. Genesis leverages the latest OPC Unified Architecture (OPC-UA) and 64-bit hardware. IT professionals can now integrate real-time SCADA information into a common, web-enabled dashboard. Compatible with Windows 7, the new release offers Silverlight 3.0 support, ‘multi-touch’ connectivity, cover flow, carousel navigation and custom ribbons. Genesis64 is integrated with Microsoft Bing Maps—www.iconics.com.

Kalido has launched an online data governance maturity benchmarking service, an interactive tool to determine the current state of corporate data governance and offer advice on advancing to the ‘next level.’ Take the test on www.oilit.com/links/1009_6.

The 1.1 release of Leica Geosystems’ Zeno Office, a GPS/GIS suite for mobile workers includes an OEM version of ESRI ArcPad 8 and a ‘one-click’ automated workflow between field and office—www.oilit.com/links/1009_7.

Kappa Engineering’s Emeraude V2.60 has just rolled out with new processing for multi probe production log analysis, new optimization algorithms, a new temperature model and a steam injection design option—www.kappaeng.com.

LianDi Clean Technology has received software copyright certificates (Chinese patents) for its stock replenishment, resource scheduling and distribution planning optimizers. The certificates provide 50 years copyright protection—www.china-liandi.com.

Meyer & Associates has released the 2010 edition of its hydraulic fracturing software suite with Windows 7 compatibility and support for analyzing the production and economic characteristics of horizontal wells. The company has also embarked on a multi-year project to open and document its file formats. Meyer was acquired by Baker Hughes this month—www.mfrac.com.

Midland Valley has developed a link to Landmark R5000, linking Move 2011 with OpenWorks—www.mve.com.

Nvidia’s 3D Vision Pro solution leverages its Fermi architecture and Quadro GPUs to hike computational simulation eight fold. OEMs using the new technology include Dell, HP, Lenovo, and NextComputing. The top of the range, 36 megapixel Quadro Plex 7000 carries s a $14,500 price tag.

Lodz, Poland headquartered CAS has released a free OPC UA Viewer for integrators and developers to connect to OPC UA servers, read data and browse the model structure. CAS also provides an online OPC UA handbook—www.cas.eu and www.oilit.com/links/1009_8.

The 8.1 release of Visual Sciences Group’s Open Inventor includes support for OpenCL-based computation and geo-referenced objects using the X3D GeoVRML specifications. The system is a native implementation for Linux and Windows and performance has been optimized for Nvidia’s Quadro Plex—www.vsg3d.com.

Quorum Business Solutions has released a gas plant accounting and allocation forecasting module, Quorum Forecasting, for prediction of current and future production month processing results using volume extrapolation algorithms and margin analysis tools— www.qbsol.com.

Roxar’s RMS 2010.1 now includes seismic volume visualization and new quality control for property models—www.roxar.com.

The new Sitrans FST020 clamp-on flowmeter from Siemens promises 1-2 % accuracy and ease of deployment—www.siemens.com/flow.

P2 Energy Solutions’ Tobin Enterprise Land 3.2 includes complex acreage scenarios with overlapping leases of varying depths as are encountered in US shale plays—www.p2es.com.


ECIM E&P Data Management 2010—Haugesund, Norway

Strong turnout for the 2010 edition of Norway’s Expert Center for Information Management E&P information management conference. Highlights include Shell and Statoil’s migration to Landmark’s R5000, PetroChina’s vision for the digital oilfield, Chris Bradley on BP’s enterprise architecture, Nexen on GIS-based data management and ExxonMobil on GIS for geologists.

Registration for the 2010 edition of Norway’s Expert Center for Information Management (ECIM) E&P information management conference in Haugesund this month topped out at over 330 making ECIM the largest upstream data conference in Europe. International (non-Norwegian) participants made up 45% of the attendees.

Focus was on Statoil’s migration from its legacy OpenWorks/GeoFrame data environment to Halliburton/Landmark’s R5000 data infrastructure. Dynamic Consulting’s Egil Helland outlined the move to ‘one platform to manage it all.’ Since the 2007 Statoil Hydro merger the number of applications has been halved. But until last year, Statoil was still running Halliburton and Schlumberger’s data infrastructures—a state of affairs that was deemed ‘costly, complex and inefficient.’ Moving to R5000 will save Statoil around 30 NOK ($5) million/year through lower maintenance, support and data loading costs. Various reorganization scenarios were envisaged such as a (costly) storage upgrade, demobilizing data to ‘non G&G’ storage at half the cost, and data deletion. To date, eight projects have been upgraded and some 600 more are in various stages of decision taking and preparation. Cross framework migration is tricky as data models really are different. Work done for one asset may not be applicable to another where things are done differently.

In a joint presentation, Catherine Gunnesdal and Wim Ahuis offered more details on, respectively, Statoil’s and Shell’s R5000 implementation. Statoil reports that while the R5000 database works ‘satisfactorily,’ currently, PowerView is not stable and performant. SeisWorks is poorly integrated to R5000 and it is ‘time to phase it out.’ The move to R5000 requires major changes to exploration workflows. Shell reported limited use of Landmark applications, but the move to R5000 was an opportunity for a major data cleanup exercise.

Speaking through an interpreter, Xun Ma outlined PetroChina’s ‘multi year, multi million’ upstream IT and data management revamp. This involved a move from scattered, siloed systems to centralized systems by domain, sharing a common earth model. Following a successful pilot at the Daqing oilfield business unit, Landmark’s Information Management & Infrastructure consultants rolled out an enterprise-wide information management system at 14 sites throughout China. The solution included Landmark’s Engineering Data model, extended with input from PPDM to include geoscience and production data. To date, 210 TB of seismics and 230,000 wells have been loaded to the system. A PowerExplorer front end gives access to the whole of China’s basin-wide geological and reservoir data set for modeling and future exploration. PetroChina’s vision is for more automated data capture to a ‘digital oilfield, digital basin and a digital PetroChina.’

Chris Bradley (IPL and author of ‘Data Modeling for the Business’) described an attempt to deploy a consistent, enterprise information architecture (EIA) at BP, where a ‘vertical silo-based approach dominates’ and ‘the accountants rule.’ The application environment consisted of many different SAP deployments, some 5,000 ‘other’ applications and a ‘huge’ Microsoft Office SharePoint environment. The move to a service architecture mandated XML model management (with ER/Studio) and a set of consistent definitions in the EIA. These were based on PPDM and used a ‘data virtualization’ approach. Bradley bemoaned the fact that local ‘empowerment’ meant that many don’t follow corporate guidelines. Some see data modeling in a bad light—so subterfuge was required, with ‘data models’ called a ‘business glossary,’ ‘data dictionary,’ etc.. A data Wiki proved popular with folks keen to provide feedback on definitions.

Bradley railed against the evils of data ‘mine-ing’ and data ‘ours-ing’ advocating a move from the hoarding mentality. Other key tools in the EIA effort include DataFlux (quality), Composite Software (integration), Business Objects and Kalido. Prior to 2006, anarchy ruled—anyone could do anything, projects created their own models and definitions. Today BP has a global ER/Studio license with 300 users and a models repository available through SharePoint. BP is engaged in a quality ‘push,’ there is no more ‘make a new model for every project.’ But there is a ‘constant battle with the bean counters.’

Geographical information systems (GIS) continues to interest the ECIM community—with a full track devoted to GIS ending with a new GIS User Forum and Workshop. Andrea Le Pard (Nexen) advocated the use of GIS as a tool for corporate data management with ESRI’s ArcGIS as ‘common industry-standard platform.’ Data suppliers, vendors and government all supply ‘GIS ready’ data. But this is incompatible with Petrel, Landmark and Petrosys, which ‘has its own non ESRI GIS.’ Nexen uses OpenSpirit to bridge the GIS gap—particularly for non spatially aware applications like Petrel. Le Pard made a plea for Petrosys to ‘get together with OpenSpirit.’ On the downside, Le Pard noted that ‘GIS is 80% about data management and only 20% about playing with the data.’

Grahame Blakey and Bernie South (ExxonMobil) gave an insightful presentation on the evolution of GIS over the past couple of decades (in 1989, Exxon was ESRI’s 26th client!). By way of a caveat to GIS enthusiasts, Blakey put things into context with the fact that ‘true enterprise integration is SAP—50% of ExxonMobil’s IT support is around SAP.’ Geologist South described geology as a ‘4D problem, not just maps but time.’ The challenge is to synthesize all regional geological disciplines and here, GIS is a big hitter. Exxon has managed to standardize on 13 INFO tables used for all ARC-based geoscience data sets. Windows migration means a move ‘from Perl and vi to Python, Silverlight and .NET.’ But the new programming tools have forced a wedge between G&G and the programmer. South found life easier back in the day of AML. Operating systems were more stable than on todays’ Windows systems, ‘Unix was a rock, a chuck of granite!’ Performance in Windows is another headache. Database joins and relates are less robust today. There are benefits in the modern GIS in terms of usability and a reduced requirement for hands-on data management. Visualization is hugely improved but Exxon warns of the danger of form over substance. Today it is too easy to ‘swallow’ bad data. More from www.ecim.no.

This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech and tw@oilit.com.


ESRI 2010 International User Conference, San Diego

Users hear from Qatar Petroleum, Petrobras, Exprodat and on Data Interoperability, 3D and Python.

Qatar Petroleum’s Rob Ross showed the extent to which the ArcSDE Geodatabase can be used in a geological context. GIS can capture a vast range of data types—from soil samples to geophysical horizons, digital terrain models and satellite imagery. The quality and source of input datasets can be captured as metadata and geoprocessing techniques can be applied to data. Ross showed how surface geological studies revealed a 2 meter high stand in the Holocene. GIS was used to visualize the ancient sea. In contrast, a 100 meter low stand during the Ice Age dried up the Persian Gulf completely. Such techniques have been applied to autoclassification of satellite imagery for geotechnical studies including mapping of near surface karsts. Ross believes that the key to GIS is in the judicious exploitation of metadata.

Sidney Pereira unveiled Petrobras’ pipeline Land Property Management System, an ArcGIS Server running on top of an Oracle database. The LPMS addresses the appraisal, negotiation, legal and accounting process. Development of the system benefitted from close cooperation with business users. Understanding and debugging workflows before starting to code was also key. Systems should assume minimal GIS knowhow from end users. Petrobras is now working on a Google Earth Enterprise front end for the LPMS integrating high resolution imagery, search and 3D visualization.

Paola Peroni showed off Exprodat’s exploratory spatial data analysis and uncertainty workflows around ESRI Geostatistical Analyst 10 (GA10). Much geoscience activity revolves around deriving a 2D surface from sparse data. The resulting uncertainty may not be effectively captured and carried through the workflow—with results of uncertain quality. Using test data (the Johansen data set from the Norwegian SINTEF CO2 Sequestration program—links/1009_9) Peroni demonstrated that many common assumptions about data (stationarity, normal distributions) are not true. The investigation highlighted areas of high uncertainty and used ‘stochastic concepts’ to manage uncertainty throughout the modeling process. GA10 ‘compares favorably with established E&P mapping packages’ although this was somewhat at odds with another of Peroni’s findings—that GA has a limited capacity for handling faults.

Larry Phillips (San Antonio Water Systems) demonstrated ArcGIS Data Interoperability (DI) module, pulling data from CAD (Microstation), land (Access) and accounting systems. DI provides support for hundreds of data formats, and ETL* functionality. A canvas lets users graphically assemble data workflows between sources and applications with scripting for transformations and ad-hoc conversions. Phillips offered some compelling metrics as to the merits of bulk transformations of GIS data sets with DI as opposed to manual changes.

ArcGIS 10 novelties announced at the show include how the notion of time is now embedded in GIS data for time lapse visualization and analysis. ArcGIS is now also a ‘true’ 3D GIS, offering 3D data models, editing, analysis, and visualization on the local machine. Users can now do virtually everything they do in a 2D environment in a 3D environment. Python scripting was highlighted for automating common tasks and analyses. Python lets users combine ArcGIS functionality with other scientific programming to extend the solutions. More from www.esri.com.

* Extract, transform and load.


Major web services development sees GIS embedded in SAP

NetWeaver SOA and Flex-based GUI underpins Saudi Aramco’s land management solution.

Speaking at the 2010 ESRI User Conference in San Diego last month, Mostafa Abou-Ghanem demonstrated how Saudi Aramco is supporting its enterprise land management effort by integrating ESRI GIS with SAP ERP tools and EMC’s Documentum document management system. Aramco’s land management solution (LMS) automates some 17 business processes. It is worth noting that when Aramco talks about ‘land management,’ it takes a very broad view of its activity, where a common operating framework covers oil production, refining, transportation and export.

Most business data has a geographic location component that can be viewed on a map. This allows for understanding and interpretation of data in ways that are not possible through a spreadsheet or table. Co-visualization of data from GIS and ERP systems helps Aramco make informed decisions and increase efficiency. Integration of software such as ESRI and SAP has been made possible with the advent of the service-oriented architecture (SOA).

Online GIS viewers are integrated with SAP to let cartographers, supervisors, and managers analyze and review land use requests from the public or private sectors. Web viewers offer high resolution satellite imagery, allowing for ‘proactive’ detection of illegal utilization of land properties reserved for oil exploration. The combined functionality of the system goes beyond mapping, providing true positional awareness and support for analysis across multiple sources of information.

The LMS integrates five sub components, SAP Case Management, SAP Real-Estate, Documentum, ESRI ArcGIS Desktop and ESRI ArcGIS Server web viewer application. SAP Process Integration (PI), a NetWeaver component, was used as a services broker to pass information between the five systems. SAP ABAP programming was used to embed a GIS mapping application inside the SAP GUI and a SAP logon ticket was used for single-sign-on. ArcGIS geo-processing and the Flex API were used in Adobe Flash-based applications to perform feature creation online from the SAP UI without the need to switch to the ArcGIS Editor and also to edit maps during an SAP transaction.

For technophiles Aramco infrastructure included ArcGIS Server 9.3.1 Flex APIs and Flex 3 framework on the front-end, Java Web Application technology on the back-end and ArcSDE 9.3 with Oracle RDBMS for Geo-Database. ArcGIS server 9.3.1 Geo-processing services and the REST API were used for spatial analysis and data processing. BlazeDS was used for server-side Java remoting. Aramco reports that, ‘All systems communicated perfectly. SAP-PI supported message persistence, ensuring no data loss when a sub-system went offline.’ Abou-Ghanem concluded by observing that, ‘embedding GIS functionality inside operational applications and processes that drive the business makes GIS more operational and easier to use.’


Folks, facts, orgs ...

ARM Oil and Gas, Atos Origin, GITA, AVEVA, Blue Marble, OpenGeo Consortium, CGGVeritas, Crown Minerals, CSC, CygNet Software, EMC, Enventure, Epsis, Expro, GE Oil and Gas, Appro, Midland Valley, Ikon, Palantir, Petris, Quorum, RPS, RigNet, Rose & Associates, Senergy, more ...

ARM Oil & Gas Solutions has opened a new office in Pittsburgh, headed up by Gary Stiegel, and will shortly be opening an office in Scranton/Wilkes-Barre lead by Andrew Strassner.

Atos Origin has set up offices in Egypt and the Gulf Cooperation Council countries and appointed Samir El Awadi CEO for the region.

Robert Austin is now president of the Geospatial Information & Technology Association (GITA).

Trond Straume heads-up AVEVA’s new Operations Integrity Management center in Stavanger.

Victor Minor, CTO of Blue Marble Geographics has been appointed chair of the Coordinate Reference System Working Group of the Open Geospatial Consortium.

Denis Ranque is to sit on the CGGVeritas Board of Directors on behalf of the French Government’s Fonds Strategique d’Investissement.

Kevin Rolens, formerly of British Gas Group, Anadarko and Amoco, has joined Crown Minerals as new Petroleum Manager.

CSC has named Mark Rasch director of Cybersecurity and Privacy Consulting. Rasch was previously with the US Department of Justice, investigating cyber and high-technology crime.

CygNet Software has appointed Kevin Rowley VP of engineering. He was formerly with Invensys.

EMC Corp has appointed Jeetu Patel as CTO of the company’s Information Intelligence Group. Patel is a former partner of Doculabs.

Enventure Global Technology has appointed David Crowley President and CEO. Previously of Precision Drilling Oilfield Services and Schlumberger, Crowley succeeds Ray Ballantyne, who is retiring.

Todd Clark has joined Epsis as US Sales Manager, working out of the US Houston, Texas office. He hails from Polycom US.

Former Schlumberger executive Charles Woodburn has been appointed CEO of Expro.

GE John Lammas is VP and Engineering and Technology Leader for GE Oil & Gas.

Ronald D. Cayon has been appointed interim CFO of Geokinetics replacing Scott McCurdy who has resigned.

Greg Hess has joined Appro International, as a Business Development and Sales Professional.

Marcos Gallotti heads-up Ikon Science’s new Brazilian joint venture with Geonunes.

Josh Strasner has been appointed CEO for Logica’s operations in North America. Strasner was previously with EDS and BearingPoint.

Ivan Guerra has joined Midland Valley as structural geologist, and Joanne McMenemy is joining the marketing department.

Paul Ravesies has joined Northern Offshore as Senior VP, business development. He was previously with Pride International.

Robert Minson has been appointed by Palantir Solutions to oversee software implementation and prepare for Palantir’s expansion in Australia.

Petris Technology has appointed Jerry Martin, as VP Western Hemisphere, and recruited Greg Palmer and Stephan Dumothier as product managers.

Scott Leeds, President and co-founder of Quorum Business Solutions has assumed the title of CEO, succeeding co-founder Paul Weidman, who will remain Chairman.

Chief Commercial Officer, Perry Turbes, has assumed the title of COO.

RigNet has opened a service center in Dickinson, North Dakota.

Former BP VP of Geoscience and Exploration, Peter Carragher has joined

Rose &Associates as Senior Associate in Houston.

Graeme Simpson heads-up RPS’ new Abu Dhabi office.

Øystein Roti is MD of Senergy Norway. Richie Miller is president of Spectrum Geo. He was previously with CGGVeritas.

Roger Peterson has joined TTI Exploration as VP Human Resources, and David Jones as CIO.


Done Deals

Baker Hughes, Meyer, Managed Pressure, NGP Energy, Nabors, Superior Well Services, IHS, Access Intelligence, Veronis, Quorum, Carlyle Group, Riverstone, RPS, Boyd Exploration Consultants, illion). Kongsberg, Odfjell, Expro, Production Testers, Fugro, ERT, Seawell, Allis-Chalmers, Dice, Rigzone.

Baker Hughes has acquired Meyer Associates, developer of the MFrac hydraulic fracturing tool.

Managed Pressure Operations International has announced a growth capital investment from NGP Energy Technology Partners.

Nabors Industries has completed its tender offer for Superior Well Services. The acquisition is to be completed as a ‘short-form merger.’

IHS has acquired assets from the chemical and energy portfolio of Access Intelligence LLC, a business-to-business information company owned by private equity investment firm Veronis Suhler Stevenson.

Quorum Business Solutions has received a ‘significant’ capital investment from The Carlyle Group, and energy-focused private equity firm Riverstone Holdings. Financial terms were not disclosed.

RPS has acquired Boyd Exploration Consultants, an oil and gas and mining consultancy, for a maximum consideration of C$13.9 million (£8.5 million).

Kongsberg has acquired Odfjell Consulting which is now a wholly owned subsidiary of Kongsberg Oil & Gas Technologies.

Expro has completed the acquisition of the Production Testers International.

Fugro has acquired Edinburgh-based consultants ERT (Scotland). The company will become part of Fugro GeoConsulting’s integrated earth science consultancy.

Seawell is to acquire Allis-Chalmers Energy in a $980 million transaction.

Dice Holdings has purchased Rigzone for an initial cash consideration of $39 million with $16 million to come if operating and financial goals are met. The Jordan, Edmiston Group acted as financial advisor to RigZone.


ISAP phase 2 mooted at SPE Miri applied technology workshop

TNO’s computer-aided history matching JIP leverages JOA Jewel Suite and CMG’s flow modeler.

Speaking at the recent SPE Applied Technology Workshop in Miri, Malaysia, researchers from the Netherlands-based TNO organization provided an update on TNO’s Integrated System Approach to Petroleum Production (ISAPP), with focus on improving production from channelized reservoirs. TNO presented an integrated workflow for forecasting production from reservoirs with complex geometries and inter-channel connectivity.

The preferred history matching technique is an ensemble Kalman filter (EnKF) that has been integrated with JOA’s JewelSuite’s Impala multi point statistics property modeling functionality to simulate multiple geological realizations. These are then evaluated for flow performance using CMG’s simulation tools.

The approach is said to be amenable to reservoir control in the light of increasing information streaming from the digital oilfield and intelligent wells. The resulting ensemble of history-matched models is then used for production forecasts and uncertainty analysis. EnKF came out of TNO’s program of fundamental and exploratory research into closed-loop concepts and computer-assisted methods. The first ISAPP program concluded in 2009. TNO is now inviting interested companies to participate in phase 2 of the research effort. The (noble) goal is to increase oilfield recovery by ‘10% or more,’ using ‘improved methods and concepts into the reservoir management workflow.’ More from www.isapp2.com.


Knowledge Ops’ Total Asset Visibility

Rapid Response Institute spinout addresses new safety regulations post Deepwater Horizon.

Houston-based Knowledge Ops has announced a new ‘Total Asset Visibility’ solution targeting emerging offshore drilling operational safety requirements and regulatory compliance. TAV addresses issues raised after the Deepwater Horizon blowout and positions drilling companies for compliance with forthcoming regulations such as the 2010 Blowout Prevention Act. TAV providing certifying decision makers information covering all aspects of the drilling processes including equipment and response criteria. It also helps regain control over the ‘myriad’ spreadsheets used in field operations and certifies that documentation and safety standards are met.

Knowledge Ops expects the legislation, which is modeled on the 2002 Sarbanes Oxley Act, to be passed into law real soon now. CEO Scott Shemwell said, ‘Operators and drilling company partners are facing new challenges, not just in the Gulf of Mexico, but worldwide. Demands for operational excellence come from local communities, regulators and the marketplace.’ TAV also acts as a platform for disaster recovery. Knowledge Ops was formed as a spinout from Monmouth University’s Rapid Response Institute. More from www.knowledgeops.com.


ERP systems and A&M—towards IT ‘nirvana’

Deloitte white paper investigates ERP options during oil and gas mergers and acquisitions.

A white paper* from consultants Deloitte looks at the aftermath of oil and gas acquisitions and mergers from the viewpoint of the CIO. Deloitte’s thesis is that to create a ‘high-functioning’ organization from two or more disparate businesses, first you have to adopt a single enterprise resource planning (ERP) solution. This can be achieved by either starting from scratch and transforming business processes during an ERP consolidation, or by adopting one of the pre-A&M company’s business processes and platform. Deloitte advocates the latter ‘integrate, then transform’ as allowing post merger companies to begin working off the same page much more quickly than would be the case with the ‘transform then integrate’ approach.

The study describes a state of IT ‘nirvana’ enabling executives from the separate organizations to ‘work rapidly and harmoniously to […] choose the best possible business processes.’ Companies ‘work tirelessly to transform their businesses into a single, high-functioning organization partially enabled by a single, effective ERP solution.’ Such ‘nirvana’ may or may not be achievable given real world constraints of sub-optimal processes and budget limitations. Time constraints and the complexity the oil and gas business may make the goal of an effective solution elusive. Technologies evolve and ‘ideal’ solutions mutate into ‘fix-it-quick’ remedies such as offline spreadsheet analysis of financial reporting and customer sales forecasts.

Rapid integration of IT systems is crucial to the creation of a single corporate culture out of two merging companies. If management chooses to delay this process by implementing an ‘optimal’ system, companies run the risk of going through the culture clash and expense of merging all over again when the new system rolls-out! Deloitte claims to offer a ‘comprehensive methodology’ designed to help companies put merger integration initiatives on the fast track. Standard activities, milestones and tools have been developed to help companies at such critical times.

Comment—The white paper is more sales pitch than analysis, but as we heard recently, over 50% of Exxon’s IT spend is SAP related. ERP really is the elephant in the room—in both A&M and in operations too. More from www.deloitte.com.

* www.oilit.com/links/1009_10.


Sales, contracts, partnerships and deployments

IceWeb, Cortex, CygNet Software, Emerson Process Management, Kongsberg Maritime, Exprodat, Paradigm, GE Oil & Gas, Al Shaheen, IBM, Object Reservoir, RigNet, Stratos Global, Theta Oilfield Services, eLynx Technologies, Venture Information Management, WellPoint Systems, Wood Group.

IceWeb has provided the US authorities with an emergency geospatial data storage solution to help the Deepwater Horizon recovery team effort—www.IceWEB.com.

Apache Canada has ‘gone live’ on the Cortex Trading Partner Network which now services over 3,000 companies. Apache’s US unit will follow suit later in the year—www.cortex.com.

Future releases of CygNet Software’s CygNet Gas Measurement Repository will include support for Flow-Cal Common File Exchange format for electronic flow measurement data—www.CygNetscada.com.

The University of Texas at Austin-operated Separations Research Program pilot plant has been upgraded with Emerson Process Management’s ‘PlantWeb’ digital automation technology—www.emerson.com.

Statoil has selected Emerson Process Management as one of three preferred automation and safety systems suppliers in a five-year frame agreement.

Kongsberg Maritime also received a five-year frame agreement covering safety and automation systems (SAS) for future Statoil green field projects both on and offshore—www.kongsberg.com.

Epsis has signed reseller agreements with the FotoPhono and AVC Media Enterprises for sales of the Epsis TeamBox in the Norwegian and UK markets, respectively—www.epsis.no.

Premier Oil has purchased Exprodat’s Team-GIS KBridge and Team-GIS Directory extensions to ESRI’s ArcGIS Desktop. The software provides browsing bi-directional data transfer between SMT’s Kingdom Suite and ESRI’s ArcMap—www.exprodat.co.uk.

Gazprom has signed a multi-year software licensing deal with Paradigm covering seismic processing, interpretation, reservoir characterization and modeling. Paradigm will also provide training to Gazprom users—www.pdgm.com.

PII Pipeline Solutions, a GE Oil & Gas and Al Shaheen joint venture is to supply GE’s PipeView Integrity pipeline data monitoring and management package to DBNGP of Perth, Australia, operator of the 1,400-kilometer Dampier-to-Bunbury pipeline—www.ge.com/oilandgas.

Sunoco has contracted with IBM for a wide range of managed business process services and application support services. IBM’s global operations centers will provide application enhancement, maintenance, finance and indirect procurement services, leveraging IBM’s experience with ‘hundreds of client engagements including many with oil and gas companies’—www.ibm.com/services.

Object Reservoir has started work on a ‘collaborative exploitation project’ investigation of the NE Pennsylvania Marcellus shale, leveraging its ‘Resolve’ finite element modeling and shale knowledge base. Participants include East Resources, EOG Resources, EXCO Resources (with JV partner BG Group), and Ultra Resources—www.objectreservoir.com.

Upstream communications solutions provider RigNet has signed an agreement with Stratos Global to become an authorized distributor of Inmarsat and Iridium mobile satellite services including Fleet Broadband and BGAN, and Iridium’s 9555 handheld phones and OpenPort Global IP data service—www.rignet.com.

Theta Oilfield Services and eLynx Technologies have signed an agreement under which eLynx will integrate Theta’s rod pump optimization and SCADA software, XSPOC, with eLynx’s web-based SCADA application, SCADALynx—www.elynx.com.

Venture Information Management has been awarded a contract by CDA (Common Data Access Ltd) to conduct a study into the possibilities of a collaborative approach to managing Production Data across the Oil and Gas UK community—www.ventureim.com.

WellPoint Systems has been chosen as the provider of back-office IT solutions for Midland, Texas-based Legacy Reserves, LP—www.wellpointsystems.com.

The UK’s Oil Spill Advisory Group (OSPRAG) has awarded a contract to assess subsea capping and containment options for the UK continental shelf (UKCS) to Wood Group Kennywww.woodgroup.com.


Standards Stuff

Statoil’s jWITSML, Setiri Group’s nWITSML, Petrolink, API, PODS, Energistics, PPDM.

Statoil has released ‘<JWitsml/>,’ a Java WITSML client as open source under the Apache 2.0 license with support for versions 1.2, 1.3 and 1.4—download the SDK from www.jwitsml.org.

Setiri Group has published ‘nWitsml,’ a .Net port of the (above) jwitsml.org project released under the same open source license. nWitsml is a ‘read-only’ implementation. Setiri is also developing a full-featured client server library as a separate commercial product—www.nWitsml.org.

Petrolink has announced a lithology editor for Witsml Connect for capture of cutting and interpreted lithologies into a Witsml Mudlog Object—www.petrolink.com.

The American Petroleum Institute is to provide free online access to key industry safety standards covering refining, offshore drilling, hydraulic fracturing pipeline safety—www.api.org.

The PODS 5.0 ESRI Spatial GeoDatabase has been released following a three-year collaboration between PODS, APDM and ESRI. PODS ESRI Spatial includes long transaction management via GeoDatabase versioning, built-in history management and management of complex geometric networks and topologies—www.pods.org.

PODS and Energistics have signed a Memorandum of Cooperation to cooperate, and share activities for the benefit of their members and the oil and gas industry at large. PODS and Energistics will identify standards development and deployment activities that appear to be among those suitable for formal or informal joint sponsorship and/or joint member participation. More from www.pods.org and www.energistics.org.

Deliverables from the PPDM Association’s Well Status and Classification effort have been posted to the test site for review. Finished versions will be available (to members) within a month. Wellbore status and fluid type have been combined in a ‘logical’ standard symbol set available in a raster and vector formats for download. More from www.ppdm.org.


ISO 15926 under OpenPlant hood

Bentley Open Plant claimed as first commercial implementation of Fiatech iRing data framework.

Bentley has released new products in its ‘OpenPlant’ suite, said to be the first commercial implementation of Fiatech’s ISO 15926 Real-Time Interoperable Network Grid a.k.a. the ‘iRing.’ The iRing project (Oil ITJ January 2009) set out to enable real-time interoperability of data and information using the ISO 15926 standard for plant and process industries. The V8i releases of OpenPlant Modeler, ModelServer and Isometrics Manager share the ISO 15926 data foundation.

Bentley Software senior VP Bhupinder Singh said, ‘Last year the iRing prototype showed how interoperability could be achieved with ISO 15926. Leveraging our MicroStation and ProjectWise platforms, OpenPlant enables users to achieve real-time data interoperability.’

User Jeanine Hargis, senior technologist with CH2M HILL added, ‘We recently completed testing the new OpenPlant products. We are a global delivery firm in six key markets, so having data-open software is something we’re looking forward to.’

The next version of Bentley’s PowerPID will also leverage the new data infrastructure and is claimed to be the ‘only commercial software for process and instrumentation diagrams to be based on a completely open data model.’ More from www.oilit.com/links/1009_13.

Comment—Being the ‘only’ toolset to use an ‘open’ data model is not perhaps where the industry wants to be. The interoperability ‘fun’ will only happen when lots of tools use the same model!


Tecgraf carries ISO 15926 torch to Brazil

POSC/Caesar presentation highlights engineering opportunities for Petrobras.

Speaking at the recent POSC/Caesar Association member meeting in Forus, Norway, Gabriel Lopes introduced his company, Tecgraf, a.k.a. the Computer Graphics Technology Group. Tecgraf was created in May 1987 by Petrobras’ Cenpes R&D unit along with the Pontifical Catholic University of Rio de Janeiro (PUC-Rio). Tecgraf is carrying the ISO 15926 torch to the Brazilian oil and gas industry—having joined the Austin, TX-based Fiatech organization last year.

Tecgraf sees ISO15926 as an opportunity to deal with ‘everyday’ interoperability issues and to brush up its R&D effort in semantic web technology with oil and gas funding. The plan is to give Petrobras control over its data and over its choice for software solutions by driving the adoption of ISO15926 in Brazil

PUC-Rio is to host a ‘high level’ ISO15926 workshop for Petrobras in November 2010 where Tecgraf plans to promote ISO15926 pilot project opportunities and semantic web technologies. More from www.tecgraf.puc-rio.br.


iResponse emergency management for Oiltanking

Cleveland Process Designs supplys map-based EMS to international tank operator.

Hamburg headquartered Oiltanking has signed a five year licensing agreement with UK-based Cleveland Process Designs. The agreement covers software and services around Cleveland’s ‘iResponse’ Emergency Management Solution (EMS) at 75 Oiltanking sites around the world.

iResponse is a map based system designed to improve pre-planning, training and ensure managers and responders are well prepared to confront an emergency. iResponse is designed to be accessed in incident control rooms, emergency control centers, fire engines and command vehicles. It provides decision makers and response personnel with information such as risk contours (Thermal, Overpressure, Atmospheric Dispersion) and decision support tools for emergency management.

iResponse’s map view shows the relative location of resources and existing infrastructure. Mapping tools enable information such as distances and asset information to be quickly determined. Tools such as foam calculators, plume dispersion modelers, firewater bund volume calculators and hose run modelers are provided to support the determination of incident impact and resource deployment requirements. iResponse helps assure compliance with current fire engineering standards and regulations. Cleveland clients include BP, ExxonMobil and Qatar Petroleum. More from www.oilit.com/links/1009_12.


Accenture review of carbon reporting in oil and gas

Clean Energy unit’s study finds uncertainty as to reporting future—lists software providers.

A white paper from Accenture summarizes the results of a recent Review of Carbon Accounting and Reporting in the Oil and Gas Industry. Authored by Accenture’s Clean Energy division. The study was based on a survey of several IOCs and a comparison of IOCs reports under the 2009 voluntary Carbon Disclosure Project.

A key finding is that while mandatory reporting is coming, there is uncertainty as to what regulation will bring. The report offers a maturity matrix showing three ‘archetypes’ of carbon management and reporting. The top level archetype has a strong CO2 management function, intergrated measurement processes and a single system for CO2 data management and reporting tailored to local requirements which differ widely around the globe.

Currently, most archetypes rely on spreadsheets to collate, analyze and report emissions. However, increasingly, companies are implementing environmental sustainability software to support specific functional and regulatory requirements particular to the carbon reporting and regulation domain. Specialist software providers include Carbonetworks, Ecofys, EnviroData, IHS and Schneider Electric. Read the Review on www.oilit.com/links/1009_11.


Deepwater Horizon—a ‘data spill’ too?

LeClairRyan attorney William Belt speculates on massive amount of ‘disclosable’ e-documentation.

Writing in the Westlaw Journal, LeClairRyan attorney William Belt described the challenge posed by the massive amount of electronic documents that will be available to litigants. Back in the Exxon Valdez days, paper documentation associated with the case filled a warehouse. Now, the 21st Century ‘imperative’ for parties to hand over all electronic documents means that the e-discovery challenges will be huge. Today, electronic ‘documents’ include instant-messages, e-mail trails and potentially ‘tweets,’ web cams and more.

Belt ventures that, ‘given its complexity, seriousness and the proliferation of electronic files involved,’ disclosable data could cross the ‘mind-bending’ petabyte threshold. Under current e-discovery guidelines, companies have an explicit responsibility to preserve digital information that is under their custody and control. Noting that it has taken nearly 30 years for the Exxon Valdez to see a final settlement, Ryan speculates as to the limits of discoverable information, whether ‘data sources’ like Facebook, Twitter and YouTube fall under a court’s jurisdiction and what they will look like in twenty or thirty years time. More from www.LeClairRyan.com.


FLACS explosion modeler linked to SmartPlant

CAD system links to GexCon’s explosion analysis for safety-based engineering and compliance.

Intergraph has interfaced its SmartPlant 3D computer-aided design (CAD) software with GexCon’s FLACS explosion analysis package. SmartPlant’s rules-based design software builds plant safety into the design process, enforcing regulation and engineering standards. FLACS is claimed to be the leading tool for explosion consequence analysis of offshore oil and gas installations.

FLACS was originally developed in a joint industry project involving 10 majors and NOCs at Norway’s Christian Michelsen Research (CMR) unit and was spun-out as GexCon in 1998. The safety management software is used in oil and gas, power and other plants involved in the handling or manufacturing of explosive or flammable materials or pressurized liquids and gases. The tool is used to identify potential risks early in the design phase and to improve safety and prevent costly corrections during construction and operations.

Validating Smart 3D models in FLACS now takes hours instead of months. Users can iterate design options with higher model accuracy to analyze gas dispersion and explosion impacts.

Smart 3D is a component of Intergraph’s Plant Enterprise, a suite of integrated solutions for the design, build and data management of large-scale process, power, marine and offshore projects. A ‘life cycle’ data management approach smoothes handover from EPCs to owner operators and helps the latter maintain, refurbish or modify their plants. More from www.intergraph.com/go/smartsafety.


LogRhythm webinar on energy cyber security challenges

Paul Reymann and Caleb Wright on critical asset security and imminent NERC regulatory update.

In a recent webinar titled ‘Cyber-Security Challenges in the Energy Industry’, Reymann Group CEO Paul Reymann and LogRythm’s Caleb Wright provided an update on NERC ‘cyber infrastructure’ for critical assets. Connectivity between today’s SCADA systems and office applications—and soon the ‘smart grid’ is making for an ‘inverted’ security model—i.e. one that is increasingly open to cyber attack (as shown by the StuxNet worm—OilITJ July 2010). The answer is ‘intelligent situational awareness and automated cyber security solutions.

NERC’s CIP V4 will roll out Q4 2010 and targets users of the bulk the electrical system. The general approach and risk evaluations methodology should be of interest to process industry SCADA systems including oil and gas pipelines and plants. Operators have a wealth of raw material in the form of device, system and database logs and audit trails. These are leveraged by Security information and event management (SIEM) specialist LogRythm’s suite of tools for event log management and analysis which provides a combination of alerts, automated aggregation and correlations tools along with visualization, search, trend analysis and data mining. Pre-packaged reports are available for regulatory regimes including NERC CIP. LogRythm clients include the US Department of Energy. More from www.logrythm.com.


Petrosys comes into Paradigm’s Epos fold

Connector leverages OpenGeo dev kit to link Petrosys’ mapping tools with geoscience repository.

The plug-in/interoperability space is hotting up with the announcement of a ‘Petrosys Connector’ to Paradigm’s Epos data infrastructure. The new connector will allow Petrosys users to access data in Epos repositories. Paradigm senior VP technology Duane Dopkin described the move as driven by Petrosys’ customers adding that this reflected the growth of Epos as a ‘repository of choice’ for Paradigm’s customers, ‘Integration between Paradigm and Petrosys will add presentation quality mapping of Epos data and demonstrates our commitment to openness and cross-vendor connectivity.’

Petrosys is to develop and maintain the links, which will be built using the Paradigm’s Epos OpenGeo programming toolkit. Petrosys MD Volker Hirsinger added ‘The integration with Epos will provide our mutual clients with more ways of summarizing and mapping current project information.’ According to Paradigm, the OpenGeo programming toolkit makes it possible to extend workflows to include solutions from multiple vendors. More from www.pdgm.com and www.petrosys.com.au.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.