January 2011


Shell’s Reference Architecture

Shell’s global data architecture V1.0 rolled out mid 2010. Under the hood are IBM’s System Architect and Metastorm’s ProVision. Inspiration came from the DAMA DM-BOK and The Open Group’s TOGAF.

At the IRM-UK data management conference in London late last year, Andrew Schultze1 described Shell’s journey along the road to a global data architecture (GDA) strategy. Shell has a long history of data modeling—as witnessed by our review of Matthew West’s new book on page 3. But before GDA, the behemoth’s (100,000 employees and $13bn income in 2009) decentralized business meant that its data architecture was divvied up between multiple units. The new initiative—described as an ‘ongoing journey’— set out to provide a terminological framework, an enterprise data catalog that will allow subsequent ‘deep dive’ data modeling into specific areas.

The top down approach began with a management decision that an enterprise data strategy was needed spanning the whole upstream-downstream spectrum. Shell’s initiative stemmed from a business ‘appetite’ for improved data quality along with the realization (from its PDO unit) that data architecture can contribute to better data quality. Shell has now developed an enterprise data catalogue (EDC) that graphically represents its data at a high level. The color-coded EDC shows which business leads for a subject area, where data quality dashboards exist and where ‘deep dives’ have been executed or are planned. The EDC lets Shell scope projects and identify overlaps—avoiding ambiguity with a single terminology. The EDC provides Shell with thematic direction, ‘big rules’ that programs and projects can follow. Users are encouraged to ‘be pragmatic, boiling the ocean will not be accepted.’

Tools used included IBM System Architect and Metastorm’s ProVision, a toolset for business and information architecture that claims to deliver a strategy-driven business and IT architecture (ProVision is also used by Hess, Talisman and RomPetrol). Part-way through the project Shell decided also to leverage the DAMA Data Management Book of Knowledge (reviewed in Oil IT Journal in June 2010) as a source for terminology.

The EDC currently lacks a lot of stuff and is to be expanded, ‘borrowing’ from industry standards such as ISO 15926, ProdML and others. V1.0 of the EDC was released in April 2010.

Shell is now working on its data management processes and on master data management—starting with upstream data from its onshore US gas business, turning to client and customer data in 2011. Product, pricing materials are also in the GDA—as are records management and ‘e-discovery.’ Shell’s acronymic bonanza is topped-off with a project delivery framework (PDF), hooked into the GDA.

1 The presentation was made by Lars Gåseby and followed-on from Shell’s 2009 IRM-UK paper where Johan Krebbers described how Shell’s enterprise architecture has been constructed leveraging The Open Group’s ‘TOGAF’ framework—www.opengroup.org/togaf.


2010 bumper year

2010 was a great year for oil & gas M&A according to IHS. BDO reports huge net cash inflows. Oil IT Journal notes accelerating ‘done deals.’

A new study1 from IHS puts worldwide upstream oil and gas asset acquisitions at a record $107 billion high for 2010—a 160% hike over 2009. The rise was driven in part by ‘sustained high oil prices, global expansion by national oil companies, divestitures by BP, ConocoPhillips, Suncor and Devon and by spending on North American unconventionals.

A survey2 of 100 oil and gas CFOs by accounting and consulting house BDO USA likewise noted major capital inflows to the oil and gas sector, with 56% reporting ‘similar or better’ access to capital compared with 2009. BDO also observes ‘record’ M&A activity. News for employees is also good as 95% of CFOs expect to ‘maintain or increase’ their headcount.

Oil IT Journal reports this month on a significant uptick in service sector A&M with, inter alia, the acquisition by GE of Smart-Signal, new equity partners for OSIsoft, a mega initial public offering for FleetCor, and the acquisition, by GlobaLogix, of Blast Energy Services’ satellite division, augmenting its ‘wellhead to website’ automation solutions. More on page 9.

1 www.ihs.com.

2 www.bdo.com.


Data modeling special issue

Editor Neil McNaughton comments on similarities and differences in data modeling approaches from BP, Shell and others as observed at IRM-UK’s data conference and SPAR Europe/PlantTech, and from his reading of Mathew West’s new book ‘Developing high quality data models.’

Sometimes I have to pinch myself to see if I am not dreaming, but attending a dozen or more conferences in a year really does give one some insights—not to mention some pretty interesting cross-checking possibilities. This month’s insights began at the excellent IRM-UK Data Management1 conference held last November, but that is not where I am going to start my account.

Matthew West’s new book ‘Developing High Quality Data Models2’ pinpoints an issue that every data modeler has come across—the way that objects evolve over time. This can lead to modeling ‘gotchas’ that make for constant tweaks to a data model. West’s answer is ‘4 dimensional’ modeling using an ‘ontology’ that seeks out the meaning of things at a deep, near-philosophical level. More on page 3.

West’s examples come from the downstream—but one can dream up some more from other subject areas. Consider an exploration permit. We will not delve too deeply into the fact that one person’s ‘permit’ is another person’s ‘license’ or indeed ‘licence,’ although such subtleties are often stumbling blocks. Let us suppose that the permit has a certain shape described by a set of points or vertices of a polygon. Again we will move on quickly even though most will spot another boat-load of pitfalls here as a ‘polygon’ has no meaning without mucho metadata as to projection system and datum and what scheme (great circle etc.) is implied in ‘joining up the dots,’ something that some jurisdictions do not even bother to define. Leaving that aside for the moment, to show how hard things get, let’s keep it simple.

This permit has been awarded to a joint venture comprising three partners. In a couple of years time, a farmout results in another company coming in and financing a well for the joint venture. The well is successful, a field is discovered. This is the sort of thing that catches the eye of larger companies with lots of cash and a portfolio to manage—so a new predator comes in and acquires its share of the field from one of the original joint venture partners—who being of a crafty nature, sells all its data in a separate deal. Some of this may include speculative, non exclusive data whose ownership is the subject of debate—again, mileage will vary in different jurisdictions.

Meanwhile the original exploration license may or may not, (again depending on jurisdictions), change shape—as it is renewed or recast as a development license. It changes its ‘spatio-temporal’ extent as West would say. Meanwhile too, more money is spent, by the original joint venture partners, by the farminer and by the acquirer. Meanwhile too (I forgot to tell you this), the discovery was made in a country that was a member of a currency area whose economies, after pulling this way and that as some members worked hard and retired late in life while others spent all their time in the café, has created a new currency, devaluing in the process. The company that acquired a share of the field is co-located in countries that are both inside and outside of the country whose currency has changed its ‘spatio-temporal extent.’

Now the boss comes in and asks you questions like ‘what is the book value of our stake in the XYZ field?’ or ‘how much is our data resource worth now?’ Now if you are like I am, a kind of entry level SQL practitioner, you are probably thinking, ‘Yikes! I have just had to change my data model 15 times and I’m still not sure if it is any good.’

If you are Matthew West you have probably figured out what the ontological sense of ‘currency’ is, how the interactions of shareholdings and license extents coalesce and diverge, and you are ready for anything. If you work for SAP or Kalido you probably have an answer too.

Shell’s data modeling approach, as described in this month’s lead, appears to eschew the ontological approach—instead striving for taxonomic homogeneity across the enterprise—and reserving domain specifics for ‘deep dives.’ I’m not sure why, but this makes me think of the common entreaty not to ‘hold your breath’ while diving is in progress...

My take is that it is unrealistic to expect any data model to cater for all eventualities. Even fundamental questions—like ‘what is the book value’ get to be hard, if not impossible to answer, without giving much more context. This is not an academic notion. The book value of an asset is a challenging topic that itself has ‘spatio-temporal extent’ as different jurisdictions debate ‘fair value’ accounting and the like.

Even production data is likely to originate as a hotchpotch of more or less accurate information gleaned from meter readers with their minds on something else, from a poorly calibrated multi-phase meter or a ‘fiscal’ meter with a stuck needle.

The underlying ‘meaning’ may be rather hard to get at which is probably why the top down approach appears so attractive. Complex models that cater for local niceties home-in on a local version of the truth—but in a way that may not be amenable to upscaling to the enterprise level. But starting at the top and working down to an understanding of everything is indeed a herculean task. At the end of the day, management, and the taxman, wants you to ‘just give me the number.’

Such issues are echoed in another IRM-UK presentation. Frances Dean described BP’s earlier attempts to ‘boil the ocean’ with mega data modeling projects. BP’s Trade Completion back office data modeling appears to be more of an attempt to align computing systems with the business. More bottom up than top down perhaps, but no mention here of ontologies or grand abstractions—although they may be under the hood!

I almost forgot, but there is even more on data standards in our report from the 2010 Spar Europe/PlantTech event we attended last month on page 8.

Well, I hope that this editorial has whetted your appetite for more from this data-modeling packed issue. Happy reading.

1 www.oilit.com/links/1101_21.

2 Morgan Kaufmann, ISBN 9780123751065, 2011—www.oilit.com/links/1101_13.


Book Review—Developing high quality data models

Matthew West’s new book summarizes a lifetime’s work on data modeling in oil and other verticals.

The introduction to Mathew West’s new book, ‘Developing high quality data models1’ (DHQDM) sketches out his career from early work computerizing Shell’s refineries and later, as data architect for Shell’s 1990 ‘Key Thrust’ on data management. In 1993 West joined the Process Industries Step consortium Pistep to work on a standard for engineering design data handover. This led to an EU ‘consortium of consortia’, Epistle, with West as chair.

There remained ‘niggling’ problems with the Epistle data model. Epistle, like Shell’s earlier models was a static snapshot of part of the enterprise and neglected the fourth dimension, time. Another learning was that West was not developing a data model but instead, an ‘ontology.’ The use of the 4D/ontological approach underpinned Shell’s spin-out Kalido and also a 2004 Shell global program to develop IT systems for its downstream operations. Epistle itself eventually forked into the ISO 15926 standard for lifecycle integration of process and oil and gas production facility data.

DHQDM builds on West’s vast experience but a warning is necessary. If you are looking for a book that walks you through what West calls the ‘traditional normalization approach’ you will be disappointed. West’s starting point is the Express data modeling language2 and its graphical manifestation Express-G. A lot of this is rather hard going. Despite his emphasis on the importance of data definitions, West doesn’t do a great job of defining terms like entity, entity type, entity data type, instance, class and so on. If you have no idea what these are, this book is not for you. If you do, lets hope it is the same as West’s. DHQDM also suffers from too granular and intrusive paragraphs—4.5.5, ‘Data quality standards’ is, for instance, a minute tautology.

West’s twofold thesis is that the 4D/ontological approach gives more robust models than conventional data modeling. The 4D issue is easy to grasp. West uses examples of stuff like feed stocks coming into a chemical plant and emerging as batches of product which in turn may be saleable as different products.

The ontological approach builds on philosophical notions of ‘meaning’ in a scheme for modeling everything—through time, space, the universe and in fact, other universes that may or may not exist! West’s framework was used to develop the ISO 15926 data model for integration of oil and gas equipment lifecycle integration and also for a conceptual data model for Shell’s downstream business.

This book explains clearly many of the pitfalls of real world data modeling and how an ontological approach is used to get to the bottom of what is being modeled. There are many insightful use cases that should make data modelers pause for thought and seek a deeper meaning in what they are trying to achieve. But the framework is not for the faint hearted. Getting closer to ‘meaning’ means more abstraction from reality and tortuous constructs such as how a customer might become an ‘instance of a class_of_state_of_biological_system_component.’

In oil and gas data modeling Express is recognized as a powerful data modeling tool. Along with STEP and ISO 15926 it was used by Energistics (then POSC) to model the upstream in the Epicentre data model. But the rub is that Epicentre is practically no more and ISO 15926 appears to be migrating to the very different RDF/OWL modeling language. So for DHQDM to see take-up, an Express revival is required. Given the dominance of the ‘normalizers’ and the emergence of RDF/OWL this is a tough call. But West’s advice and understanding of modeling is applicable across all environments. Without using these words, West appears to be saying, ‘This is the way to do data modeling properly and Express is the best tool for the job.’ In an ideal world, the power of Express appears to offer a lot. But the ideal world is, as it were, only one of many.

1 Morgan Kaufmann, ISBN 9780123751065, 2011— www.oilit.com/links/1101_13.

2 www.oilit.com/links/1101_12.


Enterprise data architecture at the Pru

John Ewan’s talk at IRM-UK outlines how data management is done in financial services.

The Prudential’s business (and hence its data model) is essentially unchanged since its agents began selling policies to clients 160 years ago. In 2003 the Prudential Data Architecture team was created to define a data architecture and a ‘canonical’ information model. Before ‘data silos’ were built for projects. The Pru’s enterprise data architecture (EDA) comprises an overarching ‘business concepts’ model that maps into a corporate data model. Next comes a logical data model, fully attributed with detailed definitions and finally the physical layer. The Pru has only one logical model, implemented and optimized in different ways.

Conceptual modeling is performed with stakeholders using whiteboards and PostIt notes rather than Case tools. Ewan likes to play the ‘dumb’ architect and involve people further up the food chain. It is important not to explain data modeling—rather to get users to write down entity names themselves. Ewan prefers Richard Barker’s ERD over ErWin for capturing the business.

The Pru has been using master data since before the term was coined. Master data is at the core of the architecture and can be accessed direct from a data store of reference or copied if needed for analytics. There can be tension with users who may prefer data in SalesForce.com over the master data repository. Tying third part applications in to the data infrastructure can be hard.

Left to their own devices, ‘stove pipe’ business areas lead to conflicting requirements and quality levels. Data ownership concepts blur over time. Prudential has well attended data governance forums around its key businesses. Business users bring issues to the table.

The Pru is now working on data quality profiling. Even without the funding to fix all issues, it can be useful to report declining quality to key stakeholders on a monthly basis. The company is also developing a ‘canonical’ information model for a services-oriented architecture. While SOA mandates a common agreement on semantics, it is ‘virtually impossible’ to have the same canonical model for all requirements. The idea is to create ‘semantically consistent’ multi-tiered models that can generate XSD and then build compound business types from the core definitions. All of the Pru’s work is platform independent with development in Java. Ewan likes to focus on solving business problems with senior stakeholder involvement—adding value, sometimes even without a data model.


PETEX 2010

OSPRAG and the UK North Sea post Macondo, Statoil and WoodMac on shale gas, Integrated asset modeling and a call for more ‘robust’ and transparent production forecasting.

Petex, the biennial conference and exhibition of the Petroleum Exploration Society of Great Britain1 saw around 100 exhibitors and 2,500 registrants for its 2010 edition last month. Mark McAllister, chair of the UK Oil Spill Response Advisory Group and CEO of Fairfield Energy presented the UK’s response to Macondo. Preventing such incidents requires cooperation between trades unions, operators and contractors. OSPRAG has four sub groups—Technical (well design, inspection and control, first response), Spill Response (response capability and remediation) and Financial & Insurance (liability, indemnity and insurance provisions). The group is also looking at North Sea-wide regulations and response mechanisms. OSPRAG has developed a well capping device. A test of a national contingency plan is slotted for May 2011.

Malcolm Brown, Senior VP Exploration with BG Group, pondered on the new ‘giants’ of the 21st century—shale gas, cretaceous fans, rift plays, pre-salt and coal seam gas. US shale gas is ‘developable’ at $5/Mcf. In the session on unconventionals, Statoil’s Iain Scotchman related its $ 3.3 billion deal with Chesapeake. This covered some 1.8 million acres of the Marcellus shale in Pennsylvania, West Virginia and New York states. Initial large gas flows of up to 10MMcf/day decline rapidly as free gas is replaced by the slower desorbtion process. Wells have an estimated productive life of 30—60 years with reserves in the range of 3 to 8 BCF per well. Scotchman observed, ‘To be successful, the play requires the continual drilling of new wells to replace those on the decline2.’ Technology-wise, 3-D seismic and micro-seismics ‘are becoming increasingly important in the locating of wells, avoidance of geo-hazards and frac monitoring.’

Rhodri Thomas (Wood Mackenzie) noted that the unconventional gas has taken off in North America and eastern Australia. The US ‘could now be self sufficient’ while in eastern Australia, the coal bed methane play has had a similar impact, albeit in a much smaller market. WoodMac puts European resource potential at 1,400 TCF—a potential made especially attractive by Europe’s high gas price. Subsurface ‘deliverability’ and ‘above ground factors3’ will hamper development. China is the next big unconventional playground.

David Aron (Petroleum Development Consultants) noted that the potential for integrating sub-surface knowledge and topside design has been recognized for years. Amoco used a two-phase gas reservoir simulator coupled to a surface network to manage production of a number of Southern North Sea gas fields. More recently, software has evolved from in-house developed tools to off-the-shelf software such as Petrel, Eclipse and other dynamic modeling tools such as HySys, Resolve and Prosper. Aron advocates deployment of such integrated software ensembles on a ‘parallel virtual machine’ (PVM). Schlumberger’s Open Eclipse provides the linkage between Petrel, Eclipse and the HySsys process simulator from AspenTech. Petroleum Experts also leverages a PVM to couple Resolve with Eclipse, Prosper and Gap and other tools. Elsewhere Halliburton’s Nexus simulator has been linked with HySys.

Such comprehensive integration creates new problems such as very large run times when models with a cell count are used, or when a large number of hydrocarbon components are modeled. Shell has worked around this by simplifying the models. Cross-discipline integration brings new communication problems between reservoir and facility engineers. ‘Process engineers are not used to dealing with uncertainty.’

Aron cited some strong claims for the financial benefits of the integrated approach. ENI reports that the integrated approach gave a 6.5% hike in produced oil thanks to better allocation of gas lift rates than was achieved by standalone reservoir models. Halliburton reported a 14.2% increase in oil production from integrated studies that ‘helped asset teams make optimum field development plans.’ Petrobras reported on use of a ‘next generation’ reservoir simulator and integrated asset optimizer to optimize a production platform’s location. The best case of 200 simulations showed an NPV that was 4.6 times that of the original location.

University of Edinburgh professor Ian Main believes that reservoir simulations should be subjected to testing against ‘publicly documented field data.’ These should be ‘blind’ tests conducted by a third party. Such an approach would reduce bias and provide concrete estimates of uncertainty. Until we know quantitatively how well current technology is faring, the utility of increasing the simulator grid-block count, better reservoir descriptions and improved history-matching cannot be judged. Main advocates using the same approach as is used by meteorologists to monitor the accuracy of weather forecasts. This has led to the development of the ‘statistical reservoir model’ (SRM).

The SRM uses production data and Bayesian information criterion to identify correlated well pairs. Bayesian dynamic linear modeling is then applied to generate a parsimonious model in which ‘only 5-25% of the wells in a field determine the rate history at each subject well.’ The method has been trialed on several North Sea fields. A blind test on the Gullfaks field (by Reservoir Deformation Research (RDR) found that 70% of the production figures for individual wellbores lay within the predicted 95% intervals. SRM sponsors include Amerada Hess, BG Group, BP, Conoco-Phillips, DTI, Kerr-McGee, Statoil, Shell and Total.

John Wingate described Baker Hughes’ new casing while drilling (CwD) technique that has been successfully used to drill and set casing in a 2,500 m well. CwD uses disposable polycrystalline diamond bits mounted on the casing string. 90% of the onshore well was drilled using CwD.

Adrock continues to vaunt the merits of its improbable ‘Lidar-like’ imaging spectrometer that uses ‘invisible laser light’ to identify subsurface lithologies. While the Petex paper did not make quite so strident claims for depth penetration as at Adrock’s 2009 EAGE presentation, the phenomenal claim that light reflections can be obtained at a depths of ‘up to 4 kilometers and beyond’ is repeated on Adrock’s Wikipedia page. More from www.oilit.com/links/1101_16.

1 www.oilit.com/links/1101_15.
2 Why are we thinking, ‘Bernie Madoff?’
3 See www.oilit.com/links/1101_10 (in French).
4 www.oilit.com/links/1101_11.


Software, hardware short takes

AVEVA, Blue Marble, Emerson, Geophysical Insights, INT, Wellsite Data Solutions, Peloton, Petris, TerraSpark Geosciences, Seismic Micro Technology.

Aveva has launched the instrumentation Business Value Calculator, an interactive tool that calculates the savings achieved through the use of Aveva Instrumentation—www.aveva.com.

Blue Marble’s Java API for GeoCalc is now in beta—www.bluemarblegeo.com.

Emerson’s FloBoss S600+ panel-mount flow computer for hydrocarbon metering handles up to ten simultaneous streams, reducing the ‘cost and complexity’ of fiscal metering. Connectivity includes ultrasonic and Coriolis meters, chromatographs, printers and metering supervisory systems—www.emerson.com.

Geophysical Insights is launching ‘Insight to Foresight’ clinics led by president, Tom Smith. The clinics consist of a single day engagements introducing the use of unsupervised neural nets for seismic interpretation—www.geoinsights.com.

The 4.2 release of INTViewer adds depth indexing of seismic data, 2D/3D display of 2D seismics and well trajectories, on-the-fly volume outline calculation, a new ‘propagation’ pick mode, EPSG codes for coordinate reference systems and a plug-in for displaying Microsoft Bing Maps—www.int.com.

Wellsite Data Solutions’ new Oil & Gas Explorer ‘allows companies and professionals in the E&P sector to remain updated and connected’—www.oilgasexplorer.com.

Peloton’s SiteView 4.0 enhances analysis, usability, data auditing and administration. A Site Schematic tool leverages geo-spatial imagery. Data records can be visualized over high resolution imagery and geo-tagged. New Excel templates add pivot tables and graphs to multi-site reporting. A data cleanup tool is available to normalize historical data—www.peloton.com.

PetrisWINDS DataVera 8.31 adds a Microsoft SQL Server 2008 data store option for projects and results. A connection wizard lets users profile data using verified models and rules—www.petris.com.

TerraSpark Geosciences has released Insight Earth 1.5 with improved fault extraction and automated ‘flooding’ of geobodies and improved coordinate conversion and SEG-Y data handling—www.terraspark.com.

SMT has joined Microsoft’s Upstream Reference Architecture Initiative—www.seismicmicro.com.


Halliburton announces DecisionSpace Well Review

First DecisionSpace for Production ‘SmartFlow’ application targets performance analytics.

Halliburton’s Landmark unit has announced ‘Well Review Management,’ a structured environment for well performance analytics, problem solving and reporting. According to the release, identifying underperforming wells is a ‘painstaking’ process, involving large data volumes from disparate systems. Well Review blends data from wells, gathering systems and pumps along with engineering and economic analyses to produce actionable information that can be integrated with budgeting processes. The process starts with candidate recognition, using decline analysis and performance metrics. Next multiple intervention scenarios are evaluated prior to planning and reporting. Well Review provides a collaborative workspace, workflow automation and management by exception.

Work process standardization exposes asset status and performance and assures consistent results through KPIs, and standard methods, opportunity screening and reporting. Well Review is the first of Landmark’s ‘SmartFlow’ applications—workflow solutions that extend the DecisionSpace for Production data and application infrastructure with a ‘rich and interactive’ user interface. DecisionSpace for Production leverages a Microsoft stack of Windows Server, SQL Server and the .NET Framework. Client side applications require Windows 7 or XP. More from www.oilit.com/links/1101_14.


XMap/EarthMate combo democratizes oil country field mapping

Eagle Information Mapping tailors Delorme’s handheld GIS/satellite link to pipeline workers.

A Delorme blog posting reports on use of its Earthmate mobile mapping solution and ‘XMap’ GIS in the oil and gas sector by Houston-based Eagle Information Mapping. Eagle has built a ‘lightweight’ application for its clients around the XMap and the Earthmate PN-Series GPS. Eagle VP Tracy Thorleifson noted that many pipeline field technicians were using PN-Series devices in the field, but they were not integrated into enterprise workflows. The device proved popular because of its SPOT satellite communicator that provides a SMS data connectivity and position in areas that are out of cell phone coverage.

Thorliefson sees XMap as a potential replacement for ‘high-cost, complicated, survey-grade GPS units’ for field data collection—particularly because most fieldwork is done at previously surveyed locations. In such circumstances, GPS is just used to locate the facility prior to field data collection. Here, Eagle’s value proposition is to tune XMap Forms to a client’s data collection needs.

High quality data thus captured can then stream into Eagle’s enterprise GIS workflows, leveraging data capture, along with the auditable QA/QC functions that are essential in complying with the new Management of Change (MOC) regulations from federal and state authorities. More from www.oilit.com/links/1101_8 (Delorme) and www.oilit.com/links/1101_9 (Eagle).


IRM-UK Data Management Conference

Oil IT Journal branches out into ‘horizontal’ data management to hear presentations from BP, Shell and data gurus including DAMA president Peter Aitken. Topics include data management in the cloud, Nestlé’s OneGlobe SAP program, taming the ‘spreadmart’ and data standards for the ‘cloud.’

We broke new ground last year, attending the ‘horizontal,’ cross-industry data management conference organized by IRM-UK1. This co-located the Data Warehousing/Business Intelligence and Information Quality conferences as well as the EU meets of DAMA and IAIDQ. Our first surprise was that there was a decent sprinkling of oil and gas folks in attendance, notably from BP, Statoil and Shell.

DAMA president Peter Aitken (Data Blueprint) gave a keynote in ‘monetizing data management.’ In other words, how to sell DM projects to the CEO. One issue is that schools teach students how to build new (Oracle) databases while industry needs people who can harmonize old databases! Today, many IT projects are done without data managers’ involvement. Data migration projects costs are often underestimated by a factor of 10. Moreover, 7 out of 10 projects and investments have ‘small or no tangible returns.’ If a project has hundred of thousands of attributes that each take one minute to map, you have a train wreck in the making!

Frances Dean and Hugh Potter from BP described its integrated supply and trading modeling initiative—a.k.a. the ‘systems transaction pipe.’ A previous data modeling initiative— described as a ‘big boil the data ocean project that failed’—resulted in serious ‘data fatigue.’ A global data management business team was formed in the Trade Completion back office. Dean noted ‘We are not IT and we are not a project.’ Hugh Potter outlined the mechanics of physical oil trading—the transaction pipe—from deal, through confirmation to scheduled movements.

Previously, the process lacked visibility of the ‘big picture’ and was increasingly defect-prone. The new system has been running successfully for three years now. Dean’s initial attempts to leverage the metrics it was producing did not go down well with the traders—resulting in angry team meetings and people unwilling to talk about performance. Metrics need to be chosen judiciously so they are not a threat, so they just provide visibility. You need to ‘hold up a mirror to let folks decide if they need a haircut.’ Now BP has established targets for metrics (working capital) and a dashboard for managers. Some feedback from the business has corrected anomalies and metrics have been re-cast. But these are used in decision support, ‘you can’t go on tweaking indefinitely—you need a stable system and metrics.’ Potter continued noting the overlap between data quality metrics and business KPIs. These allow BP to track a trading team’s activity and a deal’s ‘criticality.’ Operations managers use the system to see how well traders are doing their job. Metrics are now used to optimize portfolio and supervise deals, to find out why some transactions go wrong. Teams are less defensive as a result and can investigate ‘known issues’ separately. Dean concluded, ‘It’s never about the data—the value is in the conversations. This started as a bottom up amorphous grass roots project. Don’t expose metrics to management without having the team on board. Let them tell the story their own way.’

Malcolm Chisholm’s (RefDataPortal) presentation heralded the arrival of data management in the cloud. Cloud computing is a return to the mainframe-based time sharing of the 1970s. But access is even easier now as the web browser has replaced the TTY. Cloud computing’s poster child is the ubiquitous SalesForce.com. But you can now also rent a database for a few cents/hour. The cloud provides open source and is potentially a way around the ‘3 evil axes’ of the corporation—procurement, legal and IT. But security is a big problem for the cloud. There is a greater need for data housekeeping in the public space of cloud—data governance is the key. Personnel need to be redeployed from data centers to manage data in the cloud. Tasks include provisioning—how to push data to cloud consistently. Users need to be prevented from going straight to the cloud sans supervision. RACI2 matrices are recommended for data governance introducing notions like provision, activation, use and deactivation of a server. Data removal and backup have to be considered and ‘zombie’ instances need to be killed—otherwise it’s easy to overrun costs. You may also have to educate your programmers in the use of columnar databases such as Google’s Bigtable, MapReduce, Hadoop and others. These non-relational data stores offer a path to scalability, fast retrieval and data time stamps but they are quite a paradigm shift. There is no data modeling, ‘no Ted Codd.’ In fact, they are a step backwards in data management maturity.

Barry Devlin (9sight Consulting) describes spreadsheets as the bane of the data warehouse. ‘Spreadmarts’ are prone to logic and data errors that may take years to discover and result in having to restate the accounts. In 2008, Credit Suisse was fined £5 million for duff, spreadsheet-based closing3! But getting rid of Excel is not an option. Spreadsheets can be a sandbox for playing with stuff—you just need to nurture and fence-off the playground. Then you can leverage users’ analyses and migrate this to mainstream BI. Devlin advocates an ‘adaptive analytic cycle’ using an enterprise data warehouse (EDW). This is augmented with a ‘playmart’ using certified EDW data plus other, possibly unsafe, information sources. There is ‘more control in the playmart than in a spreadmart.’

Karsten Muthreich described Nestlé’s decade-long ‘OneGlobe’ SAP program that set out to ‘transform independent markets into one global company.’ This has now been achieved with a standard data system supporting 160,000 users in 800 factories. Nestle’s main issue was the shift from ‘my’ data to ‘our’ data—there are still issues with some local units. Before OneGlobe there were 2 million vendor records across multiple files. Now one master file holds 600,000 records. DM is evolving from local to global with 350 data standards defined. Data busts—like ‘15 meters’ instead of 15mm got users thinking about data quality.

1 www.oilit.com/links/1101_21.

2 www.oilit.com/links/1101_25.

3 More in similar vein from www.oilit.com/links/1101_18.


Homeland Security meets Stuxnet

Fall meet of Industrial Control Systems Joint Working Group hears from Trusted Computing Group, Schweitzer Engineering, Industrial Defender and the ISA Security Compliance Institute.

As Eric Byres of the Trusted Computing Group warns, ‘If you can ping it, you can own it!’ Speaking on behalf of the Metadata Access Point standards body, a 100 member-strong organization that sets out to lock down network communications and certify products, Byres emphasized that, particularly in the post-Stuxnet era, network access control (NAC) is crucial for industrial control and SCADA systems. NAC is not magic, it implies a holistic approach to user authentication, identity management and endpoint health. At the heart of trusted computing is MAP, the metadata access point. MAP targets security coordination use with an authenticated, asynchronous publish/subscribe mechanism supporting, inter alia, real-time data flows between sensors, flow controllers and other industrial equipment. MAP’s public key infrastructure provides lifecycle cryptographic identity management for SCADA and industrial control systems which are otherwise vulnerable to certificate expiry, revocation or spoofing. Oil and gas use testers include ExxonMobil, Shell, GE, Honeywell, Siemens and Yokogawa. More from www.oilit.com/links/1101_2.

David Whitehead of Schweitzer Engineering Labs asked, ‘How do you know if your control system has been compromised?’ Control system complexity offers plenty of entry points to hackers and malware making this a tricky question to answer. A multi-pronged approach is necessary to monitor SCADA systems, network appliances and intelligent electronic devices. This is done by constant surveillance of alarms, sequence of events recorders, event reports, operating system and network logs. Secondary communications paths in the network should be established to notify users when a probe or attack is underway. Cryptographic best practices protect serial and Ethernet connections. Network segments are connected via firewalls and control networks isolated with DMZ. Patches need managing.

Industrial Defender’s Walt Sikora’s presentation, ‘How Stuxnet changed the world,’ noted that, ‘Your friends now know what you do for a living and that you no longer have to justify your cybersecurity budget.’ But the reality is that the world of control systems has not changed at all. There is much more interest in who perpetrated Stuxnet and why, than on what we should be doing to prevent similar attacks on our systems. The proof that our controls systems can be compromised has not hit home. More awareness is needed of control system peer-to-peer communications, shares or the ‘seven other ways it could move about.’ More information is also needed on the Microsoft ‘zero day’ vulnerabilities Stuxnet has exposed and on the way it seeks-out and disables the anti virus. For Sikora, asset owners are hiding their heads in the sand. ‘There are thousands of threats that could compromise a control system.’ These include ‘drones,’ APT-capable worms and botnets. In fact, ‘It is possible that your system is already compromised and owned by an adversary!’ Moreover, anti-viruses, compliance with industry standards and other current preventative measures would not have prevented Stuxnet. What might is, ‘A complete understanding of your automation system, a secure configuration and system baseline and locking-down and denying access to everything by default.’ More from www.industrialdefender.com.

Notwithstanding Sikora’s skepticism, the standards movement continues its best efforts to assure plant cyber security. Andre Ristaino provided an update on the ISA Security Compliance Institute’s (ISCI) activity. ISCI has top-level support from ExxonMobil, Chevron, Rockwell, Honeywell, Yokogawa, Siemens and Invensys. The organization’s secure designation trademark provides ‘instant recognition of product security characteristics and capabilities’ similar to the ISO/IEC 61508 safety integrity level certification. Of particular interest to readers of Oil IT Journal are the emerging software development security assessment program and the related reference standards for secure software development. A stack of verification and validation protocols checks that software has been developed following appropriate engineering practices, is fit for purpose and incorporates a minimum set of security features needed to prevent common security threats. These standards are described in IEC 61508 and in Mike Howard’s book The Security Development Lifecycle1. More from www.oilit.com/links/1101_3.

Presentations available on www.oilit.com/links/1101_1.

1 www.oilit.com/inks/1101_4.


CO2 storage modeling and risk analysis—a treatise from NETL

A comprehensive report on geological and fluid flow modeling and modern risk analysis.

The US National Energy Technology Lab’s new report1 ‘Risk Analysis and Simulation for Geologic Storage of CO2’ is a veritable treatise on modeling fluids in rock—including fluid flow, thermal, geo-mechanics, chemistry and risk analysis. While carbon capture and storage (CCS) borrows much technology from oil and gas, there are notable differences. Not the least is that while an oil or gas reservoir has a trapping mechanism, the same cannot be assumed for CCS in saline formations where ‘the existence and effectiveness of the confining zone must be demonstrated through careful characterization before injection and monitoring after injection begins.’ The study leverages commercial software including Schlumberger’s Eclipse, GMI’s geomechanics tools, Dassault’s Simulia and CMG’s GEMS along with tools developed by NETL and other government labs.

Risk analysis is achieved using the DOE’s Certification Framework, a ‘simple and transparent’ methodology to estimate the risks of CO2 and brine leakage in carbon capture and storage CCS operations. The Framework is used at the DoE’s Westcarb and Secarb test sites. The 110 page oeuvre offers as good an overview of geological and fluid flow modeling and modern risk analysis as we have seen in the public domain.

1 www.oilit.com/links/1101_17.


PlantTech/Spar Europe 2010

3D Laser scan for facility revamp (Sofresid). Mobile survey (M3DM). ‘Kaizen’ in engineering supply chain (Toshiba). Engineering data standards updates from USPI-NL, ISO 15926, eCl@ss and Prolist. Spheron’s ‘high dynamic range’ camera—plant technology meets crime scene investigation!

In 2010, the Spar organization acquired PlantTech, rolling together the facilities engineering IT conference with Spar’s 3D/laser scanning event. This offered the oil and gas facilities community some enlightening glimpses at related activities such as mobile, high precision survey and crime scene investigation technology.

Mustapha Yahia (from Saipem/ENI’s Sofresid Engineering unit) set the scene in his keynote, showing how 3D Laser scan is used across oil and gas and shipbuilding at various stages of the engineering lifecycle. In 2 out of 3 major revamp projects, CAD and 3D models are ‘non existent or obsolete—especially in oil and gas.’ This, the main challenge in a revamp, used to be corrected by time consuming and inaccurate manual survey. Sofresid has been using laser surveying since 2003. A 4 floor 20 m cube platform was surveyed in a day, enabling engineering to maximize prefabrication of structural, electrical, and instrumentation components. The new module can be checked for clashes and the installation process walked-through in the virtual environment. Installation is ‘a very delicate operation on an FPSO.’ Checkout Urbica’s point-cloud fly-through on www.oilit.com/links/1101_19.

Erik Siemer from M3DM presented an eye-opener of a paper on the state of the art of mobile survey. Today, it is possible to acquire high accuracy while driving ‘at traffic speeds.’ Objects such as bridges, lamp posts and other infrastructure visible from the road can be captured with an accuracy of 1 mm relative and 5mm absolute. A helicopter is used to fill-in around the rear of houses. Data acquisition on the Schipol 06-2 runway which would have taken 8 nights of conventional survey actually took 8 hours and gave far more information. The days of static scanning, let alone the theodolite, are numbered!

Masaaki Kamei (Toshiba Japan) spoke of the lack of ‘kaizen1’ in the engineering supply chain. Every plant is ‘one of a kind,’ many stakeholders are involved in design and construction, information volumes are large and complex. Toshiba’s vision is for project-oriented information management with more IT involvement and standard-compliant data. ISO 15926 is there—but Toshiba is having a hard time figuring which of the 11 parts is relevant!

It would ne nice to be able to report that the EU standards movement is moving towards rationalization and alignment. Unfortunately, as Edwin Stötefalk (Lyondell-Basell) reported, there is the ‘bubble’ problem—of a large number of overlapping standards. There has been significant attention on ISO 15926 process data standard in recent years—but there are around 50 other standards describing the bigger picture. The CEN orchestration of industrial data (Orchid) project wound up in June 2010 having delivered a framework, implementation guide and standards landscape for the process industries—including oil and gas. Orchid leverages the Dutch USPI-NL data readiness assessment. More from www.oilit.com/links/1101_27.

The most remarkable device on show was undoubtedly the Spheron optical scanning camera. This is used to capture scenes of crime or part of a plant in a 360° x 180° spherical view. The device claims a 26 f-stop dynamic range—so everything is perfectly exposed—from a sunlit scene outside of a window to the darkest nook of the room. Once a scan has been acquired, the camera can be jacked up 20cm or so to take a second image for photogrammetric work. The resulting image allows for measurements to be made. Software is SceneWorks—a ‘visual content management system.’ Images can be manually tagged to add information and click though to related documents. More from www.oilit.com/links/1101_20.

Peter Zgorzelski (Bayer International) returned to engineering and plant maintenance standards with an update on Prolist. Prolist provides a list of properties leveraging ISO standards (not 15926). A process engineer starts with a Prolist interface and searches for, say, a coriolis flowmeter. A mouse click sends an XML enquiry file to suppliers who can compare requirements with their specs and respond with an offer. Each XML document has a unique ID so users can compare and track offers. Once a part has been selected, details can be exported to the plant material database for build. Prolist tools are available for the whole workflow on the basis that engineering data is entered once for each device. Emerson, Siemens, ABB are on board and others like Invensys can supply Prolist data. There is also a role for the standard in EPC, supplier and owner operator data handover. Zgorzelski worked through the potential savings of Prolist use in a process control engineering project with 5,000 control loops. With an estimated saving of 20 minutes per device, this tots up to a €250k saving. But time saved is actually less important than the improvement in plant documentation quality and in avoiding fitting the wrong valve!

The USPI-NL data integration group (DIG) meeting heard from purveyors of several different ‘bubbles.’ The ‘Who’s who’ of the engineering standards community saw representation from USPI, POSC/Caesar, Prolist, eCl@ss and end user companies. The DIG’s objective is to seek a consensus for engineering supply chain management data exchange. Ina Lingnau presented the eCl@ass spec which covers some 40 industries including process/plant. Again the ‘bubble’ problem of multiple overlapping standards was raised. Onno Paap (Fluor) argued that the way forward is to expose eCl@ss (and the ‘1,000 or so other relevant engineering standards’) to the semantic web as ontologies. The problem is that there are so many standards and owners of standards. Paap prefers ‘interfacing’ and interoperability of different standards—suggesting that Prolist could leverage the ISO 15926 class library. Also the ISO 15926 Part 4 class library could be slimmed down to exclude non-owned ‘foreign’ standards. ISO 15926 Part 8 provides a how-to guide for RDF implementation. Paap concluded, ‘One standard will never be the mother of all standards.’

The standards movements may appear disorganized—but this is due to the huge geographic and domain scope of process and plant construction. There is a natural tendency for different sectors and geographies to push ahead—sometimes in different directions. But the case for engineering standards in the downstream is possibly even more compelling than in the upstream—even though many of the obstacles are shared. More from www.sparpointgroup.com.

1 www.oilit.com/links/1101_26.


Folks, facts, orgs ...

OSIsoft, Palantir, Fiatech, Ikon, McLellan, ENGlobal, Murphy Oil, OHM, Petrofac, Rothschild, Atwood Oceanics, Badger Explorer, Mustang, Total, PETEX, Verdande, WellPoint Systems, Univa.

Jake Reynolds (TCV) and Ben Kortlang (KPCB) are to join the OSIsoft board of directors. Founder Pat Kennedy continues as chairman.

Charles Lewis has re-joined Palantir Solutions as VP, Business Development having completed an MBA in Energy Risk Management.

Reg Hunter has been named Senior Program Manager for Fiatech, in charge of integration and automation of procurement and supply networks.

Murray Christie has been appointed President of Ikon Science Americas, based in Houston, Texas. Christie hails from Exova.

Former founder and president of Advanced Geotechnology, Pat McLellan, has set up McLellan Energy Advisors in Calgary to offer geomechanics and reservoir characterization services with focus on unconventionals and carbon sequestration.

Tim Rennie has been appointed Executive VP of ENGlobal’s Engineering and Construction segment.

Tom McKinlay has been promoted to Executive VP, worldwide downstream operations, Murphy Oil, replacing Hank Heithaus who is retiring.

Offshore Hydrocarbon Mapping has named Michael Frenkel VP of R&D. Frenkel hails from EMGS Americas.

Andy Inglis has joined Petrofac as chief executive of its energy developments and production solutions businesses. Inglis was formerly with BP.

Rothschild has hired Bob Gibson and Paul Moynihan as oil and gas sector MDs at its new Calgary office. Both come from Mustang Capital Partners.

Atwood Oceanics has appointed Mac Polhamus as VP Operations and principal operating officer. He hails from Transocean where he was most recently MD, West Africa South.

Badger Explorer has appointed Dr. Wolfgang Mathis as Product Manager. He was formerly with Thonhauser Data Engineering.

Wood Group unit Mustang has appointed Gordon Stirling and Chet Nelson as regional directors.

Total has appointed Arnaud Breuillac and Michel Hourcard to its management committee.

John Hoopingarner has been named director of the University of Texas at Austin’s Petroleum Extension Service. He was formerly with Emergent Technologies, Inc.

Morten Vinther heads up Verdande Technology’s new operational hub in Abu Dhabi.

WellPoint Systems has elected Michael Goffin and David Sefton to its Board of Directors. Both are partners of Arch Capital Partners.

The Sun/Oracle Grid Engine team, including founder and original project owner Fritz Ferstl, are joining Univa. Ferstl will be Chief Technology Officer and lead Univa’s EMEA business.


Done Deals

OSIsoft, ABB, Altair, Black Pearl, Bolt, Dover, FleetCor, GlobaLogix, GSE, Kayne Anderson, NEOS.

OSIsoft has announced that Technology Crossover Ventures and Kleiner Perkins Caufield & Byers have made a ‘significant’ minority investment in OSIsoft totaling $135 million.

ABB is to acquire software provider Obvient Strategies, adding Obvient’s solutions to its recently acquired Ventyx software portfolio. ABB is to retain the Obvient team and place its executives in key roles within the Ventyx product management organization.

Altair Engineering has acquired computational fluid dynamics specialist Acusim Software.

Black Pearl Capital Partners has acquired oilfield industry equipment specialist Canadian Energy Equipment Manufacturing. CEEM founder Tyler Hague continues as CEO.

Bolt Technology Corp has acquired of all of the outstanding stock of SeaBotix, designer and manufacturer the ‘Little Benthic Vehicles’ ROV systems.

Dover has acquired manufacturer of down-hole rod pumps Harbison-Fischer for $402.5 million. Harbison-Fischer will become part of Norris Production Solutions, an operating unit within Dover’s Fluid Management segment.

Global provider of specialized payment systems to fleets, major oil companies and petroleum marketers FleetCor Technologies has completed its IPO of 14,576,250 shares at $23 a pop.

GE Intelligent Platforms has completed the acquisition of SmartSignal, a software company specializing in remote monitoring and diagnostics solutions for oil and gas and other verticals. SmartSignal’s solution are to be aligned with GE’s ‘Proficy’ architecture.

GlobaLogix has acquired Blast Energy Services Satellite Division, allowing it to integrate satellite communications into the design and installation of its offering of data gathering and reporting systems.

GSE Systems has bought EnVision Systems for $1.2 million cash plus an extra $3 million contingent amount.

Kayne Anderson Energy Funds has sold Energy Contractors to Nabors Well Services.

LianDi Clean Technology has established a new subsidiary, HongTeng WeiTong Technology, headed by Jian Feng Yang, to develop software for petrochemical companies.

Neos GeoSolutions has closed its $60 million financing round. New investors include Energy Capital Group and Bill Gates.


Sales, contracts, partnerships and deployments

SolArc, Amor Group, Cameron, CGGVeritas, Venture Information Management, eLynx Technologies, Emerson Process Management, Saudi FAL, Houston Technology Center, Oiltech Investment Network, Inova Geophysical, Intermap, OYO Geospace, Reservoir Group, ESG, FMC, Fluor Corp.

Swiss ‘virtual’ integrated petroleum and petrochemicals company Kolmar Group has selected SolArc’s RightAngle integrated application suite to manage its global trading and risk operations.

Oil and gas logistics specialist Asco Group has awarded Amor Group a £15 million, five year contract for the management of its information systems.

Cameron has won a $74 million order from Petrobras for 27 subsea trees and related equipment.

CGGVeritas and BG Group have signed a technology cooperation deal for the development of ‘advanced seismic solutions’ including broadband acquisition and processing for sub-salt, deep targets and unconventional gas.

Hess has awarded Venture Information Management a contract for the improvement of data quality on approximately 20,000 West African wells. Venture will be using its in-house developed ‘V-KIT’ data quality tool kit.

eLynx Technologies has signed a deal with Patara Oil & Gas for its web based hosted SCADA service. The system will monitor and control 96 recently acquired wells and three gas processing facilities in the Rocky Mountain Paradox Basin.

Iraq’s South Oil Company has awarded Emerson Process Management a contract to provide crude oil metering systems and related technologies for the new Al-Basra Oil Terminal.

Saudi FAL and Emerson have opened a 9,000 square meters manufacturing facility in Jubail, Saudi Arabia. The facility has a an annual production capacity of 2,500 Fisher control valves and up to 6,000 Rosemount transmitters.

The Houston Technology Center and Oiltech Investment Network have formed a strategic partnership to ‘accelerate the growth’ of technology companies in the exploration, production, and oilfield services sector.

Inova Geophysical Equipment has won a $15 million order from China National Petroleum Corp. unit BGP. The deal included over 20,000 channels of its ‘Scorpion’ cable-based recording system. The company has also sold 13,000 channels of its Aries II land recording system to Dubai-based Terraseis. The deal includes 39,000 SM-24XL geophone strings from the Ion Geophysical’s Sensor unit.

Intermap Technologies has won a $2 million contract for high-resolution digital elevation models to an unnamed ‘large producer of natural gas’ in the United States.

OYO Geospace has won a $7.4 million contract from BGP for a 7,000 single-channel GSR wireless seismic data acquisition system.

Reservoir Group has teamed with Canadian Quest Coring for distribution of its large-bore wireline coring technology.

Engineering Seismology Group has successfully completed what is claimed as the ‘largest microseismic hydraulic fracture monitoring program to be conducted in Canada.’ The project, for Nexen, in the Horn River Basin, consisted of real-time monitoring for a total of 143 fracture stages in eight horizontal wells.

FMC Technologies has signed an $80 million deal with Total E&P Angola for the manufacture and supply of subsea production equipment for the ‘Girassol Resource Initiative Project.’

Santos has awarded Fluor Corp. a $3.5 billion engineering, procurement and construction contract for its Gladstone Liquefied Natural Gas project in Queensland, Australia.


Standards Stuff

POSC/Caesar, Fiatech, NIST, Madagascar, Energistics, Inspire, Joint Research Center.

The POSC Caesar Association and FIATECH have published the business plan for a Joint Operational Reference Data (JORD) project that sets out to leverage the ISO 15926 plant and process data standard’s shared reference data. More from www.oilit.com/links/1101_22.

The US National Institute of Standards and Technology (NIST) is to develop standards for key technologies including cloud computing, emergency communications and energy saving. More from www.nist.gov.

Jeff Godwin has released a ‘pure-Python’ graphical user interface for the Madagascar open source seismic processing library. More from www.oilit.com/links/1101_23.

Energistics has rolled out the ResqML V 1.0 specifications including a Business Overview, Use Case and Technical Usage Guides. ResqML addresses the transfer of geophysical, geological, and engineering structural data between software applications used in reservoir modeling and simulation. ResqML builds on previous standards WitsML and ProdML, adding storage for cellular data in an attached HDF5 file. More from www.energistics.org.

The EU open source (geographic) metadata editor (Euosme) creates Inspire-compliant metadata in 22 European languages. Euosme was developed by the Joint Research Centre as part of the EuroGeoss project. The tool, written in Java using the Google Web Toolkit, embeds the Inspire metadata validator service (www.oilit.com/links/1101_23) and can be downloaded from www.oilit.com/links/1101_24.


Safety first ...

‘Step Change’ spill initiative, helicopter ‘multilateration’ for North Sea, flexible pipe integrity report.

Step Change in Safety,’ the UK’s offshore safety initiative, has committed to a 50% reduction in the number of accidental hydrocarbon releases. Such releases have almost halved since 1997 thanks to workshops, the sharing of best practice on asset integrity and the development of guidance documents. However, progress has stalled in recent years—hence the new initiative to ‘kick-start’ a further downward push on the statistics. More from www.oilandgasuk.co.uk.

A ‘ground-breaking’ (well we hope not literally) helicopter safety system has just gone live in the North Sea, backed by Oil & Gas UK and NATS, the UK’s air navigation service provider. The system claims a ‘world first’ in the operational use of wide area multilateration for tracking flights. Multilateration uses transponders on 16 offshore platforms to track helicopters at a ‘much greater range than radars.’ Helicopters are now visible to air traffic controllers from takeoff to landing.

A new report from Oil & Gas UK encourages the correct use of flexible pipe in the area of subsea and floating oil and gas developments. The SureFlex report, compiled by Wood Group Kenny, comprises updated guidance on flexible pipe integrity assurance and state-of-the-art analysis of pipe integrity technology. The report can be obtained from www.oilit.com/links/1101_7.


Going, going, green...

Energy saving at UAE Petroleum Institute, Quorum manages C02, Open Source grid software.

The United Arab Emirates’ Petroleum Institute has selected Verisae’s Sustainability Resource Platform (SRP) energy management package for its campus, hoping to realize a 10% savings in energy consumption. The Institute will initially leverage Verisae’s ‘open’ web-based energy management and analytics solutions. The PI also plans to use Verisae as an educational tool for students. More from www.verisae.com.

Quorum Business Solutions’ Quorum TIPS Gathering now manages CO2 processes for enhanced oil recovery. The package is said to be the ‘only solution’ for life cycle CO2 management, from production, transportation to re-injection or flooding. The application tracks stakeholder entitlements and obligations along with production imbalances. Clients can choose between multiple ‘cashout’ options and complex joint interest billing issues. The enhancements were developed for ‘the world’s largest user of carbon dioxide for EOR.’ More from www.qbsol.com.

Green Energy Corp. has announced the ‘Total Grid’ open source community to ‘foster innovation and collaboration’ in Smart Grid-related software. The open source community will host Smart Grid projects including applications, protocols, and ‘Reef,’ the open source version of the Green Energy’s commercial middleware, GreenBus. More from www.totalgrid.org.


Deloitte buys Altos, rolls into new MarketPoint offering

Center for Energy Solutions gets upgraded with MarketBuilder decision support and data.

Deloitte & Touche is expanding its ‘Center for Energy Solutions’ with the acquisition ‘substantially all of the assets’ of Altos Management Partners and its sister software house MarketPoint. Altos, a consultancy to energy companies, was founded in 1995. Its MarketPoint unit is the developer of MarketBuilder, an analytic suite for energy market modeling and price forecasting. The acquisition provides Deloitte’s clients with, ‘a better understanding of market fundamentals’ for energy commodities, including oil, gas and refined products.

Deloitte partner Andy Dunn said, ‘The acquisition is the foundation of a new Deloitte MarketPoint offering inside the our Center for Energy Solutions. This will provide the industry with decision support solutions, including MarketBuilder, models, data and consulting services.’ MarketPoint founder Dale Nesbitt has now joined Deloitte. Nesbitt’s modeling tools include the North American Regional Gas Model, the World Gas Trade Model, the World Oil Model and the Western European Gas Model. More from www.oilit.com/links/1101_6.


Telular acquires SMARTank business from SmartLogix

M2M wireless tank monitoring business bought in $6 million plus deal.

Chicago-based Telular Corp. has acquired SmartLogix’ ‘SmartTank’ line of business in a deal valued at over $6 million. SmartLogix was previously the largest value added reseller of Telular’s TankLink tank monitoring solutions with 400 end-user customers and nearly 16,000 billable tanks. TankLink is a machine-to-machine (M2M) wireless solution for automated tank replenishment. TankLink monitoring helps eliminate emergency or unnecessary deliveries. Real-time, 24/7 tank level information is transmitted over the nationwide cellular network. Tank monitoring devices deliver alerts on re-order thresholds, critical product level, and product delivery notifications. Data is hosted in a TankLink operations center and can be accessed via an internet-based inventory management system.

Terms of the transaction include a purchase price of $6.0 million and up to an additional $2.4 million performance-related payment. The purchase price comprises $1.5 million cash and conversion of an existing receivable.

Telular estimates the deal will add $2.6 million of revenue and $1.4 million of net income before non-cash items to its consolidated results. Telular also updated guidance for fiscal 2011. CFO Jonathan Charak said, ‘Net income before non-cash items was originally $8.0-9.0 million in fiscal 2011. The acquisition means we are increasing this to $8.5-9.5 million.’ More from www.telular.com.


Clariant ticks REACH boxes

Phase 2 of EU chemicals safety program kicks-off.

REACH, the European Community’s program for ‘registration, evaluation, authorization and restriction’ of chemical substances saw its first round of registration close last month. REACH sets out to protect humans and the environment from dangerous compounds used in industry by identifying constituent substances. Manufacturers are required to register information on the properties of their chemicals in a central database run by the European Chemicals Agency (ECHA) in Helsinki. The Agency is working to evaluate ‘suspicious’ chemicals and is building a public database in which consumers and professionals can find information on hazardous products. Last month saw some 24,000 files submitted to the agency.

Swiss chemical giant Clariant claims to have led the field in the initial REACH phase, leveraging its toxicological, ecotoxicological and chemical know how to lead 80 out of the 150 consortia and substance information exchange forums (SIEFs) submitting to REACH. The next REACH deadline concerns less hazardous substances and will close in June 2013. Complying with REACH is a prerequisite for a company’s ability to operate in Europe. Substances not registered by the deadlines set for each phase will not be allowed to be produced, imported or sold in the EU. More from www.oilit.com/links/1101_5.


Apache deploys GIS-based environmental monitoring

Spatial Energy develops ‘first of a kind’ satellite-based asset performance tracking system.

GIS data and software boutique, Spatial Energy, has created a remote sensing-based environmental monitoring program to monitor Apache Corp.’s global assets. Apache VP HSE David Carmony
explained, ‘Environmental stewardship is central to our operations and reputation. We are using satellite imagery to implement an enterprise-wide environmental monitoring program and make our operations more efficient with on-going inventory monitoring.’

The system will be used for ‘pro-active’ environmental monitoring of land and offshore operations leveraging an ‘enterprise-wide workflow’ and operational program to monitor worldwide exploration and production activities along with acquisitions and divestitures.

Spatial Energy president Bud Pope added, ‘This first-of-its kind energy industry initiative will monitor the environmental impact and safety of Apache’s operations in areas of current and historical development, as well as in environmentally sensitive areas. We will provide the latest high resolution satellite and aerial data, advanced algorithms and delivery methods, along with the processing support needed to meet Apache’s requirements.’ More from info@spatialenergy.com.


BP Shipping & CapRock Communications

Deal extends VSAT contract from Gulf of Mexico to global shipping operations.

BP Shipping has selected service provider Harris CapRock Communications to deploy its ‘SeaAccess’ very small aperture terminal (VSAT) solution on its global tanker fleet. The turnkey VSAT services will extend BP’s corporate IT network and applications to vessels and provide ‘attractive crew welfare solutions.’

BP’s Wasim Kayani said, ‘We are increasing the services we can provide to our vessels while lowering the total cost of ownership. SeaAccess provides the platform we need to get the most out of our communications.’

SeaAccess extends BP’s corporate office capabilities, enabling ship captains to send real-time reports on vessel operations, logistics and routes. The service will also support crews’ telephone traffic.

BP conducted extensive testing at CapRock’s facility in Aberdeen. The engineering team developed a time division multiple access (TDMA) demonstration with test circuits for BP’s experts to conduct real-time data transfers, make telephone calls and see CapRock’s wide area network optimization service in action.

CapRock president Peter Shaper added, ‘Currently we’re providing communication services to BP’s offshore assets in the Gulf of Mexico and in West Africa. BP continues to count on us for reliable communications with critical operations.’ Contract term is 39-months—more from www.caprock.com.


SocGen rolls-out ‘state-of-the-art’ IT platform

Bank’s commodity arm acquires trading platform from RBS Sempra.

Societe Generale’s corporate and investment banking unit has extended its energy offering with acquisitions in North America and Europe. Assets acquired from RBS Sempra Commodities include a ‘state of the art’ IT platform, developed over the last twenty years, which is now claimed to be ‘one of the industry’s benchmark platforms.’ Some 130 energy traders will join SocGen’s commodity market activities, led by Edouard Neviaski.

SocGen Americas CEO Diony Lebot said, ‘With this new development of our commodities business, we will be able to offer the full range of solutions to our North American clients in the energy sector. Clients will benefit from expertise of Jacqueline Mitchell and Michael Goldstein from RBS Sempra and from RBS Sempra’s talent and information technology.’

In Europe, following the end of its partnership with GDF Suez last September, the bank has developed its natural gas and electricity market capabilities through the joint-venture Gaselys. Around fifty senior salespeople, analysts, traders, logistics staff, and support functions have already been recruited. More from www.socgen.com.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.