I was surprised by the content of the Digital Plant event held in Houston late last year (review on page 6 of this issue) and I don’t think I was alone. I was expecting to see and hear from equipment vendors, but the show had been hijacked by the engineering design community—many of whom seem to have been drinking from the ISO 15926 cup. The show’s centerpiece was the iRing demonstrator showing the exchange of plant design data between Aveva, Bentley, IBM and the CIEAM*.
Having just attended the Fiatech meeting in Den Hague, I confess to a little ISO 15926 fatigue—which if you are a regular reader you may share. But this was a great opportunity to check out the progress made on the engineering data exchange front and also on the success or otherwise of the semantic technology under the hood.
In my previous columns, I have been pretty dubious as to the claims made for ISO 15926 and semantics. These stem from my card-carrying ‘old skeptical fart’ status as an observer of previous upstream interoperability initiatives. What appears to be going on in the iRing demonstrator is a file exchange leveraging a standard list of parts. Asking around, I did not get the impression that the data was transferred in RDF—although in a subsequent LinkedIn discussion, I was corrected in that Bentley leverages RDF/OWL in its data exchanges.
Another iRing protagonist put to me that RDF is not particularly germane to the interoperability effort, which is achievable using other formats like XML. But this then means that interoperability is achieved by everyone using the same format. This is what I call ‘vanilla interop.’
The idea behind the semantic web is not that everyone uses the same data format, but rather that data is exposed in such a way that it is ‘discoverable’ and can be ‘linked’ with other data sets. How to publish linked data is explained in a paper from the Web-based Systems Group at the Free University of Berlin.
Data discovery is an interesting idea. It suggests a casual use of a data set by a third party. The approach is used to ‘mash up’ data from different providers—for instance from public data sources like dbPedia and GeoNames. This informal use is rather different from ‘industrial’ usage where more in-depth knowledge of the data sources is usually both available and necessary to achieve what is required. Hence the more rigid iRing approach.
My semantic musing was heightened with the arrival of Honeywell’s ‘Intuition’ on the upstream semantic scene (see this month’s lead). Honeywell, a major automation contractor is taking an interest in the upstream! And in the semantic web!! And in Microsoft’s MURA**!!!
Another interesting contribution to the semantic debate came across my radar recently in the form of a paper titled, ‘A description logic primer.’ Description what? I hear you say. Let me explain. I am indebted to my friend Bertrand du Castel, Schlumberger fellow, initiator of the first online oilfield ontology and author of the oeuvre ‘ Computer Theology’ (CT). I have so far failed to find the strength to read all of CT, let alone write a review, but my take-home from this ambitious homily to things semantic is that the current interest in ontologies stems from a late 20th Century ‘breakthrough’ in something called description logics.
The Description Logic primer*** by three Oxford University researchers looked like a promising starting point to learn more about the breakthrough. The 16 page Primer sets the scene, defining description logics (there are more than one) as ‘a family of knowledge representations used in ontological modeling [...] that provide one of the main underpinnings of the web ontology language, OWL.’ Description logics (DLs) provide ‘formal semantics [that] allows humans and computer systems to exchange ontologies without ambiguity [...and...] makes it possible to infer additional information from the facts stated in an ontology. This important feature distinguishes DLs from other modeling languages such as UML.’
Apart from the fact that the definition assumes a understanding of ‘ontology’ which you may or may not possess, this looks promising and echoes the semantic web ideals of unambiguous information exchange and ‘reasoning.’
The Primer introduces the building blocks of DL ontologies using symbols indicating, for example, ‘inclusion’ as in
Mother ⊑ Parent.
The Primer, using DL ‘SROIQ DL’, shows how DLs handle incomplete information, the ‘open world’ assumption and how adding more axioms like the one above, constrains reasoning. A segue from SROIQ to OWL involves both a slimming down to a ‘lite’ DL and an expansion to a more web-friendly OWL. Those with access to Oracle 11g may like to try the OWL RL reasoning extension to SQL data.
Does DL constitute a breakthrough? The promise of non ambiguous data exchange and reasoning sounds fantastic. Mashups of disparate ‘open data’ sources look promising. Yahoo’s SearchMonkey’s use of multiple vocabularies in dataRSS feeds is a real-world use case, and shows how terminological ambiguity can be minimized. Whether it leads to ‘reasoning’ is another question. What is reasoning anyhow? Is a ‘select’ statement reasoning?
OK, I am now well and truly out of my depth. Fortunately I can see the bottom of the column approaching. You may want to pursue these ideas on the very semantic W3C oil, gas and chemicals business group.
* Center for Engineering Asset Management.
** Microsoft Upstream Reference Architecture. .
*** By Markus Krö tzsch et al., January 2012. Cornell University Library arXiv:1201.4089v1 .
So far, the automation contractors have been absent from the ‘intelligent operations’ marketing frenzy. This is surprising in that it is their equipment that powers the digital oilfield. The situation is about to change for Honeywell which is embarking on a major rebrand of its Matrikon line of business, which is morphing into a new intelligent operations offering targeting the upstream oil and gas business.
A foretaste of the new ‘Intuition’ offering comes in an intriguing whitepaper authored by Honeywell’s principal architect Jay Funnell, titled, ‘Demystifying the Intuition semantic model.’ The whitepaper addresses the silo-based information that impacts the digital oilfield, with key data scattered in ERP systems, maintenance applications and historians. Effective operations intelligence requires seamless access to all of the above, which in turn implies familiarity with a variety of industry data protocols from ISA-95, through OPC to ISO-15926 and beyond.
Honeywell’s Intuition solution embeds a semantic model a.k.a. a ‘virtual repository’ where data from isolated silos is ‘stitched together’ on demand. Data stays in its original format reference sources, and Intuition performs the federating ‘acrobatics.’
The white paper provides an introduction to semantic modeling, showing how OWL/RDF offers a flexible alternative to the relational database. A semantic ‘wrapper’ around the data silos exposes enterprise data in a consistent manner, allowing on the fly translation of different naming conventions and data idiosyncrasies. Intuition ‘creates a layer on top of native data sources and re-shapes them into a standard model.’ Intuition, it is claimed, lets companies ‘embrace standards in a non-standard environment.’ Dashboards and reports can then be written ‘as if the native data was using ISA-95, ISO 19526, Mimosa or other standards.’
Honeywell also reveals that Intuition is ‘closely aligned’ with the Microsoft Upstream Reference Architecture (MURA). This linkage appears to be based on the fact that MURA’s marketing-driven specifications include ‘semantic data services.’ This is the first time that MURA has been associated with the RDF/OWL technology stack. The white paper features impressive slideware showing how Intuition dovetails with MURA. Other MURA members may be intrigued by this marketing land grab! Intuition is the fruit of ‘Project Nemo’ a rebranding of Honeywell’s Matrikon line of business. Those interested in learning more about Honeywell’s semantic-based digital oilfield are invited to sign up to express interest. More comments on the state of play of semantics in oil and gas in this month’s editorial on page 2.
Engineering design software house Altair is offering users of its Hyperworks platform access to a ‘complexity quantification and management system’ provided by Como, Italy-based Ontonix. OntoNet will be available via Altair’s HyperWorks partner alliance (HWPA). Ontonix founder and CTO Jacek Marczyk observed, ‘High complexity makes for inefficient design. Our technology lets engineers conceive new solutions while quantifying and managing the associated risks.’
In a separate announcement, Ontonix has applied its real time complexity monitoring solution to process refinery sensor data and produce a ‘single holistic KPI used to anticipate compressor faults.’ A formula established by Ontonix proposes that the product of complexity and uncertainty in a system equates to ‘fragility.’ Such fragility may appear in a process, plant or indeed in the economy at large.
Ontologix has applied its fortune telling technology to forecast the ‘dissolution’ of the Euro in Q3 2013 and the ‘demise’ of the US in 2018. Altair clients include AMEC, Chevron, Conoco-Phillips, Landmark, Petrobras, Schlumberger, Shell and Technip. More ‘complexity,’ risk and interminable blogging from Ontonix. More on Hyperworks from Altair.
Oil IT Journal reported on the expanding use of geomechanics across a range of upstream activities in our December 2008 editorial. We were very interested then to receive a review copy of Jean-François Nauroy’s new book* to see if the new bases were covered.
Chapter one, ‘elements of rock mechanics,’ takes up about a third of the book, offering a reasonable balance between fairly hairy math and explanatory narrative in what appears to be an exhaustive presentation of the basics. Logging, leak-off test analysis core measurements are treated in a clear if introductory manner (the book includes 15 pages of references for those after more.) A section on modeling offers more advanced math, but no discussion of algorithms or software.
Chapter two, ‘drilling and production’ discusses drillability, wellbore stability, sand production and hydraulic fracturing with again a good balance of narrative and math. Narrative is pitched at a level suitable for execs or engineers from other disciplines who wish to bone up on geomechanics. As for the level of the math, that depends on what you did at school (this reviewer dropped out of math shortly after div grad and curl dropped in).
Chapter three, ‘geomechanics and the reservoir’ focuses on geocellular modeling and fluid flow simulators. Here there is more focus on applications with some illustrations of what appears to be a pet theme—that of coupling of geomechanical effects and production—causing ‘constants’ like permeability to change over time.
Environmental aspects are treated with illustrations of decametric subsidence in California’s San Joachim valley (caused by farming not oil and gas production) and less spectacular subsidence on Groningen and Ekofisk (which are due to oil and gas). Well abandonment, CO2 sequestration, and reservoir monitoring techniques are mentioned briefly in the final few pages. Geomechanics provides a good introduction to the basics along with broad brush coverage of applications in this increasingly important field. More from Editions Technip.
* Editions Technip, 2011, 224p. $131. ISBN 9782710809326.
‘How collaboration with records management can simplify your job;’ a new white paper from the American Records Management Association (ARMA) authored by Brian Barnier of ValueBridge Advisors, offers help to IT leaders seeking to align IT with records and information management (RIM). The whitepaper suggests a practical path to a better use of proven industry practices in both records and information management. Barnier proposes a ‘streamlined, cross-silo’ approach that addresses the needs of the business by targeting a ‘stable IT solution design point.’ The IT/RIM interface is critical in companies faced with legal ‘e-discovery’ demands and compliance requirements. These have ‘spidery legs’ that touch everything, catalyzing IT’s coordination with RIM professionals. Elsewhere, product development or geographic expansion drives new uses of records and information. These may involve issues that are visible to IT leaders such as cloud computing, social media, mobile devices and data migration.
RIM’s involvement in such projects is often focused on compliance and legal issues, usually motivated by evidence or retention needs. This is an opportunity for IT to engage RIM leaders and collaborate on business-driven projects. A Forrester Research/ARMA survey carried out late last year determined that only 20% of RIM leaders report to IT (simplifying cooperation). In the other 80%, CIOs need to go through other executives to tap into the expertise of the RIM team. The six page whitepaper offers a lot of advice on getting these two communities to understand each other and collaborate. In particular, Barnier offers a mapping between the RIM framework of ARMA’s generally accepted recordkeeping principles (GARP), the information governance maturity model and corresponding IT-focused guidance such as COBIT, TOGAF and ITIL.
If you are a geologist or engineer, chances are that you at some time in your education watched sand and silt being swashed around in a flume tank. French geostatistics specialist Geovariances is now offering an opportunity to manufacture sedimentary bodies digitally with ‘Flumy,’ a new geological modeler for meandering channelized systems. Current approaches to geological modeling fail to reproduce depositional heterogeneity and sand body continuity. Flumy is claimed to faithfully capture reservoir continuity and heterogeneity, making for more accurate flow simulation and history matching.
Flumy’s geological concept driven algorithm combines stochastic and process-based approaches, using geological parameters derived from logs and a priori knowledge of the depositional environment. Forward modeling incorporates channel migration and erosional truncation. Sedimentary bodies such as point bars, mud plugs and crevasse splays are generated. Models can be quality controlled as they are created with Isatis’ 3D Viewer. Flumy originated in research at the Paris School of Mines and is a complement to Isatis’ multi-point statistics and pluri-Gaussian simulators. More from Geovariances.
Speaking at the Oil & Gas Producer’s Association’s Geomatics Committee meeting late last year, Cairn Energy’s Paul Nolan endorsed the OGP’s Geospatial Integrity of Geoscience Software (GIGS) review process (Oil ITJ October 2011) and announced that ESRI has committed to perform a GIGS self assessment of its forthcoming ArcGIS 11 release.
Chairman Richard Wylde (ExxonMobil) outlined ongoing work in the field of oil spill response and collaboration with OGP’s Metocean and Environment Committees. Wylde also reported from the European Space Agency-sponsored Oil and Gas Earth Observation (OGEO) workshop in Frascati last November where 200 earth observation and oil and gas industry specialists met to discuss plans for an oil spill response joint industry project. An OGEO website is planned to disseminate earth observation knowledge into the OGP community. Wylde also reported take-up of GIGS in aviation and defense via the Open Geospatial Consortium.
UK-based Information Processing Ltd. (IPL) has been working with Statoil on a data governance framework designed to manage Statoil’s master data. The project centered on the development of a novel ‘model-driven’ framework for management of Statoil’s master data, including ‘technically complex’ subject areas relating to its upstream oil and gas business.
Following its international expansion, Statoil was faced with the challenge of managing its master data across its global operation. Project scope includes business-related subject areas like customer and product along with more technically complex master data required for oil and gas production operations.
IPL helped Statoil make a business case for master data management, established the data landscape and engaged with process owners and executives to define ownership, governance and stewardship responsibilities. The framework has now been implemented across Statoil’s global operation and the program has been ‘socialized’ across the organization.
IPL director Chris Bradley claims its approach avoids the ‘field of dreams’ approach to MDM by delivering ‘just in time’ master data services that support the business. Bradley will be presenting a keynote address on the project at the upcoming Enterprise Data World 2012 conference and is conducting a workshop on information management challenges and opportunities in oil and gas. Bradley is also chairing the Embarcadero ER/Studio user group over which he presides. More from IPL.
The Information Store (iStore) has announced the extension of its PetroTrek data management flagship to oil and gas production surveillance and optimization. PetroTrek OPS addresses the digital oilfield challenge of data access and integration with a web-based offering that connects operations and field staff to relevant information in real-time charts, key performance indicators, and maps. The workflow-driven applications guide users through activities such as well performance reviews and maintenance opportunity prioritization. The announcement was made at the recent Microsoft Global Energy Forum in Houston and was endorsed by Ali Ferling, head of Microsoft’s oil and gas segment.
Curiously though, the release made no mention of Microsoft’s chimerical upstream reference architecture (MURA), though iStore was an early backer of MURA. We asked iStore’s Ben Parker who assured us that ‘iStore is still an active participant in the MURA initiative which we support in a number of ways, including development of products and solutions that incorporate open standards, such as PPDM which is now part of each PetroTrek offering. In fact, iStore is the only software vendor that has successfully implemented the full PPDM data model (not just a thin footprint) along with full referential integrity on Microsoft SQL Server.’ More from iStore.
Oil and Gas UK kindly provided Oil IT Journal with review copies of three new publications—Guidelines on relief well planning for subsea wells, Guidelines on competency for well personnel and an Example competency profile.
The relief well planning guide was produced by the well lifecycle practices forum (WLPF) and covers compliance with the UK’s oil pollution preparedness response regulations. These now mandate advance relief well planning. The 22 page document provides a top level checklist of well complexity assessment, relief well design and planning.
The competency guidelines, also from the WLPF, distill the recommendations of the oil spill prevention and response advisory group (OSPRAG) in a 30 page document. The Example profile is another 30 pages. The guidelines strike a fair balance between narrative and checklist, are well written and relatively acronym free. More from Oil and Gas UK.
VSG has announced Avizo Fire 7, 3D visualization and analysis software for digital rock physics and core analysis. The new release adds tools and workflows for analyzing, modeling and characterizing rock samples, from pore-scale to core-scale. An XLab Hydro tool computes absolute permeability—VSG.
The 9.5 release of Canary Labs’ Enterprise Historian includes replication and web services data access. A new mobile app provides for handheld data delivery. An Excel add-in and ODBC interface enables third party business systems to access data via SQL style queries—Canary Labs.
FaultSeal’s FaultRisk 4.0 has been ported to Java and modularized so components can be marketed separately. A FaultRisk plug-in for Landmark’s DecisionSpace Desktop has been announced—FaultSeal.
Fugro-Jason’s 3.3 release of PowerLog adds multi-interpreter collaboration, capillary pressure analysis and enhanced presentations—Fugro-Jason.
INT’s J/GeoToolkit 3.2 also leverages Java for portability. The new release adds new chart types including rose diagrams and polar plots, support for CSEGY format seismics and the DLIS well log data format. The contour libraries offer improved labelling strategies and collision avoidance—INT.
Baker Hughes’ 2011 SP1 Jewel Suite release claims enhanced data management, usability, performance and new multi-point statistics for facies modeling—Baker Hughes.
Neuralog has announced NeuraLabel, a continuous color laser label printer that meets all GHS and DOT requirements. NeuraLabel produces standardized labels for chemicals and hazardous materials that are resistant to abrasion, marine immersion, and UV exposure—Neuralog.
WesternGeco has released a log and seismic conditioning plug-in for Petrel. The plug-in includes log preparation for sonic calibration, wavelet extraction and synthetics generation—WesternGeco.
Terra 3E has released a plug-in for Petrel 2011. The Opus Terra toolbox builds proxy models of reservoir simulators, solves inverse problems and performs sensitivity and uncertainty analysis—Terra 3E.
Petris is about to release Winds DrillNET 2.0 adding usability enhancements, a database default repository, improved trajectory planning, casing stress check and pore pressure prediction—Petris.
Imaging the complex subsurface of offshore West Africa or the Gulf of Mexico requires more and more complex acquisition geometries and above all, traces. Turning the multi-terabyte acquired data into something usable by interpreters and engineers requires massive raw compute power.
The seismic imaging workflow is mission critical to Total and is increasingly seen as giving the company a competitive edge. Following years of collaboration with SGI, Total has now acquired a state-of-the-art high performance computer (HPC) that puts it in a leading position in terms of compute power. The machine, currently under construction at Total’s Jean Féger Scientific and Technical Centre in Pau, southwest France, is a 2.3 petaflop SGI ICE-X.
The ICE-X is deployed in an IP 115 ‘sandwich’ blade configuration that doubles processor and memory density. The ICE-X comprises some 100 racks and takes up approx. 70 sq m of floor space including power and memory—not counting disk storage. While the exact configuration has not been revealed, a single ICE-X rack can house 2,304 processor cores and provides ‘up to’ 22 teraflops.
Total develops much of its own seismic code and uses SGI’s Management Suite and Performance Suite along with LSF—with an option on Altair’s PBS Professional for job scheduling and workload management. SGI interim CEO Ron Verdoorn observed, ‘The need for compute-intensive data processing in the oil and gas industry increases constantly. With data files exceeding ten petabytes, technological innovation for seismic imaging relies on both HPC and storage architectures. Both are areas where SGI offers a complete, integrated solution including services.’
Total is active in the oil and gas HPC community—with its head of seismic processing, Henri Calendra, a keen advocate for high end architectures including GPUs for seismic processing. But the release makes no mention of hybrid GPU-based computing. This could well be because of the imminent arrival on the HPC market of Intel’s Knight’s Ferry multi core HPC on a chip a.k.a. ‘MIC,’ (Oil ITJ October 2011). These are scheduled for release later this year and promise a more straightforward programming paradigm than NVIDIA’s current CUDA offering.
The ICE’s operating system is Novell/Suse Linux Enterprise Server 11. SGI gave up on its own Irix OS a while back and now works with Novell on a scalable version of the open source operating system. As of last November, Linux-based operating systems make up around 97% of the Top500 machines. Only one machine in the November 2011 list ran Microsoft’s ‘high performance’ Windows Server, coming in at a now modest 230 teraflops.
Total’s system is currently under construction with the likely first flop scheduled for April 2012. The reported 2.3 petaflop bandwidth puts the machine in the top 5 of the current Top500 listing. But HPC is a rapidly evolving area. Total’s claim to be N° 1 in corporate HPC is credible, but it compares a future machine with current petaflop architectures already rumored to exist chez other majors and seismic processing shops like WesternGeco and CGGVeritas. Perhaps Total’s openness will spark off some more announcements from other owners of ‘private’ supercomputers. More from SGI.
The 17th (and seemingly the last) Digital Plant event was held late last year in Houston. We already reported on Katherine Frase’s entertaining presentation of IBM’s Watson with its potential application of artificial intelligence to oilfield data. This was followed by presentations from plant IM vendors, offering an interesting contrast in marketing techniques.
Anne Marie Walters (Bentley) was exhaustive, rolling in the PointTools LIDAR/point cloud data acquisition, iPad apps for 3D model viewing, developed with Bluebeam’s Revu, SACS finite element modeler (another acquisition) and a ‘competition through collaboration’ paradigm focused on Bentley’s ISO 15926 OpenPlant. Joseph Krol made a more straightforward pitch for Siemens’ plant information management offering focusing on Comos, a database for all plant data. No mention at all of the XHQ operations intelligence platform—marketed by a separate division!
The ‘future technology’ roundtable heard from Tad Fry (Anheuser Busch) which is trying to rationalize its core software. The brewer has virtualized its tens of thousands of servers and moved to a thin client infrastructure—reducing its energy needs and costs by 50%.
Jorge Vanegas (Texas A&M) made a spirited attack on some obstacles to the roll out of new, ‘disruptive’ technologies. Knowledge custodians are often a barrier to information sharing. There is a need to ‘de-siloize’ the knowledge resource base—and particularly to open it up to academia. Moreover there is a whole generation of knowledge workers leaving industry who are ‘not allowed to teach’ and whose case histories are not available. Another sacred cow is the fixed schedule for classes which must be ‘totally obliterated.’ Vanegas cited Stanford’s global distributed teaching initiative as a ‘great success.’ .
The panel was invited to comment on their strategy for the ‘Cloud’ with some interesting results. For Ray Cline (formerly with SAIC, now at the University of Houston), ‘The Cloud model has been around for a long time, it is called ‘outsourcing.’ It is just about making data available to external stakeholders.’ Cline, a Dropbox fan, is skeptical about current implementations and suggests we ‘wait for Cloud 3.0.’ Siemens Ulrich Loewen was equally dismissive—putting the Cloud ‘somewhere between bastard and innovation.’ Collaboration is all very well but it is the process that drives IT (not vice versa) and the process needs to be improved. The debate turned to the use of smartphones, iPads and other devices in the enterprise. Annheuser-Busch lets employees bring in their personal devices which access a guest wireless system locked down with technology from Zachry. Zachry’s Todd Sutton observed that when the board of directors bring in their own devices, IT may not have much of a say in the matter. Siemens has a ‘very strict’ IT strategy at least in manufacturing. R&D may need more freedom. This can be a challenge to the younger generation which expects the latest technology.
Cliff Pedersen is CIO of the North West Redwater (NWR) partnership. NWR is active in the Alberta oil sands with an ambitious plan for a greenfield refinery that will convert bitumen directly to ultra low sulfur diesel fuel in a ‘one step’ upgrading process with CO2 capture and use in EOR. The plant will be built 45 km NE of Edmonton, Alberta over the next couple of years. Pedersen observed that the plant IM business is full of silos and that companies are still seeking the holy grail of a ‘single harmonious work environment.’ This presupposes software interoperability—missing from current operations and maintenance tools. Handover is still very problematical and is achieved at best with PDFs. ERP systems are ‘not connected to a damn thing.’ Today, the gap between ERP, real time, operations and maintenance is filled with custom ‘point to point’ software. What is needed is a data service bus transport offering federation rather than replication. Pedersen sketched out a bewildering standards landscape spanning engineering, procurement, construction and O&M. Pedersen’s analysis derived largely from BP’s Integrated Subsurface Information Systems project (Oil ITJ Jan 2007), a precursor to OpenO&M. A demonstrator of the NWR concept ran on the exhibition floor showing information exchange between IBM, Bentley, Aveva and CIEAM.
Rick Morneau, whose ‘VR Summit’ was co-located with Digital Plant had a good stab at estimating the dollar value of virtual reality on an FPSO. Savings accrue from a speedier first oil, from better uptime due to ‘just in time’ maintenance and from a better understanding of dependencies. It is puzzling why VR is not more used. A virtual reality shoot-out began with Marc de Buyl’s presentation of VR Context’s work for Total on the Pazflor FPSO, using its WalkInside simulator and data portal. The Pazflor training simulator includes metadata for three million objects along with photo realistic scenery and real time navigation. WalkInside has proved itself on Pazflor and will soon be deployed other Total assets. Harry Daglas took the stage to show Dassault Systèmes 3DVia, a VR system linked to plant and simulation data. An Exalead portal provides links to SAP. The system includes realistic scenes with water ripple, day/night, fire, smoke and a hand held flashlight for your avatar. But it was Dan Legerskar’s demo of Eon Reality’s ‘iCube’ VR cave that most impressed us, with its attention to detail in a VR walk around of a compressor station. Eon claims 160 installations of its iCube ‘Cave’ around world. With 3D projectors available for $800, there is ‘no excuse not to buy.’ Interaction is also getting cheaper—gesture-based valve control is now available with Microsoft’s Kinect. Even better is the iPhone with its onboard GPS/gyroscope combo providing plant situational awareness.
Dow Chemical CIO Jerry Gipson described a move away from proprietary tools to allow for information sharing with stakeholders, citing Fiatech as an example (Gipson is chair of the Fiatech board of advisors). The ISO 15926 standard has been delivered, the challenge now is execution.
Ryan Cormac has been investigating the role of process, systems and tools at Worley Parsons, (WP). WP delivers asset management services particularly for brownfield redevelopment through its ‘Improve’ offering. This is powered by an in house developed enterprise management system (EMS) including a risk-based framework for project execution and cost-time-resource analytics. The EMS embeds tools from Aspen, Aveva, Autodesk, Bentley, Integraph and Primavera. Palisade’s @Risk is used alongside Quest/Kbase for risked cost estimating. Cormac described the NISTIR 7259 Capital Facilities Information Guide as a good roadmap. He also referred to the NIST GCR 04-867 cost analysis of inadequate interoperability—although the EMS itself appears to show that interoperability may not be as big an issue as some pretend! Cormac concluded with a value triangle—showing that the main contribution to PMC project value was hosted information management (by WP, naturellement). In the Q&A Cormac was quizzed on post execution data handover. He observed that data handover should be limited to what is in scope. If the client only want documents, ‘there is no point in doing a big database project.’
The handover issue was the subject of a roundtable moderated by Dow Chemical’s Bob Donaho who opined that data handover may not be sexy but it is critical and is no longer about paper documents—but increasingly about re-use. Donaho referred to the Dow/Saudi Aramco Sadara (formerly Ras Tanura—Oil ITJ May 2010) as being on the ‘bleeding edge’ of standards-based asset lifecycle management. ‘It is a huge challenge to make this happen.’ More on the future of Digital Plant from TradeFair Group.
The inaugural SMi Oil and Gas Cyber Security Conference was held in London late last year. In his presentation on securing industrial control systems against cyber threats, David Alexander (Regency IT) asked ‘How did we end up here?’ with vulnerable software and inconsistent application of patches. People are concerned about the risk of attacks, but generally don’t know enough about mitigation. Meanwhile the business is pushing for remote access to information and fewer people and reduced costs.
Adrian Davis of the Information Security Forum has investigated supply chain security challenges. With the internet, intellectual property can go anywhere in the world quickly—exposing companies to data and information loss as it is now easy to write compromising snooping software. The ‘cloud’ is perceived as a way of ‘getting rid of the IT guys.’ But companies are increasingly dependent on ‘just in time’ supply chains and are critically dependent on information. One problem is that ‘standards don’t talk to each other.’ Davis recommends planning for the endpoint at the beginning—i.e. for receiving data in a format you understand. In which context he suggests leveraging the draft ISO/IEC 27036 Part 3 ICT standard.
Adam Laurie (Aperture Laboratories) recommends ‘going in blind’ to a cyber audit—making no assumptions about what measures are in place. Ask device suppliers for source code or use reverse engineering. But if a supplier doesn’t want to give you the code, you should ask yourself why. Perhaps it is because the code is bad, random number generators are not random or perhaps there is stuff on the silicon that shouldn’t be there!
Justin Searle’s (Utilisec) live hacking demo over the phone showed how much information could be retrieved from a field device when you know how. Depending on what type of information is retrieved, a hacker can adapt his strategy to attack the infrastructure. The same technique as is used to break keys on BlueRay/DVDs can be applied to crack encryption keys.
Danny Berko’s company, Waterfall Security Solutions, offers a range of security technologies including routing tables and physical/IT security. Waterfall offers a link between industrial control systems and the business network that eliminates hacking. This is achieved by a unidirectional communications link using a laser and photocell combination. The Unidirectional Security Gateway allows the control system’s server to be replicated in the business environment, but makes it impossible to write back. Waterfall is in partnership with OSIsoft, GE and Siemens. Berko cited a recent cyber attack on a Norwegian oil company as a wake-up call.
Joel Langhill (SCADA Hacker) reported Stuxnet-type attacks on a US water company that happened the previous week—with a pump being turned on and off! Even control systems which are not connected to the internet are connected to other systems that are! A plant worker can unknowingly launch an attack e.g. via a link in an infected PDF document. Because this happens on an internal network, it is ‘trusted’ and can utilize open communication channels. Siemens was the Stuxnet victim, but this year, GE, ABB and Honeywell have disclosed vulnerabilities. Half of reported vulnerabilities are in Microsoft Windows-based systems, half in embedded systems. But there is hope. These issues can be addressed by simple security e.g. preventing the Acrobat attack with a web proxy, using a unidirectional gateway and by good patch management.
A panel session debated the Stuxnet worm, developed, seemingly, by the Chinese, or was it Israel, perhaps with help from the Americans? The code quality is ‘amazing engineering.’ Iran was not the only victim, US and German systems were infected. The USB-key based attack showed that there is ‘no such thing as an air gap.’
Phil Jones described the increased security burden imposed on GDF Suez’s UK unit as it became an operator. ISO 2700 compliance is critical to keeping a license to operate. For GDF, the security team is ‘everyone in the organization.’ All need to understand security risks and the primary objective, of ‘protecting the lives of the people who work for us’ in other words, security as an HSE issue. Networking everything is ‘not necessarily a good idea.’ Geologists for instance may have their own network. More from SMi Conferences.
Kjell Erik Drevdal, former CEO and co-founder of Badger Explorer has resigned, and is replaced by David Blacklaw.
Black & Veatch has appointed John Chevrette as President of its Management Consulting Division.
Richard Keyser has joined Boardwalk Pipeline Partners as Senior VP of Operations. He hails from NiSource Gas Transmission and Storage.
Susan Herrington has been appointed to Canary Labs as sales manager for oil and gas.
CyrusOne has opened a new 3,200 sq. ft. data center in Singapore.
Curt Terje Espedal is the new European Regional Manager for Emerson unit Roxar Software Solutions. He was formerly with Landmark Graphics.
Duncan Junor (Halliburton) and Steve Roberts (BP) have been elected Chair and Vice Chair of Energistics’ Board of Directors. Jana Schey has been promoted to Senior Director, Operations and former Shell manager Joey Magsipok has joined technical analyst.
EnergyNet has promoted Chris Atherton to VP Business Development. He is currently Chairman of the SPE business development study group.
Matthew Milne is to head-up Expro Group’s new operations base in Shekou, China.
Founder and CEO of Technoguide, Jan Grimnes, has joined the Board of ffA as a Non-Executive Director.
FreeWave Technologies has hired Greg Veintimilla as CTO and VP of Engineering. He was formerly President and CEO of Vintrana.
Founder and former CEO of Apollo Sales & Marketing Group Mike Eyre has joined Fugro-Jason as Global Sales Manager.
Gas Natural has named Dean A. Ward to the new post of Chief Information Officer.
Intertek has opened a new purpose-built petroleum laboratory in Takoradi Port, Ghana.
Tim Hackett has joined video conferencing solutions provider IOCOM as CEO.
Perry Harris is new President of JW Wireline Company, part of JW Energy Company. He hails from Halliburton.
Zac Nagle has joined KBR as VP Investor Relations and Communications.
Kongsberg Maritime is leading a new ‘Situation awareness and decision support tools for demanding marine operations’ project to reverse the trend of growing complexity in marine systems with a ‘user centered design approach.’
Bernie Wolford is Senior VP operations of Noble Corp.
Object Reservoir has appointed Mark Miller as Chief Technology Officer.
Charles Goodman has been promoted to President and CEO of P2 Energy Solutions, replacing Bret Bolin who served as CEO since 2008 and who will remain on the board. Goodman was formerly COO for Ventyx. Bolin is now President and CEO of Turaz, another Vista Equity Partners’ portfolio company.
Ann Dowling is to join the Board of BP as a non-executive director.
Adil Toubia is CEO of the Oil & Gas Division of Siemens Energy, succeeding Tom Blades. Toubia was most recently a partner in Energy Capital Group and prior to that held executive positions at GCC Energy Fund and Schlumberger.
Helle Kristoffersen has been appointed Senior VP Strategy & Business Intelligence at Total. Ms. Kristoffersen joined Total last year from Alcatel-Lucent.
Christy Turner has joined Petris in its Calgary office. She hails from Encana Natural Gas. Yoann Molina and Sven Mundorf have joined the Hendon office staff. Jimmy Kirk is joining the professional services team in Houston.
Ridge Global has appointed former secretary of the Department of Environmental Protection James Seif as principal of its new energy consulting services practice.
Alison Greene has joined Subsurface Consultants & Associates as a Marketing and Communications Advisor.
Senergy has appointed Mike McEwan VP of Finance and Information Systems.
Tecplot has appointed former CEO of Ivey Imaging, Rich Stillman, as president and CEO, succeeding co-founder Mike Peery who will remain as chairman of the board.
Baker Hughes has acquired the assets and circumferential magnetic flux leakage inspection technology of Intratech Inline Inspection Services.
Complete Production Services has sold its Canadian wireline operations for $45 million cash.
IBM has completed its acquisition of Toronto-based Platform Computing.
Kongsberg Maritime has acquired a 10.7% share in Prediktor through a private placement.
OYO Geospace is to sell 1,122,565 shares of its common stock owned by OYO Corp. USA. Goldman, Sachs and Credit Suisse Securities are joint book-running managers for the offering.
Petrofac has acquired the share capital of UK high-end subsea pipeline consulting and engineering services business KW Limited. KW’s 56 employees will join Petrofac’s engineering and consulting unit.
CGGVeritas’ unit Sercel has acquired the assets of provider of downhole sensors and gauges Geophysical Research Co.
Steel Excel Inc. has acquired the business and assets of Eagle Well Services.
Technip now holds 98.60% of Cybernetix’ share capital and is to file a ‘squeeze out’ request with the French Markets Authority. On completion, Cybernétix will be delisted from Paris’ Euronext exchange.
IHS and OSIsoft have announced a strategic alliance to help clients achieve their sustainability goals. The deal targets ‘surging’ global demand for data collection, integration and information management solutions focused on enterprise sustainability management (ESM). OSIsoft president Bernard Morneau said, ‘Organizations are under pressure to make informed decisions affecting the sustainability of their business. IHS offers a flexible combination of software and content that will help clients address these new business imperatives.’
Currently, ESM challenges are addressed with ‘with disparate manual processes, spreadsheets and one-off legacy systems.’ These are failing to aggregate today’s increased data volumes. IHS senior VP sustainability Woody Ritchey added, ‘Clients need to leverage real-time process and event data. We have integrated OSIsoft’s enterprise-wide data collection and archiving solution with our applications to make sustainability an essential part of their business.’ The announcement cites one Asian utility using IHS and OSIsoft solution to process five million real-time records per month along with ‘vast quantities’ of historic data, tuning production to fuel inventory, demand forecast and environmental impact. More from OSIsoft and IHS.
Oilfield automation service provider Failsafe Controls has deployed OPC-UA-based technology from Kepware on an unnamed oil and gas production company’s major project. The client had decided to repatriate a previously outsourced process control and data acquisition operation to an in-house solution. The move stemmed from a desire to ‘empower teams with the ability to recognize trends, identify problems and optimize production.’ Cost saving was also a factor. The project covered 1,800 well sites and about 120,000 I/O points throughout North America.
Failsafe developed a web-based application that provided bi-directional access to data in the client’s SCADA system. Data sharing was enabled using the new OPC UA cross-platform, service-oriented architecture. Kepware’s OPC Connectivity Suite was deployed as an OPC UA wrapper to ‘vanilla’ OPC servers. Failsafe’s Albert Lambert said, ‘The ease of collecting the data through Kepware was a key factor in winning this bid.’ Inductive Automation’s ‘Ignition’ SCADA system was another key piece of the puzzle offering ‘no-install’ clients, database connectivity and a commercial OPC UA server. The customer uses utilizes ABB Totalflow and Bristol Babcock flow computers to monitor gas/liquid flow rates, pressures and temperatures. All of the company’s devices now communicate with historians, Ignition, and other OPC-enabled applications and devices. More from email@example.com.
Santa Clara, CA-based Picarro has announced a ‘first-of-its-kind’ solution for natural gas pipeline leak detection and measurement, based on its cavity ring-down spectroscopy (CRDS) technology. The vehicle-mounted Picarro Surveyor detects methane in the air and alerts users and repair teams in real-time. The technique was demonstrated last month by anchor client, Pacific Gas and Electric, the first utility to deploy the solution. The solution combines ultra-trace methane concentration measurements in air with high-resolution GPS location, a time stamp and wind speed and direction. Methane isotope signatures distinguish leaks from false positives due to naturally occurring methane. Wind speed and direction analysis provides an indication of the likely location of a leak.
Picarro Surveyor includes instrumentation and Picarro’s ‘P-Cubed’ cloud-based processing platform. PG&E executive VP operations Nick Starvropoulos said, ‘This solution will transform leak detection programs by increasing the frequency of leak surveys and reducing false positives, while radically reducing the cost per survey.’ More from Picarro.
Shell Global Solutions has signed with Ortec Consulting of Gouda, The Netherlands for worldwide collaboration on resource optimization and consulting services. The deal addresses supply chain optimization and ‘strategic decision making’ in both up and downstream engineering including planning and scheduling, production capacity optimization, spare parts management, and cost estimating. The agreement consecrates a 25 year long history of collaboration between the companies. CEO Lambert van der Bruggen said that Ortec would be supporting Shell’s decision-making with ‘modeling and fact-based consultancy services.’ Other Ortec clients include BP, ExxonMobil, Q8 and Repsol. More from Ortec.
Energy Solutions International has implemented its Synthesis logistics management solution for Belle Fourche and Bridger Pipelines—more.
GSE has delivered a ‘Harmony’ training system blending oil simulator to Statoil Mongstad refinery. The simulator leverages ABB HTS as a virtual DCS—more.
Halliburton and Petronas Carigali have signed a framework agreement for a shale technical centre of excellence in Kuala Lumpur—more.
Total E&P UK has signed a ten-year contract with Iron Mountain to manage its geoscience and general data including core samples, seismic sections, surveys, maps and reports—more.
ISS Group is to provide Oil Search (PNG) with a production data management solution based on its BabelFish product. The contract is valued at approx. $600,000 over nine months—more.
Aker Solutions has been awarded a three year, NOK 870 million, frame agreement for engineering, procurement, installation and commissioning for Talisman Energy’s projects in Norway—more.
The BP-operated Azerbaijan International Operating Company has awarded Emerson Process Management a multi-million-dollar contract to automate the Chirag oilfield—more.
GE Oil & Gas’ surface and subsea production equipment, control systems and services have been selected for Nexen’s Golden Eagle project in the central North Sea. Contract value is put at $170 million—more. GE was also selected by Petrobras to equip a new-build FPSO for the Guará Norte part of the Tupi oilfield—more.
Wood Group has won a 75 million two year extension from Shell UK for engineering and construction of the St. Fergus and Mossmoran gas plants—more.
Intertek has signed a frame agreement with Shell Global Solutions for the provision of asset inspection services and quality assurance projects—more.
Doris Engineering has selected Jee for specialist subsea and riser engineering on the Angola deepwater Kaombo pipeline system—more.
A joint venture between JGC Corp., KBR and Chiyoda Corp. has signed a $15 billion contract with INPEX and Total for engineering, procurement and construction activities on the Ichthys LNG Project in Northern Australia—more.
Oilennium has been awarded a contract by Aberdeen-based EnerMech to provide a learning management system (LMS) for its1000 staff and clients—more.
Oil Price Information Service has signed an agreement with PortStorage Group to combine the OPIS/Stalsby North American bulk liquid terminal database TankTerminals. The deal promises a ‘worldwide picture of the petroleum storage landscape’ with access to some 3,500 terminals—more.
Shell Brasil Petróleo has signed a $15 million deal with OYO Geospace for a 100 km. seismic reservoir monitoring array at the BC-10 field offshore Brazil—more.
Moscow based Rock Flow Dynamics has signed with Pioneer Natural Resources for provision of its ‘tNavigator’ fluid flow reservoir simulator—more.
Russian Surgutneftegas has deployed SAP’s Hana ‘in memory’ computing engine to analyze ‘millions of records’ making up its 230 GB operational data set spanning upstream drilling, production and transport procurement. Following initial tests at HQ, Hana is now being rolled out across all sectors—more.
Atos has announced a testing and acceptance management service for SAP solutions including automated testing across the application lifecycle including roll-out—more.
A Bentley Systems-sponsored site ‘iRINGToday’ promises to cover the latest developments, news, and events surrounding ISO 15926 and iRING—more.
CEN, the European Committee for Standardization has announced a framework for testing e-business standards. The Global e-Business Interoperability Test Bed, launched in 2007, has developed a comprehensive testing framework to assess e-business applications in terms of their compliance with standards and interoperability. Project partners include industry and technology stakeholders from Europe, Asia and North America including NIST—more.
Energistics reports that its PRODML protocol is ‘ramping up.’ The 2.1 release adds a business overview, centralized documentation and other enhancement such as support for wireline formation tests, productivity index elements and a ‘relative identifier’ for internal facility parameters—more.
The Open Geospatial Consortium has released a set of best practices for sharing geographic names. The OGC Gazetteer Service Best Practice specifies a web feature service that serves a geography markup language schema based on the ISO 19112 information model. This exposes the gazetteer for access and update of geographic features across the web. Inter alia, the USGS has used the protocol. The best practice is available for review.
The OCG is also asking for public comment on its WaterML 2.0 time series encoding standard. WaterML is used in the representation of hydrological observations and is the fruit of international cooperation between hydrological and government agencies, software providers, universities and research organizations from Australia, the USA, Canada, Germany, the Netherlands and other countries. Review the spec on the OGC website.
Late last year, P2 Energy Solutions surveyed US oil and gas companies with revenues under $700 million, asking financial and senior management about industry trends and attitudes. The 115 responses showed four primary ‘back office’ concerns—spreadsheets, real time reporting, workflow automation and training.
Many companies depend on spreadsheets to manage their business, but the approach has disadvantages. Spreadsheets make it hard to enforce auditable controls and their use to manage joint interest billing and AFEs is a ‘difficult and burdensome.’ Using multiple spreadsheets in monthly closures takes too long. Generally, respondents considered that spreadsheets ‘should not be used to run a business.’
77% indicated that the absence of an integrated information system puts actionable, dashboard-level intelligence beyond their reach. For many companies, workflow ‘management’ equates to moving paper around and losing track of information is commonplace. Training new hires in both the business and on software use is increasingly critical. Around 60% of those surveyed thought their accounting systems could be improved and were prepared to invest in technology.
P2ES believes the solution for most all of the above is its business intelligence solutions. These integrate with Excel for analysis and presentation and provide real-time data and key performance indicators. P2ES further advocates using a hosted solution resident in the ‘cloud’ running on dedicated servers. The survey concludes with a shameless plug for P2ES’ Excalibur Online which is claimed to tick all the boxes. More from P2 Energy Solutions.
Real time financial services data specialist Selerity is to distribute Genscape’s Cushing Oil Storage Report over a low-latency feed to electronic trading markets. The offering targets ‘sophisticated, technology-centric investors engaged in the energy markets.’ Such users blend assessment of economic conditions with raw energy consumption data to guide algorithmic and high-frequency trades.
Subscribers gain programmatic access to the bi-weekly Genscape report that provides precise measurements of crude oil in storage at the Cushing, Okla., facility, the physical delivery point for the Chicago Mercantile Exchange light crude futures contracts. Genscape gathers its readings each Tuesday and Friday by flying over the facility and using sophisticated infrared technology to measure the capacity in every storage tank. The data is then delivered over Selerity’s real-time low-latency network to electronic trading firms every Monday and Thursday. Data is available in all of Selerity’s co-location points of presence at Chicago, New York and Frankfurt. More from Selerity.
Høvik, Norway-based Det Norske Veritas (DNV) has kicked-off a joint industry project (JIP) to develop a unified approach to subsea lifting. The JIP aims to increase efficiency and safety during equipment design, operations and maintenance. DNV’s Bob Oftedal explained, ‘Existing standards and regulations do not meet the challenge. The JIP will develop a unified safety approach. So far 14 key international offshore players have joined the project.’
Subsea lifting standards and regulations have not kept pace with the demand for increased lifting capacity at ever increasing water depths. Oftedal added, ‘Instead, the required safety level has been defined by clients’ specifications, technological boundaries and manufacturers’ considerations, rather than regulatory documents acknowledged by all the stakeholders.’ DNV is inviting industry to develop a unified approach to subsea lifting design which will result in a recommended practice. Current JIP members include Statoil, Petrobras, Marathon, Technip, Subsea7, Saipem and Heerema. More from DNV.
A report commissioned by Industrial Defender from Pike Research titled, ‘Convergence in automation systems protection’ identifies three key problem areas in automation—cyber security, regulatory compliance and operations. New IT-oriented control system technologies offer visibility and control, but at the expense of operating systems, applications and hardware that can be attacked in new ways. Many deployments ignore cyber security during installation and may require expensive and disruptive security retrofits to the live production environment. Compliance with regulatory requirements such as Sarbanes-Oxley, Payment Card Industry and Critical Infrastructure Protection standards may be patchy. A balance needs to be struck between enabling plant function and keeping system complexity manageable.
Many SCADA systems have evolved over decades without a ‘master plan.’ It is rare that an automation system is based on a single architecture with consistent policies for protection and monitoring. Companies are faced with a choice between purchasing multiple point solutions and engineering the component interfaces or buying an integrated suite of components from a single vendor. The Pike report (actually more of a 14 page long editorial) concludes with an endorsement of the single vendor approach. How this is to integrate the complex legacy frameworks described above is unclear—presumably that is where Industrial Defender comes in? More from Industrial Defender and Pike Research.
ISS Group has received a significant award from Chevron for the deployment of its BabelFish production data monitoring technology in the operations information center of the $46 billion Gorgon LNG project. The contract, valued at $2.5 million over 12 months includes the BabelFish product line, services and support. BabelFish integrates data from disparate real-time systems into a ‘holistic’ overview of complex processes.
Other recent wins for ISS include a six month software and services upgrade contract for Maersk Oil Qatar. ISS management reports that its sales pipeline is the strongest for a decade. Analyst Microcap forecasts $22 million sales for 2012, giving ISS a ‘strong buy’ rating following the announcement. ISS’ share price has risen by 28% since the announcement.
ISS Group has a resale agreement with Schlumberger for sale of BabelFish to the upstream oil and gas segment. The contract is estimated at a $20 million value over the five year period that ends in 2013. Other clients include BP, ConocoPhillips, SaudiAramco, Shell, Hess, Apache and most all Australian independents. More from ISS Group.
Dallas-based Authentix, a provider of fuel marking ‘authentication’ has renewed a partnership with SGS Group for an ongoing integrity program in the republic of Senegal. The fuel marking program is helping the Senegal government recover tax money that would have otherwise been lost to fuel fraud, while assuring fuel quality.
Authentix provides ‘nano-scale engineering,’ and cutting-edge sensors. Known adulterants such as kerosene and other low tax fuels are often used to adulterate legal tax paid fuels. Authentix combats such activity by adding its markers to such likely adulterants. Proprietary testing technology is used to determine if fuel has been adulterated.
The company claims to have helped recover $11 billion in lost revenue for its clients. SGS provides inspection and certification services from a network of 1,250 offices and laboratories around the world. More from Authentix and SGS Group.
Following its successful investigation of shale gas source rocks by ExxonMobil (Oil ITJ July 2011), researchers from the University of Pau’s Open & Experimental Center for Heavy Oil (CHLOE) have applied a similar approach to model different flow régimes in computerized tomographic (µCT) imagery of sandstones. The research, presented at the Comsol user conference late last year, targets upscaling of fundamental physical observation and modeling from pore scale to real-world flow simulation at the macro scale.
Starting with µCT images constructed with Simpleware’s ScanIP, the models were imported into Comsol for study. Simulation leverages Navier-Stokes equations and their extension to a diffuse interface. Image processing with Avizo Fire is performed on the µCT volumes to evaluate porosity, connectivity and other parameters.
The research, sponsored by Total, includes streamline simulation with Comsol Multiphysics of single phase flow as well as more complex investigations of dual phase flow. The latter has been applied to problems such as viscous fluid front fingering, trapping and remobilization and reactive transport. Lead researcher Igor Bogdanov believes that such direct numerical simulations at pore-scale will in the future be widely used in modeling of porous media, particularly for petroleum applications. The full paper is available from Comsol.
Invensys’ Operations Management unit is claiming an industry first for its virtualization technology, now certified for high availability, disaster recovery and fault tolerance in supervisory control applications. Maryanne Steidinger, director of product marketing, explained, ‘The ArchestrA/InTouch 2012 releases leverage either VMware or Microsoft’s Hyper-V virtualization technology to offer high-availability and disaster-recovery.’
Virtualization means that a single physical machine can run multiple, separate instances of an operating system and software. Hardware resource use is optimized and applications can run on whatever operating system they require. Virtualization is claimed to ease the movement of applications between different host computers, enabling various fail-safe scenarios to be implemented. The system was demonstrated at the 2011 OpsManage event when a primary system in Florida failed-over to a backup disaster recovery system in California.
In a separate announcement, Invensys reported that consultants Arc Advisory Group found that the Wonderware human machine interface (HMI) software is a world wide leader in market share. Rashesh Mody, senior VP software observed, ‘This progress is all down to organic growth as Invensys has not acquired any other HMI companies in recent history. Our HMI installed base now approaches 700,000 licenses.’ More on the study from Arc Advisory Group.