In my editorial of April 2008 I offered a Data Management 101. Not exactly a ‘body of knowledge’ but, I believe a good start for those not blessed with the hardened nose that comes from years of managing upstream data. This sparked off a flurry of correspondence (well two letters actually) and a slightly contrarian view on some of the terms used. David Lecore’s thoughtful contribution made me sit up and think. But the nature of the editorialist’s life is such that I was not ready to respond to his well made points. I was waiting for inspiration and clarification.
Last month we reviewed the DAMA Data Management Body of Knowledge (DM-BOK) and now (page 9), the companion DAMA Data Dictionary. I hoped that these authoritative oeuvres would put to rest the meta/master data controversy. Well of course they didn’t. Input from DAMA, which I suppose is largely the financial services community, rather than providing insights, has clouded the issue. It is hard enough trying to analyze one’s own data into a logical structure. It is even harder trying to shoehorn it into a ‘one size fits all’ analysis.
Lecore pointed out that my data/meta data/master data breakdown did not recognize that category of data that is placed in a ‘master data set,’ a cleaned-up version of say, well headers or perhaps a definitive list of formation tops rather than the plethoric provisional, idiosyncratic or just plain wrong versions that may exist in project data stores. What does DAMA say here?
The situation, as I understand it, which exists in the financial services community, is that the ‘granularity’ of the ‘authoritative’ master data is pretty much the same as the granularity of the subordinate data sets. Cleaned-up ‘master’ data touches each record. Matching Fred Bloggs with F. Bloggs and Frederick E. Bloggs as they appear in his bank’s database, the Social Security and wherever else.
In E&P, the situation is different. Managing voluminous complex data types involves compromise and ‘creaming off.’ You are not going to build a ‘master’ data set including every seismic sample. A project will contain such, but these will likely be in a file, not a data base. As you go up the data hierarchy to a ‘master’ set, believe me, if this contains a definitive list of line names and endpoints you will have done very well indeed, better than most majors I believe. E&P data management is about compromise and the art of the possible.
One of the funny things (to a technical data manager) that one comes across when rubbing shoulders with the data managers from financial services is the use of ‘data marts,’ ‘data warehousing’ and the like. What is happening here is that if you are processing a very large number of transactions (folks using their credit cards in a ATMs all around the country), then the last thing you want is to have some smart ass ‘analyst’ hit the transactional database with a show stopping ‘query from hell’ (that one did come from the DAMA dictionary!) To avoid such disasters, data from the transactional system is replicated into another system where queries can be performed without breaking anything. This arrangement is has spawned a lot of technology and terminology that is occasionally seen in the upstream—as witnessed by our our story from Composite Software and Netezza on page 3 of this issue.
In the earliest days of data management it was noted that the same terms—notably a well identifier—cropped up, often in slightly different versions in different databases around the enterprise. Early workers—especially the smaller outfits like iStore, Petris and OpenSpirit, who were obliged to live with the major vendors’ databases, soon learned how to map between different formats and provide more or less seamless access to data in different data stores. Of course ‘seamless’ does not necessarily mean all your worries are over. There are issues of data quality and cleansing which I’ll deal with in a future article. But what is key is that sooner or later, as you cross the domains and silos you are into data reconciliation and ‘mapping.’
Now this is hard work, it’s not something that is easily shrink wrapped or ‘sold’ to management. Which is probably the reason why the very successful, proven technology of mapping across different vendor stores has gotten relatively little attention—and why this wheel is constantly being re-invented. Today, there is a wide choice of solution providers and offerings from the major outsourcing/consulting houses who are capable of performing the mapping and building a more or less bespoke ‘framework’ for the upstream.
So what I hear you say? Well the ‘what’ in this context is that this, the data mapping ‘problem,’ is what IBM’s Integrated Information Framework and Microsoft’s Upstream Reference Architecture pretend to address. While the former’s history has more visibility (funded by Statoil and targeting operations and maintenance), MURA’s background in the upstream is less clear—that is until you see the white papers from Wipro1 and Pointcross2. Hidden behind the marketing jargon and kowtowing to Microsoft’s architectural ‘promised land’ are enterprise-scale data mapping projects preformed for clients.
In conclusion I have to admit that the cause of straightforward definitions has not been advanced very much. There is a parallel here between data management technology and terminology. The biggest problem in managing data is handling legacy data. And the biggest problem with trying to pin down the terminology is the ‘legacy’ of overlapping techniques and usage. On the other hand, this is what makes things so interesting. As I trust this issue of Oil IT Journal shows—with reports on Petrel data management from Apache, on data ‘virtualization’ from Composite Software, interviews with thought leaders from OpenSpirit and Roxar and developments on the ‘semantic’ front from the POSC/Caesar Association and our review of a rather good book on the subject.
As data volumes rise, load times are going through the roof—a serious issue for companies like Apache Corp. whose (total) online seismic data volume has grown 7 fold to 3.5 petabytes over the last four years.
Bradley Lauritsen, manager of exploration computing with Apache said, ‘Our users’ geological models used to take 20 minutes to open in Schlumberger’s Petrel 2009. The 64-bit address space lets them access very large files, creating data transfer bottlenecks.’
Apache has now deployed a 10 Gigabit Ethernet backbone (with 1 GbE workstation links), Windows’ Server Message Block 2 (SMB2) network file-sharing protocol and a storage system comprising NetApp’s FAS and the Unix-derived ‘Ontap’ operating system.
Solid state ‘Flash Cache’ modules improve I/O throughput by reducing the need to pull data from Apache’s back-end SATA drives. Cache is proving an effective solution to seismic data transfer. Apache reports a ‘near 70%’ cache hit rate—reducing access to the slower drives.
Apache uses multiple 512GB NetApp Flash Cache cards in five FAS6080 non-blocking storage systems in its Houston headquarters. Local sites deploy a FAS3170 systems as a backend to Petrel and other interpretation systems. Lauritsen reports that what took 20 minutes to load before now takes only 5 minutes. He described the NetApp storage solution as the ‘missing link’ that allows its geoscientists to take full advantage of Petrel 2009.
Curiously, the foregoing has been reported as a Windows success story—by both Microsoft1 and Schlumberger2 and to knock Linux-based interpretation systems (read from Paradigm and Landmark.) But the reality is a little more complex. The advent of SMB2 in the latest 64 bit Windows operating systems has meant that these can now ‘talk’ to industrial strength storage systems like NetApp.
We asked NetApp’s solid state product marketing manager Mark Woods if getting data over the network was really faster than a local disk. He replied, ‘No, loading a file from a local disk is typically going to be faster than downloading it from networked storage. This is likely still the case in the new environment. Apache wanted to enable geoscientists to work in parallel on the same data files. Changing from local to networked storage made this practical.’
In conclusion, this story is more about data management than performance. In which context, Apache is also leveraging NetApp’s SnapMirror and SnapVault data protection and replication technology to synchronize data between headquarters and affiliates. Last month NetApp reported total (all clients) sales of one petabyte of cache memory. More from www.oilit.com/links/1007_8.
A recent study from IBM’s Chemicals and Petroleum unit, compares C&P CEO’s attitudes to ‘complexity’ with those of a peer group from other industries. In the executive summary1, IBM reveals, ‘In an increasingly complex world, CEOs are learning to master complexity in countless ways—redesigning operating strategies for speed and flexibility and considering how best to take advantage of global efficiencies while addressing local needs.’
C&P CEOs are ‘slightly ahead of the crowd in this respect.’ 52%percent are focusing on simplifying and operations to ‘manage complexity more effectively.’ The ‘most dexterous’ CEOs are more intent on reducing fixed costs and increasing their variable costs (sic), so that they can rapidly scale up or down.
A ‘surprising’ number of peer group CEOs feel ill-equipped to cope with a ‘world of increasing complexity.’ But the C&P CEOs are confident in their future. ‘Only’ 64% expect complexity to grow and 59% think they can handle it. The apparent complacency of the C&P CEOs reminds us of the old adage that the best business in the world is a well run oil company. And the second best is a badly run oil company!
Exhibiting at last month’s PNEC, Composite Software described some of the work it has been doing for a Gulf of Mexico client and for Shell/ExxonMobil’s Aera Energy joint venture. Some digital oilfield operators (such as BP—Oil IT Journal September 2008) have deployed Netezza Performance Servers (NPS) running data warehouse applications to support offshore operations. One such client asked Composite Software to build a business intelligence layer atop multiple Netezza appliances, federating data into larger enterprise data sharing efforts.
Use cases supported by Composite’s business intelligence solutions include a ‘virtual’ well maintenance, repair and operations (MRO) combining rig status, staffing availability, best practice procedures, maintenance records and flow rates to support real-time dispatching of repair resources.
The Aera Energy example blends real time data streams with historical data from over 10,000 producing wells, optimizing deployment of repair crews and equipment. Composite data virtualization here includes SAP’s maintenance management system with historical surface, subsurface, and business data from the enterprise data warehouse.
Last month, Netezza and Composite announced the Netezza Data Virtualizer (NDV), a 'whitelabelled' appliance powered by Composite’s software. The NDV will be licensed and supported by Netezza and available later in the year. More from firstname.lastname@example.org.
The POSC/Caesar Association met last month in Stavanger. PCA is a Norwegian standards body that has developed the ISO 15926 standard for oil and gas plant data and is in the forefront of the use of ‘Semantic Web’ technology in the upstream. The ‘POSC’ part of PCA comes from a long forgotten link with Energistics’ previous manifestation. The Caesar connection refers, not to Julius, but to an earlier Norwegian standards initiative.
As a Norwegian entity, much of PCA’s activity is driven by Statoil whose Oscar Fredagsvik traced the evolution and deployment of ISO 15926 from the 2008 Global Operations Data Integration (GODI) project that set out to ‘refine plant data into knowledge.’ Under GODI, IBM developed its OMM reference architecture a.k.a. the IIF with its ‘reference semantic model’ spanning the whole standards smorgasbord from ISA 88/95 to WITSML passing through Mimosa, UN/CEFACT and more. GODI, which has now blended into Statoil’s Master Program IT (MapIT) promises ‘non intrusive integration’ with existing data sources. Statoil is now working on another component of MapIT—a real time visualization framework of information workspaces and collaboration tools. A use case showed how a simple query for well head pressure trends currently implies knowledge of tag numbers, database information and maybe even a request to operators for clarification. MapIT is to fix such issues and will provide information seamlessly into Statoil’s applications.
Jennifer Sampson (Statoil) presented work done on combining RFID technology with ISO 15926 (a joint industry project backed by OLF1, BG, BP, ConocoPhillips, GDF Suez, Shell and Statoil.) The project set out to define the requirements of the oil and gas industry for RFID deployment in areas including personnel monitoring, cargo tracking, drill string components tracking and the management of equipment. Again, ISO 15926 was used to develop an RFID Ontology. So far 65 concepts have been defined and mapped in the Protégé Ontology WorkBench. The resulting ISO 15926 RFID ontology can be extended to other industry domains. The spec is data and time aware—so that equipment can be tracked in time for dispatch, reception, inspection etc. A spatial representation allows for location tracking. A set of nine OLF-approved guidelines for the deployment of Radio Frequency Identification is now available (links/1007_2).
Richard Harris recapped Woodside’s voyage through engineering standards and data management. Early work with EPISTLE and STEP failed to gain traction as these projects were not really aligned with Woodside’s requirements, EPC deliverables and the engineering software needed to support Woodsides’s ‘mega’ projects. Since then, Woodside has identified critical documents and data sets required for its asset life cycle information system (ALIS). ALIS houses generic engineering deliverables—with no ‘gold plating.’ Harris believes that the owner operator world is maturing and today is in a position to define and expect quality information deliverables. Moreover, some project design tools have mutated into production systems for operating and modifying facilities. The trick is to ‘Keep it short and straightforward.’ For Woodside, the reliable, quickest to implement, easiest to use and lowest cost solution is ALIS—which embeds AVEVA’s VNET portal for a single point of data entry and access. And the key part of ISO 15926, for Woodside, is the pragmatic Part 4—the parts list.
Alan Johnson of the Open O&M Consortium drew a demarcation line between ISO 15926 as a ‘reference environment,’ while MIMOSA/Open O&M is an ‘operations environment.’ Between the two a ‘transformation engine,’ running on an ‘information bus,’ is required. The ISO 13374 standard for machine condition assessment data processing and information flow also ran. Echoing the upstream ‘what is a well’ initiative, Johnson described a MIMOSA/PCA joint venture that is trying to define a globally unique ID for assets. This should link up with the RFID work above and all be rolled into the Open O&M Information System Bus—a.k.a. the ‘intergalactic systems bus.’ More from www.posccaesar.org.
1 Oljeindustriens Landsforening—the Norwegian oil and gas trade association.
SAP has teamed with GTZ1, acting on behalf of the German Federal Ministry for Economic Cooperation and Development, to help the government of Ghana meet global standard for transparency in the oil and mining industries as set by the Extractive Industries Transparency Initiative (EITI). SAP is to donate an enterprise performance management (EPM) solution from its SAP BusinessObjects portfolio which the Ghanaian Ministry of Finance will use to monitor and analyze payments and revenue flows from future oil and gas production. The solution will enable Ghana to efficiently quantify revenue flows to government and other stakeholders, identifying reasons for any shortfalls or overshoots.
Deputy Minister of Finance, Seth Terkper said, ‘EITI is a useful tool in improving accountability and transparency. The addition of an IT system to the EITI process will be a significant addition to the process in Ghana and will tie in our public financial management reforms and the existing Ghanaian integrated financial management information system.
SAP’s contribution includes BusinessObjects Financial Consolidation application and business intelligence solutions. Jim Hagemann Snabe, SAP co-CEO said, ‘We’re pleased to bring our software and business process expertise to help Ghana meet the global standard for transparency in the oil and mining industries.’ More from www.sap.com and www.eiti.org.
1 Deutsche Gesellschaft für Technische Zusammenarbeit, a federally owned organization ‘working in the field of international cooperation for sustainable development.’
‘Oaths of allegiance’ sworn by Petris/Wipro and Pointcross to Microsoft’s Upstream Reference Architecture (now officially ‘MURA’) shed light on what it all really means. A white paper1 from Petris/Wipro describes PetrisWinds OneTouch as ‘taking advantage of’ MURA. This itself is curious since OneTouch was announced in May 2009 (Oil IT Journal) while MURA’s official roll-out was in March 2010 when it was described as ‘currently at a very early stage of development.’ In the Petris/Wipro context, MURA is something of a framework in a framework and appears to mean the use of Microsoft tools including SharePoint, Silverlight, FAST and Web Parts from third parties such as OSIsoft. We conclude that OneTouch with MURA inside is pretty much the same thing as it was before MURA was mooted!
Turning to Pointcross, a similar picture emerges. Pointcross was adapting its Orchestra framework to the upstream back in 2005 as the ‘Oil Company in a Box’ (Oil IT Journal December 2005.) The company became a MURA founder late in 2009. The Pointcross whitepaper2 similarly enumerates Microsoft technologies as above, adding Biztalk and Microsoft Office Business Applications, all of which blend into its new Integrated E&P Suites (IEPS).
For MURA to mature into something close to its marketing promise i.e. to provide a definitive mapping across ISA/ISO/WITS/PRODML and more, the next step ought to be a merger of the frameworks—both the above and the ever growing list of others (OpenSpirit, iStore …). It this possible—let along likely? It would be fun to be a fly on the wall in a MURA meet. We wonder if the talk is of objects, relations, of data footprints and mappings. But we suspect that the debate is more about how to craft the next press release promising the integration ‘moon,’ while treading on a minimum number of partner’s toes! More from www.oilit.com/links/1007_22.
Significant happenings in the world of geophysical FOSS1 this month as upstart Madagascar hits V1.0 and a 3.0 revamp is planned for the venerable Seismic Un*x. Madagascar 1.0 was announced at this month’s Madagascar School of Reproducible Computational Geophysics event where it was described as a big event in Madagascar’s history, the first stable version of the tool. Madagascar is more than seismic processing as it delves into the realm of multidimensional data analysis and reproducible computational experiments.
We asked Madagascar founder and lead developer Sergey Formel for an explanation. He said, ‘Madagascar is not tied to seismic and has been used on other multidimensional data such as geostatistics [notably by Shell’s Jim Jennings] and reservoir modeling. ‘Reproducibility’ is not a goal in itself, just a path to productivity, collaboration and technology transfer from research to production. In my opinion, all scientific software should adopt some form of reproducibility if it wants to evolve technologically.’ Madagascar has seen 10,000 downloads to date. Speakers at the Madagascar School hailed from KAUST University, Saudi Arabia, University of Texas at Austin, BP and the Center for Wave Phenomena (CWP) at the Colorado School of Mines (CSM). More from www.reproducibility.org.
The CWP is also home to John Stockwell’s SU project. SU has been installed in more than 3,300 sites world-wide in 68 countries. A revamp is under consideration, provisionally dubbed ‘S3,’ that will likely develop in parallel to SU, and will be delivered under a GNU license (the current SU is ‘owned’ by the CWP) and which will embed the GNU scientific library codes in what is planned as ‘SU for the 21st century.’ More from www.oilit.com/links/1007_5.
1 Free and Open Source Software.
The latest version of SAIC’s GeoRover geospatial data tracker, an ArcGIS plug-in, blends GPS data and field collected multi-media files in GPX format, automatically georeferencing this over maps and imagery. GeoRover now accepts data from ‘modern geotagging devices.’
P2 Energy Solutions has rolled out ‘P2 Analytics,’ extending its Excaliber suite with a business intelligence dashboard powered by Tibco Spotfire.
Emerson Smart Wireless has embedded Tyco’s TraceTek technology in its Rosemount 702 product line, proving hydrocarbon leak detection and monitoring of tank farms, and pipelines.
The 7.2 release of AspenTech’s AspenOne adds support for multibillion dollar ‘megaprojects,’ with rapid economic evaluation for owner-operators and EPCs.
Energy Navigator’s Value Navigator 5.2 includes a Google Map view, ‘in situ’ capabilities for unconventional hydrocarbon plays and improvements to forecasting.
Nokia/Qt has announced an ‘Ambassador Program,’ a showcase for ‘live’ Qt development projects. More from links/1007_14.
The 6.7 release of Geolog, Paradigm’s log and well data management solution introduces model-based petrophysical uncertainty to ‘accurately and scientifically quantify uncertainty in a hydrocarbon column,’ particularly during geosteering operations.
Petris Technology’s ZEH Plot Express now supports the new high speed iTerra TL1290 multi-width log printer from iSys.
Release 15.0 of PTC’s Mathcad includes design of experiment functions, expanded reference libraries and integration with third-party tools including True Engineering’s ‘Truenumbers.’ True’s technology is designed to ‘communicate values across applications and the organization without loss of quantity or unit integrity. Results and values can be moved outside of Mathcad onto different document types—enabling easy sharing of the data.’
A new partnership between Statoil, Nacre and Norway’s Sintef R&D organization is to develop ‘next generation’ hearing protection and communication technology for the offshore energy industry. According to the release, workers on offshore oil platforms face the risk of not only permanent hearing loss from exposure to high levels of hazardous noise over extended periods of time, but also challenges in maintaining adequate levels of speech communication and situational awareness, which are essential to maintaining safety.’ Nacre’s Quietpro system is already in use with US, NATO and Scandinavian armed forces.
At last, a book that tells what the semantic web can do, rather than explaining how the world might be if, overnight everything on the web suddenly became ‘semantic.’ As stated in Programming the Semantic Web1’s (PtSW) introduction, significant research funds have been sunk into the semantic field and sometimes ‘the noise from the R&D community obscures the fact that this is not rocket science.’ You do not need to be into philosophy or artificial intelligence to use the semantic web.
Despite the ‘semantics’ name, the semweb is not about natural language. It is rather about sharing data between communities and machines. Thus the starting point is data modeling in fact perhaps our main learning from PtSW is that RDF/OWL is a data modeling language—with no magical powers beyond SQL or Express.
PtSW kicks off with a limpid explanation of the path from tabular data through relational databases, and how hard these are to maintain and evolve. While database schema complexity is manageable in well understood industries, it gets somewhat intractable when there are ‘hundreds or thousands of datasets all talking about different things.’ It would be hard and labor intensive to put all the world’s data into a single schema—i.e. the RDBMS suffers from poor ‘scaling to complexity.’
Enter the ‘key/value’ schema—a more flexible data model that handily maps into the Resource Data Framework’s ‘triple.’ This is the ultimate stripped-down data model where relationships are described by the data itself. Before you know it (page 23) you are building a simple triple store in Python2.
Having built a few semantic silos—the next big thing is tying them together with shared keys and overlapping graphs. But nota bene—there is no magic here, the problems of inconsistent naming and mis-spelt data has not gone away. PtSW even has a new term for disambiguating multiple records of the same thing, ‘smutting.’
The section on ‘Just enough OWL’ probably would not satisfy the purist but the idea is that OWL brings back the stuff we threw away when we abandoned the RDBMS, relations and data modeling. But modeling anything of moderate complexity in RDF/OWL quickly becomes ugly. Enter Protégé, an industry standard tool for managing such complexity.
If you are interested in understanding the technology that underpins the ISO 15926 standard (see page 3) this is a great starting point. But there is a caveat. The stripped down data modeling functionality of the semantic web only moves the complexity in your data into the Protégé model. Even simple objects translate into ‘graphs’ of mind numbing complexity. Determining even simple stuff like a unit of measure involves a trip around a graph. RDF is not really conducive to encapsulating data into ‘building bricks.’ Those involved in process modeling might like to check out OntoCAPE. More from www.semprog.com.
1 Programming the Semantic Web, Segaran et al. O’Reilly. ISBN 978 0 596 15381 6.
Bogdan Motoc of Calgary-based Allied Bionics poster presentation, ‘Too connected to fail’ used ‘complexity theory’ in a metaphoric sort of way, seeing the enterprise as a ‘life form’ whose purpose is self perpetuation, rather than some altruistic good. An analysis of healthcare units found that they ‘work’ for their own survival and need a lot of social constraint to moderate and metrics to optimize. Factors impacting oil and gas include the scientific community, public and investor pressure. Such agencies operate at widely different speeds. Public pressure ‘travels’ much faster than an oil and gas businesses’ capacity to respond as events in the Gulf of Mexico are demonstrating. What does this mean for oils? Oil and gas is ‘too connected’ to the rest of our ecosystem to fail—‘collapse is not an option.’ We need to speed governance processes and educate the public on what can realistically be achieved. One way of doing this is through social network analysis that can help identify actors of importance and open up new parallel paths of communication.
Landmark was busy with the roll out of its DecisionSpace Desktop/EarthModel (DSD), a ‘unified work space’ spanning geology, geophysics and modeling—and based on Open Works/R5000. The demo showed a new Project Designer ‘brainstorming’ tool, a workflow manager and other novelties including geostatistics and ‘vertical cell walls’ (not pillar gridding!) Models can be sent straight to the simulator sans upscaling. Will DSD be Landmark’s Petrel killer? Not without a more enthusiastic marketing push. Maybe this will come with the release of a Windows port.
Schlumberger Information Solutions president Olivier Peyret gave an enthusiastic booth presentation of the Ocean development environment and application store. At first it seemed like he was working for Apple, waving his iPhone around. But his praise was for the Apps and the App Store not the hardware. ‘Petrel is Schlumberger’s iPhone’ he explained adding ‘even our fiercest competitors are developing plug-ins’ using the Ocean dev kit which is also used internally by Schlumberger. Schlumberger bet on Microsoft’s .NET development platform 10 years ago and ‘we don’t regret it.’ Flagship client Shell bought Ocean in 2003 and now has developed 80 plug-ins. Our guess though is that Peyret is, at heart, a Mac aficionado. As the Ocean Store demo proceeded he was heard to say, ‘It scrolls better on the iPad!’
We interviewed OpenSpirit president Dan Piette next for the skinny on its support of Microsoft’s Upstream Reference Architecture (MURA). Piette pointed out that Microsoft’s market presence made the offer of an association something that could not be refused. Microsoft plans to use OpenSpirit to add G&G focus and extend integration to drilling and production. Piette admitted that, ‘It will take time to define something concrete. Meanwhile we will continue to focus on our clients’ needs.’ These are growing apace as OpenSpirit has 450 sites and 1000 users in 66 companies—representing around $15 million in sales. Despite MURA there are no plans to de-focus, ‘We will not be going to SQL Server-based business data access.’
OpenSpirit has a now poster child in the form of Seismic Micro Technology which is ‘white labeling’ OpenSpirit’s data infrastructure that now provides Kingdom to Kingdom data access (previously provided by the ‘Tunnel’ point to point links.) This use of OpenSpirit begs the question as to why the toolset is not getting more visibility as a data management solution, something it is obviously rather good at. Piette intimated that there were certain sensitivities regarding OpenSpirit’s shareholders’ existing solutions in the data management space.
Kjetil Fagervik, Roxar’s VP product development was our next interlocutor. He explained that although Emerson acquired Roxar mainly for its flow measurement business, the software arm is considered ‘a huge bonus’ and is being rolled into Emerson’s vision of ‘intelligent fields.’ Novelties include seismic data in the RMS flagship—using Hue Space’s technology and a new ‘.ROX’ framework. This includes new domain models and services and an open source database for transactions and persistence. We asked if .ROX was Roxar’s Ocean. Fagervik said, ‘We have not yet decided—but it could become one! We are aiming for ease of use, functionality and massive parallelism.’ Roxar seems less keen than some others on the lemming like dash to Windows. Roxar’s Rob Smallshire advocates staying multi-platform. Roxar has even trialled RMS on the Amazon EC2 cloud. Reflecting on this we did notice that most EAGE demos we attended were running on Red Hat Enterprise Linux. Maybe Roxar has something...
SPT was showing a Mepo plug-in for Petrel. Petrel is a monolithic app and can be very slow on compute-intense tasks, ‘folks are complaining.’ Now Petrel users can hand off number crunching to Solaris/Linux clusters and run, say, 2,000 simulations at once. A Petrel link to Olga is also under development.
Total’s Raphaële Henri-Bailly reported progress on Resqml, (RML) Energistics’ geomodeling data standard. RML uses NASA’s binary HDF5 format and also embeds coordinate reference data. While Rescue occupies a niche between the model and the simulator, RML targets the whole workflow loop, a dozen or so activities expanding upstream to real time data. Real money is involved—with a funded port of HDF5 into 64bit Java. In a back to back presentation, Jean-Francois Rainaud showed how the French Petroleum Institute is adding semantic ‘certified’ annotation to RML.
Paradoxically, one project that suffers from a lack of funding is Saudi Aramco’s effort to apply model based control to production optimization. Othman Taha reckons that the technology could hike Aramco’s recovery rates by 15%! Aramco’s fields have the instrumentation and communications infrastructure required to apply the same control technologies as used in gas plants and refineries. But the project is languishing in universities and R&D departments. Taha said, ‘Every time I say ‘optimization’ someone says ‘we tried that 20 years ago.’’
This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech.
There was a quality turnout for OpenSpirit’s Barcelona Technical Symposium with representatives from Statoil, Shell, Idemitsu Petroleum, OMV and KOC amongst others. 2009 was a good year for the OpenSpirit (OS) interoperability framework. Past year highlights include the Ocean-Petrel Application Adapter (OPAA) and ‘new partners by the dozen.’
Kay Sutter (SMT) admitted that integrated workflows had been something of a ‘hiccup’ for Kingdom in the past. Another issue for Kingdom users is the proliferation of projects all over the place. Users wanted to consolidate, merge and subset projects. SMT noted that OpenSpirit (OS) could do all this and more and has opted to leverage OpenSpirit not just for connectivity with ‘foreign’ data stores but also for Kingdom-to-Kingdom data exchange, data management and data QC. Sutter reports that on-the-fly coordinate conversion is particularly appreciated allowing for ESRI cultural data as a project backdrop—drag and dropping grids from ArcGIS. The ‘next generation’ Kingdom data management release—with ‘whitelabeled’ OS inside will be available later this year.
Surendra Bhat’s presentation went into more detail on the new data management functionality. The 3.2.2 ‘next generation’ desktop comes with plug-ins for data selection scan and copy. Rules can be written in the rule editor and a new model view function used to show for instance a subset of database tables. Rules can be developed for data quality initiatives and run as bulk edits—these can be versioned and run on selected source data. Regular expressions and a chron type scheduler complete the picture and comprise a single entry point for data modification with a log of changes and exceptions.
Rune Hagelund described Roxar’s workflow integration with the new ‘.ROX’ framework and OpenSpirit. .ROX promises ‘seamless’ integration of applications and data that lets a user work with the tool that is best suited for a given task, while offering access to functionality in external applications. All of this while working on data in the ‘optimal’ domain—GIS, geology, petrophysical or seismic. .ROX offers access to Roxar data and services and a dev kit including scripting with Python. Roxar is contracting with OpenSpirit for the development of an application adapter for the .ROX environment. The OS common model is being extended with better 3D grids handling inter alia. More from email@example.com.
According to Honeywell’s Don Morrison, payback from advanced process control (APC) and optimization in oil and gas is the best of all of Honeywell’s verticals—from zero (immediate) to a mere two months. ROI benefits include increased production, lower operating costs, improved quality, flexibility and process stability. Honeywell’s Profit Suite targets process improvement by proving a ‘flexible and scalable control and optimization technology infrastructure spanning linear and nonlinear control through to multi-plant optimization.’ Profit Suite provides rigorous process models and simulations, and critically, engages the operator as the first line of support.
Jack Kiefert of the Enraf metering unit described terminals as ‘cash registers for high value liquid products.’ As such, accurate and safe integrated solutions are required to maximize operating efficiencies. What happens when an operator gets it wrong was illustrated by the Buncefield disaster in the UK. Buncefield was the fifth largest oil-products depot in the UK, with a capacity of about 1.7 million barrels of product. The 2005 explosion and subsequent fire generated over 1 billion in insurance claims. The root cause of the Buncefield incident (like the later Alyeska PipeLine incident) was a tank overflow. A UK government inquiry resulted in a lengthy report (www.oilit.com/links/1007_13) described a chain reaction of equipment and operator failures. Kiefert recommends that ‘a printed copy of the Buncefield report should be in every office of a terminal or tank farm.’ The conclusions of the Buncefield report have informed Honeywell’s Tank Management system which has been designed to provide additional, independent protection layers to mitigate the overfill risk. These derive from Buncefield Recommendation N° 8 and include high level detection that is independent of components internal to the storage tank, more dependable validated tank level gauging systems with warnings and fault detection. One such component is the new OneWireless Flexline system for overfill prevention—claimed as ‘the only wireless solution approved by the UK government.’ The new Flexline is under test with Chevron, ConocoPhillips, Total and Exxon. A real time interface to ERP systems from SAP and Oracle/JD Edwards is also available.
Tom Moroney, Manager of deepwater technology and production systems for Shell’s Upstream Americas unit, described how digital technology has been ‘harnessed’ to transform Shell’s production surveillance capability. Shell is developing an exception-based surveillance ESB leveraging technology from Matrikon, acquired by Honeywell in May 2010 in a $142 deal. Shell’s ‘EBS Bridge’ is centered on an onshore collaboration center located in Shell’s downtown Houston offices. The Bridge collates data from engineering ‘doing tools’ such as Energy Components, PI System, Production Universe and Oilfield Manager. These feed into a ‘solid IT infrastructure’ comprising alarming, workflow integration situational awareness and knowledge management (storage). Key Matrikon applications under the Bridge’s hood include Equipment Condition Monitoring, Operational Insight, Alarm Manager and the Net Objects business hierarchy model. Looking forward, Moroney sees continued development of Shell’s EBS capability in four areas—deeper analytics and modeling, SAP integration (maintenance and logistics), extended workflows spanning onshore and offshore engineering expertise and closed loop field management leveraging model-based optimization and control. More from www.honeywell.com.
Jean-Georges Malcor has been appointed CEO of CGGVeritas. Robert Brunck continues as Chairman of the Board. Apache Corp. has promoted David Carmony to VP Environmental, Health and Safety (EH&S).
A senior database administrator for GEXA Energy in Houston has been sentenced to 12 months in prison and fined $100,000 for hacking his employer’s computer network.
Energy Ventures has appointed Shantanu Agarwal to associate and Kristian Lier to investment manager. Agarwal hails from Schlumberger.
Shelly Leedy is ENGlobal’s SVP business development and Steve Kelly is interim president of the automation segment.
Tim Cejka is to retire as president of ExxonMobil Exploration. He will be replaced by the president of ExxonMobil Upstream Research, Steve Greenlee.
The Founder Institute has launched a training program for technology entrepreneurs in Houston. The school will be run by Enterprise Builders.
FMC Technologies has elected Eleazar de Carvalho Filho to its board. Carvalho served on the Petrobras and Vale boards.
Eric Gebhardt is senior general manager and VP at GE Energy Services.
Bill Moll has joined Geokinetics as VP and corporate secretary. He was formerly with Patterson-UTI Energy. Scott Zuehlke, formerly of Hercules Offshore, has joined as director of investor relations.
The French Petroleum Institute IFP has changed its name to ‘IFP New Energy,’ to more closely reflect the increasingly ‘green’ nature of its research programs.
Ingrain has promoted Marcus Ganz from COO to CEO, replacing co-founder Henrique Tono, who is becomes chairman.
Idea Integration has joined the Microsoft Upstream Reference Architecture Initiative.
Saudi Aramco and GeoFields received an ESRI award for their work on Aramco’s pipeline integrity management system.
Isilon has appointed Eric Brodersen Senior VP marketing. He hails from Quantum. Former Openwave CEO Don Listwin has joined the Isilon board.
Roger van Lier has been named Mahindra Satyam business development executive for the Benelux at its Rijswijk office.
Marie-Ange Debon, General Secretary of Suez Environment Group, has been named administrator of Technip.
Weatherford has named former US Secretary of Energy Sam Bodman, Guillermo Ortiz and Emyr Jones Parry to its board.
David Pyles has been promoted to Director of the Chevron Center of Research Excellence.
Landmark has launched R5000User.com, a ‘read only’ blog for R5000 technical information. Users can add their own comment—by email!
Joe Craver now heads-up SAIC’s Infrastructure, Energy, Health and Product Solutions Group which includes what used to be IT and Network Solutions.
Remote information systems provider Skymira has appointed Mary Dyer Business Development Manager.
Gary Lundeen is heading up Westheimer Inc. in Houston.
Microsoft informed us that Schlumberger is indeed a founder member of the Microsoft Upstream Reference Architecture (MURA). A Schlumberger spokesperson added, ‘Schlumberger is a partner in Microsoft’s Upstream Reference Architecture. We do not have any further statements.’
Seismic Micro-Technology founder Tom Smith has just launched a new company, ‘Geophysical Insights,’ to develop advanced technology for seismic data interpretation with neural nets.
Reservoir Group has acquired Enigma Data Solutions. Terms of the deal were not announced. The acquisition will reinforce RG’s information services division of RG, which includes previously acquired data management specialist Info Asset. More from www.reservoir-group.co.uk.
Fugro has entered into an agreement to purchase $20 million of senior secured bonds from Electromagnetic Geoservices (EMGS). The deal ‘represents Fugro’s interest in establishing marine EM as a valuable tool in oil and gas exploration.’ The bonds yield quarterly coupons of Libor + 8% and are secured against revenues from a recently awarded Pemex work program.
HitecVision is to acquire Statoil’s Tampnet unit. Established in 2001, Tampnet’s fiber backbone and radio links provide high bandwidth communication to some 60 offshore platforms operated by a multitude of oil companies. RBC Capital Markets advised Statoil on the transaction.
ABB has acquired level detection specialist K-TEK which will be rolled into ABB’s Process Automation division. K-TEK claims over 350,000 installations world wide.
Merrick Systems has received funding from Main Street Capital Corp. The amount of the ‘senior secured term debt’ was not disclosed. GulfStar Group advised Merrick on the transaction.
Seven venture capital investors have announced the formation of the Oiltech Investment Network to discover and bring to market new E&P technologies. The Oiltech line-up is Chevron Technology Ventures, Energy Ventures, Epi–V, Lime Rock Partners, SEP, Shoaibi Group and Viking Venture. The pioneering network is managed by OTM Consulting—more from www.oiltechinvest.com.
Total Safety has acquired Houston 2-Way Radio offering a range of portable and fixed wireless communication products and solutions.
Following our review of the Data Management Body of Knowledge (DM-BOK) last month, we ordered the companion volume—the Dictionary1, sold as a CD Rom containing a single PDF file. We were curious to see if this work would be the Linnaean classification of data management that we have been looking for—and, more prosaically, how its definitions stack up with Wikipedia. First impression is that it is rather a lightweight production, the 850 definitions occupy 140 pages of rather spaced-out text, probably not more than a 30,000 word count in the whole dictionary.
In the introduction, the authors note—and we can only concur, that the industry is ‘in great need of clarity in its terminology.’ They further observe that ‘While experts may never reach 100% agreement, we believe there is general agreement with these definitions across the data management profession.’ Well that would be nice if it were true, but one of Wikipedia’s strengths is the way it captures enlightening differences of opinion.
So what’s it worth? The definition for ‘online analytical processing’ is ‘The use of multi-dimensional databases and/or analytical tools to support the analysis of business trends and development of business projections.’ This is OK as far as it goes—but it lacks narrative. It fails to explain the relationship between a normalized data base’s poor query performance. Wikipedia hits the nail on the head in its first sentence viz., ‘an approach to swiftly answer multi-dimensional analytical queries.’
How about ‘master data?’ This is ‘synonymous with reference data’ and is ‘data that provides the context for transactional data including details of internal and external objects involved in business transactions.’ That is OK but not much use outside of the business/financial services environment of DAMA.
The synonym, ‘reference data’ curiously has a slightly different definition ‘data used to categorize other data, or for relating data to information beyond the boundaries of the enterprise—see master data.’
Semantic ‘has to do with meaning, usually of words and/or symbols’ while a semantic data model (synonymous with an ontology) is ‘a [..] data model for non-tabular data [that] makes meaning explicit so that a human or software agent can reason about it.’ Another lexical run-around occurs when you check out ontology—‘a semantic data model [as above]’ and ‘a synonym for schema.’
I am afraid that the DAMA Dictionary’s usefulness is rather limited. It might be useful for looking up the odd acronym or checking out a business-oriented definition of a term? But Wikipedia beats it hands down for depth and, we believe, authority. Wikipedia, unlike the Dictionary’s CD-ROM/PDF format is also more likely to be to hand when you need it.
1 The Data Dictionary of Data Management 1st ed. 2008, Mark Mosley, Editor, Technics Publications. 140 pages. $44.95.
New ‘customer spotlight’ white papers from Visualization Sciences Group (formerly TGS/Mercury) shows how Petrosys and SeisWare have leveraged VSG’s Open Inventor (OI) graphics library to extend their software to 3D. Open Inventor is an object-oriented, cross-platform 3D graphics toolkit for the development of interactive applications using C++, .NET or Java. OI comes as an application programming interface (API) for rapid prototyping and development of 3D graphics applications. OI extends the OpenGL graphics standard with a library of components for visualization and GPU-based rendering.
Petrosys’ new 3d-viz application extends its geoscience mapping package to 3D and supports a complex collection of surfaces and industry-specific objects. 3d-viz offers horizontal and vertical clipping planes, multiple light sources, a raster hard copy, movie making facilities and full interactive controls from Windows or Linux desktops.
The latest 7.0 release of SeisWare’s eponymous seismic interpretation suite introduced a 3D Seismic Visualizer. SeisWare leveraged its knowledge of OI’s C# interface in the development. The new 8.1 OI release offers a display rate ‘close to graphics hardware peak performance.’ A dynamic resolution mechanism applies rendering progressively, refining image quality on the fly. The new release is optimized for Nvidia QuadroPlex systems. More from www.vsg3d.com.
We popped in on Interactive Network Technologies at the EAGE to checkout its visualization components a.k.a. ‘widgets.’ INT’s widgets are pretty pervasively used by upstream software developers to display seismics, logs, well schematics and more. The latest OpenGL/Java offering from INT farms-out rasterizing to a GPU renderer. The result is a sharp seismic display with pan, zoom and annotation capability and various combinations of color variable area and wiggle. But the fun starts when the display switches to 3D. Seismics, logs—even elaborate multi-track composites—can be viewed and manipulated in 3D. Geobodies can be contoured and annotated. By changing a single line of code, developers get GPU speedup and an extra dimension! The new INTViewer 4.1 leverages the Netbeans rich client platform. GIS display enhancements provide support for projections when loading shape files. INTViewer relies on Nokia Qt for cross platform functionality on Windows, Mac, Linux and UNIX operating systems. More from www.int.com.
Noble Americas has implemented Amphora’s Symphony ETRM solution to help manage its global crude and derivatives trading.
Avere Systems has announced that Ion unit GX Technology has deployed six FXT 2700 appliances, each with eight 64GB SLC Flash SSDs. GXT reports 2.5 fold speedup of some of its seismic processing jobs.
IHS is partnering with Tyler, Texas-based provider of surface-ownership data OneMap. IHS will market and license OneMap’s data, with initial coverage in Texas including the Permian Basin, Gulf Coast and Northeast Texas regions, and will expand to include other states, primarily focused on the shale plays.
P2 Energy Solutions is partnering with Trinity Management Consulting to support P2’s Excalibur product line.
Knowledge Reservoir and 3GiG have signed with Hess Corporation to implement a global Asset Management Portal. The initiative uses 3GiG’s Prospect Director solution in the design and implementation of an enterprise-wide, business information management system to support the global asset management process at Hess.
The Technip/Samsung consortium has appointed Emerson Process Management as main automation contractor for Shell’s ‘Prelude,’ the world’s first Floating liquefied natural gas (LNG) development in the remote Browse Basin off the coast of Western Australia.
Fugro General Robotics has won an order from the Haugesund Simulator Center for a DeepTouch ROV simulator to be integrated with a Kongsberg offshore vessel simulator providing an operations training platform.
Whitestar and PennWell Corp.’s MAPSearch unit have partnered to create a ‘complete data solution for energy industry cartography and analysis.’ Whitestar base layers, well and culture data and digital imagery blend with MAPSearch’s energy infrastructure layers. The collaboration will start with the co-marketing and selling of their respective product lines. Both companies offer complete coverage of North America.
Toyo Engineering Corp. is the first engineering company in Japan to purchase a perpetual version of SPT Group’s OLGA multiphase dynamic simulator to provide flow assurance services to the upstream oil and gas market.
Accenture has signed a five-year business process outsourcing (BPO) contract with Statoil, to manage the company’s Accounts Payables processes. Financial details were not disclosed.
Group has teamed with Schlumberger’s WesternGeco
unit to create ‘IG Seismic,’ offering land seismic acquisition services,
data processing and inrepretation services in Russia, Kazakhstan, Uzbekistan,
Turkmenistan. WesternGeco holds a minority 25% stake in the venture. The new unit will operate a combined capacity of 40 seismic crews and 900 CPUs of computer processing power.
ABB and Statoil have signed a $23 million three year framework agreement for modification and maintenance of telecommunications systems for all offshore installations on the Norwegian continental shelf. ABB’s oil, gas and petrochemical business unit also provides pilot studies, construction, testing, installation and commissioning of modifications to telecommunications systems. Other deliverables include data networks, telephony, radio (UHF and VHF), public address, meteorology and radar systems.
Following an ‘extensive and careful selection process,’ Siemens has chosen Information Builders’ WebFOCUS as its global business intelligence solution for new reporting applications and will add the BI platform to its list of standard platforms.
Energistics is inviting comment on Version 1.4.1 of its WITSML well data transfer standard. The new release sees extension of the data footprint with a well fracturing service and ‘StimJob’ data object. This provides a ‘comprehensive summary’ of stimulation operation and can be scaled to allow the required amount of detail for service company or operator reporting. This object within WITSML provides the opportunity to expand the range of workflows available to assist wellbore construction operations. Comments on the new release are due by September. Energistics expects the new version to be published in December. More from www.energistics.org/witsml-standard.
The Society of Exploration Geophysicists (SEG) is forming a Technical Section to establish guidelines for ‘Best Practices in Geophysical Data Management.’ Troika’s Jill Lewis considers that the initiative has ‘critical mass’ and invites interested parties to contact her to join. More from www.troika-int.com/contact-us.
The W3C has published a new standard for web based ‘rule systems.’ The Rule Interchange Format (RIF) was developed to provide cross system interoperability. More from www.oilit.com/links/1007_19.
A live DVD with the open source Madagascar seismic processing package installed is available at www.oilit.com/links/1007_20.
A new Oil IT Journal security section this month, inspired by the advent of the Stuxnet trojan. Stuxnet malware is distributed on USB sticks and affects Siemens’ Simatic WinCC SCADA data browser. WinCC provides process visualization and ‘plant intelligence.’ The act of viewing the content of an USB stick could infect a system. According to Siemens1, no cases of damage to customers’ sites have been reported to date. Siemens recommends running Trend Micro’s Sysclean to remove. Some have speculated2 that Stuxnet may herald a wave of new control system attacks.
With what appears to be good timing, Houston-based process software specialist PAS has released ‘Integrity Recon,’ a package that improves reliability of Microsoft Windows-based automation systems and process control networks. Integrity Recon monitors and reports KPIs and control system vulnerabilities, improving infrastructure management and helping monitor aspects of the control systems, such as license management. Recon also aids with common operating environments (COE) compliance. COEs are approved configurations of hardware and software that enable security, reliability, and facilitate troubleshooting. Hardware and software upgrades are also facilitated by identifying existing components and dependencies. More on such ‘exploits’ from the US National Vulnerability Database www.oilit.com/links/1007_18.
Tibco has announced a new ‘Silver Spotfire’ offering of business intelligence and analytics from the compute cloud. Silver supports remote creation of custom dashboards, data visualization and reports. According to Tibco, the cloud paradigm brings a ‘social’ dimension to business intelligence—adding collaboration and sharing of data, reports and calculations. Silver integrates with social media, allowing users to embed live dashboards into their business blogs and online articles.’ Silver Spotfire is available at no charge for the first year. Monthly hosting options are available for ongoing usage thereafter. More from www.tibco.com.
Erdas has released ‘Apollo on the Cloud’ (AOC), a cloud-based geospatial data management and delivery solution. A turnkey, pay-as-you-go solution, AOC lets organizations maintain entire geospatial serving operations in a secure, scalable environment, eliminating the need for in-house hardware, IT head count and expertise. President Joel Campbell said, ‘Many GIS users do not have the capability or infrastructure to support a geospatial server. AOC brings a cloud-based geospatial solution to this group of customers.’ AOC is a hosted version of Erdas’ Apollo geospatial data management solution. Erdas has plans for more cloud-based offerings in the future. More from www.erdas.com/apollocloud.
ESRI’s ArcGIS Server is now available on the Amazon EC2 cloud. Preconfigured ‘virtual’ images of ArcGIS Server running on the cloud are available to licensed ESRI customers on request. More from www.oilit.com/links/1007_15.
Honeywell has announced a greenhouse gas (GHG) emissions management offering, a portfolio of energy services that use ongoing measurement, monitoring and real-time emissions data to meet sustainability goals and meet the US Environmental Protection Agency (EPA) regulations covering organizations that produce more than 25,000 metric tons of GHGs per year. Few organizations have the in-house resources needed to catalog current emissions and even fewer have the ability to use the information to develop a strategic plan to reduce their environmental footprint. The service offers monitoring and reporting, web-based dashboards that pull real-time data into a single view and communications tools and support. Honeywell also helps clients sell offsets, carbon credits, and renewable energy credits.
Statoil threw the switch on its Gjøa project this month—heralding Norway’s first floating platform to get electricity from the mainland. The company plans to avoid 210,000 tonnes of carbon dioxide (CO2) emissions per year by providing the Gjøa platform with electric power from the shore. A 90,000-volt, 100km long cable links Gjøa to the mainland. ABB helped Statoil develop the technology that replaces 40 megawatts of gas turbine power generation.
Beijing-based LianDi Clean Technology has announced two new software sales to Karamay Petrochemical Company and Fushun Petrochemical Company. LianDi’s dynamic simulation training system will be used to ‘improve utilization and increase energy efficiency’. The deals amount to $2.7 million.
Paradigm has unveiled its plans for a 64 bit Windows 7 version of its ‘seismic-to-reservoir’ application suite along with its Epos integration and interoperability framework. The Windows version will offer the same look and feel as Paradigm’s Linux offering. The update complements existing Windows versions of Paradigm software for structural and reservoir modeling and engineering, formation evaluation, geological cross-section and correlation and drilling engineering. Paradigm CTO Duane Dopkin said, ‘Our track record of delivering modeling, geological, formation evaluation and drilling solutions on Windows has allowed us to leverage in-house expertise to help with the port.’
Phil Neri, Paradigm’s head of marketing, told Oil IT Journal, ‘The move to Windows follows demand from some of our clients whose IT departments are driving the move from Linux. Not all companies are doing this however. We achieve the Windows port thanks to Nokia’s Qt cross platform development kit. We are fundamentally a multi-platform company. Windows 7 is just another port for us, the heavy lifting has already been done by Qt. And our EPOS data infrastructure is very good at hiding multi-OS dependencies.’ Paradigm’s roadmap envisages product availability on Winds 7 by mid 2011. More from www.pdgm.com.
Barcoding Inc. has just received a US patent for ‘e-Action,’ a wireless, internet-based system for alert transmission and action. Designed to be worn or carried by the user, e-Action transmits critical commands from any location on the worksite. The system enables an individual who witnesses an emergency or spots dangerous irregularities to immediately stop all mechanical systems. In addition to manual control and override by a user, the e-Action system can be configured to detect and react to pre-defined physical or biological parameters. Corrective action can be taken when such baselines are exceeded, shutting down equipment immediately.
Originally designed for use in manufacturing and distribution, the customizable solution is adaptable to other users of heavy machinery including oil and gas drilling and production. e-Action, which was developed by Steinmetz and engineering consultant Larry Cuthie, will be available in late 2010. More from www.barcoding.com.
Seahawk Drilling has signed with Houston-based RigNet for the provision of managed communications for its fleet of offshore rigs including telephony and internet services. Seahawk is also utilizing RigNet’s ‘TurboNet’ service, that leverages Riverbed’s Application Accelerator appliance to improve performance of applications running over high latency satellite links. More from www.rig.net.
As the release asks, ‘Would you want to do serious process design on your phone?’ Alph, RedTree Development’s process calculator for the iPhone and iPod Touch (Oil IT Journal Jan 2010) showed that modern smart phones are capable of solving sophisticated flowsheet and distillation problems. But some engineers still doubt the smart phone’s capacity to do ‘real work.’
RedTree thinks the 1.3 Alph release, which now runs on a Apple iPad may change such perceptions. RedTree president Craig Morris (the original author of the Hysim process simulator) said ‘The iPad just seems like more of a work device and is friendlier for fat fingered folk. And with eight times as much screen area as the iPhone, the iPad certainly provides a much less cramped workspace.’
The program embeds a subset of the ‘industry proven’ thermodynamics in Virtual Materials Group’s database of compounds along with the Peng Robinson and Steam95 property packages. Not quite a replacement for enterprise class simulators but a useful tool for light hydrocarbon and petro-chemical problem solving and an introduction to VMGSim—the high end tool. More from www.virtualmaterials.com and www.redtree.com.
Shell’s game is changing with bewildering speed. Its latest technological innovation hails from UK defense contractor QinetiQ in a ‘swords-to-ploughshares’ deal that is to convert Qinetiq’s ‘OptaSense’ perimeter security technology for downhole use.
OptaSense uses standard telecom fiber-optic cable to detect, discriminate and locate acoustic events during wellbore operations ‘far beyond what is currently available.’ OptaSense turns a standard fiber strand into an array of ‘virtual microphones’ spaced between 1m and 15m and up to 50km in length. From one location OptaSense can monitor 100km of asset creating 10,000 sensors instantaneously without the need to conduct any in-field operations. The technology has been tested in an 18 month field trial in Canada and is now being used in onshore field development and exploration—especially in acoustic monitoring of hydraulic fracturing of shale gas reservoirs.
Qinetiq’s OptaSense is already used in onshore pipeline intrusion monitoring. Shell VP R&D Jeroen Regtien said, ‘Given that most of this technology is in existence today in the defense and security industries, we anticipate a relatively low development risk and expect to deploy the first system soon.’ More from www.qinetiq.com.