As you will no doubt remember, before Macondo, BP was in the firing line following a complaint by a whistleblower, aided by an activist group, Food and Water Watch (FWW), that BP failed to properly manage the engineering documentation on the Atlantis production platform in the Gulf of Mexico. The whistleblower alleged that BP did not properly maintain the engineer-approved as built drawings of Atlantis systems and structures and that this increased safety risks for the facility and its personnel.
We wrote about this at the time (Oil ITJ July 2009) but we were practically alone. Despite FWW’s best efforts at calling press conferences (which were rather well attended by the news services and mainstream media) the story got little coverage. Not until Macondo when the mainstream media and the blogosphere all piled-in with ‘evidence’ that Atlantis was another Macondo waiting to happen and that production should be shut in forthwith.
Back in 2009 we commented as follows, ‘Speakers at document and data conferences across the upstream, from geoscience to construction, have bemoaned the parlous state of their data and the difficulty of getting adequate resources for its management. Some have forecast exposure to regulatory risks and non compliance ‘issues.’
Many companies are struggling to address the problem of maintaining up-to-date engineering documentation across the complex design, build and commission life cycle of the modern offshore facility. Whatever the outcome, the FWW case will be music to the ears of engineering document management software vendors’!
Early this month the Bureau of Ocean Energy Management, Regulation and Enforcement (Boemre) released the findings of its investigation into Atlantis’ engineering documentation. Their report1 should be compulsory reading for all involved in IT, data and/or engineering. The report tracks how BP coordinates engineering documentation from a large number of design and build contracting organizations and how it handled a significant modification to the subsea system, when inspection identified a metallurgical anomaly in a pipeline termination. Ironically, it was the decision, by BP, to rebuild the manifold in the interests of ‘preventing an inadvertent discharge into the Gulf of Mexico’ that led to the charge that its engineering documents were not up to date!
Changes in engineering specification and rebuilding part of a facility—retrofits, revamps and the like are, on the one hand, well understood processes that happen all the time. On the other hand, they are rather tricky to track with IT systems that may or may not have been designed for such. We are in Matthew West country here (see my January 2011 editorial) with the issue of ‘four dimensional’ data modeling that captures all successive states of a model. If you don’t believe me, check out the offerings from Sword CTSpace and Aveva on page 12 of this issue. As the Aveva release states, ‘The reality of the situation for most Owners Operators is that information quality and accessibility is still a significant challenge to manage...’
The headline news is that Boemre cleared Atlantis as fit for service. But this conclusion was reached in a rather tortuous way. The executive summary of the report states that, ‘The electronic document database that BP used to store documents developed during the design, construction and installation of Atlantis was disorganized and inadequate to handle the large volume of documents generated by BP and its contractors. In addition, BP used a confusing labeling system for engineering drawings in the project files. Those drawings had other defects and deficiencies, including undated and missing stamps and signatures, and inconsistent titles for types of drawings.’
However Atlantis was judged as safe because ‘the documents in the electronic database were not the materials relied upon by Atlantis operations personnel.’ This sound familiar, rather than relying on some half-baked ‘database’ BP’s engineers were rolling their own—or as Boemre put it, ‘the process for transferring the drawings and documents used by operations personnel on the Atlantis facility involved the use of systems handover packages, [...] engineering documents and drawings compiled by a team [...] including engineers with knowledge of each component being handed over [...] prior to the start of production.’
In other words, the document handover process is more ‘standard operating procedure’ than all-singing-and-dancing ‘digital oilfield’. While the digital oilfield aficionados like to ‘dis’ Excel as a data repository, what is important is that the documents are correct, whatever the format. Of course handing over a bunch of Excel files and CAD documents is not a perfect solution, hence Boemre’s qualifying remarks above. But as Boemre director Mike Bromwich concluded, ‘Although we found significant problems with the way BP labeled and maintained its engineering drawings and related documents, we found the most serious allegations to be without merit, including the suggestion that a lack of adequate documentation created a serious safety risk on the Atlantic facility. We found no credible evidence to support that claim.’
However, the legal action drags on. Late this month a federal judge declined to dismiss a lawsuit from the whistleblower claiming that BP ‘took $10 billion worth of oil and gas belonging to the U.S. government by submitting false documents about operations at its Atlantis rig.’
The Boemre report is interesting in that it shows how attractive the ‘digital oilfield’ approach is in that much of the naming issues could be fixed by a database. But an intellectually unchallenging task like document workflow becomes nightmarishly complex as the real world maps to the IT system. And the shift from clunky but tried and tested manual processes to a shiny new database is tricky to say the least.
Santos has turned to the open source community for a remote visualization solution for its geoscientists accessing an in-house web portal from their Red Hat Linux desktops. Various proprietary thin client solutions were trialed and found wanting. However two open source projects, VirtualGL1 and TurboVNC2 offered high performance, hardware accelerated 3D graphics deployable across the network.
Santos signed up as a VirtualGL project sponsor in May 2010 and has since been working to upgrade the TurboVNC solution to ‘rock-solid’ enterprise standards.
The thin client solution outperforms the ‘traditional’ 64 bit workstations to the extent that geoscientists are running their Red Hat desktop via TurboVNC exclusively, even requesting that the workstations be removed to give them more desk space.
Santos now caters for more users by adding servers to the TurboVNC farm and has decommissioned infrastructure in its interstate offices. Now all Linux and associated geoscience applications (such as Paradigm’s seismic interpretation and visualization tools) run in the Adelaide cloud, benefiting from more processing power and shared NFS storage. Data management has been simplified as users now see the same applications, disks and databases. The solution has brought immediate capex savings of around $1.8 million and a reduction of $750,000 in yearly admin costs.
Geoscience Systems Specialist Darren Stanton said, ‘Our sponsorship of the TurboVNC and VirtualGL projects gives us direct access to the technical brains that have made this all possible. Any bug-fixes or feature enhancements are dealt with quickly, and it’s not uncommon to have a new version of code sitting in our inbox ready for testing the morning after emailing a request to the programmers. The move to Open Source thin-client deployment has been a huge success for us in so many ways, and we would encourage other companies to adopt and support Open Source technology.’
TurboVNC hides network latency by decompressing and drawing a frame on the client while the next frame is being fetched. The system provides authentication with one-time passwords or Unix login credentials. Over 30 Megapixels/second of image data can transit over a 100 mbps LAN with ‘imperceptible’ loss of quality.
Santos is now sponsoring the migration of the TurboVNC codebase to ‘TigerVNC,’ based on the newer RealVNC 4 and X.org codebases. Early work on both Virtual GL and TurboVNC benefitted from sponsorship from Sun Microsystems and Landmark Graphics which also contributed seed IP to the project.
1 www.oilit.com/links/1103_1 and www.oilit.com/1103_2.
Following recent incidents at offshore installations, refineries and storage facilities, the UK-based Energy Institute (EI) has published a ‘high level framework’ for process safety management (PSM), a 40 page download1. The PSM framework provides a ‘simple and systematic approach’ to defining what organizations need to get right in order to assure the integrity of their operations.
Process safety is a blend of engineering and management skills focused on preventing catastrophic accidents such as structural collapse, explosions, fires and dangerous releases.
PSM builds on the findings of the Baker Report on BP’s Texas City refinery fire which noted that ‘The passing of time without a process safety accident is not necessarily an indication that all is well!’
The Framework defines four ‘focus areas’—’process safety,’ ‘risk assessment,’ ‘risk management’ and ‘review and improvement.’ Alongside these, a set of guidelines is provided to ensure adequate leadership, compliance, workforce selection and involvement and other aspects of PSM. The EI is also working on an online Process Safety Survey (EIPSS) self-assessment and benchmarking service to assess compliance with the framework.
Microsoft CEO Steve Ballmer visited Houston this month—speaking at both the Houston Technology Centre and at CERA Week. Ballmer opined that, to satisfy energy demand growth of around 40% by 2035, ‘We’re not going to be able to just conserve our way into our energy future. Something has got to happen, on the back of not just efficiency and conservation, but on the back of innovation, underpinned by information technology.’ Ballmer’s thesis is that consumer technology, such as instant messaging, migrates into the business environment. ‘Opportunities open up with these technologies in business and specifically, in energy.’
Current ‘powerful ideas’ include augmented reality, ‘helping people assemble and fix complex machinery,’ technology that ‘bends and folds and shapes,’ and wall-sized displays that support real time and interactive training, communication and analysis. Some of this will be available in the very near future while other components will remain unrealized for a decade or so.
What the industry really wants is to model the physical world in the virtual world, and use that modeling to guide behavior and decisions. Enter the ‘most important technology shift of our generation’ the transformation of the IT backroom by cloud computing, ‘combining the power of intelligent devices like personal computers, phones, even electric cars, with the breadth of the internet and the programmability and security that we all expect today in our own datacenters.’
Ballmer envisages ‘seas of computers (so cheap and inexpensive you throw them away) that collaborate to store massive amounts of data at low cost and high computational efficiency.’ In turn this brave new world will require a ‘reinvention of the way we think about building and distributing software.’ This will leverage social networking and cloud storage à la Picasa. Ballmer even managed to segue into the X-Box 360 Kinect controller whose gesture capture and voice control functionality could be harnessed and applied anywhere, from a classroom to an offshore platform to a conference room.
‘We think about putting this kind of technology in an oil and gas collaboration room in Houston, and having engineers in the farthest flung parts of the world be able to communicate with one another to safely investigate and manipulate, for example, a subsea reservoir model and collaborate with the offshore engineering teams with the kind of precision that is as easy with words and hand gestures but perhaps very difficult with a keyboard.’ Virtual world models can be manipulated by rotating your hand and the results used in real time to perform actions at a remote location.
Entire ecosystems of service providers will be connected virtually, working off the same data and providing secure collaboration and data integration for remote service and repair. A variety of computing devices will let you visualize and understand the data that helps you make smart, safe decisions quickly, as you manage digital oilfields, smart grids and other critical infrastructure.
Ballmer cited Baker Hughes as a poster child in the context of reservoir simulations that ‘used to take nine months to build and run.’ Now it takes less than 30 days in the Windows Azure cloud. Ballmer wound up with a brief plug for the Smart Energy Reference Architecture and a rather low key mention of similar work in the upstream. More from www.oilit.com/links/1103_47.
Speaking at the Tibco Spotfire Energy Forum in Houston this month (more in next month’s issue), Scott Biagiotti described Hess’ ‘Opportunity Register,’ an in-house developed application that manages Hess’ exploration and development opportunities. Opportunity Register allows Hess to evaluate worldwide investment opportunities in as impartial a manner as possible—even in the face of disparate, unstructured information from diverse sources. The Opportunity Register was built with 3GiG’s Prospect Director toolkit.
Information is entered directly into the system by Hess’ knowledge workers, capturing opportunities at a fairly detailed level. Opportunities can be evaluated with a multiple scorecard system according to NPV, reserves added, risks and impediments and costs saved.
The philosophy behind the register is that the most valuable system in evaluating an opportunity is as likely to reside in the head of a subject matter expert as in a corporate database. Hence the system addresses the capture of ‘soft’ information and subjective ranking criteria. The idea is to avoid evaluating a prospect or opportunity on the basis of a ‘good’ presentation. The Register turns qualitative evaluations into quantitative analyses that are amenable for further study using Spotfire—in particular the Spotfire Web Player which allows for collaborative analytics from a web browser. More from www.oilit.com/links/1103_27 (Spotfire) and www.oilit.com/links/1103_28 (3GiG).
Paradigm reports industrial-strength use of the computationally intense technique of reverse time migration using code running on high performance graphics processing units (GPU) from Nvidia Corp. Reverse time migration (RTM) is a preferred solution for imaging complex structures such as are encountered in the deepwater Gulf of Mexico. RTM (actually a depth imaging solution) code has been around for a while, but it is only in the last few years that computing power has been capable of imaging large 3D data sets.
Acceleware’s RTM library leverages Nvidia’s CUDA compiler that allows the same code to run on either GPUs or CPUs as appropriate. The Echos RTM module supports anisotropic imaging. The code can be run in ‘modeling’ mode, to generate synthetic shot records and coupled with Paradigm’s ‘Skua’ interpretation flagship. More from www.oilit.com/links/1103_33.
President Obama has requested $358.4 million (a 50% hike) to fund the Bureau of Ocean Energy Management, Regulation and Enforcement in fiscal year 2012. The budget is designed to implement organizational and regulatory reforms in the wake of the Deepwater Horizon disaster. The report from the National Commission on the BP Deepwater Horizon Oil Spill noted that the agency has been ‘historically and dramatically underfunded.’ The hike will be partially offset by $65 million in fees charged to industry.
The latest edition of BP’s Energy Outlook is the first the company has made public. Energy Outlook (EO) complements BP’s Statistical Review with a projection of future energy trends out to 2030. EO sees little deviation from the current trends in CO2 emissions—something of a ‘wake up call’ to BP CEO Bob Dudley. The report sees energy use as flat in the OECD but rising sharply in line with population and growing economies elsewhere. . Overall energy use is to grow by 40%. Coal oil and natural gas remain by far the predominant fuel sources. Oil’s share in the mix continues to decline and is replaced by natural gas and, to a lesser extent, by renewables.
Calvert Asset Management is pressing the SEC to implement the Dodd-Frank Wall Street reform act so that oil, gas and mining companies intensify their disclosure efforts in countries with poor governance, weak rule of law and high levels of corruption.
Grant Thornton LLP’s ninth annual Survey of Upstream US Energy Companies found a bullish outlook for employment levels in 2011. Fallout from the Macondo well is likely to spark a 10% or greater hike in Gulf of Mexico drilling costs.
P2 Energy Solutions’ informal survey of attendees at the 2011 NAPE Expo found that commodity prices were the number one concern followed by a ‘chorus of unhappy voices about politics and government policies.’ Concerns about drilling equipment and field supplies came in third. Fourth was the ‘challenge of hiring good people.’
Prophet ‘s annual Corporate Reputation study saw former high flyer BP tumble in the face of the Gulf oil spill. BP fell from number 78 to the very bottom of the ranking at 145. On the tech side, Sony moved up to No. 5, ahead of Amazon (9) and Apple (13). Google, however, slid to 28 from 10. The oil and gas industry overall saw a 10-point drop. ConocoPhillips was its high performer with a ranking of 127.
Trace International has launched an anti-bribery specialist accreditation program, a framework for anti-bribery compliance training and certification—links/1103_32.
Shell’s Energy Scenarios look ahead to 2050 by which time global energy demand could triple from its 2000 level. While improvements in energy efficiency could moderate demand by 20% and supply growth could bring a 50% production hike. This leaves a 400 EJ/a gap—roughly the size of the whole industry in 2000. This ‘Zone of Uncertainty’ will be bridged by ‘some combination of extraordinary demand moderation and extraordinary production acceleration.’ I.e. it will be either a zone of extraordinary opportunity or extraordinary misery!
The Association for the Advancement of Artificial Intelligence has an informative article1 on how IBM’s Watson was built. Last month Watson won out over human competition on the popular Jeopardy quiz show. Watson’s motor, IBM’s DeepQA architecture uses the open source Apache Unstructured Information Management Architecture (UIMA), Hadoop big data analytics and RDP data sources (see also Heaton Research2).
Landmark continues to add functionality to its new interpretation flagship Decision Space Desktop (DSD—OITJ October 2010). The port to Microsoft Windows is now complete. Product Manager Gene Minnich told Oil IT Journal, ‘Feedback from early adopters is good. Folks like the underlying OpenWorks data management infrastructure, the integrated dynamic earth modeling and the fact all the ‘classic’ Open Works tools are still accessible from the Desktop.’
Kees Rutten’s (Slokkert Consultancy) unfaulting and unfolding software has been added to the seismic interpretation module enhancing seismic model building. Other new developments include support for raster logs, user defined petrophysical calculations and the incorporation of geomechanics in the Well Planning module for ‘factory’ drilling of shale gas plays. A fully bi-directional link with ESRI helps plan wells around faults.
DSD now sports a software development kit (à la Ocean for Petrel) allowing developers of plug-in apps to the DSD to leverage all the above data and interpretational functionality. The primary language for the DSD SDK is Java. This typically runs in the open source Eclipse integrated development environment. This allows Landmark to deploy its software on multiple target platforms, leveraging various open source tools to port code e.g. to Windows.
Halliburton’s InSite rig connectivity is now available from DSD and the dev kit. DSD is also now used within the Halliburton organization, notably with the integration of multiple drilling products into a new ‘Optimized Drilling Performance’ (ODP) suite. ODP targets integrated bit and bottom hole assembly design, mud characteristics and real-time optimization of the drilling process. Decision Space adds real-time well placement and real time updates of the earth model.
Senior VP Jonathan Lewis said, ‘Teams now work with this proprietary workflow and the digital infrastructure in our real-time operating centers in every region. ODP reduces nonproductive time and improves performance for our customers.’ More from www.oilit.com/links/1103_29.
The 2011 edition of Geovariances’ Isatis includes local optimization, an automatic variogram fitting tool and multi-point statistics leveraging Ephesia’s Impala library. Also new is an interface to Roxar’s RMS modeler—www.oilit.com/links/1103_35.
ClearEdge3D has announced ‘Edgewise Plant’ to automatically extract pipe geometry from 3D point cloud data and exports it as a CAD-ready DXF file—www.oilit.com/links/1103_36.
The latest release of Flare’s E&P Catalog includes a stand-alone GIS display of query results. The interface shows ‘publishing activity’ to track key documents as they are published and updated—www.oilit.com/links/1103_37.
The 4.2 release of Fugro-Jason’s EarthModel FT includes a Petrel data import/export plug-in, an OpenSpirit link, improved upscaling and more—www.oilit.com/links/1103_38.
IDS VisNet2.0 solution for drilling data visualization, analysis and reporting includes both ‘static’ and customizable, complex queries. A dashboard displays multiple tables and charts for monitoring of rate of penetration, non-productive time which can be compared rig to rig, well to well or tour to tour—www.oilit.com/links/1103_39.
IHS has announced ‘IHS Energy Executive Insider,’ a web-based subscription and daily alert of ‘decisive’ energy issues of the day, along with analysis of and perspective on industry developments and business implications—www.oilit.com/links/1103_40.
Mathcad Prime offers a new ‘intuitive task-based’ interface and ‘intelligent’ units support. A new design of experiment capability reduces the number of physical or virtual experiments. The new Mathcad better integrates with other PTC tools and with CAD and CAE tools—www.oilit.com/links/1103_41.
Qbase has upgraded its MetaCarta GeoSearch toolkit for the Solr open source enterprise search engine. Users can index and search documents by keyword inside of a defined geographic area. MetaCarta reports a ‘growing’ number of users of Solr and also of Lucene—www.oilit.com/links/1103_42.
The 16.9 release of Petrosys’ eponymous flagship adds direct connections to Paradigm EPOS 2D and 3D seismic data, direct access SeisWare interpretations and 3D grids. An upgrade to Petrel connectivity brings ‘significant’ speed improvements—www.oilit.com/links/1103_43.
Stingray Geophysical ‘s new fibre-optic ‘FosarFocus’ system is a permanently installed broadband multi-component seabed seismic array for time-lapse monitoring of flood fronts—www.oilit.com/links/1103_44.
Emerson/Roxar has released new versions of its Tempest flow simulator and EnAble, its history match assistant. Tempest 6.7 adds new CO2 flood capability, better memory usage and new grid transforms and visualization aids. EnAble 2.4 includes a console log that keeps track of all user activity, a progress indicator and an improved navigation tool bar. Both products run on Windows and Linux desktops with multi-core CPUs or on Linux clusters and Windows HPC servers—www.oilit.com/links/1103_45.
V 9.0 of Invensys’ SimSci-Esscor PRO II steady state process simulator sports a new customizable user interface and support for Microsoft App-V virtualization. Updated thermodynamic correlations includes a link to NIST’s properties database—www.oilit.com/links/1103_46.
Following in the footsteps (OITJ November 2009) of the Minerals Management Service (now Boemre), the US Department of the Interior’s Office of Natural Resources Revenue’s (ONRR) new eCommerce reporting web site went live this month. ONRR has streamlined revenue reporting from Federal and American Indian mineral leases, improving collection of royalties, rents, and production data. ONRR records some 700,000 lines of data every month. Online training is available from the ONRR website. Under the hood, the system uses Ipswitch’s MessageWay for EDI file transfers and a Microsoft .Net framework for web services. The electronic reporting website is an Oracle/PeopleSoft system.
The new system has consolidated ONRR’s editing process and ‘significant’ cost savings have been achieved over the previous contracted out electronic reporting system. Today, around 98% of ONRR reports come through the eCommerce system. More from www.oilit.com/links/1103_25.
Statoil is embedding Kadme’s Whereoil inside its ‘GeoTracker’ well reporting and information management system. GeoTracker will be localized to Statoil’s individual country requirements and used to report data to regulatory authorities worldwide. Whereoil tracks Statoil’s internal data flows through stage gates and quality control steps before capture to corporate data stores and transmission to third parties. The project kicked off last month and will go live in September 2011.
Statoil’s Leading Advisor on Data Management, Lars Olav Grøvik, said, ‘GeoTracker will be an important piece in Statoil’s data management system. Its well data tracking functionality will ensure Statoil complies with internal and external reporting requirements.’
Automated notifications will ensure that users meet reporting deadlines. Data flow tasks are based on predefined, country-specific rules. Monitoring and tracking capabilities allow supervisors to check and report on data status. Whereoil will connect Statoil’s internal systems to outside information sources such as the Norwegian Petroleum Directorate fact pages and the Diskos National Data Repository.
Whereoil also powers the Columbian EPIS National Data Repository and the oil company-sponsored Arctic Web data portal. More from www.oilit.com/links/1103_26.
The 10th meeting of the Energistics-sponsored National Data Repositories (NDR) organization held this month in Rio de Janeiro saw some 130 delegates representing 28 countries. Magda Chambriard, director of Brazil’s ANP oil and gas regulator underlined the importance of good quality, accessible data in attracting newcomers to the Brazilian E&P scene. Following the softening of Petrobras’ monopoly, some $650 million worth of data has been transferred to the Brazilian Data Center BDEP1, endowed with around $3.5 million of hardware. BDEP’s staff of around 50 currently manages over three petabytes of data. Some 78 companies (40 Brazilian) utilize the resource. BDEP deploys a Petrobank solution from Halliburton’s Landmark unit along with an internally developed tool, ‘PowerQC’ for well data. This checks incoming data for conformance with ANP’s own versions of LIS and DLIS formats and validates curve mnemonics with the ANP standard nomenclature. ANP is now working on a tender for a completely new solution.
On the other side of the ocean, and of the vendor fence, Sonangol’s Angolan NDR leverages Schlumberger technology as Rick Johnston related. The Angolan NDR was designed as a showcase for the country’s high volume deepwater data assets. Schlumberger tools including the Seabed database, eSearch, Petrel plugins and ProSource data management are accessible through a secure, encrypted data access layer although, ‘encryption requires enormous compute time.’
NDR 10 featured several breakout sessions offering a platform for operators and suppliers to exchange views. The old chestnut of data standards continues to fuel the debate. Many upstream standards like SEG-Y and current versions of Witsml are not very standard themselves. Energistics is working to tighten up the 1.4.1 Witsml release to enhance interoperability.
Some NDRs mandate their own NDR standards and have adopted a draconian ‘if data doesn’t conform, we send it back’ approach. There is a general recognition of the high cost of ‘standardizing’ and managing data. Standardization extends beyond data formats to include naming conventions for well and seismic data objects and their digital files. Older seismic data has proved especially problematical. One NDR operator reported 50,000 SEG-Y tapes with no identifiers. Even when standards are set and laws are made, enforcement can be problematical.
Media disposal is another problem shared by NDR operators and operating companies. On the one hand space is a problem, especially for older media. But not all are prepared to bite the bullet and go for destruction of legacy media. The transcription process is never entirely risk-free. ANP reports some 200,000 tapes ready for disposal. These were initially offered back to the operating companies but they didn’t want them! ANP, like others, is now faced with the issue of how to dispose of or recycle their old media.
Jerry Hubbard noted Energistics’ growing role in the NDR arena, with five agencies as Energistics members. Hubbard encouraged NDR operators to ‘think of the economics behind standardization’ and leverage the three flagships of Witsml, Prodml and the embryonic Resqml. For Hubbard, ‘proprietary standards have no value.’ Energistics, with help from the UK’s DECC regulator is mooting a collaboration portal for NDR operators.
Remastering seismic data is another hotly debated topic. One experienced user noted that it is important to perform the critical matching process—tying seismic and navigation data with observers reports before touching the tapes. Otherwise a laborious load into a datastore can result in large volumes of orphaned and/or duplicate data.
For ANP, copying to new media is a ‘never ending’ process as manufacturers only offer a 10 year guarantee for their media. New technology is offering ever increasing capacity with terabyte discs and multi gigabyte media.
The tape vs. disk storage is another hotly debated issue. Some advocate staged storage with high performance media for online data and cheaper media for long term storage. Falling disk prices mean that many are considering all disk storage although this is deprecated by others as suitable archival media. On the other hand, the perennity of tape hardware has been questioned too! Such decisions are often taken in the context of a resource and cash constrained environment.
LMKR runs the Pakistan Petroleum Exploration and Production Data Repository2 as a pubic-private partnership with government. Saeed Akhtar described this early adopter of Petrobank which is now a disk-based online solution. All investment are made by LMKR ‘at its own risk’ with cost recovery through future sales before a profit sharing split with the government. A major upgrade and office automation program was part financed in a USAID program. Today the system holds the digital equivalent of 300,000 tapes and 22 million feet of log curves. The system can be configured to provide a digital ‘E-Room’ for data review during bidding rounds and also provides an active upstream scouting service3 on Pakistan’s activity. Along with the more usual benefits cited by NDR operators, PPEPDR is used to ‘groom’ and increase utilization of in-country resources.
Individual country reports showed a range of projects at varying stages of maturity. India is in the tender stage for its five year ‘INDR’ project to store its country wide data on disk and a 250 slot LTO4 robotics system. All data will be available in digital form with optional self-service uploading by operators. Kenya has scanned 90% of its paper data archive into the Norwegian Petrad web-based asset management system. Netherlands-based TNO offered a different perspective on NDR rationale. As larger company exploration declines, the Dutch government wants to maintain activity levels by encouraging smaller companies and sees easily available online data as a plus. Paper data and 3D seismic is now available online in the DINO system. This was set up according to EU directives, leveraging Inspire for spatial data and the UN framework for energy resources and risks. Dino has a €2 million budget for the next 3 years.
Fugro won the Azerbaijan NDR Tender in 2009 and its Traxx software underpins ‘Odlar,’ the oil data library of Azerbaijan Republic. Odlar offers controlled access, encrypted login, strong password validation and data entitlement with all transactions recorded. Running of Odlar will be taken over by Socar after the pilot load phase.
Asim Hussain outlined Kuwait Oil Co.’s e-business and enterprise asset management integration program a.k.a. the ‘eBeams’ project. This is said to be one of the ‘largest and most ambitious’ implementations of IBM Maximo ever. eBeams has integrated and streamlined KOC’s life cycle from design, through acquisition, operations, maintenance and write-off for a variety of assets. eBeams saw the retirement of over one hundred standalone and legacy applications including Indus (now Ventyx) Passport. Data from the legacy apps was cleansed, standardized and migrated into the Maximo enterprise asset management system. A suite of customized software tools has been built around eBeams which went live in December 2010. KOC has spent the last three years developing eBeams and anticipates a ‘wide spectrum’ of benefits to users and to KOC’s business in terms of process improvements, work simplification and functional integration. Roll-out included a four week long campaign, along with ‘town hall’ sessions with interaction between the project manager and KOC personnel. eBeams includes business-to-business functionality for interactions with external business partners. The system will ‘lower the costs of goods and services by facilitating price comparisons, streamline interactions with contractors and enhance procurement and tendering.’ eBeams includes interfaces with Primavera and Microsoft Project and a link with KOC’s ERP system for cost control.
Richard Hopkinson traced Tullow Oil’s spectacular growth over the last six years. The company has two ‘world class’ oil field in Uganda and Ghana and is now in the FTSE Top 30. Tullow has launched its biggest IT project ever in support of its 150 strong supply chain staff. Project Chombe targets the transformation of Tullow’s supply chain, addressing the issue of poor data availability with a supply chain management system built on a single global instance of IBM Maximo V7. Chombe’s scope extends master service agreements, reporting, standard operating procedures, national content and HSE. Maximo is the backbone of all this. The same hosted instance supports all of Tullow’s activity in Europe, Africa and elsewhere. The Maximo instance is hosted by IBM in the US. Support is provided by IBM’s offshore team in Pune India.
Chombe includes a supply chain operating model and business processes. The system includes Tullow’s ‘Phoenix’ LiveLink-based document management system, Maximo and Infor’s SunSystems financial package. IBM adds application service provision and hosting including workflow management and delegation of authority. Maximo purchasing, materials and asset management have been implemented. Tullow has added custom development to the mix—to enhance desktop requisition, stores handling and for the SunSystems link. Maximo has been tweaked to handle oil country tubular mixed units. The multi organization, multi site environment necessitated a fair amount of customization. Tullow’s own work process needed updating with more liberal delegation of authority to handle newly empowered users. The hosted system provides ‘99.7% availability’ averaged across 13 countries. The system is tuned to available network capacity and is reliant on local ISPs. But the feedback is, ‘so far so good.’ Maximo touches everybody in Tullow. All have been trained on the system from geologist to finance manager. A traveling user can log in and approve from anywhere. ‘Super users’ embed all lines of business and are ‘told to attend all seminars etc.’ Change management made all the difference thanks to town halls, posters, leaflets, Tshirts and cheat sheets.
Tullow can now track demand and transactions, spend control, cost recovery audit trail. Moving procurement down the chain has freed-up managers for higher added value work and analytics. Hopkinson noted that ‘data readiness’ was key to project success. The master data management processes was very important, performed by an integrated cross functional team of specialists. Thorough, objective checks of business readiness at each locality before go-live was critical as was just in time training. This may be hard to arrange, but it does avoid the need for refresher training and follow up. Future development will likely include asset management and supply chain reporting with (probably) Cognos. Tullow is now working to develop scorecards to track and improve its spend process—work that used to be done with ‘a blitz of consultants looking at thousands of invoices!’
Oneok has consolidated 15 apps including legacy Maximo and multiple spreadsheets in an effort to improve business processes and correct inefficiencies in its work and asset management. Today six facilities are live on Maximo. Developing a leak management application required particular care in this safety-oriented and highly regulated environment. Recent ‘incidents’ are driving changes in regulatory requirements and the business wanted to go further and consolidate into a set of best practices.
Oneok’s guiding principle for application development is ‘do not customize.’ It is better to use Maximo configuration code rather than Java. A guided process means that users don’t have to fill in all fields—enhancing data integrity. Notifications are fired off when records are getting close to non compliance and field labels change depending on the type of leak.
Once a business requirement has been defined, these are fine tuned by subject matter expert testers who find and fix bugs before rolling-out to users. Initially a system may be rolled out with limited functionality. Features are added with subsequent smaller releases. This approach allows the business to feed back on functionality as it grows.
Mike Van Gemert (West Engineering Services) noted the challenge of new ‘big ideas’ that are not aligned with standards. Rig systems are increasingly complex, IT is challenged by the offshore environment and security models are not keeping pace with threats. Critical documents and drawings may not be available in their most up to date form on board and high day rates often determine priorities and maintenance strategy. The answer, according to Ven Gemert, is the Norwegian integrated operations initiative and its taxonomies and XML schemas for interoperability based on ISO 15926. West worked with the IBM Centre of Excellence in Norway on integrated O&M, now embedded in the Maximo for oil and gas solution. More from Pulse on www.oilit.com/links/1103_48.
Nalco VP Ying Yeh has joined ABB’s Board of Directors.
Advantage IQ has appointed Seth Nesbitt as VP and chief marketing officer. Nesbitt was formerly VP marketing with Parallels.
Ajei Gopal has been appointed to Ansys’ Board of Directors.
Apache Corp. has appointed Rod Eichle as president and CEO, and Roger Plank as president and CCO.
Captain Lee and Keane Sia head-up the new API office in Singapore.
Former BP CIO Simon Orebi Gann has been appointed to AspenTech’s board of directors.
The new US Ocean Energy Safety Advisory Committee will be chaired by former Sandia National Lab. director Tom Hunter. Industry representatives include Charlie Williams (Shell), Paul Siegele (Chevron), Jo Gebara (Technip) and Don Jacobsen (Noble).
Kevin Montagne is now COO of Dynamic Drilling’s Apollodart unit.
EOC has appointed Jon Dunstan as COO. He was formerly with London Marine Consultants.
Stéphanie Berthelin has joined Ephesia Consult. She was formerly with France’s CNRS R&D establishment.
John Gremp has succeeded Peter Kinnear as President and CEO of FMC Technologies.
Honeywell has nominated former U.S. Senator from New Hampshire Judd Gregg, to stand for election as a Director at the 2011 Annual Meeting of Shareowners.
Ikon Science Americas has appointed Mark Kittridge as VP Technology. Kittridge was previously with ConocoPhillips.
Ryan McPherson heads-up the new Abu Dhabi arm of the oil and gas Industry Technology Facilitator. Dave Liddle is Strategic Technology Director and Dorothy Burke, Operations Director.
Allen Johnson has joined Kalido as VP of marketing and channels. VP of strategy and business development, Winston Chen is now CTO.
Patrick Sullivan and Sotheavy Loch have joined NDB.
OSIsoft has named Susanna Kass as new COO. She was formerly COO of Trilliant Networks. Dennis Lin has joined as China country manager.
Palantir Solutions has created a free guide to global oil and gas fiscal terms—links/1103_49.
Simon Kendall has joined Reservoir Group as Data MD. Wade McCutcheon is Geo VP and Dave Clark VP corporate development. Sara Macauley is marketing and communications manager.
Tor Magne is Norway country manager for Corpro Systems. He hails from Odfjell Drilling Technology.
Stephen Wright is VP EAME operations with Solomon Associates. He was formerly with KBC Process Technology. Bill Trout has been promoted to VP of refining studies.
Stewart & Stevenson has named Tony Petrello, Robinson West and Micki Hidayatallah to its board.
Tomas Arroyo heads-up T.D. Williamson’s new Colombia unit. Juan Chacin is director, Latin America.
Michael Tiller is moving to Dassault Systèmes.
Morten Vinther will head-up Verdande Technology’s new Abu Dhabi branch.
Philippe Theys points out that in last month’s review we mistakenly titled his book ‘Quest for Data Quality.’ The correct title is ‘Quest for Quality Data.’ Our apologies.
CGGVeritas has teamed with PT Elnusa on a marine seismic joint venture in the Asia Pacific region.
Reservoir Group has acquired Houston-based The Mudlogging Company.
Fugro has acquired subsea inspection, repair and maintenance provider TS Marine Group. Simmons & Co. represented Fugro in the transaction.
EMAS AMC, part of Ezra Holdings Group, has completed its acquisition of Aker Marine Contractors AS from parent Aker Solutions AS.
Dresser-Rand is to acquire Grupo Guascor SL for € 500 million including the assumption of approx. € 125 million debt. A ‘significant portion’ will be paid in stock.
Johnson Controls is to acquire EnergyConnect.
Kappa Engineering has acquired Grooviz SAS, developer of the ‘RealityLounge’ geoscience and engineering visualization systems.
Midas Medici Group (owner of UtiliPoint) has acquired data center provider Consonus Technologies. The transaction was paid for with approx. 5 million shares of Midas common stock.
NetApp is to acquire LSI Corp.’s Engenio external storage systems business for $480 million cash.
Geoscience and engineering training specialist Nautilus has joined RPS Group.
Shaw Group has acquired Coastal Planning & Engineering Inc. for approx. $26 million.
Siemens is to acquire Poseidon Group and Bennex Group from Subsea Technology Group AS.
AMEC has acquired project delivery specialist Qedi. Simmons & Co. represented Qedi in the transaction.
Teradata Corp. is to acquire Aster Data Systems’ business, IP and technology. Teradata acquired 11% of Aster last year and is to pay approx. $240 million for the remaining 89%.
Trelleborg is to acquire Veyance Technologies’ Brazilian unit for approx. SEK 40 million.
Weatherford stock dropped over 10% after the company announced that it would restate its financial statements and delay its annual report because of accounting problems. Weatherford said it expects that it will have to make a $500 million ‘adjustment’ to its financial statements and Q4 2010 earnings, for the period from 2007 to 2010.
Chevron hosted a gathering of standards bodies last month working in the field of oil and gas plant and process engineering. Billed as the OpenO&M/MIMOSA and PCA Forum (Americas) the event also rolled in automations standards from OPC-UA. The event was billed as focusing on the benefits of using OpenO&M and ISO 15926 in a ‘coordinated fashion.’ A tough call given the considerable differences and overlap between these two independently developed solutions to what is essentially the same problem.
Alan Johnston of the OpenO&M and Mimosa Consortia offered a perspective on interoperability for critical infrastructure including energy and chemicals facilities. Johnston noted the different perspectives (and standards deployed) in capital projects and operations and maintenance. He went on to problem statement—of multiple non communicating systems and temporal (handover) barriers and the case for standards. Large enterprises are now spending 15 times the cost of license fees on integration efforts. A standards-based interoperability model will ‘dramatically reduce these direct costs.’
What is needed is an ‘open information management architecture’ comprising a reference information environment and an execution environment. So that nobody is left out, Johnson suggests that ISO 15926 should be the principal standard for reference information with OpenO&M the execution environment. Over time, the complex equipment models will be stored in 15926, but right now OpenO&M’s registries, schema and services are how the execution environment actually works. This gives a ‘pragmatic way of moving forward from where we are to where we want to be in an incremental and risk managed fashion.’ An open event-oriented message bus provides ‘platform neutral’ messaging between OpenO&M and the 15926 environment. Notwithstanding Johnston’s positive spin, the integration slideware is not for the faint of heart. Now Mimosa, OpenO&M Fiatech, PCA, IBM and the Australian CIEAM have begun collaboration on ‘improved approaches to open standards-based interoperability for asset management through an industry use-case driven solutions process.’ All under the auspices of yet another ISO—TC184.
Mimosa CTO Ken Bever provided a technology overview of Mimosa’s open system architecture for enterprise application integration (OSA-EAI). This is an attempt to address the O&M problem with an ‘open, unambiguous data language’ coupled with a ‘standards-based abstraction and integration layer.’ The Mimosa OSA for condition based maintenance (OSA-CBM) also ran.
Clifford Pedersen, CIO of Northwest Upgraders and Mimosa director, outlined a ‘use case’ or what will be one when deployed, of OpenO&M at a Canadian heavy oil upgrader. The plan is to leverage Mimosa across construction, handover, upgrades and on to O&M including ‘semi-automatic triggering of condition based maintenance and early warning notifications. Pedersen observed that ‘no single standard has broad support for capturing all of the required process, structural, mechanical, electrical, electronic and software information elements. This means that multiple engineering reference standards are likely to continue to evolve in order to solve the entire problem and that much of the information is still being managed either through proprietary extensions to a standard or through completely proprietary means.’ Pedersen’s vision for the The Northwest Information Management System (NIMS) is of Mimosa, ISA-95 and OPC used together to support M&O—with OPC UA as the ‘data pipe.’ What is needed is an ‘open architecture solution that integrates process/operational, maintenance, and business systems, applications and processes that can be used by everybody, not more research!’ The proposed solution is ‘field proven’ as it was used in BP’s ‘eRTIS’ downstream enterprise portal. To make this work, software vendors will have to write adapters to ‘talk’ OpenO&M.
Ian Glendinning (Glenco Information Services) has provided us with the following from his presentation on the ISO15926/Joint Operational Reference Data (JORD). ISO15926 users are dependent on well managed reference data. JORD kicked off in 2010 with a ‘front-end’ project to scope out a scalable and sustainable business model that could support JORD delivery. In February 2011 the PCA and FIATECH boards agreed on JORD Phase with a kick-off meeting planned for May 2011. Harmonization with other standards like Mimosa is achievable by collaboration, without the need to merge them into ISO15926. Harmonization of Mimosa/OpenO&M with ISO15926 will depend on accessible reference data and will therefore mandate successful execution of the JORD project. More from www.oilit.com/links/1103_34.
A teaming between nuclear cyber security specialist, DevonWay and Curtiss-Wright’s Scientech unit heralds a move to the oil and gas vertical. The current partnering addresses mandatory security assessment of nuclear power plants with DevonWay’s ‘CyberWay’ software bundled with Scientech’s expertise in mapping controls to assets for assessment ‘walk-downs.’
The approach will likely extend to other critical infrastructure including oil and gas processing facilities where Curtiss-Wright is an equipment supplier. DevonWay VP Sally White told Oil IT Journal, ‘Bob Felton, our CEO and President and SVP Rich MacAlmon founded Indus International (now Ventyx). To date our focus has been on the nuclear industry where we have developed a suite of safety and regulatory products. Our workflow solutions transcend other industries and we anticipate entering the oil and gas vertical next year.’ More from www.oilit.com/links/1103_23 (DevonWay) and www.oilit.com/links/1103_24 (Scientech).
Saudi Aramco has awarded Foster Wheeler and partners Sofcon and SCEC K&A a five year contract relating to the company’s general engineering services plus (GES+) initiative. The deal covers engineering design and procurement services for oil and gas, refining, petrochemicals and infrastructure projects.
Aramco has awarded another GES+ contract to Wood Group unit Mustang Al-Hejailan Engineering.
Verizon Wireless is to provide Duke Energy with a telecommunications network connecting smart meters in its ‘Envision: Charlotte’, a public-private energy efficiency drive. Real time energy use data will stream over Verizon Wireless’ 4G LTE network to ‘large interactive lobby-level screens’ provided by Cisco.
Composite Software has teamed with IPL on the provision of data virtualization services in the US and Europe. The services include best practices developed for ‘a global top-five integrated energy company.’
Schlumberger has named INT a ‘preferred’ Ocean development partner. Third parties can contract with INT for the development of proprietary plug-ins, leveraging their own IP. Schlumberger has also licensed INT’s GeoToolkit.NET which will be integrated with Petrel 2011.
Knowledge Reservoir has teamed with market intelligence specialist Quest Offshore Resources to provide consulting services and integrated reservoir, production, facilities, and costing data solutions to the upstream. The deal teams KR’s reservoir knowledge base with Quest’s field development infrastructure data to provide detailed analysis of reservoir performance, field design, economics and asset modeling.
FMC Technologies has signed a five-year global frame agreement with BP to provide subsea production systems and life-of-field services for BP’s subsea developments.
Hess Corp. has joined the OFS Portal community. OFS Portal and its trading partners leverage the API PIDX electronic business standards to drive electronic business within the petroleum industry.
Contango has awarded Moblize a contract for real time data services on an exploration well in the Gulf of Mexico.
IPCOS has completed a six month audit of Dolphin Energy’s facilities in Qatar and the UAE. Dolphin’s automation systems were benchmarked against the ARC Advisory data base of oil gas and petrochemical plants worldwide.
Sercel has been awarded a contract by Compania Mexicana de Exploraciones S.A. de C.V. (Comesa) for the purchase of 10,000 digital sensor units. The company also reports the sale of a second Unite cable-free acquisition system to Mitcham Industries. The 3,000-channel order follows a previous purchase 7,500 channels last September.
Cameron Subsea Systems has signed a global agreement with BP for engineering, procurement, construction and operational support of subsea production systems.
KBR is to provide FEED services on KM LNG’s Kitimat, British Columbia liquid natural gas plant. KM LNG is a unit of Apache Corp. Kitimat is anticipated to be ‘a fully electric-driven’ plant.
GDF Suez has awarded Technip a € 45 million engineering, procurement, construction and installation contract for its Norwegian North Sea Gjøa development.
Aker Solutions is one of two finalists in the competing on the Woodside-operated Browse LNG development.
Norwegian EPIM has signed off on a Logica-developed substitute for the LicenseWeb and AuthorityWeb reporting standards. The result, ‘L2S’ went live last month with a peak of 500 simultaneous users—www.oilit.com/links/1103_50.
Sunoco Logistics has updated its geographical information system leveraging the Petroleum Open Data Standard (PODS) data model. A decision support system offers users an internet map interface to facilities, pipelines and other PODS data. The development used ESRI tools and a Microsoft Silver Light GUI. PODS also reports the imminent release of its 5.1 model version—www.oilit.com/links/1103_51.
CEN, the EU standards organization has announced the small and medium sized enterprise Standardization Toolkit. SMEST2 will ‘heighten awareness of standardization’ with a new website for knowledge sharing—www.oilit.com/links/1103_52.
A ‘position paper’ from the UK Government states that, ‘When purchasing software, infrastructure and other ICT goods and services, Cabinet Office recommends that Government departments should [..] deploy open standards in procurement.’ More from www.oilit.com/links/1103_53.
API PIDX is inviting volunteers to join the new project to develop a supplier KPI data transmission standard. Target benefits include automated scorecard completion, improved data quality and faster processing—www.oilit.com/links/1103_54.
CapeOpen/CO-Lan is inviting comments on the latest version of its unit standards. The 6.21 revision lease ‘clarifies numerous points while not changing the underlying design of the interfaces.’ More from www.oilit.com/links/1103_54.
Eliis’ PaleoScan consortium has been extended for a year after the initial two year phase. Current members GDF-Suez, BG Group, Wintershall, ENI and Maersk remain on board. The 2011 work program targets optimization of the geomodel engine for larger seismic volumes and faster processing. New tools are to be developed to constrain models within horizons and faults boundaries for ‘seismic to reservoir’ characterization. More from www.oilit.com/links/1103_15.
The Scottish carbon capture and storage consortium issued its report this month. The study investigates the potential of CCS in the deep saline aquifers of depleted North Sea oil and gas fields, augmenting an earlier 2009 study with a detailed quantitative analysis of the storage potential of the sandstone of the Captain field in the Moray Firth. Schlumberger’s Petrel was used to build the geological model with CCS simulation performed using the Eclipse 300 CO2Store module. Modeling allowed CO2 migration to be tracked out to 5,000 years in the future. However, the modeling also showed that injection rates may only be sustainable for a few decades before pressure build up significantly restricts capacity. Total capacity of the Captain sandstone is around 1.5 billion tones or roughly 3 trillion kWh. Download the free 72 page report from www.oilit.com/links/1103_14.
Shell has awarded Logica and FleetCor Technologies a €300 million contract for the supply of a ‘managed’ fuel cards service. The 10 year deal covers Shell’s commercial fleet fuel cards program in Europe and Asia. Business-to-business fuel cards provide operators of commercial vehicle fleets with ‘convenient, cost-effective and secure’ payment for fuel and other on-road services.
Logica is prime contractor to Shell and will deploy FleetCor’s Global FleetNet (GFN) card processing platform which is to replace a Shell legacy system. A pilot is due for completion by April 2012 with full roll-out scheduled for year-end 2013.
The system will then be operated by Logica and FleetCor as a managed service on behalf of Shell. The project covers 35 countries in Europe and Asia. Logica CEO Andy Green said, ‘Our understanding of the oil and gas industry and experience in payments and fuel cards gave us the edge in developing a compelling proposition for Shell.’ Logica already provides e-commerce services to Shell via a managed Postilion electronic funds transfer service. More from www.oilit.com/links/1103_16 (Logica) and www.oilit.com/links/1103_17 (FleetCor).
BP’s Russian joint venture, TNK-BP, has selected Houston-based Amphora’s (formerly TradeCapture) ‘Symphony’ energy trading and risk management (ETRM) platform as its international crude and oil products trading platform. TNK-BP CIO Iain Greig said, ‘We needed a system to support our international expansion and our new global trading operations in Geneva. Key requirements were for an accurate, intuitive system that could be implemented in a short time frame by a partner with whom we could build a long term relationship. After a review of the market, it was clear that Symphony provided the greatest functional fit and that Amphora understood our business.’
Amphora’s flagship ETRM solution supports the entire life cycle of all physical and financial transactions for oil and other energy commodities. Symphony pinpoints the value of a position in real-time and tracks the total risk position under any given set of transaction variables. More from www.oilit.com/links/1103_18.
The Norwegian EqHub was announced last year (Oil ITJ May 2010) as a ShareCat-based exchange for suppliers to the Norwegian offshore market. EqHub has support from four Norwegian operators, several prime contractors and 20 or so suppliers. Epim, EqHub’s operator has just announced that Autronica Fire and Security is the first supplier company to qualify as an approved supplier in the hub and as such is a provider of pre-qualified, classified and quality assured information to the repository. Norwegian Autronica is a provider of fire detection and extinguishing systems.
Autronica’s Ole Bredal said, ‘The EqHub certificate of approval means that Autronica’s documents are on the hub and will help customers looking for accurate and timely product information. We seek to reduce the time traditionally spent sending standard documents and expect EqHub to be a cost effective solution here. Being first is always a challenge and we would thank representatives from ConocoPhillips and EqHub in this initiative.’
EqHub is based on the 15926-based POSCCaesar Association standard, and on input from DNV (quality assurance), Achilles (prequalification services) and ShareCat (technology). The EqHub is owned and operated by EPIM and funded from fees paid by Norwegian operators. Several other members of EqHub are in the pipeline of getting the approved EqHub supplier status. EqHub operator members are currently ConocoPhillips, Lundin, Total and BP and on the supplier side with EPC members ABB, BVS, Endress & Hauser and Wilhelmsen Callenberg. More from www.oilit.com/links/1103_19.
Shell now figures amongst Microsoft’s poster children for its cloud-based Business Productivity Online Suite (BPOS). BPOS comprises messaging and collaboration tools, delivered as a subscription service. The hosted components include Microsoft Exchange, SharePoint, Office Communications for ‘presence,’ instant messaging, and peer to peer audio calls and Office Live Meeting for web and video conferencing.
Kurt DelBene, president of Microsoft’s Office division said, ‘cloud apps let people collaborate from any location and devices. Our online productivity solutions work with tools that organizations already use, simplifying their transition to the cloud.
Shell is to start using SharePoint Online alongside its SharePoint 2010 instances to enable global, online collaboration among employees. These tools provide a central platform for applications, such as document and content management, as well as project work. Through Microsoft SharePoint, Shell will ‘appreciate considerable cost reductions and flexibility.’ BPOS is now available in 40 countries and regions. More from email@example.com.
Sword CTSpace has announced a new release of its FusionEnterprise (FE) engineering content management system that allows engineering companies to blend and manage information stored in multiple systems. The latest FE release combines engineering content from separate EMC Documentum, IBM FileNet P8 and Microsoft SharePoint 2010 systems. FE provides support for capital projects and operations by adding engineering and compliance features to a ‘basic’ ECM repository. Features include CAD support, transmittals management and audit trails.
Sword VP Tim Fleet said, ‘Larger customers wanted to deploy a common engineering content application across multiple legacy ECM systems in different parts of their organization. FE now provides the flexibility needed to federate engineering content.’ An engineering company might use SharePoint for capital projects and Documentum for records management. FE ‘leverages the strengths of the individual systems,’ avoiding data compatibility issues.
CTSpace is available as an online software as a service solution or an on-site solution. Sword Group claims over $270 million revenues. The Sword client list reads like a Who’s who of the engineering and plant vertical. More from www.oilit.com/links/1103_9.
A white paper from Aveva, ‘Information integrity management for oil and gas owner operators1’ describes how the recently acquired WorkMate and Technical Integrity Manager (TIM) have extended Aveva’s Digital Information Hub with what is claimed as a ‘comprehensive strategy’ for operations integrity management. TIM and WorkMate add functionality for change management, data validation, maintenance and procurement.
The 15 page document also provides a good ‘problem statement’ of the ‘operational realities surrounding the handover of the ‘digital asset.’ Multiple suppliers around the globe working with information from diverse systems make for a complex environment. Processes and data standards are ‘local’ to a supplier, rather than ‘global’ to the project. The ‘commercial motivations’ of stakeholders differ. Information handover standards are part of the solution but few tools can aggregate and validate supplier information with the IHS. Enter Aveva’s information centric approach to engineering requirements a.k.a. the Aveva Net ‘digital information hub.’ This exposes handover data issues at the tags and attributes level, rather than at the document level. The digital hub’s open interface is said to meet the ‘latest’ ISO standards. More from www.oilit.com/links/1103_14.
1 www.oilit.com/links/1103_12 (login required).
Glen Sartain (Teradata) and Olivier Germain (Halliburton) took part in a Hart Energy webinar this month that investigated the potential application of ‘next generation’ database technology in drilling. The thesis is that existing systems are or soon will be overwhelmed by the data volumes streaming into a drilling rig—particularly with the advent of wired drill pipe such as National Oilwell Varco’s IntelliServe1/Intellipipe downhole network streaming ‘up to’ 1,000,000 times more data to the rig. Even today, a rig can create around 4 terabytes of data a day, more than can be analyzed in real time by a database/flatfile combo. Sartain believes we have to change the way we value and view data—moving from a ‘reporting’ standpoint to an environment where data is constantly analyzed and compared with historical data during operations—blending strategic and operational intelligence and moving towards ‘autonomous’ drilling.
Enter Teradata’s ‘shared parallel’ database with its ‘in place’ analytics. Teradata instantiates analytical models inside the database—removing data latency. Germain sees this as a logical progression from the introduction of systems like Pason’s AutoDriller which were met with skepticism when they came to market. Tests comparing humans on the brake against the AutoDriller showed the latter to be much more performant. Artificial intelligence and parallel databases will take such automation to the next level. There is also a feeling that post Macondo, it will become mandatory to capture real time data to a ‘pristine’ environment for root cause analysis inter alia. Webinar replay available on www.oilit.com/links/1103_11.