When I hear that a company is ‘moving towards’ a ‘services-oriented’ architecture my skeptical mind interprets this as meaning that they are not in a hurry to get there. This suspicion is borne out in a recent release (page 10) from Petrolink including the statement, ‘By using neutral interfaces, operators are less reliant on ‘integrated offerings’ from the larger service providers and can deploy best of breed solutions from other specialized vendors’. In other words, web services will likely work against the major vendors—hence their reluctance to adopt them.
It was therefore with some surprise that I stumbled across a community whose very existence is based on web services, on openness and on a seeming disregard for commercial considerations. My encounter took place at the ‘Geo Mash-Up’ event hosted by the Ordinance Survey in Southampton, UK.
What is a mash-up? I was lucky to get a ride from the airport with Charles Kennelly who, as well as being a program manager with ESRI UK, is a DJ. He showed me how two pieces of music are mashed into one, matching their pitch and rhythms. Thus a rapper can rap over a more ‘classical’ oeuvre—the Steve Miller band being a popular choice. Translating the mash-up concept into software, all you need is some freely-available web resources. A few lines of code are used—either to leverage an application programming interface if there is one—or failing that, to ‘scrape’ websites for their content, perhaps with a few lines of Perl.
Google Map is the mash-up equivalent of the Steve Miller band and is the backdrop to many mash-ups. If you are desperate in San Francisco you will be interested in paul.kedrosky.com/publicloos/. If you are into Wikipedia, you might like to spatialize the encyclopedia with webkuehn.de/hobbys/wikipedia/geokoordinaten/index_en.htm. Oil IT Journal has, without realizing it, already reported on an oil and gas mash-up—the use of Google Earth as an Open Spirit data selector (Vol. 11 N° 6).
The mash-up phenomenon is a facet of what has come to be known as Web 2—a loose grouping of technologies—RSS, Asynchronous Java Script and XML (AJAX) and other web services that are adding functionality to the humble web browser. If you want to code your own, try Pragmatic Ajax*, which will have you mashing up a Google map by page 20.
The mash-up event also raised issues of a non technical nature. Many protagonists come from the public sector and are sitting on information resources which it would be nice to see in the public domain. But such a move would upset organizations like the Ordinance Survey’s business model. A cross cutting theme is the open source community’s belief that the code itself should be ‘free’. You can think what you like about these notions, but they are hard to ignore when vast amounts of data and code are already freely available. There are some, the UK Geological Survey for one, which seem to be thinking, ‘if you can’t beat them, join them!’
Inspired by all this and intrigued by the Wood Mackenzie report (see page 1) on how a country’s tax take can cushion capex overruns, I tried my own mash-up (see graph). Here I ‘mash’ Woodmac’s NPV data** with Transparency International’s Corruption Perception Index for 2005***. I thought that there might be a relationship between companies that appear to ‘help’ oils with project overruns and their position in the TI Corruption index. I confess that the results do not really bear out the theory. But you can make a case for countries in the lower right quadrant as suffering from ‘corruption inefficiency’ as they appear to be discouraging potentially lucrative cost overruns with a high tax take. I’m sure that many good conspiracy theories have been started with less than this!
You may be interested in the technology behind this ‘mash-up’. Web services? SOAP? .NET? Actually no. I’m afraid this was manual re-keying of data, cut and paste and much, far too much, futzing around with Microsoft Excel. But I did discover a rather neat macro**** that adds labels to Excel graphs even though that’s not a web service either!
* Gehtland et al. ISBN 0 9766940 8 5, Pragmatic Bookshelf.
You’ve all heard of the ‘people problem’, the personnel shortage that has resulted from the fast ramp-up in oil and gas activity brought on by high oil prices. While information technology (IT) has already greatly enhanced knowledge workers’ productivity, oil and gas companies are now seeking even greater efficiency by leveraging IT to manage the workforce and supplier relationships.
Exhibitors at the 9th Annual Human Resources Technology Conference and Exposition (HRTCE) in Chicago this month reported take-up of e-HR solutions targeting oil and gas recruitment. New Jersey-based iCIMS was showing off its Software-as-a-Service (SaaS) iRecruiter solution. iRecruiter was deployed earlier this year by Occidental Oil & Gas to create a global hiring management process for its 7,000-strong workforce. Oxy was faced with problems including poor international collaboration, a ‘cumbersome’ resume-management process and an inability to attract and communicate with candidates.
Oxy recruiters’ wish list included large-scale resume management, worldwide screening capabilities and communication tools for recruiters. Oxy also wanted a tool that would make it easy for candidates to apply online and to facilitate applications from international candidates without web access. With about a third of its recruiters in the Middle East, Oxy needed to find a solution that would offer a consistent international hiring process.
Gwen Hill, Oxy’s Strategic Staffing Specialist explained why iCIMS was selected, ‘iCIMS stood out on every point of our evaluation criteria. iCIMS allows us to run a collaborative global recruitment program, eliminating the paper process and allowing us to track our candidates and employment opportunities around the world. We have seen a huge increase in the number of quality applicants we have identified since implementing iRecruiter.’
Hess Corp., along with solutions provider Authoria, presented its integrated approach to ‘talent management’ at the HRTCE. Hess uses Authoria’s products for recruiting, performance management, compensation management and benefit/policy communication.
Petrobras is tackling shortage of skilled designers in the booming oil and gas market under the auspices of the Brazilian government’s Prominp program. This seeks to increase the local content of equipment and service suppliers and provide more trained personnel. In this context, UK-based Aveva is working with Petrobras to train Brazilian engineers to work on modeling Petrobras refineries. To date more than 300 designers have graduated from these schemes.
A study from Wood Mackenzie, ‘Cost hyperinflation: all is not lost ’ investigates the true cost to investors of project overruns. Woodmac VP Graham Kellas explained, ‘Oil and gas hyperinflation has resulted in billions of dollars of cost overruns in field development.’ The report investigates the tax situation of increasing oil field costs asking ‘does an extra dollar spent reduce the value of the project by a dollar?’ In general, an extra dollar on Capex does not reduce the value to investors by a dollar, due to its deductibility or recovery in production sharing. Kellas explained, ‘For a 100 mmbbl development, with a $200 million cost overrun, the post-tax NPV ranges from a reduction in the NPV of between $152 million (Iran) to an actual increase in the NPV of $26 million (Angola).’
The report concludes that in general, Production Sharing Contract regimes offer higher levels of protection from cost overruns than royalty/tax regimes. In Azerbaijan, Libya, India and Angola the government assumes virtually all of the impact of the cost overrun. Countries offering the least protection from cost overruns are Iran, Yemen and Brazil.
See also our ‘mash-up’ of the Woodmac data with Transparency International’s 2005 Corruption Index in this month’s editorial.
Given a large, integrated oil and gas company which wants to completely renew its IT/IM, what do you propose?
Bayne—Minimize complexity as follows. Minimize integration complexity via Seabed as a common logical data model. Minimize tool complexity with common applications and data-stores and finally reduce the demands on end-users by enabling workflows like our Petrel plug-ins that deploy IM workflows natively.
Bouffard—End users are looking to new ways of working—these may not be completely defined. But we do have some technology directions to offer. Streaming RT data is becoming part of daily life. Access to production data—WITSML, PRODML. We see convergence of structured and unstructured data.
Bayne—You will always need structured databases but there is huge growth in emails etc. Here we defer to Infosys for help in managing this.
Bouffard—IM may not always be apparent. For instance in Shell’s HDP*, interpretation and data access all goes through the Portal. The future will see interpreters doing more and more in less and less time—and data management will have to be done in the background.
Why do we even bother building projects? Why can’t you offer interpreters windows onto their data ΰ la MapQuest?**
Bayne—Our Seabed paradigm is a step in this direction.
Bouffard—I agree that today’s technology is not up to this. We expect to be able to offer such facility through web services and data replication.
We hear a lot about web services (WS). Where is SIS in WS, with self describing data, tagging, data validation?
Bayne—ProSource and Petrel DM leverage web services today. Our integration engine as deployed in ProSource Results leverages a services-oriented architecture.
Can workflows be captured in XML, edited programmatically and re-run?
Petrel does not support the ability to export the workflows to XML currently. SIS is experimenting with Microsoft Windows Workflow Foundation to enable this kind of functionality.***
Talking of ‘openness’ in the context of Petrel is something of a paradox. Petrel is a facet of SIS’ ‘wall to wall Microsoft’ approach to technology. A far remove from what is generally understood as the Open Source movement!
Bouffard—That’s not quite how we see it. At one level, the Ocean API provides openness. But users won’t get access to all of our IP as you would expect from an Open Source environment. But Ocean does allow you to add your own functionality to applications like Petrel. We are completely genuine on this openness issue. Publishing Seabed has been a major event for us. This data model represents tens of man-years for our data modelers. Offering this up to the market is a bold statement and we hope that the industry as a whole will gain from this initiative.
We had a look at the Seabed model—it is a text-document, are you planning to release an implementable DDL version of the data model like PPDM does?
Bouffard—PPDM restricts its DDL to members. SIS ‘only’ publishes the Seabed data model—but makes it available to all. Indeed, we do perceive the physical implementation of the data model as a differentiator. Your readers might like to check the model on www.slb.com/content/services/software/opensystems/index.asp by clicking on the Seabed Data Model link.
* Hydrocarbon Development Planning
** An idea we developed in our September 2006 editorial.
*** This answer was supplied offline.
Two reports address different aspects of information security in the US energy and chemicals sector. The first, prepared by the Department of Energy’s Sandia National Laboratories at the request of the Office of Fossil Energy analyzes e-commerce standards for the energy sector. The second, authored by the American Chemistry Council (ACC), addresses ‘cyber security’ in the chemical industry.
The Sandia report, ‘Wholesale Electric Quadrant Draft Technical Standards for Public Key Infrastructure’ analyzes draft public key infrastructure (PKI) standards developed by the North American Energy Standards Board (NAESB) that support secure electronic bidding and purchase of fossil fuels. The report found some vulnerability within transactions and offered mitigation strategies which ‘will result in higher standards for the future.’
The Chemical Sector Cyber Security Strategy report is strong on entreaties and rather light on solutions. The chemical industry, like oil and gas, relies on technology solutions from the information and telecommunications sector and is highly dependent on service providers. These and other interdependencies demonstrate the importance of ‘proactive risk management and reduction strategies’ to ‘protect chemical industry companies, communities and the nation as a whole.’
According to the report, the physical structure of the chemical industry ‘reduces the likelihood and scope of a cascading failure effect.’ Processes and equipment are contained within the physical boundaries of a facility and security checks determine the validity of incoming information before it is used in a control action.
The report acknowledges however that ‘cyber attacks could result in business interruption, lost capital, risks to plant employees and communities.’ Moreover ‘the potential of a combined physical and cyber attack and the criminal use of illegally obtained information represent threat scenarios that could impact industries such as the chemical sector.’ Further information and guidance documents are available at www.chemicalcybersecurity.com .
Netherlands-based JOA Oil & Gas has just announced the imminent commercial release of a software development kit (SDK) for its Jewel Suite, ‘seismic to simulation’ modeling package. The SDK is already used internally by JOA to integrate Jewel with the Sensor reservoir simulator from Coats Engineering. The SDK will allow users and other third parties to build their own plug-ins, using Microsoft Visual Studio 2005—with support for Visual C# and Visual Basic.
Bob Rundle showed us the Jewel Suite SDK at the SEG using the C# visual development environment. Jewel Suite components such as the workflow control panel container can be customized by dragging and dropping workflow steps onto the canvas. Computations can be added, leveraging Jewel’s units management system.
JOA’s Albequerque development center offers certification to third party plug-in developers who can add their own databases and applications to a Jewel workflow. Prototype development is also possible from within Microsoft Excel. JOA is working on a template that will embed the Sensor reservoir simulator. Jewel Suite is now also OpenSpirit ‘certified’.
Archivas and Enigma Data Solutions have just announced a Hierarchical Storage Management (HSM) solution targeting oil and gas digital archiving and information management. The HSM links Archivas’ Cluster (ArC) archiving solution with Enigma’s PARS data management package. The joint solution claims to assist oil and gas companies with Sarbanes-Oxley reporting.
Enigma VP Tim Bowler said, ‘PARS is now an industry standard for archiving project data from geotechnical applications. The new solution adds industrial strength archival for both disk and tape data. ArC takes PARS archives and stores them securely, adding automatic replication and a powerful search functionality.’
Enigma has also just announced PARS 3, a major rewrite in Java for cross platform deployment. PARS now archives Kingdom and Petrel projects, capturing metadata to its own database. PARS has had a checkered history since it was developed by PECC. Subsequent ownership has migrated from CGG to SmartMove before a management buyout in June 2005.
Perth, Australia-based ISA Technologies is to offer Schlumberger’s LiveQuest hosted applications to the Asia Pacific region. Schlumberger is working with ISA to provide local companies with access to its multi-platform E&P applications via the application service provision (ASP) solution.
Department of Industry
The Petroleum and Royalties Division of the Western Australian Department of Industry and Resources is the anchor client for the LiveQuest solution which will be served from ISA’s high performance computing and visualization centre.
Department of Industry General Manager Mark Gabrielson said, ‘We plan to access the LiveQuest solution via ISA’s facilities rather than using our own infrastructure.’ LiveQuest-served applications include GeoFrame, and the Eclipse reservoir fluid flow simulator.
The United States Geological Survey has just unveiled plans for a ‘National Geological and Geophysical Data Preservation Program’. The implementation plan was delivered to Congress on October 16. The Program will inventory, archive, preserve and catalog geological, geophysical and engineering data, maps, well logs, and samples.
National Digital Catalog
Information in the geological and geophysical database will be accessible through the Internet-based National Digital Catalog, allowing users to quickly identify what is available and where it is located. The Program will ‘provide consistent standards’ and its storage facilities and collections will be managed by the USGS and maintained by agencies within the Department of the Interior (DOI) and State geological surveys.
$30 million per year
The Energy Policy Act of 2005 Sec 2011 made State Agencies responsible for the preservation of geological data and allowed States to acquire data from federal property. The Program is a central metadata catalog which will be separate from data holders. The project is slated for $30 million per year. More on the Program from http://energy.usgs.gov.
Ikon Science and MTEM are to embed MTEM’s multi-transient electromagnetic modeling into Ikon’s RokDoc rock physics package.
Petrobras has joined Computer Modelling Group’s ‘next generation’ reservoir simulation project. CMG has been sponsored by Shell to develop a modeling system for ‘the entire hydrocarbon recovery system through point of sale.’
Object Reservoir has released Resolve 3.4, its finite element fluid flow modeler. The new release provides quick look analysis of individual wells, completion interval and reservoirs. Updates to the model such as changes in geology are automatically re-meshed with fracture properties re-computed on the fly. The new technology is particularly applicable to unconventional tight gas plays.
Peloton has announced version 3.0 of its SiteView package for surface facility and pipeline management. SiteView now offers ‘full facilities and pipeline asset tracking and management.’
Advanced Geotechnology has released a new version of its well planning and analysis software STABView. Version 3.0 introduces a mud weight window plot, graphical presentations of borehole breakouts and sensitivity analyses. AGI has also announced the second version of RocksBank, a rock mechanical and petrophysical properties database. The data base, which now includes heavy oil and oil sands reservoirs, contains samples from 3,600 locations around the world. Shell International was one of the first RocksBank licensees.
Badley Earth Science and the Dublin-based Fault Analysis Group (FAG) are cooperating to offer structural geological and fault property modeling from FAG’s TransGen3 as modules inside TrapTester EarthGrid.
de Groot-Bril has released two new plug-ins for its OpendTect environment. Developed by Ark cls the plug-ins offer Seismic Colored Inversion for rapid band-limited inversion of seismic data and Seismic Spectral Bluing—a technique that shapes the seismic spectrum to optimize the resolution ‘without boosting noise to an unacceptable level.’
The latest release of Fugro-Jason’s Geoscience Workbench introduces a new model builder, EarthModel, a 3D grid/CPG builder and RockScale modules for constructing and upscaling simulation-ready grids. The new modules leverage the FastTracker technology acquired from Volumetrix in 2003.
Ansys Software’s computational fluid dynamics unit Fluent has announced support for Microsoft Windows 64-bit Compute Cluster Server 2003. Ansys 11.0 and Fluent 6.3 leverage Microsoft’s Message Passing Interface (MPI) for data communication between processors on the cluster. The new releases also use the Microsoft Job Scheduler, an ‘off-the-shelf’ solution for launching and controlling jobs on the cluster.
A new version of Geologic Systems’ GeoScout offers proprietary data import and visualization, PNGR Reports, well data export to Excel spreadsheet templates and calculation and posting of acreage data.
Geosoft’s Oasis Montaj now offers map export to ‘Dapple’, a global data explorer for visualizing, presenting and sharing large geoscience datasets. Dapple rolls in corporate, internet DAP, WMS and government tile servers such as NASA Landsat data, or Geologic Survey of Canada geology layers.
Rackable Systems has unveiled what is claimed as the world’s first cabinet of Quad-Core Intel Xeon 5300 processor-based servers. The new system was showcased during Intel CEO Paul Otellini’s keynote address at the Intel Developer Forum this month. Cabinets house 40 Rackable Systems C1000 1U servers, packing 320 cores in a 22 unit footprint.
Siemens has announced a line of security RFID chips that guarantee tag data integrity, stopping ‘product pirates’ from eavesdropping wireless data transmission. The ‘self-contained’ solution, which does not require a permanent database connection, leverages a new, compact form of elliptic curve-based public key encryption.
The World Wide Web Consortium (W3C) has updated its markup validation service which now includes an API for developers. The open source markup validator and link checker are ‘among W3C’s most popular and useful resources.’ More from w3.org/QA/Tools/#validators.
Impress Software’s Engineering Project Management package is currently being used by BP to synchronize project data between SAP and Primavera in the ‘transitional shutdown’ of BP’s Lingen, Germany refinery. The $60 Million project involves 2,100 external contractors and 300 BP employees.
cc-Hubwoo, which acquired the ailing Trade Ranger in 2005, has recorded a ‘partial impairment charge’ of €34.4 million.
Targeting inter-alia the seismic processing market, graphics chipmaker ATI Technologies has teamed with PeakStream to further commercial adoption of its ‘stream computing’ technology. ATI is also working with AMD to develop a stream computing co-processor based on its Torrenza chipset. Stream computing uses graphics processors (GPU) to solve complex computational problems.
ATI CEO Dave Orton said, ‘Today’s graphics processors are capable of processing more than just graphics. They are capable parallel processors, ideally suited for scientific and business applications.’
ATI processors running on the PeakStream platform have shown up to a 26 fold speed up in Monte Carlo simulations and have accelerated seismic processing by ‘as much as 20 times’.
PeakStream CEO Neil Knox added ‘PeakStream cost-effectively converts GPU into powerful computing engines for exponentially increased application performance. Companies can now program ATI’s high performance graphics processors for accelerated processing of non-graphics tasks. Stream computing has recently been adopted in Stanford University’s folding@home—a protein folding equivalent of the SETI project.
The Schlumberger Information Solutions (SIS) 2006 Forum was subtitled ‘breakthrough team performance’ but perhaps a better title would have been ‘The Petrel Show’. The plenary session covered familiar ground. Industry is on a roll, but constrained by lack of drilling rigs and other resources. The National Oil Companies are on the up, while the Internationals are constrained by lack of personnel. Everyone is planning to hire—but where these folks are to come from is uncertain. A solution to the people shortage, conveniently for SIS, is IT-led productivity gains. Hence the enthusiasm around Petrel which has built great momentum as a tool for integration of geoscience and engineering data. The ‘fly’ in the Petrel ointment is its own integration with other interpretation tools. Petrel files proliferate like Excel spreadsheets—making the data manager’s job hard.
Two other themes from the conference are ‘openness’ and web services. Openness to Schlumberger means the Ocean API and the Seabed data model. But don’t imagine that SIS is going open source. The Ocean API is for the development of plug-ins to Petrel (and in the future Osprey, Merak, and Avocet) and the Seabed data model has been published as a textual description of the database, rather than a DDL. But understand that third party developers using Ocean are impressed with its functionality. Likewise, the exposure of the Seabed intellectual property has raised some approving eyebrows in the industry.
On the web services front, the situation is less clear. SIS’ Petrel, Ocean and Seabed flagships do not appear to expose much in the way of web services that could be leveraged by third parties. Indeed there is a natural tension in a services-oriented architecture (SOA) which could for instance, move control of the workflow from Schlumberger’s infrastructure to a client’s script. SIS’ Joe Perino put it succinctly—‘SIS has embarked on a move toward SOA and will offer, for instance, versioning and results management as a service. But it is up to operators to push us to use open systems.’
Charles Johnson, MD of Microsoft’s manufacturing industry unit, described the partnership with Schlumberger. Microsoft is to offer Excel-based spreadsheet access to data in Schlumberger’s data bases, just like the Microsoft/SAP ‘Duet’ product. Johnson noted Microsoft’s arrival in the high performance computing (HPC) marketplace with Windows Compute Cluster Server 2003 (CCS) and the 64 bit edition of Windows XP. This was greeted with some skepticism and Johnson admitted that those requiring 64 bit support and robust debuggers will have to wait on Vista.
Stephen Whitley offered an insight into SIS’ IT architectural vision. Project teams include the developer, a business person and an ‘architect’ to assure conformance with Schlumberger’s Technology Blueprint Vision (TBV). According to the TBV, all products ‘must be open and interoperate’. There is focus on standardization. SIS ‘will adhere as closely to ProdML as possible.’ The roadmap favors software that exploits multi core processors and ‘wide scale XML-based data frameworks’. The desktop shift to Windows .NET was ‘a tough decision for SIS, traditionally a Unix company.’ Enterprise computing is to leverage OpenSpirit and Microsoft BizTalk and will ‘adhere to BEPL2.’
Jevon Williams presented Shell’s hydrocarbon development planning workflow (HDP). This leverages LiveQuest along with an in-house tool for uncertainty management. A ‘Smart Workflow System’ keeps an audit trail of decisions made. The solution is ‘blue printable’ and can be deployed in other contexts. Projects use Shell’s Active Directory to build and notify teams. OpenSpirit is leveraged to build a Petrel project.
The Petrel Papers
Petrel is now integral to Petroleum Development Oman’s (PDO) subsurface asset management as Talib Al-Ajmi explained. PDO now has 100 active models and 125 licenses (for 200 users). Usage spans petroleum engineering, reservoir engineering and petrophysics (seismic interpretation is under development). Petrel offers good integration of fracture data, mud loss data, PLT (production logs), pressure, seismic well etc. This lets PDO highlight areas where efficiencies are possible. Full field models let PDO keep facility maps up to date by including satellite photos, pipeline data etc. Petrel has proved an ideal collaborative environment for data and interpretation QA/QC and managing uncertainty.
Arun Narayanan presented the Petrel data management roadmap which ultimately will combine flat file performance with the features of a database. The Petrel data management roadmap extends out to 2008 when a Petrel ‘DBX’ database will interface with Seabed—bringing full units and coordinate reference system support.
Seismic in Petrel
Peter Diebold described Shell’s evaluation of Petrel’s seismic interpretation (PSI). Datasets over 10GB are common in Shell, often with a mix of 2D and 3D data. Diebold interspersed his evaluation with proselytizing for a paradigm shift from what Diebold describes as the old-fashioned ‘paper on screen’ (PoS) technology—advocating instead a modern ‘sparse’ 3D approach. Petrel was evaluated in this context across a matrix of 8 ‘themes’ and was tested all over world on different live projects. The results showed that Petrel was great for ‘quick look’ seismic interpretation and integration with non seismic disciplines. Users commented that they had ‘never seen volume interpretation come so naturally’. Petrel’s workflow management, multiple realizations and its use in interpretation quality assurance were positive points. On the downside, Petrel’s lacunae in data management were deemed ‘a huge issue’. Legacy data integration was problematical and OpenSpirit proved ‘slow and unpredictable’. Shell has recommended limited deployment of PSI, for example in the New Business Ventures group. PSI requires training and a mindset change for the PoS ‘gang’ found in middle management.
This article has been summarized from a 14 page report produced as part of The Data Room’s Technology Watch reporting service. For more on this subscription-based service please visit the Technology Watch home page or email email@example.com.
The second annual user meeting of the Pipeline Open Data Standards organization (PODS) took place in Houston last month. 70 members were in attendance from as far afield as Russia, Germany, and Norway. President Alan Herbison (KinderMorgan) described PODS as ‘a growing, world-wide organization with members in 11 countries.’ PODS is accelerating delivery of new modules to allow fast track deployment. In July, the North American Corrosion Engineers association (NACE) and PODS jointly approved a new external corrosion direct assessment (ECDA) integrity data standard covering close interval survey data, DCVG/ACVG, depth of cover, centerline and survey data. PODS is also going ‘spatial’ with a variety of GIS developments, notably the ArcGIS Pipeline Data Model (APDM) work with ESRI. The latest release, PODS 4 is now fully documented with Sybase’s PowerDesigner and a workgroup has begun on ‘fine grained’ entitlements-based Oracle access.
Michael Ray reviewed Texas Gas Transmission’s (TGT) migration to PODS from manually drafted location data, alignment, DOT sheets etc. In the 1990s, TGT moved to a Sybase database with an ArcView GIS. This was found ‘slow and cumbersome’ and there were issues with data management and data exchange from risk and corrosion systems. In 2005, TGT decided to migrate to a PODS database to benefit from its better design, extensibility, integrity rules and to share experience with other operators. There was also an expectation that vendor products would integrate better with a PODS-based solution. GIS was upgraded to ArcGIS 9.0 and ArcSDE. Ray offered advice for prospective migrators—leverage vendors who know PODS and stay ‘true’ to the model.
Pipeline awareness (DOT)
John Jacobi is the community assistance and technical services manager with the Southwest Region of the Pipeline and Hazardous Materials Safety Administration (PHMSA), part of the US Department of Transportation. The PHMSA’s Office of Pipeline Safety (OPS) is the federal safety authority for the US’ 2.3 million miles of natural gas and hazardous liquid pipelines. As a component of its program to ensure the ‘safe, reliable, and environmentally sound operation of the nation’s pipeline transportation system,’ the OPS has instigated a pipeline awareness program which sets out to ‘advance public safety, [..] by facilitating clear communications among all pipeline stakeholders, including the public, the operators and government officials.’ The awareness program targets ‘high consequence areas’ (HCA) such as highly populated areas, waterways and drinking water sources. The potential impact radius from a pipeline is computed with regard to buildings’ proximity and occupancy. Here GIS and asset facility management tools are used to determine critical geographic parameters and assure the PHMSA of probable compliance.
Sarah Johnson described pipeline operator Sunoco’s migration from a legacy database and GIS to a PODS/ArcGIS infrastructure. Sunoco operates a 1,800 mile network on the Eastern seaboard, terminal facilities and another 3,600 miles of pipeline in Oklahoma and Texas. A legacy mapping system ‘SunMap,’ created in the 1990s deployed AutoCAD and Oracle for live generation of alignment sheets. DOT integrity requirements led to the development of a more flexible system rolling in new data types and to the development of Sunoco’s PODS-based ‘ALIGN’ database. This has been designed to support internet mapping, alignment sheet generation, risk management and other applications. A complex migration and data correction process involved much quality control. When errors are found, all relevant sheets are checked. Johnson believes ‘pipeline is too important’ to neglect data QC/QA checks which may include field work. Post migration, there is ongoing QA/QC of migrated data. Sunoco has realized the importance and effort required in data management to keep things consistent. Johnson believes that communication is key—data management ‘cannot take place in a bubble’. For specialist data such as corrosion control, data management may require help from domain specialists as well as GIS.
Spatial work group
Spatial Chairman Mike King (BP America) reported that the spatial workgroup is developing ‘PODS-Compatible database model(s) or extension(s) that will spatially enable the PODS database to underpin proprietary applications and to facilitate PODS/GIS integration.’ Although PODS plans to maintain a vendor neutral approach to spatialization, the need to integrate with proprietary spatial technologies is recognized. Sub committees have been formed to address both ESRI Geodatabase solutions and Oracle Spatial that will support SmallWorld, Intergraph, and Autodesk solutions. The ESRI Geodatabase comes in both ‘Lite’ and ‘Heavy’ forms. The Lite consists of common APDM core classes and a limited number of the PODS tables and will be for free public distribution. ‘Heavy’ extends the APDM core with all PODS event and domain tables and access will be limited to PODS members only. The ESRI development is the subject of a memorandum of understanding between ESRI, PODS and the ArcGIS Pipeline Data Model (APDM)—see below.
Other presentations of note included Ken Greer’s (Centerpoint Energy) presentation on linking PODS inventory data to a scheduled maintenance management package (Maximo) and Gary Hoover’s talk on managing derived data in PODS. The latter included linking PODS data with Google Earth. With due attention to the coordinate reference system, Google Earth ‘.kml’ files can be built on the fly, effectively spatializing the PODS database.
Following the PODS meet, a memorandum of understanding was signed between ESRI and PODS and the ArcGIS Pipeline Data Model (APDM) Steering Committee. The outcome of this relationship will be a PODS-based data model that uses essential APDM core components. The APDM was expressly designed for implementation as an ESRI geodatabase. The POD input will add POD standards to the APDM.
The Petrotechnical Open Standards Consortium is to inaugurate an ‘Energy Standards Resource Center’ in Houston next month.
Canadian Pengrowth Energy Trust has joined the OFS Portal e-commerce hub.
Industrial Software Solutions has received an A$863,373 grant from the Australian ‘AusUndustry’ Commercial Ready fund for R&D of its ‘BabelFish Aspect’ geopsatial tool.
Cell/GPU-based computer hardware specialist PeakStream has secured $17 million funding from Foundation Capital, Kleiner Perkins Caufield & Byers and Sequoia Capital.
The Petroleum Technology Alliance Canada (PTAC) has just published a report titled ‘Filling the Gap—an Unconventional Gas Technology Roadmap.’ The Roadmap will shortly be available free from the ptac.org website.
Mukul Sharma’s group at the Center for Petroleum and Geoscience Engineering at the University of Texas at Austin has received a $694,000 grant to improve the 3-D models used to assess hydraulic fracturing. Anadarko will test the new models.
Wireless SCADA specialist vMonitor reports deployment by ExxonMobil Nigeria and SonaHess Algeria, a joint venture between Sonatrach and Hess, of its wireless well monitoring and metering solution.
UK-based AnTech has appointed Rachel Heal as Electronics Engineer, James Kelsall as Design Engineer and Angela Bryant as Sales and Marketing Administrator.
The American Petroleum Institute API has appointed Rex Tillerson (ExxonMobil) as chairman of its board, Larry Nichols (Devon Energy) as treasurer and Clarence Cazalot (Marathon) as chairman of the audit committee.
Aspen Technology has appointed David McKenna to its Board, replacing Chris Pike who has resigned.
BJ Process and Pipeline has appointed Brent Greenway as sales manager for Saudi Arabia.
CGG unit Sercel has acquired wireless seismic acquisition systems developer Vibration Technologies in a cash transaction.
Jo Webber is now CEO of pipeline software house Energy Solutions and Ann Casaday has joined the company as VP Global Sales. Both were previously with InnaPhase Corp.
Following a conflict with another trademark, Finetooth has changed its name to Headwave Inc.
Ken Fox now heads up EMEA marketing for Invensys Process Systems.
Geotrace has promoted Mark Carrigan to VP Western Hemisphere of its newly acquired Tigress Software unit.
The French Petroleum Institute (IFP) has named Olga Vizika director of reservoir engineering research and Dominique Henri as director of industrial development.
Mick Lambert, president of Input/Output’s GX Technology subsidiary is to retire at the end of the year. Jim Hollis is to assume responsibility for GXT’s business. LSI Logic has promoted Flavio Santoni to VP Storage Sales and Marketing.
Frank Ingari has joined MetaCarta’s board of directors.
Cristina Robinson Marras is General Manager of ER Mapper UK.
Randy Smith has joined SCA as its Sales & Recruiting Manager.
SGI’s reorganization is complete and the company has ‘emerged’ from chapter 11. 11 million new shares in SGI are now trading Nasdaq under the SGIC ticker.
Schlumberger has completed its US corporate office relocation to Houston from New York.
Daniel Valot is to step down as Technip Chairman next year. The company is looking for a replacement.
Richard Ellam has joined Techsia as General Manager and Executive Vice President.
TGS-NOPEC has opened an office in Moscow.
MWD specialist Ulterra Drilling Technologies has named Bob Iversen as President and CEO.
Advanced Geotechnology has been awarded a contract from the Plains CO2 Reduction Partnership for a geomechanical investigation into acid gas (CO2 and H2S) sequestration in the Keg River Formation at Apache Canada’s Zama Lake field in northwest Alberta.
Landmark’s Digital and Consulting Solutions (DCS) division has announced new hardware and software bundles for high end visualization and data storage. A deal with Verari Systems and Nvidia will offer Landmark’s flagship GeoProbe interpretation package running on a Verari ‘E&P 7500’ server, with up to eight AMD Opteron processors, 128 gigabytes of memory and Nvidia Quadro graphics. An optional Quadro Plex 1000 visual computer is also available. The Linux-based E&P 7500 has a sub $100,000 price tag.
Landmark DCS also unveiled a visual cataloging and tapeless archiving solution integrating Landmark’s Corporate Data Archiver (CDA) with EMC Centera’s content-addressed storage system. Mandmark VP Doug Meikle explained ‘Because data is archived to disk rather than tapes, retrievals that previously took days now take seconds—without assistance from the IT department.’ CDA catalogs and summarizes project data, creating ‘thumbnails’ that allow interpretations to be reviewed without full retrieval.
Austin, TX-based Drilling Info has announced the ‘Well Log Network’, a new well log digitization subscription service. Subscribers can digitize, access, and trade well logs and gain access to a library of well logs and images for research, as well as to a catalog of relevant log files. The Drilling Info Well Log Library holds over 850,000 well logs and images ready for download. Subscribers can license third-party files and earn royalties on the files that are created from their own well logs.
Drilling Info CEO Allen Gilmer said, ‘We have expanded the value of Drilling Info for our subscribers by integrating digitization with our web-based content delivery.’ Subscription rates are based on company size and digitization pricing is tied to subscription level.
Schlumberger has just announced a ‘portable operations center’ for remote control of drilling operations. The Center is a compact unit that fits in the back of a truck. Ping and Power (mains electricity and an internet connection) are all that is needed to get up and running. Once installed the Center gives in-house drilling engineers real time information on the well’s progress in a familiar metaphor of hydraulics, torque and drag and other displays. A digital record of the well is captured which can be played back later for training, to compare well models to actual recordings and to identify precursors to abnormal events.
The portable operations center is not for super majors which will likely have their own in-house dedicated drilling control rooms. But small to mid-size companies may want to deploy one occasionally.
Following its acquisition of Maurer software last June, Petris has now added a ‘user-friendly, multi-language interface’ to Maurer’s drilling toolkit, repackaging the software as ‘PetrisWINDS’ DrillNET’. DrillNET combines Maurer’s drilling programs into an integrated package that permits data to be captured, reused and shared among users.
Based on Microsoft .NET, DrillNET’s integrated database is coupled with context-sensitive help and a novel ‘traffic light’ approach that checks that all data is complete for a calculation. Reports can be exported to Microsoft Office products. DrillNET has been localized in English, Russian, Spanish and Chinese.
Petris CEO Jim Pritchett said, ‘DrillNET represents a significant advance in usability for drilling engineering. Clients told us that they need a solution that lets them collaborate and reuse drilling programs while making it possible to bring new personnel up to speed faster and with better performance.’
Iron Mountain is to leverage the collaboration and content management functionality of Microsoft’s SharePoint Server 2007 to address the records management challenge. Iron Mountain will embed its records management policies, process and archival into Microsoft’s Office 2007, creating a ‘policy-based lifecycle e-records management solution’.
Microsoft Office SharePoint Server 2007 and Exchange Server 2007 will allow users to apply records classification and retention policies to documents as part of their normal activities. Less frequently accessed information is automatically migrated to Iron Mountain’s hosted digital archive, where retention policies are applied.
Iron Mountain senior VP Ken Rubin said, ‘This is the first time these capabilities are available to the mass business market without requiring huge investments in large-scale document management or traditional enterprise content management deployments. Combining our records management expertise with Microsoft’s mass market reach will bring a first—a cost-effective and practical e-records management solution.’ The new solution also addresses Federal Rules of Civil Procedure N° 26 covering electronically stored information. This is expected to put companies’ ability to respond to ‘e-discovery’ requirements during litigation to a severe test.
Chevron and Los Alamos National Laboratory (LANL) have embarked on a joint research project to improve the recovery of hydrocarbons trapped in oil shales and slow-flowing oil formations. The idea is to develop an ‘environmentally responsible and commercially viable process’ to recover crude oil and natural gas from Western US oil shales—with focus on the Piceance Basin in Colorado. The work will include reservoir simulation and modeling and experimental validation of new recovery techniques such as in-situ processing—a technique that has the potential to mitigate greenhouse gas emissions.
Chevron has also applied to participate in the Bureau of Land Management’s research, development and demonstration leasing program in the Piceance Basin to evaluate these technologies in the field. The R&D will be shared between LANL and Chevron’s technology center in Houston.
The first ‘Semantic Days’ conference was held under the auspices of the Norwegian oil and gas trade association, OLF, in Stavanger last April. Presenters included semantic luminaries Eric Miller (MIT/W3C) and Deborah McGuinness (Stanford), co-author of the W3C’s Ontology Web Language (OWL). The semantic web, the brainchild of the world wide web’s inventor Tim Berners-Lee, revolves around the idea that information on a web page should be presented in a way that will allow automated ‘discovery’ of business relevant information.
Following educational sessions on semantic web technologies and on their application in the public sector, the ‘semantic oil and energy track’ included presentations from Statoil, Hydro, Tieto Enator and the POSC Caesar Association. As Jon Atle Gulla, of the NTNU research organization explained, semantic search uses ‘ontologies’ to represent domain vocabulary, documents’ content and users’ information needs. The problem is that most documents today do not contain such information, and annotation of documents with semantic information is a ‘tedious and labor intensive task.’ Worse, such metadata is not even used by current search engines! Notwithstanding these issues, Gulla expects that in the future, ‘hidden’ ontologies will aid query interpretation and automate semantic indexing. Gulla then turned to the Norwegian semantic flagship project, the ambitious Integrated Information Platform (OITJ Vol. 10 N° 6). Gulla showed how ‘morpho syntactic’ search can use resources such as Schlumberger’s Oilfield Glossary to develop semantic relationships.
IIP project manager Svein Omdal described how XML data standards and the web ontology language (OWL) are being used to classify and retrieve information from ISO 15926 and other data models. The aim is to be able to visualize information from both operations and subsurface. Using an example of condition-based maintenance, Omdal described first generation technology leveraging domain specific XML schemas based on the POSC CAESAR reference data library. A second generation approach will leverage a common ontology ensuring that concepts are consistently defined across domains. ‘Reasoning software’ will combine data from several domains, automatically monitor and control equipment, order spare parts and prepare maintenance plans.
Computas VP Roar Fjellheim described the ‘Active Knowledge System for Integrated Operations’ (AKISO) project. This includes a Statoil project to assure the exchange of information between the offshore rig, operations centers and expert communities of practice in the organization. Early database approaches failed because passive processes are not efficient. This led to the search for knowledge transfer using semantics and ‘active’ knowledge search. Over time a ‘knowledge resources map’ is built up. AKSIO deploys a drilling ontology in OWL (Protegι). Knowledge maps are represented with the resource description framework (RDF and Jena). Integration is performed in Microsoft SharePoint.
Intergraph Corp. has teamed with James W. Sewall (JWS) to combine its facilities management offering with Sewall’s pipeline integrity tools into a ‘complete utilities pipeline integrity management solution.’ Sewall’s integrity software helps gas pipeline operators perform risk assessment, generate alignment sheets, perform class location analysis and high consequence area (HCA) calculations. These applications complement Intergraph’s G/Pipeline suite which couples pipeline industry best practices with geo-located facilities management. The deal claims to ‘streamline integrity data management, risk and integrity assessments.’
Intergraph VP Jay Stinson said, ‘By integrating Sewall’s industry knowledge with our pipeline integrity applications we can provide end-to-end pipeline management that will ensure natural gas network operators operate at the highest level of capacity.’
X-Change Corp. unit AirGate Technologies has licensed sixteen patents relating to radio frequency identification device (RFID) tags for tracking oilfield equipment inventory from Den-Con Electronics. The tags include drilling, production and work-over equipment, tubulars, valves, and plant equipment. The agreement also lets AirGate use Den-Con’s surface acoustic wave (SAW) RFID technology. SAW RFID devices are suited to high temperature and pressure environments and work in the presence of liquid and metals—common conditions in the oil industry.
AirGate president Michael Sheriff said, ‘This patent portfolio, in tandem with our advanced SAW technology, significantly strengthens our competitive position and supports our RFID effort in the oil industry.’ Earlier this year AirGate signed an agreement for sale and support of SAW RFID systems with Austria-based CTR AG.’
The American Records Management Association (ARMA) and the Storage Networking Industries Association (SNIA) are to cooperate on educational and technical programs. The alliance sets out to stimulate collaboration between the organizations’ constituencies—records and information management (RIM) and information technology (IT). A white paper (www.arma.org/pdf/news/collaborationwp.pdf) titled, ‘Collaboration: The New Standard of Excellence’ discusses the need for collaboration between RIM, IT, legal, and security professionals, and addresses the complexities involved in managing records and information. At issue is the fact that today, many disparate operating groups own a piece of the information management puzzle, ‘whether they realize it or not.’
The hope is that this collaboration will ‘quell the impending chaos’ of the convergence of regulatory and legal imperatives. SNIA works on standards for data management and information lifecycle management. More from www.snia-dmf.org, www.snia.org, and www.arma.org.
Hydro and Foster Findlay Associates (FFA) have just announced a new seismic interpretation toolset for advanced volume interpretation. Hydro AVI enables rapid reconnaissance of large 3D seismic data sets, delineation and measurement of potential reservoirs and identification of hydrocarbon indicators.
FFA has been cooperating with Hydro since 2004 when the companies first announced the R&D effort (OITJ Vol. 9 N° 4). The project has developed tools for analyzing ‘multiple frequencies within the seismic data’ and blending them into a single 3D image. The results have been tested Hydro’s operations.
Hydro’s AVI tools will be available from FFA as components of the January 2007 release of SVI Pro, FFA’s flagship product for processing and analysis of 3D seismic data.
Petrolink has signed a marketing agreement with Milan, Italy-based XMLPower to integrate its WITSML Data Solution (WDS) into the Petrolink service portfolio. Petrolink is an oil and gas communications technology provider specializing in the secure transmission and distribution of geotechnical and associated data through the world. XMLPower’s WDS comprises modular, fault-tolerant acquisition-transmission-storage units that interface automatically with a third-party vendor’s or operator’s WITSML-compliant data sources and applications.
By combining the WITSML open-source standards with other emerging XML services, XMLPower intends to be the leading provider of WITSML solutions to the worldwide oil and gas industry. XMLPower has no affiliation with wellsite data acquisition service providers and claims to be positioned to provide an extranet hub for use by operators, partners and service providers. The neutral extranet hub avoids conflicts of interest when service providers are required to transfer data that they do not own.
Best of breed
By using neutral interfaces, operators are less reliant on ‘integrated offerings’ from the larger service providers and can deploy ‘best of breed’ solutions from other specialized vendors—promoting a ‘healthier competitive environment’.
Pavilion Technologies has just released version 2.0 of its Pavilion8 platform. Pavilion Technology embeds Halliburton/Landmark’s DecisionSpace for Production simulator. The new version offers augmented modeling capabilities, enhanced web service integration and a new metadata warehouse. A Real-time Environmental Management tool now complements the existing process optimization and control applications.
Pavilion8 deploys a service-oriented architecture (SOA) implemented in J2EE. The new release includes enhancements to the Model Analytic Engine to support composite models aggregating and extend existing models—including those from other vendors. A new Metadata Warehouse aggregates time-series data, providing a ‘system of record’ of production and compliance information.
Other enhancements include integration with ERP applications, data warehouses, historians, distributed control systems and other heterogeneous data sources. Real time environmental management brings compliance and emissions monitoring reporting.
WellDynamics is a joint venture between Shell and Halliburton. So news that it has ‘teamed’ with Halliburton’s Landmark unit is not exactly earth shattering. But the technology to be shared between Halliburton’s ‘smart well’ unit and its software arm is pretty interesting. Landmark is to contribute AssetSolver—the marriage of Pavilion Technologies’ model-based controller package (Oil ITJ Vol. 11 N°4) with Landmark’s DecisionSpace for Production. WellDynamics is bringing its SmartWell techology to the table. The idea is to optimize SmartWell’s fine-grained control of producing zones using AssetSolver.
Landmark’s VP Doug Meikle said, ‘Today, operators can build models of an asset. But without the ability to control what happens downhole, this knowledge cannot be fully leveraged. Combining these technologies gives operators the knowledge they need and the ability to act on it.’
AssetSolver performs multiple runs of steady state models, generating ranges of values that are used to train a neural net-based optimizer. Sub models (such as WellSolver) can be included to create a complete model of the production system. Models can be used to perform front end engineering design (FEED) or to ‘monitor, forecast, and optimize well behavior and production for individual wells and across entire gathering networks.’
WellDynamics hardware is used to control downhole flow and to manage complex recovery methods such as chemical flooding, miscible displacement and thermal recovery. WellDynamics VP Derek Mathieson added, ‘WellDynamics gives our users unprecedented control of what happens below the surface. The joint solution will give them the information they need to use our tools and systems to control their wells and to meet goals like NPV, ultimate recovery or short term production.’
Apache Corp. has chosen McLaren Software’s Enterprise Engineer Application Suite to manage engineering content across its worldwide assets. Enterprise Engineer (EE) is a suite of configurable business applications that manages engineering content and associated work processes. EE provides lifecycle content management for CAD drawings, standard operating procedures, correspondence, email and specifications.
Apache Corp. CTO Mike Bahorich explained, ‘We selected EE to automate our business processes for managing engineering documentation; including ‘as built’ designs and standard operating procedures within and across business boundaries. EE is to automate existing manual processes, creating economies of scale in the management of our capital assets. Our first implementation is in the UK North Sea.’
Earlier this year, Franco-British oil company Perenco deployed EE to manage engineering documents on its UK North Sea Trent development. EMC Documentum was Perenco’s repository of choice, with EE adding business process and engineering content management. Perenco’s Robbie Shields said, ‘EE gave us rapid deployment of configurable applications for engineering content and process control.’
Previously, Microsoft Word was used in a ‘manual’ review process. EE now provides default document types, dialogs, automatic numbering, filing, project control and process automation. EE’s Transmittals application underpins the review and the distribution of content to third parties. McLaren Studio is being used to configure the XML business rules within Enterprise Engineer. The transmittal process also offers an electronic audit trail of documents as they proceed from review to repository. Time limits can be set for review and reminders sent out to reviewers as deadlines approach.
At its Annual Technical Conference and Exhibition in San Antonio last month, the Society of Petroleum Engineers inaugurated two new Sections targeting information technology and R&D. IT Technical Section Chairman Mehrzad Mahdavi, describing the section as being ‘about the digital oilfield,’ said, ‘We’ve got CIO, presidents of IT, and senior VP from companies like Chevron, ExxonMobil, Total, Occidental Petroleum, and Petrobras on the committee.’
IT Section founder and ExxonMobil CIO Steve Comstock said, ‘IM and KM are a huge part of a petroleum engineer’s job today. With the IT section, we want to create an environment which supports these new career paths.’ Some vendors queried why they had not as yet been invited to the IT party. Mahdavi, who is VP global security with Schlumberger, intimated that the founders preferred to keep initial access to oil companies.
The inaugural meeting of the R&D Technical Section drew a crowd of 80. Section goals include providing a forum, promoting and targeting R&D and technology for the petroleum industry. A nominating committee is drafting a slate of potential officers with voting scheduled for mid November. The section will be supporting the SPE 2007 Research & Development Conference to be held in April 2007 in San Antonio. include ("copyright.inc"); ?>