May 2012


Oil and gas MDM

Shell’s SAP and Microsoft master data management trials. IPL on Statoil’s model-driven approach. Envizion on GDF Suez’s IBM InfoSphere-based semantics for information management.

Judging from presentations at the IRM-UK Master Data Management conference held last month in London, the MDM problem is easier to state than to resolve. Andrew Schulze outlined proof of concept MDM initiatives in Shell. Both SAP and Microsoft’s MDM tools have been trialled, with help from Accenture and Wipro. The tools were assessed as master data ‘peers,’ and evaluated for integration with Shell’s information architecture (Oil ITJ January 2011). Scripts for various scenarios were run on hardware provisioned through Amazon web services. A gap/functionality scorecard has been established and today, seven data objects are live, six projects active and three PoCs complete. Key finding is that MDM needs to be delivered for the enterprise. Responsibilities must be clearly defined ‘so governance does not fall through the cracks.’

Chris Bradley (IPL) and Eldar Bjorge outlined Statoil’s model-driven approach to MDM. Internal metrics showed that 85% of incidents at Statoil’s supply chain service center are due to master data errors. Problems arise from poor master data within and across environments and from silo-oriented, stand-alone applications. MDM is being driven by Statoil’s need to rationalize drilling, well, lease and supply chain data and to support marketing, processing and renewable energy. A pilot covering enterprise content management of location data has allowed for testing of MDM governance workflows and tools.

One issue with MDM is that projects can’t wait on master data delivery. The answer is to supply ‘just in time’ subsets of master data to projects as needed. ‘MDM is not the field of dreams.’ In E&P, managing complex data types involves compromise and the art of the possible. Statoil has established decision criteria for its approach and architecture along with a conceptual model for subject areas that will be put under MDM control. The geography subject area is finalized. Priority and sequencing for other subject areas has yet to be decided.

Jan Henderyckx (Envizion) and Greet Krekels described GDF Suez’ vision of information as a corporate asset—protected through its life cycle, providing trusted information to all stakeholders. GDF Suez uses the InfoSphere toolset to map its information landscape across real-world assets, like rotating equipment, into business and implementation semantics. An InfoSphere business glossary browser contains master definitions. Definition ownership is thrashed out by a coordination board and subject matter expert task forces working across divisions. Multiple models are built and maintained in InfoSphere data architect. Henderyckx observed that low maturity in data modeling and poor standards enforcement results in a strong business case and lots of ‘low hanging fruit.’ More from IRM-UK.


TGS bags Volant

Acquisition of Volant Solutions promises tight integration of vendor data products with geoscience data integration tools EnerConnect and Envoy.

TGS has acquired all the outstanding shares of Volant Solutions. The six employees of the Houston–based software house provide data integration solutions to E&P companies. Senior VP of TGS’ geological products division, John Adamick stated, ‘Volant helps clients connect different types of well data to multiple geoscience applications with the Ener-Connect and Envoy solutions. This deal will allow TGS to grow Volant’s business and at the same time will enhance the connectivity of TGS well data sets.’

Volant CEO Scott Schneider added, ‘We will be working with TGS to integrate its suite of data products with our technology and provide customers with efficient access to TGS’ data products from their interpretation processes. So far we have lacked the key ingredient of tight integration with vendor data.’ Referring to multiple recent partnerings, Schneider told Oil IT Journal, ‘Our existing partnerships, all are still very much in play. TGS fully expects Volant to continue development of integration solution that is leveraged into partnerships with Neuralog, EnergyIQ, Halliburton, IHS Energy, Paradigm and Perigon Solutions.’ More from scott.schneider@volantsolutions.com.


Kaizen, the cluttered desk and upstream workflows

Inspired by a Brendan Greeley’s Business Week article on Monomoy Capital Partners’ approach to shop floor efficiency, Oil IT Journal editor Neil McNaughton wonders if kaizen is really necessary if your window on the world is a computer monitor. Back from the 2010 PNEC data integration conference he decides that yes, cleanup is needed—both on the desktop and on the desk itself!

This editorial was inspired by an article in Business Week by Brendan Greeley titled, ‘My week at private equity boot camp’ and by my attendance earlier this month at Phil Crouse’s PNEC data integration conference more of which in the June issue of Oil IT Journal. Greeley describes a visit to a mid-sized factory in Scottsboro, Alabama making parts for industrial refrigeration units. The unit, HTPG, was acquired by Monomoy Capital Partners in 2010 and recently underwent an in-depth makeover performed by Monomoy’s efficiency gurus. I invite you to read the full piece for a blow by blow description of how Monomoy does its stuff—it’s pretty informative. Monomoy uses the Toyota production system with its ‘kaizen’ (continuous improvement) approach. In the plant this translates into a sometimes ‘alien obsession’ with cleanliness and order.

What caught my fancy here was a kind of triple take I had as I finished the article. Reading about folks cleaning up after themselves to keep the plant shipshape and doing kaizens on this and that, my first reaction was that this is all jolly interesting stuff but that it could hardly apply to a company of the size of Oil IT Journal’s publisher The Data Room.

Then I thought, actually our ‘plant,’ a very small office, is in an incredibly un-kaizen state. There are boxes of old computer leads piled up along the back wall. Ill-advised kit bought years ago on E-Bay sits atop fading Ikea ‘Billy’ bookshelves, sagging under the weight of old computer manuals. Several different sizes and geometries of carry-ons are readied for short notice departure, papers and documents are piled up higgledy-piggledy, two live and one dead computers sit alongside various scanners, printers and a fax. We are, it would seem totally un-obsessed with cleanliness and order.

In fact the problem with ping, power and desktop publishing is that you can more or less happily look forward at your computer screen while the crap piles up around you. Stuff comes in on the wire—it shifted around on the computer and massaged on the screen and eventually it goes out to the website and or email recipient and the job is done. The real world of documents, paraphernalia and dust is somehow orthogonal to the digital world. I wonder is there any benefit to be had from a small office kaizen blitz?

While pondering this, my third take came along. Kaizen, cleanliness and all that is pretty well exactly what data management is about. A labyrinthine folder structure is just like the plant filled with spare inventory. A kaizen expert would have a field day in your average IT shop. Maybe they already do. In case not here are some hints from Greely and Monomoy.

Monomoy sends a team of kaizen experts in to perform a kaizen ‘boot camp’ on its acquisitions. The process starts with ‘cleaning’ which involves sorting, straightening, sweeping, standardizing and ‘sustaining’ the shop floor. Kaizen, which started out as the Toyota production system (TPS) homes in on three kinds of ‘muda’ or waste. Actions that create no value, that overburden people or machines, and lead to inconsistent production. The process involves what used to be called ‘time and motion studies’ to optimize the workspace and defined a precise job for everyone.

So how does all this map across to the upstream? At the highest level you can imagine private equity-backed experts coming in and sorting out the tortuous processes of portfolio management, drilling and producing. Perhaps Tullow should abandon exploration in its own right and turn itself into an E&P private equity cum mergers and acquisitions specialist.

At a level closer to the E&P shop floor, a kaizen blitz ought to be a good idea for the data and interpretation cycle. This would involve quality control—not just of the data itself but of the processes surrounding it—attacking ‘waste that overburdens’ like reformatting and conditioning data for use, copying files, rework—stuff that happens all the time. It is something of a truism that much of the industry lives from constant fiddling around wasteful processes. These issues are well known and addressed at every PNEC. So what exactly is added by kaizen?

Well, it is not a magic bullet and one should not expect it to introduce concepts that are totally foreign to the enterprise—IT does enough of that anyhow! A couple of things are interesting in the Monomoy approach. First it is the external eye that is cast on the work process. I remember being very irritated as a seismic interpreter when the boss came in to review my work. Inevitably he homed in on the blindingly obvious weak point that I had been addressing with … denial. Even a non specialist’s eye can have a positive impact by pointing out what you knew all along.

Greely also argues that the shop floor gets used to the process and begins looking at their own actions through the kaizen lens. The first kaizen is the worst and folks will be irritated by the new interference. But eventually there is a recognition that you need to be constantly re-evaluating processes as the business evolves and to counter ‘entropy,’ as things inevitably backslide into disarray.

Albert Einstein is supposed to have asked, ‘If a cluttered desk signals a cluttered mind, of what, then, is an empty desk a sign?’ A good question.

But really, I have to do something about the mess in the office. Only this week my digital ‘window to a clean world’ notion failed as I found the PUG program beneath a pile of invoices. While it’s hard to do a kaizen blitz on yourself that is just what we need. I think we’ll start with the physical dust and junk in the office and then move on to the digital jumble spread across our various desktops and the server. Yep, we are definitely going to get down and dirty—just as soon as I get back from the Copenhagen EAGE. See you there?


IBM on Big Data

Oil IT Journal reviews IBM’s booklet—‘Understanding Big Data.’ More marketing material than treatise, UBD provides a peek into the puzzling phenomena of Hadoop, MapReduce and more.

They say that there is no such thing as a free lunch. There is probably no such thing as a free book either. IBM’s oeuvre titled, ‘Understanding Big Data’ is a savvy blend of marketing puffery for the company’s InfoSphere BigInsights and some insight into what the Hadoop/Big Data hype is about. One use case intriguingly is the oil field—where, apparently, a well can have ‘20,000 sensors generating multiple terabytes per day,’ a statement that might surprise some. This and other examples are shoehorned into a FUD-ish description of overwhelming data volumes either ‘at rest’ (in a database), or ‘in motion’ (streaming from real time data ‘exhausts’). Skipping to page 55, we learn that Hadoop is a distributed file system that stores data and indexes redundantly across large clusters. The system was developed by Google to provide extensible, resilient infrastructure on commodity hardware. Is Hadoop just a massive RAID system?

Well perhaps not. UBD makes the statement that ‘MapReduce is the heart of Hadoop’ and goes on to explain the workings of the map and reduce approach to dealing with large amounts of sparse data. Here the brief explanation is at once simple and frustratingly opaque. To understand what is really happening a quick trip to Wikipedia was necessary where we learn that MapReduce, rather than being ‘at the heart of Hadoop’ is ‘a free and open source implementation of Hadoop.’ All very confusing.

UBD moves on to describe various Hadoop-based projects such as Hive, a SQL front end, an ETL adaptor, Flume, and others. Amongst these is HBase—a column-oriented database for sparse data sets. Although Hbase only gets a short paragraph it is arguably the non relational approach of sparseness and de-normalization that make up the big data difference. At page 81 we return to the BigInsighhts sales pitch. For IBM, Hadoop is not designed to replace your data warehouse, but rather to complement it. If you have sweated blood getting your business intelligence up and running, the thought of a big data ‘complement’ may make you feel faint. A good question is the extent to which the technology is really just a rehash of business intelligence (data at rest) and complex event processing (data in motion) for the distributed file system.

IBM states that its BigInsights implementation is, and will always be, based on the core Hadoop distribution. This begs the question as to why you would want to use anything other than the free version. I guess it depends on how adventurous you are. All in all, UBD probably provides more insight into IBM’s marketing strategy than it does into Hadoop. But this is certainly not without interest—especially if you are, of if you are considering becoming, a client. For our part, consider this a first stab at what will likely be more attempts to come to terms with the new paradigm and to see if they really do have as much potential for application to oil and gas data—whether it is at rest or in motion.


Artificial lift R&D council 2012 gas lift workshop

Non conventionals and the gas lift renaissance. OVS’ ‘virtual’ data integration for Pemex. Honeywell’s HGLO optimizer. Batch gas lift trouble shooting at ConocoPhillips Alaska’s Kuparuk field.

Speaking at the 2012 Artificial lift R&D consortium’s gas lift workshop in Houston earlier this year, ConocoPhillips’ worldwide LNG manager, Greg Leveille painted an upbeat picture of the oil and natural gas industry’s ‘renaissance’ and the implications for gas lift deployment. The renaissance is of course driven by the unconventional revolution that was ‘born in the USA’ and has revealed an enormous, previously unrecognized onshore prize of over a century of self-sufficiency in natural gas and a second Hubbert peak in oil production. The renewed interest in gas lift is because it is insensitive to produced gas and solids, is compatible with horizontal wells, minimizes tubing wear and is tolerant of dogleg severity and small diameter wellbores. Gas lift is already the method of choice in the Barnett and Eagle Ford shales, driven by high gas oil rations and flow rates, centralized pad drilling and deviated wellbores. Gas lift will become even more attractive as the oil-component of the revolution gains momentum in the US and internationally.

Sebastiano Barbarino, CTO with OVS Group described how his company has been addressing production data overload at Pemex’s Samaria-Luna asset. This produces over 200,000 bopd from 200 wells, many on gas lift. OVS has designed a virtual data integration layer to Pemex’ databases of reference and to real time in PI. A web services interface feeds a third party nodal analysis package and Microsoft Office-based reporting. The ‘evergreen’ well models have leveraged in-place technology and provided a ten fold reduction in lift candidate selection time and a fifty fold speedup in optimization.

Ravi Nath introduced Honeywell’s on-line gas lift optimizer (HGLO) which is claimed to best traditional gas lift optimization by monitoring compressor, valve and water handling constraints along with sales gas pipeline pressure. HGLO offers a real time advisory on potential optimizations in the face of short term opportunities such as changing weather or pipeline pressure. The key is a ‘fast executing, on-line, real time optimizer.’ HGLO embeds Honeywell’s NOVA process optimizer and the URT real time execution platform. A case study of an eight well Gulf of Mexico platform showed a production hike of 0.8% and a $7 million increase in NPV.

Grant Dornan presented on batch gas lift trouble shooting on ConocoPhillips Alaska’s Kuparuk River field, in production since 1981 and still producing 121,000 bopd from 630 production wells supported by 507 injectors. Compressor constraints mean that gas lift is targeted to where it makes the most oil—optimized in the Scada system. Batch gas lift trouble shooting is controlled by CP’s system nodal analysis software. SNAP identifies opportunities to increase the gas rate on wells that are not lifting efficiently or where lift performance can be improved. Optimized gas lift designs for 11% of KRU wells were implemented in 2011 resulting in 1% hike in the oil rate. Further deployment and work on the tool is planned for 2012. Read the presentations on the ALRDC website.


LMKR upgrades GeoGraphix with Blue Marble GeoCalc

New geodetic calculator assures pinpoint horizontal well placement.

The 2012 release of LMKR’s GeoGraphix geoscience interpretation flagship includes an upgrade to Blue Marble’s GeoCalc 6.5 geospatial toolset. LMKR has also embedded the GeoCalc software development kit (SDK) in the new release to assure accurate, 2D and 3D coordinate transformation. LMKR’s Scott Oelfke observed, ‘A Marcellus well costs $5-7 million dollars. If its bottom hole location is too close to another party’s lease line, the investment will be severely compromised. Being able to rely on accurate locations to base drilling decisions is critical.’

The new release adds a new vertical height offset method, with support for geoid models from Australia, The Netherlands, Denmark, France, Sweden and South Africa. Users can create custom offsets from defined geoids. Accuracy tags have been added to all datum transformations so users can select the most appropriate datum shift for a specific transformation.

Blue Marble VP Kris Berglund said, ‘The 6.5 release contains recommendations made by oil and gas majors, the geomatics committee of the oil and gas producers association (OGP) and the Americas petroleum survey group (APSG).’ The new release allows read/write of GML geodetic objects to the Blue Marble data source and adds support for HTDP reference frames and coordinate reference systems. A data source audit trail tracks edits to the geodetic parameter data source. More from LMKR and Blue Marble.


Paradigm 2011.1 leverages Nvidia Maximus GPU/CUDA

Computation, display tasks shared across appropriate hardware.

The 2011.1 edition of Paradigm’s geoscience interpretation suite adds graphics processing unit (GPU)-based computation of seismic trace attributes using Nvidia’s Maximus technology. Maximus workstations combine Nvidia’s Quadro GPUs with Tesla GPU-based compute engines for rendering and or simulation computation. The combo frees up CPU resources for I/O, running the operating system and multi-tasking and also allows the GPU to be dedicated to its display role. In the seismic context, this means that seismic trace attributes can be computed in the Tesla for display in the Quadro. Paradigm claims the technology offers ‘interactive or dramatically reduced calculation times’ for the interpreter.

Oil and gas executive with Nvidia, Paul Holzhauer said, ‘Interpreters can now do complex seismic analysis on their desktops, improving the interactivity and enabling faster and better decisions. By harnessing the parallel processing power of the GPU, Paradigm is leading the way with a new generation of exploration tools.’ More from Paradigm and Maximus.


Petrosys’ dbMap/Web achieves PPDM ‘Gold’ compliance

PPDM web-based data browser complies with 3.8 model at table-level.

Petrosys has achieved ‘gold level’ compliance with the 3.8 edition of the Professional Petroleum Data Management Association’s (PPDM) eponymous data model for its dbMap/Web offering. For PPDM, ‘compliance’ is defined as ‘a level of conformance between a database or software product and the published PPDM standards.’ The highest ‘gold’ level of compliance indicates table-level conformity and ‘the possibility (but no guarantee) that multiple software applications will have update interoperability on the database.’

Petrosys’ submission covers a total of 73 tables mapping to 12 PPDM modules related to well, seismic and GIS data. Petrosys’ Scott Tidemann explained, ‘We are a committed long term supporter, user and contributor to the PPDM data model standard and the global professional data management community. Petrosys operates an open, vendor neutral approach to information management to support our clients’ long term needs.’

dbMap/Web provides access to data in master databases based on the PPDM data model. Query and edit access is controlled through Oracle security. Interactive elements provide look-ups, document linking and map browsing and the system can be customized to client data model extensions, business rules and content. More from Petrosys and PPDM.


Forest Oil reports on AFE navigator deployment

Automated workflow system replaces error-prone manual AFE process.

Denver-based Forest Oil has moved from a manual process for authorization for expenditure (AFE) creation, routing and approval to a solution built on AFE Navigator from Energy Navigator. Previously some 2000 AFEs were handled yearly with routing slips and manual data entry into the Bolo accounting system. AFE packages were being held up and unpaid invoices were waiting on AFE approvals.

AFE Navigator has provided a means of electronic creation, routing and approval along with automated integration with Forest Oil’s accounting system—P2 Energy Solutions’ Bolo. The solution has also been tailored to Forest Oil’s approval matrix and delegation of authority when approvers were out of the office

AFE Navigator pulls data from Bolo and other sources and loads it into the AFE Navigator database. Forest Oil utilized this functionality to synchronize data between AFE Navigator and Bolo. The system makes SOAP-based calls to other systems to validate all data prior to the issuance of the AFE. Intermediate files can be generated at certain points in the AFE life cycle. The result is that ‘lost or missing AFEs are a thing of the past.’ Users can link AFEs with support documentation into a uniform workflow across all departments. More from Energy Navigator.


Software, hardware short takes

Baker Hughes, Invensys, Eliis, Expertune, Exprodat, Geomodeling Technology, Gedco, Schlumberger, Petris, Quorum Business Solutions, Senergy, Tecplot.

Baker Hughes has released JewelSuite 2012 introducing reservoir ‘sweet spot’ and scenario analysis, Blue Marble-based geodetics, and connectivity to Baker Hughes’ MFrac hydraulic fracture simulation software.

The 5.0 release of DynSim from Invensys’ SimSci-Esscor unit includes ‘patent-pending’ code for start-up and shutdown sequencing, enhanced compressor models and multi phase flow modelling with PipePhase.

EliisPaleoScan 1.2 speeds geomodel building and enhances well and fault display and data management and adds multiple attribute methods.

A new ‘virtual classroom’ from ExperTune offers on-line training in the control and tuning of PID loops.

Exprodat has announced Team-GIS Unconventionals Analyst, an ArcGIS Desktop extension targeting shale gas/oil and coalbed methane and Team-GIS Discovery, an ArcGIS Server add-on for web-mapping applications.

Release 12.0 of Schlumberger unit Gedco’s Vista 2D/3D seismic data processing package improves data management, adds real time data display and modules for VSP, 2D refraction and sparse decon.

Geomodeling Technology’s Attribute-Studio 7.2 adds neural network analysis for reservoir property prediction and stimulated rock volume calculation from microseismic data.

Geosoft’s new Voxi cloud-based potential field inversion service generates 3D voxel models from airborne or ground gravity and magnetic data. Voxi is a component of Geosoft’s 2012 release.

Schlumberger has added a shale sweet spot plug-in to the Petrel/Ocean store. The app combines reservoir indicators, surface constraints, well and field planning for shale gas, SAGD well pairs, and radial wells drilling.

The 8.1 release of Petris Winds Enterprise adds Windows 2008 Server R2 and Red Hat Linux 5.5 support and can be deployed on SQL Server 2008, Oracle and Esri ArcGIS—1110.

Quorum Business SolutionsPGAS 9.0 and TechTools 5.0 releases enhance meter configuration data capture and calibration and extends flow measurement capabilities and QC—1111.

The 4.1 release of Senergy’s Interactive Petrophysics adds nDPredictor for porosity and water saturation predictions while drilling, SandPIT 3D for sandstone reservoir rock failure prediction and a total organic content module.

Tecplot RS 2012 allows up to four different grid solutions to be synchronized in time and space and animated. CMG data loading has been improved.


Spotfire 2012 Energy Forum

Spotfire’s statistical calculator and data visualization engine under the Chevron ‘iRave’ hood. Wood Mackenzie consolidates data delivery through Spotfire-powered portal.

Speaking at the 2012 TIBCO Spotfire Energy Forum earlier this year, Chevron’s Steve Rees described how Spotfire has been embedded in Chevron’s integrated reservoir analysis and visualization environment, ‘iRave.’ Rees’ unit operates a billion barrel offshore oil field presenting significant reporting and project management overhead and data management challenges. The current state of data management and analysis was deemed inefficient, with complex email and ftp-based workflows and multiple information repositories. Users were accessing and manipulating data in spread sheets which often lead to duplication of effort and time wasted on managing and manipulating data. Enter Spotfire and iRave—and a new user friendly collaborative environment targeting workflows with immediate impact such as daily and historical production monitoring and analysis, reporting and forecasting and more.

iRave combines information from heterogeneous data sources including Oracle tables, map files and network flat files. Information is displayed in 9 panels with over 60 visualizations. Some were customized using Spotfire extensions designed by Troy Ruths. Following initial deployment in 2009, iRave was re-tooled for deployment on Rees’ asset in under four months. In one workflow optimization use case, variance analysis for monthly forecasting has been reduced from 2 hours to 20 minutes.

Chevron suggests some best practices for such development—leverage a multi-disciplinary project team and include a champion from the asset team with knowledge of critical workflows. This targets higher value workflows while limiting the initial scope. Engaged end users positively impacted the design process and improved deployment effectiveness. iRave 2 is being planned with extensions for benchmarking, business plans, reserves and petrophysics.

Scott Reid traced Wood Mackenzie’s history of data delivery—from compact disks, through a web portal to Excel-based delivery and, as of 2012, a ‘new breed of products with embedded Spotfire visualizations.’ Woodmac uses Spotfire to analyze fiscal regimes and to perform competitor analysis. The ‘new breed’ of Spotfire visualizations include a corporate analysis tool, developed in .NET and accessed through the Woodmac Portal. CAT offers analysis of 2,900 companies worldwide which can be sliced and diced into companies, regions and other metrics.

Earlier this year, Woodmac’s Exploration service released a corporate exploration report in Spotfire—the first Spotfire-based deliverable on the Woodmac portal. Other Spotfire-based products are in the pipeline. Reid observed that even an apparently simple concept like ‘government take’ can hide considerable complexity. Tax regimes divvy up fields into different profiles and sizes, other parameters like commodity price scenarios and discount rate assumptions make for some 6.75 million ways to interpret government take. Spotfire has allowed Woodmac to consolidate its discovery dataset and migrate its legacy Excel/VBA offering to a web-based toolset. More from the Spotfire Energy Forum.


ESRI PUG, Houston

The oil and gas geographic information systems Mecca that is the ESRI petroleum user group heard of an imminent GIS ‘tipping point,’ from Shell on the facts and FUD of the cloud and of ExxonMobil’s spatial data framework. Virtual outcrop geology, OMV’s WebGIS, navigating the Bakken and more.

Around 1,800 attended the ESRI Petroleum User Group annual conference in Houston this month. PUG chair Brian Boulmay (BP) observed that the PUG now represents over 20 years of knowledge sharing and has evolved into a true community of practice. The PUG now has over 2,000 members and a new website pugonline housing notably the PUG ‘List’ of desiderata for enhancements, bug fixes and the like.

Keynote speaker ESRI’s Clint Brown thinks that GIS is at a ‘tipping point’ thanks to the universal use of smart devices as a window to online information. ‘The ubiquitous apps that enrich your life are set to do the same for the enterprise.’ Minds used to boggle at data volumes, now all consumer data is in the cloud. Maps are the most popular smart phone app—but folks want maps that do more. On the other side of the balance, IT is suffering from ‘fatigue’ as managers look for ‘sustainable’ IT in the face of faster and faster technology churn.

ESRI’s latest contribution to the churn is ArcGIS Online where ‘the map is the app.’ AGO offers consumer-style mashups of GIS layers both on and off the premises. AGO is heralded as the tipping point for GIS and a new way to manage content.

ESRI’s Damian Spangrud was up next relating how the geospatial ‘ecosystem’ is expanding to embrace RT data, big data (UAV/LIDAR), the cloud, mobility and an increasingly geo-literate population offering ‘crowd sourcing.’ 3D and time are now integral parts of GIS. Maps can now be embedded in Excel, Sharepoint, Cognos and SalesForce and across Apple’s iOS, Qt, .NET, Java etc. A new ‘ArcGIS for Petroleum’ offering provides packaged solutions to get you going.

AGO underpinned Danny Spillmann’s demo of a fictitious oil company, ‘Clancy Energy’ whose youthful explorationists expect Facebook functionality from their enterprise apps. This is provided with ArcZone (Esri’s own social network). Maps are embedded into web pages for drill down to lease information. The new ESRI Maps for Office also ran. A time slider function allows lease maps to be viewed at different points in time—or to move forward to their expiry date. Petrel users can now access live maps from AGO instead of shapefiles. Production engineers can take information out of the office to a mobile device, work it up and then synch on their return to base. Android devices can link to a laser rangefinder with BlueTooth to capture ad hoc measurement. Another theme is the spatialization of business intelligence in ERP and asset management systems—a.k.a. ‘location analytics’ or ‘heat maps’ as opposed to ‘map maps.’

All was finally rolled into a super dashboard showing real time data of Clancy Energy vehicles, pipeline pressure valves and alerts. Components can be added to the dashboard—such as a ‘geochat’ object that localizes tweets! Instead of a company report, Clancy produces a collection of maps—with for instance, a Haynesville holdings atlas that pops up on an exec’s iPad. Just to keep things real, a crash held things up momentarily, ‘it happens every plenary’ said the ESRI demonstrator, ‘it happens every day’ said the voice off.

Keith Frayley described Shell E&P’s test of the cloud, separating ‘fact from fiction and FUD.’ Shell has been piloting Amazon web services from inside its firewall to run ArcGIS Server on an Amazon machine image. AMI provides a virtual private cloud that means there is no need for a systems administrator or even for hardware. To test the cloud’s claim to agility, Shell ran some resource hungry PalaeoGlobe plate reconstructions. The hardware was easily switched from 2 to 8 cores overnight. The cost and security claims made for the cloud seems to hold up too. Performance was more hard to evaluate—a local file geodatabase outperformed the cloud. Shell is evolving a hybrid model with data and server side components in cloud and a local file geodatabase for performance. While it will always be hard to beat a local data center, the cloud is great for satellite offices and levels the playing field for smaller operators.

Ronald Lopez outlined ExxonMobil’s spatial data framework. Exxon uses ArcGIS Server as its enterprise foundation and as a web-based mapping alternative for ‘mid to light’ GIS use cases. All this is in the face of a ‘tectonic shift’ in user expectations, going beyond pan and zoom and into geoprocessing. ArcGIS for Silverlight allowed rapid prototyping. Datasets that are identified as ‘foundational’ candidates are reviewed and may be published as a web service. GIS has enabled the consolidation of every imaginable data type from geoscience, through HSE/facilities to vendor data. There is a big buzz around business intelligence and Lopez agrees that this is an area where GIS can add value.

Shell’s Lionel White showed how large, high resolution photorealistic geological outcrop models can be captured in Arc Scene and viewed in EON’s Reality Viewer. Some 106 photos were assembled over a LIDAR scan for study with GHVM’s GeoAnalysis tools. These allow the virtual structural geologist to measure in strike dip, trend plunge, axial plane and more. The technique was used to analyse a multi kilometre outcrop of the Eagle Ford shale.

A more conventional deployment was presented by OMV’s Chris Smolka who had developed a WebGIS G&G archive for production unit maps. OMV’s PU maps are a minimal representation of a field for use in peer reviews and reserves audits. OMV has established a single environment for archiving and searching G&G data across its EDMS, GeoFrame, Petrel. The Petrosys plug-in for Petrel is used to create Shape files and to export to Petrosys map. Backend is ArcSDE with a searchable meta data catalog. SynerGIS Web Office also ran.

No PUG would be complete without its share of geodetical health warnings. TGS’ James Stolle used the example of multiple deviated wells beneath a Beverly Hills high school to stress the importance of getting the deviation survey done right. Interpreters, well planners and drillers all need to pay attention to the accuracy of geospatial data. Scare stories abound—from a last minute change to the surface location to mix ups between true and grid north. Unfortunately there are few to zero standards for directional surveys in the US. This is a sensitive issue in areas of intense horizontal drilling like the Barnett shale. Datum shifts from NAD 27 to NAD 83 mean that some Bakken wells are hundreds of feet off target. Perhaps as many as17% of Barnett have this sort of problem.

Tim Downing showed how to bridge the gap between a PPDM-based data model and GIS, sharing findings from Geologic Systems’ new website development. Now all searches combine ArcObjects with PPDM-stored data. This initially gave very poor performance. Various combinations of SDE, ArcObjects and SQL were tried. In the end SQL ‘ST_’ functions and SDE registered views were used in Geologic’s light weight solution.

Tracy Thorleifson (Eagle Information Mapping) noted that current pipeline models assume stationing, something foreign to many upstream operators. As GPS has displaced conventional survey, stationing is now an anachronism that ‘places an undue burden on many operators.’ Thorleifson suggests a work-around, concentrating on geodatabase-based models and shapes which are ‘easier to manipulate than tables of coordinates.’ Stationing can be bypassed by associate events directly with a line. In the Q&A, a PODS representative advised that an alternative referencing workgroup has just started.

Peter Veenstra (Willbros Engineering) likewise revisited the pipeline data model situation, tracing their history from ISAT, PODS, APDM (with its geodatabase) and on to PODS ESRI Spatial, PPDM and ISO 15926. Data models tend to be large and all-encompassing while remaining country-specific. Their use requires much training.

Veenstra recommends the ESRI geodatabase as an enterprise solution for data management that sits on top of a relational database, adding geometry in standard database tables. The geodatabase supports versioning, archive, replication, network topologies, dataset aggregation. An open design makes it easy to enter data. Validation is done after the fact. The geodatabase trade-off is that there are few constraints and no referential integrity checks. This can make for mayhem! Scrip-based post processing is required for referential integrity. Veenstra suggests nine changes to the data model that will make your life easier (some are already in PODS ESRI geospatial). These have ben posted to ideas.esri.com. Sparx Systems Enterprise Architect got a plug as Veentsra’s modelling tool of choice.

Stephen Richard told how the Arizona Geological Survey is offering geological survey data as ‘ready-made’ services. These leverage ArcGIS to serve geologic data for the DOE national geothermal data system, part of the national geoscience information network, USGIN. GML simple features are used to capture formation name, age and rock type. This approach was preferred over the GeoSciML standards—deemed ‘too complicated.’ GeoSciML however has seen take-up in the EU OneGeology. A new OneGeology US has been proposed, leveraging GeoSciML V2.0 to provide geological vector data harmonized with a national age schema but with preferred local legends.

It’s been a couple of years since we covered the PUG. It seems like while the technology evolves, the issues stay the same while the. The most obvious change was that back in the day, folks looked askance if you said ‘esri’—spelling it out as ‘E.S.R.I’ used to be de rigeur. Nowadays, everyone says ‘esri.’ More from the ESRI PUG.


Free and open source software for geospatial a.k.a ‘FOSS4G’

While ESRI is the only game in town for GIS in the oil and gas vertical other communities are leveraging open source tools. Blogger Anthony Quartararo of Spatial Networks reports.

Anthony Quartararo, president and CEO at SpatialNetworks attended the ‘Free and open source for geospatial’ a.k.a. FOSS4G North America meet in Washington last month and has kindly allowed us to cherry pick his blog from the event. Quartararo witnessed ‘enthusiasm and passion’ from the open source community along with three main themes as follows.

1) FOSS for geospatial is cool but currently lacks a business model for long-term sustainability. One notable exception is OpenGeo, which has open source products alongside a composite licensing model. Most FOSS4G companies charge for support, training and custom development—putting them in more of a consultancy role.

2) The lack of a visible business model is made up for by the abovementioned passion and ‘a higher percentage of passionate, dedicated professionals than any other community or company I’ve experienced, even my own.’

3) Free and open-source software for geospatial will usurp the incumbents within the next 5 years. Note however that it is unlikely that a single FOSS4G company will replace companies like ESRI, Intergraph or Google. But the technology innovation of the FOSS4G and OpenStreetMap community will best the combined efforts of all the industry giants above. The writing is already on the wall, or more accurately, it’s written on Github. FOSS4G provides a steady stream of breath-taking technology innovation. Traditional GIS software companies spend too much on marketing and propping up their hegemony and not enough on solving problems.

ESRI-bashing is a popular sport at FOSS4G and, for Quartararo, is largely deserved—although, as a self-confessed ‘persona non grata’ from ESRI, he is ‘not the most objective person at the moment.’ But the growth in FOSS4G ‘poses an existential threat to ESRI, Google and the like in terms of dominance over all things geospatial.’

Since Quartararo blogged his blog, the FOSS4G presentations have been posted. We took a quick look and spotted Enterprise Web Mapping for Utilities and Telecom Companies by Peter Batty of Ubisense whose myWorld app embeds open source components including PostGIS, MapFish and OpenLayers and is positioned as the as the kind of disruptive technology mentioned above.

Another interesting pitch came from Nicholas Knize of Thermopylae Sciences and Technology which has leveraged a NoSQL ‘big table’ implementation of MongoDB to tackle dynamic geospatial data at massive scale. Customers include the US Dept of State Bureau of Diplomatic Security, US Army Intelligence Security Command and US Southern Command.

You might also like to check out David Bitner’s (Metropolitan Airports Commission) presentation on the use of PostGIS, PL/R, and range data types for real time and post processing of 4D flight track data. More from FOSS4G and from Quartararo’s blog.


Folks, facts, orgs ...

Asset Risk Management, Aveva, Celerant Consulting, Chesapeake, Emerson, Fiatech, GE Energy, Fraser Institute, ITF, KBR, Midland Valley, OGC, OSHE Consultants, OVS Group, Senergy Development Solutions, Serafim, Union Drilling, West Engineering, Lloyd’s Register.

Keith Barnett and Bob DeMan have joined Asset Risk Management as, respectively, senior VP and director of fundamentals analysis and senior VP for client hedging. Barnett was previously with Merrill Lynch Commodities, DeMan with BNP Paribas.

Aveva has opened a new office in Wroclaw, Poland.

Celerant Consulting has hired Maarten van Hasselt as senior VP and Americas sector lead of its global energy practice. He was formerly with GE Oil & Gas.

Chesapeake Utilities has appointed Matthew Kim as VP and Thomas Mahn as treasurer. Kim hails from The Carlyle Group, Mahn from Perdue Inc.

Emerson’s Roger Freeman has been elected to the Fiatech board of advisors.

GE Energy is investing $10 million in an oil and gas training facility in Houston.

The Fraser Institute has published its 5th annual survey of industry executives investigating barriers to investment in upstream oil and gas exploration around the world.

ITF is to launch a ‘cluster group’ for the US to focus on ‘solving major technology challenges hampering exploration and production.’ Member company Shell hosts the kickoff meet next September.

David Zelinski has joined KBR as president of the downstream. He was formerly with Fluor’s Energy and Chemicals group.

Graduates Luke Smallwood and Nathan Collins have joined Midland Valley’s development team.

Caterina De Matteis has joined the Brussels secretariat of the Open Geospatial consortium, OGP, as administrative assistant.

OSHE Consultants are offering a ‘NEBOSH’ international technical certificate in oil and gas operational safety course in Houston.

Larry Denver has joined OVS Group as president. Founder and past president Jose Alvarez stays on as CEO. Denver was previously with Knowledge Reservoir.

Senergy Development Solutions has appointed Rob Fisher as director of its subsea, pipeline and construction management business. He hails from Technip.

Former head of reservoir engineering at Addax, Giel Krijger, now heads-up sales and marketing at Serafim.

Alan Roachell has joined Union Drilling as VP HSE and training. He was formerly with Rosewood Resources.

Following its acquisition by Lloyd’s Register, West Engineering Services’ owner and founder Mike Montgomery is to retire as president but will stay on as an advisor. Lloyd’s Register Americas president Paul Huber takes over the presidency. Duco de Haan remains CEO and MD of the ModuSpec unit.


Done deals

AGR, CMG, Deloitte, CRG, DNV, NPS, Expro, Siemens, FMC, CSI, Schilling, Fugro, Lloyd’s Register, West, PetroLogistics, Reiten, Competentia, Statoil, SåkorninVest, Sekal, Schlumberger, more...

AGR has acquired an 80% stake in Steinsvik & Co., supplier of HSE services to operators and rig owners.

The Toronto Stock Exchange has accepted Computer Modeling Group’s notice of intention to make a normal course issuer bid that will allow CMG to purchase approximately 10% of its public float for cancellation.

Deloitte has acquired substantially all of the assets of New York-based provider of operational and financial restructuring services CRG Partners. Terms of the deal were not disclosed. Deloitte’s reorganization services group and the transferring CRG professionals will operate under the name Deloitte corporate restructuring group headed-up by ‘co-leads’ Bill Snyder and Sheila Smith.

DNV has acquired oil-spill preparedness specialist Norwegian Petro Services.

Expro is to sell its connectors and measurements business, including the Tronic and Matre brands, to Siemens AG for a $630 million consideration.

FMC Technologies has completed its acquisition of control and automation system solutions provider Control Systems International. It has also now acquired the totality of Schilling Robotics.

Fugro has acquired Montpellier, France-based geoconsultancy Geoter SAS.

Lloyd’s Register has acquired West Engineering Services.

PetroLogistics has announced its IPO pricing. 35 million common units are on sale at $17.00. Morgan Stanley, Citigroup and UBS are running the book with SG Americas Securities, LLC, Stifel, Nicolaus & Co. and SunTrust Robinson Humphrey acting as co-managers.

Reiten & Co has acquired 51 percent of the shares in Stavanger-based oil and gas consulting firm Competentia. The founders will retain 49% of the shares.

Statoil has joined SåkorninVest as an investor in the newly established company Sekal AS, which commercializes the products DrillTronics and DrillScene, developed in collaboration with Iris. The software tools use real time data for ‘safer and more efficient drilling operations.’

Schlumberger is to acquire Tesco Corp.’s casing drilling division for $45 million cash. Schlumberger has also acquired Calgary-based geophysical survey design specialist Gedco which will integrate the WesternGeco business unit.

Wells Fargo Bank has closed its acquisition of the North American energy lending business of BNP Paribas. Most all of BNP Paribas’ Houston- and Calgary-based employees have now joined Wells Fargo.


Logica productizes StreamInsight accelerator for oil and gas

Logica productizes complex event processing—targeting speedy analysis of offshore real time data.

Logica has productized its upstream ‘accelerator’ for Microsoft’s StreamInsight, a complex event processing module in SQL Server. Logica’s accelerators enable rapid development of real-time analytics across diverse data sources and destinations. Real time data streams are read and processed ‘in-stream’ rather than post-storage, accelerating decision-making. The approach is claimed to eliminate the latency of traditional business intelligence.

StreamInsight combined with Logica’s accelerator is claimed to reduce the time required for data cleansing, analysis and event detection to ‘milliseconds rather than hours or days.’ Joe Perino, oil and gas sector lead at Logica said, ‘We have combined our knowledge of StreamInsight technology and data-driven industries such as oil and gas into a solution that accelerates decision-making and helps clients gain a competitive advantage.’ Light weight applications built on the platform can be deployed as cloud-based services located in rugged environments such as oil platforms. The solutions ‘operate alongside’ systems such as OSIsoft PI.

Logica spokesperson Mimi Reilly told Oil IT Journal , ‘Our real-time data team has incorporated the Microsoft Upstream Reference Architecture (MURA) principles into the accelerator. MURA provides guidelines for open extension points for integration with existing enterprise systems. Users can integrate real-time data sources, such as OSIsoft PI System and WITSML and build adapters for canonical data services used in their organizations.’


Millennial Net, IPSO Alliance and the ‘Internet of Things’

Industrial wireless network provider to promote IP-based networking of ‘smart’ objects.

Industrial wireless network systems provider Millennial Net has joined the IPSO Alliance to develop applications for the ‘Internet of Things’ (IoT). The IPSO Alliance seeks to promote the use of the Internet Protocol for the networking of Smart Objects. Dieter Schill, President and CEO of Millennial Net said, ‘More and more devices are gaining the ability to communicate with one another and we believe IP support is an important stepping stone to accelerate wide adoption of networked devices.’

Millennial Net’s MeshScape wireless mesh sensor networking operating system supports dynamic, ad-hoc and mobile applications requiring bi-directional communications. MeshScape is optimized for low power, scalability, recovery time and bi-directional latency.

While most IoT applications are in the utilities and smart grid space, Millennial Net has oil and gas form—notably as a member of the BP-backed Sensor Network Consortium (Oil IT Journal December 2004). Moreover it seems likely that the consumer smart grid technology will impact the oil and gas vertical before too long with its low cost, ubiquitous, standardized smart sensors. Well, maybe… More from IPSO and Millennial.


Matlab’s video ‘how to’ on GPU-based seismic processing

80 minute video provides introduction to Matlab, virtual arrays and CUDA/GPGPU acceleration.

Matlab developer The Math Works has put a seismic data processing case study online to demonstrate the use of Matlab on large data. The demo shows how to manage out of memory data using a memory mapped file and customizing the object for array indexing. This enables reuse of the memory mapped file inside functions or with parallel computing without needing to rewrite code or recreate the memory mapped file on each worker manually. The demo also shows how to speed up the solution of the wave equation using a custom Cuda kernel.

We followed the 80 minute video authored by Matlab’s Stuart Kozola who showed how many data types—from spreadsheets, databases to SEG-Y files are too large to fit into memory. The work around is to create ‘virtual arrays’ which are amenable to parallel computing with GPUs. The demo also provides an introduction to the use of the Matlab desktop, an integrated data and development canvas onto which files and folders (such as the SEG velocity model) can be dragged and dropped.

A spinoff of the approach is that any experiment (here using code from Gerard Schuster’s book on seismic interferometry) is self documenting and ‘reproducible.’ The demo works through several seismic techniques including shot simulation and gather generation, finite difference modelling and wave equation solutions. Much use is made of videos to illustrate both wave front propagation and computational progress.

A 20GB data set on disk is addressed as one big virtual array for seamless processing. A Matlab ‘pool’ can be defined. Here, four machines with four cores make for 16 ‘workers.’ Parallelization is said to be good and even more speedup can be obtained by offloading calculations to GPUs.

The Matlab parallel computing toolbox runs on a local machine, on a cluster or on the Amazon web services cloud. Matlab codes is portable across all platforms with GPGPU support. Fore more performance, Matlab code can be compiled to C and invoked along with Cuda kernels from Matlab. Watch the archived webinar and check out the code.


Sales, contracts, partnerships and deployments

Entero, Fluor Corp., DynaEnergetics, GE Oil & Gas, Halliburton, Intergraph, Emerson, OYO Geospace, Paradigm, Petrofac, Rajant Corp., Safe Environment Engineering, Safe Software, Blue Marble Geographics, Triple Point, Transform Software, ESSCA, Wood Group Mustang, Inpex, Expro.

Calgary-based Traverse Energy has licensed Entero’s production accounting software to manage its production and data reporting.

Fluor Corporation’s joint venture with WorleyParsons, the Kazakh Institute of Oil and Gas and KazGiproNefteTrans Engineering Co. has been awarded a contract by Tengizchevroil for a wellhead pressure management system.

DynaEnergetics of Troisdorf, Germany is now a global distributor of GE Oil and gas portfolio of free point equipment and wireline accessories.

Halliburton has signed an agreement with Gazprom for the development of novel upstream oil and gas technologies.

GDF Suez E&P UK has adopted the Intergraph SmartPlant Enterprise engineering suite, including SmartMarine 3D, for use on the Cygnus gas field.

Lukoil has expanded its Varandey oil terminal with Emerson’s PlantWeb, including the DeltaV digital automation system for terminal operations control.

OYO Geospace has won a $13.6 million purchase order from Dawson Geophysical for 8,000 single-channel wireless units and 3,000 three-channel wireless units. Another $14.0 million came from Tidelands Geophysical for 13,000 single-channel units and related equipment.

Paradigm has sold a license to its seismic processing suite to Divestco which is to become the primary integration platform for Divestco’s technology portfolio.

African Petroleum has selected Paradigm’s Windows-based interpretation suite as a corporate standard for seismic interpretation.

Petrofac has been awarded a service contract by Nexen Petroleum UK, for the Golden Eagle area development. Petrofac’s Plant Asset Management unit is to implement a maintenance regime for the project, supported by its bespoke data development and management software tool, BuildME.

Rajant Corp. has announced a ‘strategic alliance’ with Safe Environment Engineering for the provision of mesh network-based solutions for hazardous environments.

Safe Software and Blue Marble Geographics have teamed on GeoCalc Extension for FME, offering FME Workbench users accurate coordinate transformation.

Koch Supply & Trading has licensed Triple Point’s flagship Commodity XL platform for its natural gas trading, risk and scheduling activities in Europe. Triple Point was also selected by China National Offshore Oil Corporation to manage trading, risk and logistics.

Transform Software has announced a multi-year software reseller agreement with Beijing-based ESSCA.

Wood Group Mustang has been awarded the topsides detailed engineering and procurement support for the semi-submersible central processing facility at the Ichthys field development by Samsung Heavy Industries, the EPC contractor for the CPF. The project is operated by Inpex.

Expro has secured $10M of new contracts to supply its ‘CaTS’ cableless telemetry system which will be deployed by two Brazilian operators to provide pressure and temperature data from four suspended deepwater subsea exploration/appraisal wells. The ART service will also be used by two North Sea operators.


Standards stuff

EQ-Hub’s data exchange standard. Infoweb floats ISO 15926 Part 10, ‘validation.’ International digital object identifier now ISO standard. ISO/DIS 19115-1 Energy Industry Profile real soon now!

Norwegian e-business initiative EQ-Hub has published its data exchange standards, a 13 page document that leverages the Posc Caesar Association’s ISO 15926 standardization process and technology from ShareCat.

Infoweb has published a position paper floating a proposed Part 10 for the ISO 15926 plant data standard. Part 10 addresses validation of Part 9 files prior loading and ‘target-side’ validation to check if newly-loaded information is in conflict with information already stored in a Façade or in a ‘confederation of participating façades.’

The International DOI Foundation’s digital object identifier (DOI) is now an ISO standard (ISO 26324:2012). DOI is a unique object identifier for use on digital networks, a means of identifying a physical, digital or abstract entity over the Internet and for sharing same with a user community or managing it as intellectual property. 60 million DOI names have been assigned to date by a federation of registration agencies.

V1.0 of the Energistics’ ISO/DIS 19115-1 Energy Industry Profile (EIP) is targeted for release in Q2 this year. This standard is a profile of, and is intended as a metadata specification supporting the discovery, evaluation, and retrieval of information (and physical) resources of interest to the Energy Industry. Speaking at the ESRI PUG, Chevron’s Scott Hills and Steve Richard of the USGS described initial focus as on ‘structured and unstructured information with associated spatial coordinates.’ The vision is for seamless transfer of geo-referenced data fed into applications that auto-populate metadata. Active participants include the AAPG, ExxonMobil, ConocoPhillips, IHS , Oracle, P2ES and many more. Flagship project is the USGIN Project, a joint effort between the USGS and all 51 states.


Mellora rolls-out smartphone HSE reporting app

iPhone/Android app simplifies user capture and reporting of HSE incidents.

Bergen, Norway-based Mellora has announced an app for the collection of health, safety and environment/quality data in the field or office. The ‘open’ smartphone application, ‘HSEQ,’ lets personnel report on non-conformance, accidents and near misses and provides safety and quality feedback.

Mellora CEO Trond Hansen said, ‘HSEQ data capture has never been easier, a report can be submitted with a few keystrokes. The data generated is application independent so the system is not tied to a particular database or application. Ease of use increases the likelihood that incidents will be reported—contributing to continuous improvement and accident prevention.’

HSEQ runs from an iPhone or an Android smartphone and offers six report scenarios. These predefined categories allow capture text and images documenting the deviation, accident, near miss or preventive proposal. HSEQ is localized to English, German, Norwegian, Spanish and Portuguese.


SPOC Automation releases Well Insight SCADA monitor

Monitoring, data acquisition and control of artificial lift equipment over cellular radio.

Trussville, Alabama headquartered artificial lift control specialist SPOC Automation has released Well Insight, a SCADA package for monitoring, data acquisition and control of artificial lift and other oil country equipment. Well Insight uses 3G/4G cellular or satellite connectivity to provide access to well conditions and production information from a browser.

Well Insight (WI) acts as a production data historian, capturing data and displaying trends for a well or an entire field at user-controlled time scales. Wells can be annotated to indicate significant events or changes in equipment and control settings. Notes are stored chronologically and up in trend plots. WI Well Insight lets users set condition-based, time-stamped alarms that can trigger email or SMS messages. Data can be visualized in map view or in a spread sheet. SPOC claims an extensive custom cellular network and its own IP addresses for security and control of data which is hosted on geographically separated redundant servers.


TeleCoil—downhole communication for coiled tubing operations

Baker Hughes device captures pressure, temperature and casing collar location.

You’ve heard of wired drill pipe (Oil ITJ December 2006), now Baker Hughes has come up with ‘TeleCoil,’ a downhole communication system for coiled tubing operations. The TeleCoil bottomhole assembly (BHA) is injected into the coiled tubing, transmitting data to the surface over a ‘nonintrusive’ conductor. The basic TeleCoil BHA includes pressure, temperature and casing collar location sensors. An optional logging adapter can be added to enable wireline logging, including third-party logging tools.

TeleCoil has been field tested in operations such as perforating, setting plugs, milling, cased-hole logging, acid stimulation through inflatable straddle packers, gas lifting, fill cleanouts, and fishing. An HSE-relevant functionality provides confirmation of discharge of perforating guns through monitoring of downhole temperature. Product line manager Jason Skufca said, ‘TeleCoil provides real-time insight into coiled tubing operation. Operators can see that downhole tools are functioning properly and that the tubing is accurately on depth for critical applications such as perforating and plug setting.’


Kongsberg simulates Superior’s multi-vessel operations

Marine and subsurface operations simulator prepares Alaskan workforce for complex scenarios.

Houston-based drilling service provider Superior Energy Services has taken delivery of an advanced maritime training simulator to be deployed at its new Anchorage, Alaska facility. The simulator, supplied by Kongsberg Maritime, provides ‘full mission’ training for critical operations such as ship bridge maneuvering and navigation, anchor handling, ROV and crane operations, process control, containment, and controlled pumping and flaring of hydrocarbons.

The simulator supports simultaneous operations of multiple vessels and will be used to develop best practices for surface and sub-sea activities. The simulator features two service vessel bridges, with 360 degree field of view, an offshore crane simulator, and a ‘DeepWorks’ ROV simulator from Fugro Subsea Services. A process simulator supplied by Kongsberg Oil & Gas Technologies can be used for further operator training, control system checkout and for simulating multi-phase flow production operations.

Superior’s working relationship with Shell Offshore was a primary motivator in establishing this state-of-the-art facility. Superior’s Captain Scott Powell observed, ‘We believe in ensuring our people are as prepared and properly trained as possible. It makes sense from a safety perspective, from an environmental perspective and from a business perspective—it is simply the right thing to do.’ More from Superior and Kongsberg.


AGR Solutions rolls out ‘P1’ probabilistic drill planner

Field testing leads to productized well planner. Riserless drilling achieves 500 well milestone.

Norway-based AGR Solutions Systems’ ‘probabilistic’ well planning tool, P1, has been extended with a ‘live’ project cost tracking module and a materials management option. The new components have undergone extensive internal testing prior to their introduction to AGR’s external markets.

AGR also reports that some 500 wells have now been drilled using its enhanced drilling solutions’ technologies, the cutting transportation (CTS) and riserless mud recovery (RMR) systems. The proprietary technologies use a subsea pump to assure closed-loop drilling mud circulation during the drilling of the top-hole. The systems avoid ‘pump-and-dump’ drilling, ensuring zero discharge to the environment.

RMR is used in managed pressure drilling applications in extra deep water environments to isolate the well from the pressure regime in the riser. RMR technology was used, along with casing while drilling, to set a recent world record in a deepwater well drilled by Woodside Australia.


Baker Hughes embeds Shell safety KPIs in HSE system

Macondo fast tracks health, safety and environment consolidation. Shell’s DROPS scheme shared.

The current issue of ‘Connexus’ magazine describes Baker Hughes’ revamp of its health, safety and environment (HSE) management systems in a post Macondo world. An earlier project to harmonize HSE standards was fast-tracked to meet a November 2011 deadline for new Gulf of Mexico regulations. The result is that best practices from seven former divisions have been rolled into enterprise wide controls.

The new common operating standard has been audited by Shell under its global wells business improvement plan—an initiative that Shell shares with its service providers including BHI. The plan includes joint management visits and a ‘deep conversation’ between business managers from both companies. Site visits enable sharing of good practices and opportunities for operator/provider collaboration.

HSE manager Don Elam observed, ‘If you don’t meet Shell’s requirements, you don’t work for them.’ Scorecard key performance indicators track inter alia, lifesaving, dropped objects prevention, temporary pipe work and process safety. Shell recommended that BHI’s operational controls documents, some of which were 200 pages long, be summarized in a four to five page best practice document with hyperlinks to other documents. Lessons learned from Shell’s work on life saving rules, staff competency development and Shell’s ‘Drops’ dropped object prevention scheme were shared during the project.


Ecom’s ATEX-certified handheld

BP-endorsed, curiously named ‘i.roc Ci70 –Ex’ ruggedized handheld runs Windows embedded OS.

Ecom Instruments has teamed with Intermec to develop an intrinsically safe, ruggedized handheld mobile computer for use in hazardous environments such as oil refineries, drilling platforms and petrochemical plants. The curiously named ‘i.roc Ci70–Ex’ runs applications originally developed for Intermec’s 70 series devices running Microsoft Windows embedded 6.5.3.

Ecom claims conformance with ‘all key global certifications, from NEC to ATEX and IECEx—making it a truly global platform.’

The device has backing from BP whose technology consultant Mike Haley observed, ‘At BP, we focus on a few technologies with immediate impact that will also benefit long-term business needs. The integration of current and emerging radio communications for large industrial sites could yield great efficiencies for business operations.’

Wireless networking capabilities include 3G CDMA or UMTS, WiFi and Bluetooth 2.1. A long-range bar code reader and Radio Frequency Identification (RFID) capability are included and a range of peripherals and accessories are available.


IFS—super fast document replication for oil and gas

IFS Applications module promises reliable data transfer in the face of ‘challenging’ communications.

Linköping, Sweden-headquartered IFS has announced ‘Instant Replication’ (IR) for replication of data between offshore rigs or vessels and onshore facilities. The solution replicates documents and transactional data used in in maintenance and operations via satellite. IR is a component of IFS’ ‘IFS Applications’ flagship industrial ERP suite of over 100 modules covering the lifecycle of contracts, projects, assets and services.

Real time document synchronization between offshore and onshore localities means that critical information for decision-making is always current. The solution is said to replicate data on an ‘application level,’ rather than on a ‘database level,’ for improved data QC error management.

IFS CTO Dan Matthews said, ‘IR assures reliable data transfer under the most challenging conditions. These characteristics make it ideal for upstream players within the oil and gas industry, such as marine seismic companies and drilling contractors, whose offshore units require quick and reliable data replication and transfer of critical business data.’

IR was developed in collaboration with some of the leading companies within the oil and gas industry and used an ‘agile’ development methodology, drawing on customer feedback and IFS’ own oil and gas experience. The solution is a component of the IFS Applications 8.0 release due out this month. IFS oil and gas clientele includes PGS, Seadrill, Northern Atlantic Drilling, Microseismic, Babcock, Heerema and Technip.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.