November 2010


Fraunhofer SDPA

PGAS-based Seismic development and processing architecture, ‘parallelization made easy.’ ‘Excellent’ performance sans tuning for multi core, parallel and heterogeneous GPU/CPU environments.

With ever increasing core counts, and thousand node architectures, there is a growing need for an efficient parallel programming paradigm. But, according to the Fraunhofer Institutes’s Franz Pfreundt, speaking at the High Performance Computing workshop at last month’s Society of Exploration Geophysicists convention in Denver, in parallel computing and HPC, ‘simple ideas don’t work!’ Parallelism and data management are hard problems. Simply put, the popular SeismicUnix (SU) package does not parallelize across terabytes of data. Today, parallelizing a good serial algorithm through MPI and integrating into a processing workflow takes months.

To address this, Fraunhofer has developed the Seismic Data Processing Architecture (SDPA). SPDA replaces the popular MPI infrastructure with a virtualized global memory for ‘persistent, performant and fault tolerant storage.’ The global memory is based on GPI/Infiniband (PGAS), works at ‘wire speed’ and scales to petabytes. ‘Where MPI fails, GPI takes off.’

Fraunhofer then addressed programming, ‘the hard part,’ separating workflow orchestration (a.k.a ‘coordination’) from algorithm development. Algorithms are written as modules in high level languages. SDPA coordination and low level memory management uses modified PetriNets and ‘easy to learn’ abstractions. SDPA allows any programming language to be used—Fortran, C or Java. SU modules like segyread can be piped through sugain etc., just as in regular processing. Real world workflow complexity for data load, output and storage is handled by SDPA, leaving geophysical coding to the geophysicist. Orchestration, described in XML, provides auto parallelization taking account of hardware constraints such as GPU/CPU availability. SDPA’s ‘magic’ is to recognize data and task parallelization. SPDA uses a functional language, described as ‘close to the compiler.’ Programs can be dynamically re-written on the fly to optimize for hardware configurations. A parallelized version of SU ‘su-parallel’ was developed in 2 hours during a Statoil workshop.

Fraunhofer is now working on a library for Kirchoff migration using a simulator to adapt code to machine configurations. Some benchmarks compare well with hand-crafted optimization. Pfreundt concluded that SDPA is ‘parallelization made easy’ and provides excellent performance without much tuning. Pfreundt told Oil IT Journal ‘The solutions we have developed make the system highly productive for the seismic processor. Getting optimal performance out of a GPU or CPU multithreaded module is left to the HPC expert. We want to make the life of a geophysicist easier and stimulate algorithmic innovation delivering a quick path to cluster wide parallelization.’ More on the SDPA Consortium from www.oilit.com/links/1011_0.


Hosted EOP

Weatherford hosts CygNet Software’s Enterprise Operations Platform—targeting production optimization for shale gas producers.

Weatherford has signed with CygNet Software to offer its Enterprise Operations Platform (EOP) as a hosted solution for gas well optimization. The ‘exclusive’ OEM agreement extends Weatherford’s production control, monitoring and analysis solution to smaller gas producers.

The Software as a Service/Cloud-based environment lets users collect data from Weatherford’s plunger lift controllers, deliquification devices, rod pump controllers, and variable speed drives and use the data in EOP and Weatherford’s Lowis analytical and data management tools. EOP collects real-time and historical data from field, production and business systems, giving users at all organizational levels the information they need to support daily operations and make informed strategic decisions.

Weatherford VP Dharmesh Mehta said, ‘We can now provide a complete solution for shale gas production optimization as well as more extensive solutions to CygNet’s clients.’ CygNet claims that over 100 customers, representing more than half of North America’s domestic gas production, use EOM. More from www.weatherford.com.


Making Software—what really does work?

Editor Neil McNaughton reviews ‘Making Software’ a new book from O’Reilly. Subtitled ‘What really works and why we believe it,’ the 600 page oeuvre provides more editorial fodder than answers.

The book ‘Making Software*,’ subtitled ‘What really works and why we believe it’ had me drooling with anticipation while waiting for my review copy. I was eagerly mulling over a few questions that I would like to see answered such as what is better, open source or proprietary development and what programming languages are best.

Making Software (MS) is a huge book, nearly 600 pages, although somewhat padded out with references. MS has multiple authors, most from Academia and a few from Microsoft. The result is a lack of an overall theme. In the preface the chapters are referred to as ‘essays’ and indeed, MS contains equal measures of editorializing and research.

Behind the question ‘what really works’ is a widely held perception that software development often doesn’t work—or at least not very well. Tales abound of poorly managed developments especially in government. A chapter on Systematic Reviews debunks this viewpoint—in particular the Standish Group’s ‘Chaos Report’ that claimed to find ‘huge project overruns.’ Apparently Standish Group ‘deliberately solicited’ failure stories—thus ‘one of the most frequently cited papers on project overruns cannot be trusted.’

As an older programmer whose skills were honed on Fortran and Visual Basic I confess a certain unease with modern ‘object oriented’ languages. One of my questions was, ‘Is OO good for productivity?’ Here I was disappointed. MS only provides a 20 page paper on language comparisons and there is no mention of Fortran, SQL, Mathcad, let alone Cobol. Lutz Prechelt’s (Berlin University) comparison of web development productivity covers Perl, PHP and Java to conclude that Java came out both top and bottom (different teams working on the same problem). While PHP is good ‘for small programs,’ it is ‘more important to have the right programmer than the right language!’ I guess anyone could have told you that.

I was still interested to see if anyone else had the same trouble as I did with ‘modern’ languages. The chapter on API usability by Microsoft’s Steven Clarke had eight experienced VB6 programmers tackling the .NET paradigm shift—exactly where I came unstuck! None could complete a simple task in the allotted time—which Microsoft described as ‘a complete bust.’ Video tapes showed programmers scratching heads over the System.IO namespace. This led to a new field of ‘cognitive dimensions,’ an approach to ‘analyze and better understand’ such difficulties. In the end a new class was created to make the programmers’ lives easier. If you are struggling with .NET arcana though, you are unlikely to get Microsoft to rewrite the API for you. Apparently ‘When [Microsoft] started looking at API usability we didn’t know how to do it or if the effort would be worthwhile.’ Microsoft is seemingly a convert to usability now—but why was there ever a doubt?

Another of my questions was ‘Is the open programming paradigm that has given us Linux and a multitude of other applications better than the closed proprietary development of Microsoft et al.?’ This question is addressed in the chapter Quality Wars: Open Source vs. Proprietary Software by Diomidis Spinellis (Athens University) whose work was part funded by EU 6th Framework. Spinellis scored the different operating systems on a variety of quality metrics to find that OpenSolaris has the highest balance between positive and negative marks while Microsoft’s Windows Research Kernel has the largest number of negative marks and Linux, the second highest number of positive marks. ‘The most we can [conclude] is that OS does not produce software of markedly higher quality than proprietary software.’ This is a pretty interesting conclusion in so far as the quality of Linux, developed by a team of enthusiasts, is ‘not markedly higher’ (i.e. it is a bit better!) than that produced by the largest software company in the world, which has of course access to every line of Linux code in existence!

My final question was, ‘how is the programming world shaping up for multi core and clusters,’ a key topic for the seismic processing community. MS is silent on the subject. There is no indication as to what might make multi core or parallel software development ‘really work.’

A chapter on ‘why is it so hard to learn to program’ by Mark Guzdial of the Georgia Institute of Technology riffs on about poor pass rates in computing courses to conclude that computing education research is a ‘fledgling field’ (really, I took a course in 1970—that is one heck of a long fledge!), that students struggle to learn computing, and that we need more tools to measure their progress. No suggestion here that the objects and obscurantism has made learning programming harder!

Dewayne Perry (University of Texas at Austin) studied a million plus lines of C to ask ‘where do bugs come from?’ This is one of longest chapters in the book—but it is frustratingly inconclusive. The authors liken the design of their empirical study to the act of coding itself. It is impossible to create a ‘bug free’ study. I would extend the analogy a bit further—locating a concise conclusion from Perry’s study is like finding a bug in code too. There is one interesting, if rather obvious finding. It emerges that ‘problems with requirements, design and coding account for 34% of the total bugs.’ Thus the ‘fastest way to product improvement is to hire domain specialists.’ This is because ‘lack of domain knowledge dominates the underlying causes of bugs—so hire knowledgeable people.’ A more contentious conclusion is that ‘few bugs would have been solved by ‘using a better programming language [than C].’

As an old fart I would summarize the situation as follows. Today’s programming languages address concerns that are far removed from those of business users. Programmers have turned themselves into an elite breed of experts with arcane knowledge of .. programming. This has disenfranchised domain specialists. Back in 1970, a scientist buddy of mine boasted of learning (Fortran) in a single night. Today it might take you a night to get to grips with a simple ‘Hello World’ app—if you are a fast learner.

* Edited by Andy Oram and Greg Wilson. O’Reilly, 2010. ISBN 9780596808327—www.oilit.com/links/1011_16.


Open Text Semantic Network—an exclusive test drive

New cloud-based smart search engine trial leverages www.oilit.com information resource.

Following its acquisition of Nstein earlier this year, enterprise content management specialist Open Text has re-cast Nstein’s technology into the Open Text Semantic Network, a cloud-based solution for indexing and searching large volumes of text. OTSN is a search-driven content delivery platform developed in PHP/Java embedding Open Text’s content analytics engine, Apache’s SOLR and Lucene.

Oil IT Journal signed up for the OTSN Beta trail and had the system index the 4,000 plus articles on oilit.com. You can view the results as a ‘faceted’ search on www.oilit.com/links/1011_11*. We used the search engine to find a reference to ‘JMP’ while researching our book review below. The search returns one direct reference, to SAS’ PAM offering. This is the easy part—we got this using our vanilla ‘FreeFind’ search tool. But OTSN also returns a list of half a dozen other references that are, as it were, one concept away from ‘JMP’. These point to articles on the use of SAS in Total, our review of SAS’ GGRE analytics in oil and gas and two non SAS pieces on predictive maintenance from Emerson, IBM and Matrikon. The latter actually refers to the use of a competitive stats package from StatSoft.

What OTSC is doing is recognizing key concepts in the stuff it finds with free text search and kicking off smart search for related topics. It seems to work. You could argue that there is likely more ‘near relevant’ material to be found on oilit.com. But on the other hand it is nice to have five pertinent suggested documents. The same ‘JMP’ search on Google returns just over 8 million hits. This is pretty impressive stuff for a bare bones implementation of semantic search. There has been no tuning of the tool to this website, no use of the API at all.

As the technology evolves, OpenText plans to add ‘true’ Semantic Web capability to the toolset, exposing detailed annotations from the annotator. This will let developers blend search results from a ‘local’ resource (such as Oil IT Journal) with public semantic datasets such as www.DBpedia.org (an RDF Database of Wikipedia). Likewise, terms recognized as geographic or other structured entities such as ‘Paris’ will be recognized as such and be linkable to databases such as www.geonames.org.

Open Text is also developing more mainstream functionality for OTSN with plugins for document stores such as Microsoft SharePoint or its own Content Server. These will enable ‘find the expert’ type searches across large organizations.

* The trial was set up on the public www.oilit.com website. Site licensees may have to log out of the corporate site to test drive on the public site.


Analyzing and interpreting continuous data using JMP

Almost a textbook, SAS Press’ step by step guide provides real statistical insights.

There is always the hope that a new book*, even a product manual, will provide insights into those long forgotten (or never really grasped) fields that crop up again and again in one’s everyday life and work. Geoscientists and engineers of all persuasions are constantly confronted by data and expected to extract meaning from it. At the simplest level this is easy—we regularly cross plot this against that to extrapolate, interpolate and what have you. But there often remains a nagging suspicion that we have left something out of our analysis. Is the correlation significant? Is there enough data from which to draw a conclusion at all? The book ‘Analyzing and Interpreting Continuous Data’ (A&ICD) is a step by step guide to the use of SAS’ JMP statistical package to do just that—to seek out causality in data sets and back up one’s conclusions with a demonstration of their significance.

According to Wikipedia, JMP stands for ‘John’s Macintosh Project’ after JMP’s original author John Small, a rather embarrassing factoid that A&ICD glosses over. The toolset has application in the oil and gas vertical. JMP’s design of experiment functionality was used by Shell to design its cyclic steam stimulation program on the Peace River Carmon Creek tar sands project. SAS uses the tool as a front end to its ‘PAM’—Preventative Asset Maintenance (Oil ITJ November 2008).

A&ICD provides examples from a wide range of fields including manufacturing. The most immediate oil and gas application is in the downstream, as witnessed by the glowing endorsement from Dow Chemicals’ lead data miner and modeler Tim Rey. But anyone who is looking at a cross plot of anything—from seismic velocities to net to gross ratios is an implicit user of these techniques.

The book fills the promise of educating the reader—in part because it is derived from a US National Institute for Science and Technology (NIST) online resource** covering much of the same material—but without the SAS/JMP focus.

Not being specialists, we can’t say how original the material in A&ICD. But it has a ring of authority about it. We started with a dip into the Overview of Exploratory Data Analysis (EDA) which sets the scene with a vivid discussion of the philosophy behind the science behind the statistics. This is definitely not padding, but a serious look at what we are really trying to achieve—before setting out to explain how to achieve it. The same pattern repeats in each chapter, with a look at the problem description, key questions and tools—then an in depth discussion of the topic in hand. Only after this scene setting is the JMP toolset use discussed.

A&ICD starts with a introduction to basic statistical concepts, then works through EDA as above before moving on to characterizing manufactured products and materials.

There is a distinct bias towards discreet manufacturing—there is no mention of design of experiment in A&ICD, let alone of geostatistics. But if you use, or have wondered about using fancy statistics, be they six sigma, variance analysis or whatever, in any context, then this book will provide a thorough background in the underlying math and science that you really ought to have before leveraging these techniques and software tools.

* Analyzing and Interpreting Continuous Data Using JMP, A Step by Step Guide. José and Brenda Ramirez. SAS Press 2009. ISBN 978-1-59994-488-3 and www.oilit.com/links/1011_21

** The NIST handbook is available online at www.oilit.com/links/1011_20


Earth Sciences Associates GOM3 user meet

ESA rolls-out ‘GOMsmart’ web-based map interface. Anadarko describes remote services usage.

The 2010 User Group meeting of Earth Sciences Associates (ESA) GOM3 was held in Long Beach last month. ESA’s GOM3 is an ESRI GIS-based decision support tool for Gulf of Mexico users of data sources from the US Bureau of Ocean Energy Management (formerly the Minerals Management Service) and third parties including GeoMark, GXT PFC Energy and others.

Will Ghomi and Kevin Shows (Anadarko) presented a paper on ‘Using remote services in GIS, the good, the bad and the useful.’ Remote services come in several flavors - ESRI map and image services, OGC-compliant services and Google/KML streaming services. The advantage of an online feed is that data is always ‘fresh’ and there is near zero data management. Proper use of remote services makes it possible to enforce company-approved symbology and reference data sources. On the downside, there is less user control of data, performance may be poor and not all E&P applications can consume such services. Third party downtime can also create problems of availability.

Anadarko currently uses a variety of remote services that include ‘pre-packaged’ connections in ArcCatalog, browser connections, ESRI layer files with embedded remote services and the GeoPortal, a central catalog of available services. Anadarko uses Bing Maps, ArcGIS Online, Spatial Energy/I3, NOAA and other weather feeds, Premier Data Services, Google Earth and Yahoo’s geocoding and internal ArcGIS/Image Servers.

ESA’s Simon Wright showed off the new GOMsmart interface, a web-based data viewer, reporting and mapping toolset. GOMsmart builds on existing GOM3 functionality with fine-grained customization of the interface with bespoke layers , standard templates and enhanced data I/O. ESA is currently mooting adding support for ESRI’s freeware browser, ArcGIS Explorer. More from www.earthsci.com.


UKOGL goes live

Lynx rolls out online access to UK geophysical dataset. Download nav data and JPEG seismic lines.

Lynx Information Systems, custodian of much the UK’s onshore oil and gas data has announced ‘UKOGL Live,’ a new web map interface to the UK Onshore Geophysical Library. UKOGL Live provides access to a ‘plethora’ of data, including license block information, 3D survey outlines and onshore fields. The database allows map or text based search for navigation data for onshore seismics with click though access to high resolution ‘free-to-view’ images of the SEG-Y data.

Users can view well locations and access intersected formation top depths in downloadable PDF format. Regional seismic profiles are also available as downloadable JPEG images. Lynx used the ArcGIS JavaScript API to access the data held in ArcGIS Server. More from www.oilit.com/links/1011_10.


Worldsensing, Octave Reservoir team on seismic monitoring

Proprietary wireless technology at heart of new ‘smart oilfield’ solution.

Octave Reservoir Technologies has signed with Barcelona-based Worldsensing for the development of a near surface monitoring solution leveraging a proprietary wireless seismic acquisition system. Worldsensing eschews Wifi, Zigbee and UWB-based communications preferring instead its own ‘cutting edge’ wireless technology which promises high bandwidth, range and ‘ultra low’ power consumption.

Earlier this year Worldsensing won an IBM SmartCamp award for its prototype ‘smart car parking’ solution and its ‘close alignment with IBM’s Smarter Planet vision. Octave is to turn Worldsensing’s technology into a smart oilfield solution. More from www.worldsensing.com and www.octaveresrvoir.com.


HPC Society inaugural meet, Houston

New Society hears from Bureau of Economic Geology, Texas Multicore Technologies.

While the ‘official’ HPC workshop was held alongside the Denver SEG, the upstart HPC Society held its inaugural meet in Houston. Executive Director Gary Crouse introduced the new Society and kicked off a very full day of presentations from Academia and vendors.

Industry watcher Addison Snell from Intersect360 outlined current HPC ‘trends’ concluding, inter alia, with the rather astonishing claim that Windows is ‘growing in HPS with currently 12% of the installed base.’ Snell obviously is not looking at either the TOP500 (page 5) nor the seismic vertical (page 7).

A presentation from Doug McCowan and Vladimir Bashkardin of the Bureau of Economic Geology hailed the return of the ‘SIMD’ style of parallel computing of the 1980s with the arrival of the GPU.

CTO Dan Cooke’s company Texas Multicore Technologies was set up to commercialize SequenceL, a NASA-funded solution to parallel computing. Early tests on some (non seismic) problems show promisingly linear speedup with increasing core count. SequenceL was designed to allow developers to write code without having to worry about how it should be computed. A ‘ThreadedS’ service is available to companies who wish to accelerate legacy code bases with TMT’s technology. More from www.texasmulticoretechnologies.com and www.hpcsociety.memberlodge.com.


Software, hardware short takes

Oracle, Knowledge Reservoir, EnergyNavigator, Phoenix International, Blueback Reservoir, Earthworks, Austin Geomodeling, Ikon, Exxon, GE, Geosoft, Geomodeling, Neuralog, Badger ...

Oracle has posted some benchmarks for Schlumberger’s Eclipse 300 running on its Sun hardware alongside a 3D Prestack Kirchhoff time migration job. The tests showed a ‘potential opportunity for consolidating applications using Oracle’s grid engine resource scheduling and virtual machine templates’—www.oilit.com/links/1011_15.

Knowledge Reservoir is offering information management services centered on the development of Wiki Portals for its upstream oil and gas clients, leveraging work performed for a major. Portals are built around KR’s template and base content, extended with the client’s own content and best practices—www.knowledge-reservoir.com.

Version 7.3 of AFE Navigator offers mobile AFE approval from users smart phones along with email reminders and support for 64-bit Windows desktop and server—www.energynavigator.com.

John Deere unit Phoenix International’s new Satellite Gateway tracks mobile assets over the Iridium-based Quake Global short-burst low latency data transceiver—www.phoeintl.com.

Blueback Reservoir and Earthworks have released ‘ESI,’ a seismic inversion plug in for Petrel. ESI provides deterministic and stochastic inversion algorithms for pre and post stack inversion—www.ocean.slb.com.

Austin GeoModeling has rolled out a geologic scenario manager (GSM) and a Petrel plug-in for its Recon flagship. GSM provides deterministic analysis of geological uncertainty, automating management of multiple realizations and alternative scenarios—www.austingeo.com.

Ikon Science’s RokDoc 5.5 release bundles Statoil’s rock physics templates, a ‘foundation for clastic lithological analysis and improved rock physics modeling’—www.ikonscience.com.

ExxonMobil has released ‘Fuel Finder,’ an iPhone app providing drivers with real-time maps, driving directions and station information for more than 10,000 US gas stations—www.exxonmobil.com.

V 2.0 of GE’s ‘Proficy’ package claims to reduce the cost of process control in verticals including oil and gas. System integrators, OEMs and end users can create re-usable blocks and graphics tailored to their own processes—www.ge.com.

Geosoft’s new Exploration Information Management Solution (EIMS) bundles professional services, geophysical workflows and the DAP Server for hosted potential field and image data— ww.geosoft.com.

Geomodeling Technology’s AttributeStudio 6.6 now includes microseismic data analysis and fracture modeling software for unconventional oil and gas exploration—www.geomodeling.com.

Hyperion’s upgraded interface to Yokogawa’s Centum DCS platform now offers support Prosafe-RS, Field Bus device display, configurable alarming and record and playback operator actions—www.hyperion.com.cy.

Moblize’s real-time drilling operations center is manned by ‘experts in Witsml data transfers’—www.moblize.com.

A new release of Neuralog’s eponymous well log digitizing system includes ‘intuitive tools’ for capturing lithology tracks, ROP, gas curves and associated text. New ‘auto-stream’ tracing aids capture of ‘difficult’ curves—www.neuralog.com.

Badger Explorer reports a ‘successful’ demonstration of its improbable autonomous drilling device, ‘processing drill fluid and compacting solids into a plug’ while fully submerged—www.bxpl.com.


Super Computing 2010 November Meet, New Orleans

Petaflops are so 2009! Exaflops on 2020 horizon. Linux unsung hero of TOP500.org HPC list.

A poster* on the top500.org website sums it up, the fastest single high performance computer in the world broke the petaflop** barrier a year ago and is racing on to the exaflop which should be achieved sometime around 2020. At the SC10 Conference on High Performance Computing this month the winner was ... China’s Tianhe 1A, located at the National Supercomputer Center in Tianjin with 2.57 petaflops. Tianhe is a CPU/GPU combo with a core count of 186,368 and a healthy 260 terabytes of RAM. Three out of the top five machines use Nvidia GPUs (graphics processing units) to accelerate computation.

The Chinese machine is reported as being used, inter alia, for oil and gas exploration (seismic imaging). Seven machines in the TOP500 list broke the petaflop barrier. But it should be noted that commercial seismic processors are conspicuous by their absence from the TOP500—no doubt due to a lack of time and inclination to reveal their secret sauces.

Despite its 2007 claim to ‘dominate’ HPC (Oil IT Journal March 2007), Microsoft continues to slide down the TOP500. The Shanghai Supercomputer Center’s Dawning 5000A running Microsoft Windows HPC Server 2008 took the number 10 slot in 2008 is now down at N° 35 in an apparently unchanged configuration.

Intel still dominates the high end processor market with an 80% share. AMD’s Opteron family follows with 11%. IBM Power processors follow with 8%. Power consumption is a limiting factor in HPC. 25 systems on the list use over a megawatt of electrical power. IBM’s prototype BlueGene/Q system set a new record in power efficiency with a value of 1,680 Mflops/watt, more than twice that of the next best system.

The more or less unsung hero of the TOP500 remains Linux with 99 out of the top 100 machines running Linux or a derivative. Those who believe that Linux’ success is limited to the engineering sector may be interested in a report*** in Computer World UK which revealed that the London Stock Exchange’s new Linux-based system is now delivering world record networking speed with 126 microsecond trading times.

The Linux-based system replaces Microsoft .Net technology ‘criticized on speed and reliability and grappling with trading speeds of several hundred microseconds.’ The New York Stock Exchange is also a Linux shop. More from top500.org. You can also download an Excel spreadsheet with the complete TOP500 stats from www.oilit.com/links/1011_13.

* www.oilit.com/links/1011_12

** 1015 floating point operations per second. Top500.org measures this using the standard HPLinpack program. Your mileage may vary.

*** www.oilit.com/links/1011_14


Society of Exploration Geophysicists, Denver

Shale gas, Tarantola memorial, DHI Consortium, Geoprobe on Windows 7, Paradigm Skua 2011.

Shale gas is the theme of the year. The AAPG in New Orleans, the SPE in Florence and now the SEG in Denver all devoted their plenary opening sessions to the non conventional boom. We will spare you another blow by blow account—although the SEG did pull off a coup, getting US Secretary of the Interior Ken Salazar onstage—all dressed-up in designer jeans and bolo tie.

The industry’s argument goes, 1) that gas is abundant and good—almost green, 2) that coal is bad—but has a strong lobby and 3) that more needs to be done to reassure in regard of frac fluids. All the speakers thought that they were playing to the oil gallery. Except that this was the geophysics gallery! Geophysics did not even get a mention until, answering a questioner, Range Resources president Jeff Ventura stated that in the Marcellus shale, 3D seismics was used to assess rock ‘frackability’ although such use ‘is not widespread.’

In the Special Session on the ‘Road Ahead,’ the future turned out to be pretty much what is already happening. Ed Biegert, Shell, enumerated a multiplicity of remote sensing technologies, winding up with a plug for Shell’s ‘LightTouch,’ sniffer technology that dates back to 2003. Craig Beasley (WesternGeco) asked, after a century of seismics, ‘what’s left to do?’ The answer is ‘more of what we are already doing.’ In other words, multi/wide-azimuth and anisotropic acquisition with 30,000 channel onshore ‘super crews’ all amount to a ‘quiet revolution’ in land acquisition. Simultaneous source techniques like WesternGeco’s own ‘SimSrc’ and BP’s ‘ISS’ have brought about a 10 fold productivity gain. ‘Today, new acquisition is driven by requirements not cost.’ On the processing front, full waveform inversion can now produce a 40 Hz sub salt image, an ‘amazing’ improvement over the last 5 years. The main remaining challenge is aliasing, even though ‘you may not realize it until you try to do new stuff with data.’

There was a good turnout for the special session in memory of Albert Tarantola,* professor at the University of Paris and seismic processing maverick-cum-luminary. From a background in astrophysics and general relativity, Tarantola turned to inverse theory and one of most difficult inverse problems in geosciences—the ‘adjoint’ method of seismic inversion. This led to the incorporation of ‘a priori’ models of ‘acceptable’ geological structure, multiple realizations and a ‘pragmatic’ way of integrating geostatistical and geophysical data that underpins much of modern seismic processing. The new approaches implied compute horsepower way beyond what was available at the time, leading to early proof of concept tests on a ConnectionMachine 2. But this work came up against the ‘curse of dimensionality’ and the development of a new ‘smart’ Monte Carlo approach. One of Tarantola’s seminal papers was described as a ‘litany of failures!’ But even these were ‘tremendously instructive’ and influential. Tarantola could be irritating to colleagues—but he managed to get to the core issue and test folks’ convictions. His legacy was evidenced by the SEG’s five sessions on full waveform inversion.

We attended a great presentation on the SMT booth by Mike Forest who chairs the Rose & Associates Direct Hydrocarbon Risk Consortium. Forest, who was previously with Shell Oil, traced the 40 year history of direct hydrocarbon indicators, starting with a ‘health warning,’ that bright spots, flat spots, AVO** anomalies and so on do not actually indicate the presence of hydrocarbons! Shell coined the ‘bright spot’ term back in 1960, in the face of considerable skepticism. This was followed by a period of ‘peak and valley’ days going from optimism to pessimism. Management support and the digital revolution helped with better data and produced significant wins in a 1970 lease sale with the 750 million barrel Eugene Island 330 field which had been identified with the amplitude/background plot, using Aubrey Bassett’s ‘Payzo’ program.

These early successes were followed by pitfall-induced failures and the realization that, for instance a 10% gas saturation gives the same reflection coefficient as an 80% saturation—and this is still a problem today. Successes including the Gulf of Mexico Bullwinkle, Popeye and Tahoe fields confirmed the general usefulness of the techniques. This led to the establishment of Rose & Associates DHI Risk Analysis Consortium*** in 2001 with 35 members. The consortium is developing a systematic and consistent work process for DHI analysis—not a ‘silver bullet.’ For Forest, the three top issues for DHI are ‘calibration, calibration and calibration!’ You need to check the whole processing sequence, look at the gathers and tie everything to rock physics. Pitfalls include wet sands (23% of all failures in the Consortium’s database), geopressure, low saturation gas and hard rock above sand. One example from offshore Barbados shows a huge flat event—but the gas was long gone! Recent DHI successes include the billion barrel discovery in Uganda’s Lake Albert, Ghana’s Jubilee field and (perhaps) McMoran’s Gulf of Mexico Davy Jones discovery.

Landmark was showing off a port of Geoprobe to 64 bit Windows 7 on a 6xHD, 11 megapixel display from Mersive. Geoprobe is now a true ‘big data solution’ for both Windows and Linux as witnessed by the 200 million triangle surface. A new data format supports multi-CRS data across NAD zones, leveraging the OpenWorks R5000 data infrastructure. Geoprobe data can now reside either in memory or be streamed from disk and is always at full resolution.

Paradigm was pitching its 2011 release which is to redefine interpretation around the Skua flagship. The company is now also leveraging its Epos 4 data infrastructure with ‘preview technology’ leveraging ‘tera size’ data and models. Distributed, secure computing, Microsoft Windows ports and HPC with GPU support also ran. Today’s interpreters are overwhelmed with attributes and need systems that are engineered to support multi attribute interpretation, co-visualization and holistic workflows spanning cross sections to property models, gridding and flow simulation. Data management is a new focus for the 7,500 Epos users who are going ‘from terabytes to petabytes in prestack data roaming.’

The best laid plans of mice, men and the marketing department do go awry. It behooves us as the intrepid reporters that we are to note that on the WesternGeco corner of the Schlumberger booth, a demo of very high end Gulf of Mexico seismics was running, not on Petrel and Windows, which as you know is capable of displaying ‘all the Norwegian data,’ but on GoCad and RedHat Linux! Can’t WesternGeco afford a Petrel license? Whatever the answer, the Paradigm folks on the booth opposite were amused.


* www.oilit.com/links/1011_5

** amplitude vs. offset

*** www.oilit.com/links/1011_6


Ocean User Group, Denver

State of the Ocean, Petrobras’ BR-PlugIn, Ocean in education, Resoptima, Blueback Reservoir...

Around 60 turned out for Schlumberger’s Ocean User Group* held in Denver following the SEG Convention. In his ‘State of the Ocean’ address, product manager Brad Youmans described Ocean as a technology and business ecosystem of 30 oils, 40 software houses and 20 academic institutions. Ocean lets users ‘focus on science, not on infrastructure.’ For vendors Ocean is an integration pathway to Petrel. Ocean also is having a major impact within Schlumberger. The wireline, drilling and completions divisions are leveraging Ocean to get their products’ data into Petrel’s 3D canvas and Ocean is now supported by WesternGeco with seismic data management, 2D components and a geoscience window for processors. The Ocean Store launched earlier this year and there are over 50 plug-ins available and a further 50 academic projects are ripe for commercialization. Some 870 developers have taken the five day certification course.

Luiz Araujo revealed that Petrel is now Petrobras’ ‘official’ reservoir characterization solution. Petrel has been fine tuned to Petrobras’ needs in the ‘BR Plug-In’ program. BR-Plug-In builds on earlier test developments with Ocean and aims to homogenize coding, focus development effort and make data connections easier. When Petrobras saw Microsoft Visual Studio 2010, ‘it was like Alice in Wonderland!’ And a consolidation on Visual Studio Team Foundation Server quickly followed. BR-Plug-ins developed include PetroLibNet for licensing, simulation, PetroLibOcean, visualization, units and CRS management. For Araujo, despite some limitations, ‘Ocean is a wonderful framework—the best we have found to date.’

Brad Wallet (Oklahoma University) proselytized enthusiastically for the C# programming language, which he sees as being of ‘more value to employers’ than Matlab, Fortran and Java—hence the University’s initiative to train geoscientists in Ocean programming and prototype new technologies. Student reactions have been mixed, some don’t like Ocean. Rapid prototyping can make for poor code. Ocean is a rich, complex environment with a significant learning curve. Ocean documentation is broad rather than deep and not always up to date.

Tore Felix Munck described how he quit his job at Trondheim University to found Resoptima and devote himself full time to Ocean development. Resoptima has developed a ‘practice-driven test driven development’ methodology for Ocean around JetBrains’ ReSharper Visual Studio add-on. The key is to keep tests simple and consistent. TypeMock is used to test code mock ups, ‘faking’ as yet undeveloped objects and to test methods under development. FinalBuilder is the recommended tool for test automation, TypeMock Isolator the recommended mocking framework.

Rob Hall introduced Blueback Reservoir’s Spatial Image Connector (SIC) for Petrel, developed in partnership with Spatial Energy. SIC allows digital imagery to be streamed into Petrel from web map servers. SIC, complete with an Ocean map view, menus and CRS management took a week to prototype. Apache Corp is a user.

In a webinar follow-up to the OUG, Schlumberger’s Thomas Gehmann explained how the Ocean API has been extended with CRS management. ‘CRS-naïve’ Petrel mandates conflation policies for managing spatial data as it is imported and exported. This is now possible with the concept of a ‘hub’ projection and transformations that leverage EPSG transformations. But Gehman warned, ‘Even if we try to keep it simple, CRS problems are not really related to software. Geodesy-related issues are our clients’ real problem.’

* www.oilit.com/links/1011_7


SEG High Performance Computing Workshop

Sandia Labs’ Trilinos, Fraunhofer’s Green Wave, Maxeler’s MaxRing/MaxBox and Repsol on GMAC.

Michael Wolf (Sandia National Labs) described how to achieve ‘painless parallelism’ on multi core architectures with the ‘Trilinos’ object-oriented framework for large scale science. Trilinos lets scientists write once and run on a variety of shared memory architectures including multi core CPUs, GPU and NUMA. More from trilinos.sandia.gov.

Vlad Bashkardin (UT Austin) was circumspect on the often reported ‘orders of magnitude’ performance hike of GPUs. These frequently compare optimized GPU code against an outdated CPU. Using reverse time migration and test data supplied by Chevron, UT’s Texas Advanced Computer Center reports a more modest 10 to 15 fold speedup for RTM code—itself something of a best case for parallelization. For more sophisticated imaging 10 x and 200 Gflops would be ‘excellent.’

Jens Kruger outlined the Fraunhofer Institute’s ‘Green Wave’ Project addressing power hungry HPC tasks like RTM.

Green Wave leverages Tesilica’s customizable processor cores as deployed in the Berkeley Labs Green Flash climate modeling project. Tensilica allows for the development of a processor that is optimized for a particular application. Before the chip is actually fabricated, the Berkeley Emulation Engine* is used to predict performance upfront. GW falls between the Nehalem and Tesla in terms of performance but wins out in terms of ‘megapoints’ per watt. The first actual chip will come off the fab line in Q4 2011.

Maxeler’s Oliver Pell also stood out from the GPU crowd advocating RTM on an FPGA—with another 1000 x speedup claimed. FPGAs offer terabytes of memory bandwidth—but only a 100MHz clock. Maxeler has bundled its FPGAs with a high speed ‘MaxRing’ interconnect—with MaxBoxes surrounding an X86 CPU controller. The MaxGenFD Java library promises good scalability. ‘Propagating 70Hz waves is practicable.’

Gladys Gonzalez described Repsol’s search for next generation computers and programming tools. GPU accelerators outperform the CPU by 20x but there are serious issues with host to accelerator interoperability, with code portability and maintenance. Industrial software needs industrial standards. Today programmers need to know every detail of the stack. There is good news coming from the Barcelona center for Super Computing’s GMAC** library offering a unified virtual address space and data management. OpenCL also got a plug as a potential standard replacement for NVIDIA’s CUDA.

* www.oilit.com/links/1011_8

** www.oilit.com/links/1011_9


Folks, facts, orgs ...

Acona, Kadme, Baker Hughes, IHS, Barco, Online Systems, DNV, Dyadem, eCorp, Frac Tech, FTC, GE, Algesco, Infield Systems, ENGlobal, GSE Systems, Matrix Service Co., Microsoft, Liaison Technologies, NetApp, PE Energy Solutions, Spectrum ASA, Ikon Sciences, Schlumberger, Technip...

Acona Wellpro and Kadme have rolled out the arcticweb.com portal providing free online access to public data sources pertaining to the Arctic region. Arctic Web was funded by six Norwegian oils and the Norwegian Research Council.

Baker Hughes has launched its Reservoir Navigation Services targeting optimized wellbore placement through a combination of reservoir modeling, real-time drilling evaluation and 3D/4D visualization.

IHS president and COO Jeffrey Tarr has elected not to renew his contract. He will be replaced by Scott Key, presently senior VP-global products and services.

Dirk De Man is leaving Barco and will be succeeded by Carl Peeters. Filip Pintelon replaces Peeters at the helm of the MCM business group and as COO operations.

Jack McDonald has resigned as Chairman of the Board of Perficient.

Online Systems has rebranded as Control Point Design, offering expanded 3D laser scanning and 3D CAD modeling.

DNV’s new office in Perth is headed up by Hans Kristian Danielsen. The unit brings expertise in LNG, pipelines, subsea and CCS in support of Australian ‘mega’ LNG developments.

Operational and risk management specialist Dyadem has opened a second EU office in in Starnberg, Germany, led by regional director, Uwe Schneider.

Tom Harris has joined eCorp International as COO and board member. Harris hails from BlackRock E&P.

Frac Tech has hired Marc Rowland, former Chesapeake executive as president and CFO.

The US Federal Trade Commission has appointed computer scientist, hacker, and security researcher Edward Felten as its first CTO.

GE’s $100 million Brazil Global Research Center will employ 200 researchers and engineers focusing on advanced technologies for oil and gas and other verticals. GE has also signed with Algerian Algesco, a Sonatrach and Sonelgaz joint venture, to open a service center near Algiers. The $36 million facility is the largest GE Oil and Gas service center in the world.

Kader Dicko is to head-up Infield Systems’ new Aberdeen office.

David Barr and Michael Jennings are to join ION Geophysical’s board. Barr comes from Baker Hughes, Jennings from Frontier Oil Corporation.

Neal Vizina is ENGlobal’s new president of engineering. Vizina hails from UniversalPegasus.

GSE Systems has promoted Jim Eberle to CEO and Board Member.

Mike Bradley has resigned as president and CEO Matrix Service Co. Tom Long, VP and CFO also resigned.

Ariane Jayr Coupel is the new global oil and gas industry solutions manager for Microsoft. She was previously with OXY.

Rob Consoli has joined Liaison Technologies as Executive VP of North American Sales.

NetApp has appointed Vic Mahadevan as chief strategy officer. Mahadevan was formerly VP of Marketing with LSI.

David Muse has joined P2 Energy Solutions as senior VP global sales and marketing. Muse was previously with SAIC.

Rune Eng is now CEO of Spectrum ASA. He was previously with PGS.

Steve Hunt is now senior VP Ikon Group and Richard Swarbrick is director global geopressure. Ian Edwards has joined Ikon Science as director—he was previously with CGGVeritas. Rekha Patel is VP sales US and John Turvill, manager sales Asia Pacific.

ShipConstructor Software has promoted Darren Larkins to CEO.

Schlumberger has opened a new research and geoengineering center in Rio de Janeiro, Brazil to house up to 300 scientists, engineers and technical staff.

Societe Generale has hired John Herrlin to head its US oil and gas equity research unit. Herrlin hails from Alpha One Capital Partners.

Pascal Colombani is Technip’s first senior independent director. He was formerly director of the French Atomic Energy Commission (CEA).


Done Deals

Altair Engineering, SimLab, Energy Ventures, Foster Findlay, Hart Energy, Rextag, Intertek, Profitech, Iridium, Ovation, SpectrumData, TrueMarx, Vernon & Park.

Simulation technology provider Altair Engineering is to acquire California-based SimLab. Altair will integrate SimLab’s technology and development staff with its HyperWorks computer-aided engineering operation.

Venture capital group Energy Ventures has invested £3.2 million in 3D seismic data specialists Foster Findlay Associates (ffA).The funding will be used to roll out a new generation of 3D seismic analysis tools, to double R&D and expand global sales and support. ffA expects to up its head count to over 50 in 2011 and forecasts a £12 turnover by 2013.

Hart Energy Publishing has acquired Rextag Strategies, a mapping and GIS database services company in an all-cash deal. Terms of the transaction were not disclosed.

Intertek has signed an agreement to acquire Profitech (UK) Ltd, a technology provider of advanced mathematical software modeling services serving the oil and gas industry.

Iridium Communications has closed the financing facility for its next generation satellite constellation, Iridium NEXT. $1.8 billion of financing is to come from a syndicate of nine EU banks.

Ovation Data Services has completed its acquisition of SpectrumData of Perth, Australia. The acquisition strengthens Ovation’s Asia Pacific presence in the exploration and production industry. SpectrumData continues to operate within Australia and New Zealand under the SpectrumData brand, with founder Guy Holmes as CEO.

TruMarx Data Partners has closed its latest round of funding led by newly appointed Board members Jerry Putnam and Kevin O’Hara. Other investors include Vernon & Park Capital and Walter G. Kortschak, also named to the Board. TruMarx develops the Comet web-based trading platform for the power and natural gas markets.


OSIsoft 2010 regional seminar—France

French users hear of PI’s growing role in the data center, at BP and chez GDF Suez’ Elengy unit.

OSIsoft’s PI System has a 50% market share in process industries and a growing role in the data center. Ebay for instance has around 11,000 tags for power and cooling—all of which is monitored and optimized with PI/ Microsoft and CISCO likewise use PI to monitor data center power consumption.

PI is moving steadily from its traditional ‘historian’ role to an enterprise infrastructure—with the PI Asset Framework (PI AF). This is concomitant with a move to OPC UA with tools for .NET, Java and SOA developers. Flagship client BP uses PI-AF on a Microsoft stack of SQL Server, Share Point and Office with connectivity provided by PI DataLink into Excel 2010 where the new ‘PowerPivot’ functionality consolidates multiple data sources.

GDF Suez’ liquid natural gas unit (LNG) ‘Elengy’ operates three LNG terminals in France. Before PI deployment, the company was experiencing problems with data from heterogeneous distributed control systems (DCS) and was having a hard time with data archival and information exchange between terminals and applications. Manual data transfers presented security issues and it was hard to comply with a regulatory requirement to keep 10 years of operational data online. Elengy now has a PI system at each site with ten years of data in the historian. Around 5,000 tags are used for activity monitoring which can send email alerts to key personel. Data is consolidated into a national information system—a.k.a. a ‘single version of the truth.’ More from www.osisoft.com.


OMV, Petrom deploy Palantir’s Cash and Plan

Petroleum economics and portfolio management solutions standardized at Austrian major.

Austrian OMV and its 51% owned Romanian unit Petrom have adopted economics and portfolio tools from Palantir Solutions. OMV will deploy PalantirCash and PalantirPlan in a global economic and portfolio planning system for the two units. The new system will improve process efficiency by reducing manual data input, drawing data from multiple third party sources of reference. The aim is to provide OMV’s businesses with ‘a comprehensive overview of global assets to aid decision making and planning at both local and global levels in a standardized manner.’

PalantirCash provides ‘secure and standardized’ economic analyses. PalantirPlan provides planners with insights into the portfolio, creating detailed business plans, setting targets and optimizing holdings. More from www.palantirsolutions.com.


P2 Energy Solutions thrashes out 2011 strategy

Advisory board suggests ‘computer integrated manufacturing’ approach to real time data.

At the P2 Energy Solutions (P2ES) 2010 ‘Ascend’ user conference in San Antonio this month, president and CEO Bret Bolin outlined the company’s ‘four-pronged’ approach to growth to a 500-strong audience. The four areas of growth for P2ES are—capturing value in the field, unconventional drilling efficiencies, improved HSE and regulatory reporting and an international expansion. The new focus emerged from the first meeting of P2ES’ Industry Advisory Board last month where representation from major, mid-, and micro-cap oil and gas clients thrashed out a ‘futuristic’ vision for oil and gas production. The idea is to leverage techniques similar those used in computer integrated manufacturing*, but at distances of hundreds, and even thousands of miles away. One user is already moving toward a ‘canopy of surveillance’ with video feeds from job sites monitoring HSE during drilling activity. Bolin said, ‘Our advisory board is already influencing our decisions about new products and strategic direction. Accurate real-time information is what our business is all about. Companies are also using new information from electronic eyes and ears in the field to accelerate performance improvement.’

One micro-cap client noted how the ‘fast moving complexities’ of leasing, royalties and taxes mandate accurate real-time financial data when locating new shale gas drill-sites. More from www.p2es.com.

* www.oilit.com/links/1011_4


Asset Guardian SCM for Technip

Satellite broadband and source code management system for marine PLC control development.

Technip has consolidated source code management for programmable logic controllers on its construction and pipe laying vessels with a software configuration management package, ‘Asset Guardian’ from UK-based Elite Control Systems. Technip was having trouble managing changes to the software that controls engineering processes and equipment—particularly on vessels at sea.

Asset Guardian supports complex software configuration management, reducing risk in process-critical software development and operation in high-value production environments. The system tracks, backs up and protects software, documents and hardware configuration. Built-in ISO 9001 ‘TickIt’ quality assurance tools allow for revision tracking of code, documentation and correspondence. The toolset has been installed on ‘Deep Blue,’—one of the largest pipe laying and subsea construction vessels in the industry. A broadband satellite link provides access to the system to onshore engineers. More from www.assetguardian.com.


Sales, contracts, partnerships and deployments

Westheimer, FileTek, ABB, Industrial Defender, AspenTech, CapRock, CGGVeritas, AVEVA, OSJSC DneprVNIPIenergoprom, Promon Engenharia, MatrikonOPC, McLaren, IRIS, Kongsberg, Microseismic, Transform, P2ES, Petrofac, PinnacleAIS, RigNet, MIT, IDS, Tieto, WellPoint, Accepta.

Westheimer Energy Consultants and FileTek have announced a joint agreement to deliver an integrated seismic data store based on FileTek’s StorHouse platform.

Aspen Technology has signed a multimillion dollar contract with Petrofac. AspenOne Engineering will be deployed across Petrofac’s global operations.

CapRock Communications has won a multi-year contract extension to provide VSAT communications to Diamond Offshore’s fleet in the Gulf of Mexico.

CGGVeritas has been awarded a three-year contract by Maersk Oil to perform high-end seismic data processing at a dedicated center in Copenhagen.

AVEVA has signed a contract to supply AVEVA Plant engineering and design software solutions to OJSC Dnepr
VNIPIenergoprom
, one of the largest engineering design institutes in Ukraine. The company has also signed with Brazilian Promon Engenharia to provide its information management solution.

MatrikonOPC is partnering with InduSoft to include MatrikonOPC’s OPC Servers in InduSoft’s web studio platform.

Invensys has entered into a partnership with Russia’s TimeZYX Group which is to combine its reservoir simulation package, MKT, with Invensys SimSci-Esscor simulation software to deliver an integrated reservoir and surface facilities simulator.

Document management recognition specialist IRIS has finalized an agreement with McLaren Software to sell and support Enterprise Engineer in the Nordic region.

Kongsberg Oil & Gas Technologies has signed with Statoil to extend use of the SiteCom real-time drilling data solution, DiscoveryWeb data browser and SmartAgent toolset across Statoil’s drilling operations.

MicroSeismic has contracted with Transform Software to license key components of its TerraSuite software for imaging unconventional gas reservoirs.

Canadian oil and gas producer Enerplus Resources Fund has adopted P2 Energy Solutions’ Qbyte Metrix 2.0, a web-based production accounting system.

Petrofac has won a contract worth £40 million over three years for the provision of engineering services to Maersk Oil North Sea.

PinnacleAIS is working with the Texas A&M Mechanical Engineering Department, part of the Dwight Look College of Engineering, on pressure vessel integrity R&D.

Midstream natural gas services provider Regency Gas has selected Triple Point’s counterparty credit risk and credit scoring software to manage credit risk processes for its growing natural gas and natural gas liquids business.

RigNet has extended its contract with SandRidge Energy to provide fully-managed remote communications services to its fleet of over 20 land drilling rigs operating in the Continental United States.

Shell and MIT have signed an agreement to invest $25 million in the research and development of ‘high value, sustainable technologies designed to drive innovation in energy delivery.’

Cairn Energy has selected IDS ‘StockNet,’ a web-based materials tracking solution for its Greenland exploration program.

Australian natural gas transporter APA Group has selected Tieto’s Energy Components product suite as its gas & customer management solution.

WellPoint Systems and Norwegian accounting group Accepta AS, have announced a partnership to build and provide financial management services in Norway. WellPoint’s Energy Financial Management (EFM) will be customized for Norwegian clients and Accepta will manage ongoing training and support.


Standards Stuff

Cybersecurity framework, MathML3, seismic navigation/velocity data, WITSML StimJob.

ITU-T has announced a new Cybersecurity Information Exchange Framework (CYBEX) to provide a common framework for cybersecurity information exchange. More from www.oilit.com/links/1011_18.

W3C has rolled out V3 of the Mathematical Markup Language. MathML is a component of the W3C’s Open Web Platform, which includes HTML5, CSS, and SVG.

The W3C also got a shot in the arm this month with the global adoption of W3C standards by the International Standards Organization (ISO) and the International Electrotechnical Commission (IEC)ISO/IEC. The organizations ‘have taken steps’ to encourage greater international adoption of W3C standards. W3C is now an ‘ISO/IEC JTC 1 PAS Submitter’, bringing ‘de jure’ standards communities closer to the Internet ecosystem.

The SEG Technical Standards Committee and the Energistics Geophysics Special Interest Group (EGpSIG) are collaborating on a new standard for seismic velocity data exchange in XML and ASCII textual ‘stanzas.’ OGP and SEG are revising the P1/90 and P2/94 navigation data exchange formats to address modern acquisition and processing requirements and circumvent ‘proliferating’ variations of the legacy formats which will be deprecated. More from www.oilit.com/links/1011_17.

Energistics has announced a new WITSML ‘StimJob’ specification for characterizing frac treatment, reporting and summary interpretations. More from www.oilit.com/links/1011_19.


Wireless World

Mentor Engineering, Stratos, Alcatel, Hughes, Environmental Safety Systems, Hercules Offshore.

Oil country fleet and work management solutions provider Mentor Engineering has added Iridium satellite connectivity to its in-vehicle computer, Mentor Ranger. Ranger in-vehicle computers provide constant contact with the office, switching between cellular and satellite mode. Mentor’s ‘work-alone’ monitoring solution for the oil and gas industry lets office staff monitor the work-alone status of all vehicles on their computer screen and provide countdown to the next required check-in using intrinsically-safe work-alone pendants. The office is automatically alerted if the pendant has been motionless for a specified amount of time or if an operator misses a scheduled check-in. More from www.oilit.com/links/1011_3.

Stratos Global and Alcatel-Lucent are to offer IP/MPLS and WiMAX communications links to oil and gas operations in the Gulf of Mexico. A two-year agreement worth $5.2 million sees StratosMAX II broadband network powered by Alcatel-Lucent’s packet microwave and IP/MPLS technology provide ‘last-mile’ radio links to remote GoM locations offering voice service, corporate VPNs and high-speed internet. More from www.stratosglobal.com.

Hughes Network Systems and Environmental Safety Systems are to provide the Hercules Offshore GoM lifeboat fleet with satellite-based broadband maritime services. More from www.hns.com and www.essicorp.com.


Avocet for CBM specialists Ember Resources, Origin Energy

Schlumberger’s Avocet cannibalizes Quorum Volume Management shop.

Ember Resources, operator of a coal bed methane (CBM) exploration program in Alberta’s Horseshoe Canyon trend, has selected Schlumberger’s Avocet Volumes Manager (AVM) solution for gas reporting and compliance. Previously, Ember’s engineers stored data in Excel spreadsheets in parallel with an underused operations data management system*.

Ember production engineer Simon Rivard said, ‘AVM allows us to stick with our own specialized workflow and store data properly in an Oracle database. Since we determine how our software works, we can accommodate evolving processes.’

Revised Alberta regulations have lightened the reporting burden for operating wells with no heat value change in the last three years. Ember used AVM to indentify 250 such wells, cancelled the tests and saved CDN $ 25,000 in yearly operating costs.

Avocet was used in another CBM context by Origin Oil to streamline production data workflows in the Spring Gully coal seam gas field. A paper** presented at the SPE Asia Pacific conference in Brisbane last month described how Avocet has ‘consolidated scattered data sources into a unified repository for reporting and analysis.’ The paper describes a ‘stalwart’ stream of data generated by the thousands of wells that typify a CBM development.

* Actually Quorum Volume Management, acquired by Schlumberger in 2006 (OITJ June 2006).

** www.oilit.com/links/1011_2


ABB teams with Industrial Defender on secure automation

Whitelist-based cyber security solution to protect ABB’s process control systems.

ABB has joined with process control cyber security specialist Industrial Defender (ID) to deliver secure automation systems. The partnership will provide cyber security and compliance automation for process control systems. Jens Birgersson, head of ABB’s Network Management business unit said, ‘As the world’s power systems become more dependent on IT-based control systems for their safety and reliability, new cyber security standards are emerging. Collaboration with ID will offer customers purpose-built solutions to protect the integrity of their automation systems.’ In what is described as a non-exclusive partnership, ABB will integrate, market and distribute ID’s cyber security and compliance automation products.

ID’s flagship is the Host Intrusion Prevention System (HIPS), a ‘whitelist’- based technology that only lets authorized applications execute on a given device—thus blocking unauthorized applications and malware is blocked. More from www.abb.com and www.industrialdefender.com.


EnergySys reserves management in the cloud

Hosted production management and compliance application frees-up in house IT resources.

UK-based EnergySys is offering a ‘cloud’ based service for reserves management and reporting. EnergySys Reserves Management (ERM) provides information on reserves classification and value, and decision support for evaluating multiple opportunities. ERM lets users study the impact of oil and gas prices, new drilling viability and potential acquisitions. The system can be up and running ‘in days or weeks.’

The ‘cloud’-based offering (a.k.a. a hosted application) caters to companies with limited IT skill availability. EnergySys MD Peter Black explained, ‘We want to help our customers with the key business planning and audit requirements associated with Reserves Management. This tool is easy to use, always accessible and secure. It will be at the heart of strategic planning for oil companies giving them a clear, efficient route to assessing multiple business scenarios.’

ERM provides audit trails with automatic tracking of changes, including details of what was changed, when and by whom. It is specifically designed to simplify operational processes and ensure compliance with the SPE’s ‘Standards Pertaining to the Estimating and Auditing of Oil and Gas Reserves Information.’ ERM can access live data pulled from multiple sources. More from www.energysys.com.


Verdande ‘DrillEdge’ case based reasoning for Shell

Artificial intelligence spots drilling problems before they happen.

Following extensive calibration on its world wide drilling history dataset, Shell Upstream Americas is to use Verdande Technology’s ‘DrillEdge’ case based reasoning tool to predict and identify drilling problems in difficult Middle East wells. Eric van Oort, Wells Performance Improvement Manager at Shell, said, ‘Testing with DrillEdge produced compelling results and demonstrated that unscheduled events don’t happen without warning. There are predictable and repeatable signs that occur hours or even day ahead of an event. DrillEdge recognizes these symptoms allowing corrective action to be taken.’

Verdande CEO Lars Olrik added, ‘Calibration was done on wells with twist-off, lost circulation, stick-slip and stuck pipe problems drilled in the Middle East, Russia, the US and the North Sea. The datasets were blind tested and cross-validated by region and problem area.’ Development is continuing with additional datasets to extend diagnostics. Shell plans to deploy DrillEdge technology through their network of Real-Time Operating Centers that provide 24/7 monitoring for high visibility and challenging wells worldwide. Verdande is a spin-out of the Norwegian University of Science and Technology (NTNU) in Trondheim. More from www.verdandetechnology.com.


Petris rolls out Operations Management Suite

Integrated data management solution for asset lifecycle management built on SQL-Server database.

Petris has announced its ‘Operations Management Suite,’ (OMS) an integrated data management solution for drilling, production and field asset management. OMS supports lifecycle oil and gas asset management - from acquisition through, drilling, production and divestiture. The common environment for master well data management comprises components for asset and equipment management, AFE workflows, drilling and completions operations and reporting, production data management and materials management.

OMS is built atop of a centralized Microsoft SQL-Server database, the PetrisWINDS Operations Data Model (PODM). PODM is an extension of the industry standard PPDM data model. OMS has support for multiple languages, different currencies and units of measurements. OMS was developed on the Microsoft .NET platform. More in next month’s Oil IT Journal and from www.petris.com.


Invensys’ Eyesim 3D simulator for ENI

Eyesim first principle simulator and virtual reality used to train and test refinery operators.

Invensys Operations Management reports deployment of its virtual reality (VR) training system at Eni’s refining and marketing unit. Eni’s refining know-how has been incorporated into Invensys’ ‘Eyesim’ 3D simulator to improve operator safety and productivity. Eyesim ‘kiosks’ are to pilot in Eni’s Gela refinery on the Southern coast of Sicily before deployment in other facilities world wide.

Eyesim uses a stereoscopic headset to immerse trainees in a virtual plant, rendered at 60 frames per second. Trainees learn process operations and procedures from an interactive tutorial and the system tracks performance. Eyesim leverages first-principle simulation and augmented reality in an environment that applies ‘gaming and other skill sets familiar to younger employees.’ The system also covers computer-based maintenance and document management. More from www.invensys.com.


65% of North Sea fields to be abandoned in next decade

Douglas Westwood, Deloitte and Venture look into the upcoming decommissioning bonanza.

Douglas Westwood and Deloitte have authored The UKCS Offshore Decommissioning Report 2010-2040* (UK-ODR). In the next decade, 65% of the UK’s producing fields will be decommissioned including 45% of platforms and two thirds of all subsea units. Looking out to 30 years hence—over 250 platforms will be wholly or partly deconstructed. Some 5,000 surface and subsea wells and 15,000 km of pipeline will be removed before 2040. Cost estimates for the massive clean-up range up to $30 billion. Assuming, that is, that sufficient engineering capacity, expertise (and funds?) can be found.

Deloitte’s PetroScope economic modeling system was used to provide the economic analyses in UK-ODR. The report includes models of field numbers and facilities, economic limit (production limit) as well as incremental and cumulative abandonment expenditure (Abandex). The report envisages a ‘step change’ scenario with the development of a new class of super heavy lift vessels capable of a 15,000 ton plus lift. Regulatory processes and key technologies are also investigated along with an overview of completed, planned and in-progress decommissioning projects and detailed case studies.

Venture Information Management blogger Mike Grant recently looked at another facet of the big decommissioning. The UK government’s Health and Safety Executive issued ‘Key Programme 4’ earlier this year focusing on Ageing & Life Extension of Offshore Installations. KP4 addresses concerns over the safety and integrity of offshore installations and sets out a variety of requirements for staff competency, management responsibilities and measures taken to ensure ‘complete and correct information and documentation’ at all stages of the process. Lack of information, meaning that the Duty Holder is not able to satisfy the regulatory requirements, can lead to shutdown. Venture recommends a ‘health check’ of corporate documentation for KP4 compliance.

* www.oilit.com/links/1011_1


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.