July-August 2012


BPM & EA in oil and gas

ConocoPhillips’ AQPC-based business process hierarchy. Shell moves from Excel to a corporate information factory in OGEAP, the ‘onshore growth enterprise architecture program.’

Speaking at the combined IRM-UK Enterprise Architecture/Business Process Management conference in London last month, Matre Nyberg described how ConocoPhillips Norway uses BPM to ‘support operations, promote corporate harmonization and global excellence.’ CPN has developed a four level process hierarchy, inspired by the AQPC*’s upstream industry framework. This is embedded in a custom Norway business unit process model template, a.k.a. a collaboration map, showing process ‘swim lanes’ from different perspectives, along with related applications and procedures. The templates are color coded to show interactions between different organizational groups and provide drill down into the process hierarchy.

CPN is now deploying the process template across all business and IT improvement projects. Nyber observed that resistance to change can be reduced by involving the right people, but there is no need to ‘sell’ BPM, just to tell folks, ‘this is your job.’ Users can improve their own work processes by participating in the BPM network and managers now have a better view and understanding of their responsibilities and how these relate to other processes. The ISO 9004 compliant system is used for, inter alia, emergency preparation documentation and training on greater Ekofisk.

In a double slot presentation, Shell’s Dan Jeavons and Kathy Young described the BPM-based approach behind Shell’s onshore growth enterprise architecture program (OGEAP). Enterprise architecture is taking hold in Shell where reusable reference architectures are being used to develop architectures for specific operational segments. Shell is also moving from compliance-oriented policy documentation to a more dynamic approach involving BPM, lean methods, Kaizen workshops and more.

OGEAP, a five year program, sets out to wean Shell’s users from plethoric Excel spreadsheets by delivering a trusted data warehouse. The project leverages IP from the Shell/Exxon Aera joint venture and promises a Bill Inmon-inspired ‘corporate information factory’ by 2014. Great thought has been given to governance—with an elaborate structure of process councils and plethoric swim lanes and other diagrams. The starting point was the Upstream Americas process model, now being adapted for deployment at the corporate level. More from IRM-UK.

* American productivity and quality center.


Wellhead bundle

GE’s ‘pre-engineered’ wellhead production control system offers ‘easy-to-deploy’ safety systems, monitoring and control for harsh environments.

GE is to leverage its ‘deep domain expertise’ of the oil and gas industry in a ‘pre-engineered’ ruggedized onshore production wellhead bundle including integrated control and safety systems. The system, designed for harsh environments, provides single point consolidation of fire and gas detection, as well as emergency shutdown. The bundle enables rapid post discovery production startup. The system was also developed in response to regulatory compliance requirements, maintenance costs and safety according to GE oil and gas tsar, Mayank Mehta.

The production wellhead includes a hardened RTU* platform and plug and play, hot-standby and ‘non-duplicative’ redundant systems. Safety diagnostics include system validation and security. The system has TUV SIL2 safety certification for emergency shutdown and fire and gas detection applications. The controller is intrinsically safe for Zone 1 and Zone 2 hazardous areas. With less than five watts of power consumption per controller, the solution minimizes battery and solar panel infrastructure installation and maintenance costs. More from GE.

* Real time unit.


Olympic Games ambush marketing special

Editor Neil McNaughton goes ‘off topic’ to unveil new sporty website, harangue readers to get fit and boasts about his sporting under-performance. Anyone one up for a run at ECIM?

It’s a few years, actually since October 2008, since I last abused my editorial position and went completely off topic. Back then I wrote on ‘Gas guzzling, CO2, horsepower and ‘green’ ...’ And I am still interested in feedback and ideas on the unlikely contribution of ‘air geothermal’ heating and cooling to the world’s energy mix as described in another off-topical editorial, ‘Heat pumps, phlogiston and the world wide web’ (March 2008).

With the holiday break imminent (Oil IT Journal takes a very French month off in August), I thought that I should indulge myself again and also do myself a favor with an unashamedly dual-use editorial. Let me explain.

I am something of a sport addict and, as you may have observed, a writer. Like many folks in oil and gas, having lived through a few downturns and changed companies and countries a few times, I am also in the position of not being 100% sure as how my retirement will be funded. So I thought that, as a kind of insurance policy, I would bag a website, ‘OldSports.com, that would allow me to combine writing, sports and old age. This seemed to me, when I registered the domain name, like a good idea.

Well that is as far as I got. The website has been displaying a ‘welcome’ message and a promise that content would be forthcoming, ‘real soon now!’ This is in part due to my other strategy for retirement which is, not retiring—meaning that I have as yet no need to seek to ‘monetize’ OldSports.com!

But right now I have a more pressing need—filling in two more columns of editorial before midday. Also, the Olympic Games are about to kickoff in London today, offering an excellent opportunity for some illegal ambush marketing. The use of the words ‘Olympic’ and ‘Games’ in any context except for marketing hamburgers and soda is, apparently, a felony. Even the cycling area has been dubbed the ‘Pringle’ stadium!

Much as I enjoy sitting in front of the TV watching others leap, spring and pedal, it has to be said that there is something wrong with this as a sporting paradigm—as in many older unfit viewers watching a super-fit few.

The situation at the other end of the age spectrum is not much better as the number of people participating in sport has an inverse relationship with age. Many sports clubs act as filter on young sportspeople’s progress as they seek to identify champions and eliminate the laggards. Things may be different elsewhere, but in my experience as a parent, kids sports clubs fail singularly to inspire them to do something sporty for pleasure and for their own well being. This is compounded by the ‘Olympian’ spirit of some parents who see in their offspring a future medalist and, in the referee, an idiotic impediment to such.

When the youngster inevitably fails to reach the Wimbledon final or whatever the objective was, disillusion sets in. This will likely coincide with an adolescent desire to do other stuff, like watch TV, or not wash for a week, and sport is forgotten.

In middle age, far too many folks just drift away from sports, and engage in less and less physical activity of any sort—well almost. And of course, as you lose fitness, you can only do less and less. It is a vicious circle.

A recent report in The Lancet described the ‘pandemic of physical inactivity’ as the ‘fourth leading’ cause of death worldwide. Well you may say that the ‘fourth leading’ cause of death is probably not worth worrying about and I would agree. But there is a much better reason to get fit and that is the effect it has on one’s mental well being.

I myself in the rather distant past suffered from depression. But not since I took up running (and later biking) about thirty years ago. Every now and then I still feel the blues creeping back. But I now have a solution. I put my shorts on, go off for an hour or so running and the endorphins do the rest.

In that one’s mental well being is pretty important to one’s professional life, sport helps one at work too. I like to think that at least some of the increased blood flow is getting to the brain and staving off age-related doltishness! I’m sure that it makes sitting in the plane for a 10 hour trip to Houston more bearable. And I also suffer less form colds and flu. These may be replaced by the odd bout of tendinitis or other aches and pains. But my take on such ailments is that they are too often taken as a pretext to quit. The medical profession does not always help here. Some doctors seem to think that older folks should avoid ‘too strenuous’ activity.

One of the reasons that I have not yet done anything about OldSports.com is that blogging about one’s life, sporting or otherwise is awful close to boasting. Well the internet is full of bloggers ‘boasting’ about what they have done and I for one enjoy reading of their exploits.

So here goes. I run something like 2,000 km per year and bike about the same. This year I turned in a sluggish 4:42 for the Paris Marathon. I also completed my seventh ‘Etape du Tour’ (0304) a popular bike race where around 6,000 cyclists ride one of the harder stages of the Tour de France. This year it was from Albertville to La Toussuire—a ‘short’ stage (150km) but with a lot of climbing (4750 meters). I finished 4301st out of the 4400 qualifiers. You can watch me finish in style here. In 2010 I actually came in last— 6888th out of a field of 6888 and was one of two people referred to in L’Equipe, the French national sports newspaper (OK they named the guy who won and I was just ‘the guy who came last’ but what the heck.)

So what is all this boasting about coming last and running slow? Well in a way that is the whole point. It’s better to be out there coming in last rather than being inside watching someone else come first isn’t it?

If any of you are attending the ECIM* E&P IM Conference and User Meeting in Haugesund, Norway on 10-12th September, I invite you to join me on a slow run through the lovely Steinsfjellet park just outside the city center. Ping me on neil@oilit.com and bring your trainers.

* Expert Center for Information Management.

Follow @neilmcn

Phil Crouse 1951—2012

A tribute to Phil Crouse who died this month. PNEC Petroleum Data Integration conference to continue—with the 17th edition scheduled for May 2013 in Houston.

On the 10th of July 2012, conference organizer extraordinaire Phil Crouse died, as he would have wanted, ‘in harness’ while working on the next edition of his industry-leading PNEC Petroleum Data Integration conference. The Dallas Morning News obituary traces Phil’s remarkable career—from roustabout in West Texas to the youngest member, at age 28, of the Conference board congressional fellow program. In 1979, as a senior staffer on the US Senate committee on the budget, Phil was responsible for a $5 billion segment. In 1990, he established the Petroleum Network Education Conferences (PNEC) as a consultancy, training and conference organizing body.

Phil was a big guy and a larger than life character. His interest in the industry at large came across loud and clear as he would hold forth on subjects from politics, through IT and data to the latest developments in non conventionals. Phil was also a man of action. When he spotted a promising novelty—such as horizontal drilling, coiled tubing, geosteering and later, data management, he would organize a conference or course to educate and promote the new technology.

One interesting facet of Phil’s work with the PNEC Conferences organization is the way he raised the game in conference quality and showed-up some of the more established ‘not for profit’ conferences as being rather second rate affairs. Phil was stickler for doing it right—striking a balance between overt commercials and informative content. He was very demanding of authors who were required to submit both slides and papers ahead of time—no showing up and arm-waving at PNEC! His rigor led to some company-making presentations—notably in the field of data quality—the combination of papers from a vendor and satisfied clients in major oils was pretty compelling stuff and showed that marketing IT could be an intelligent, hype-free process.

PNEC and Oil IT Journal—then called Petroleum Data Manager—started out about the same time in the mid 1990s and we shared an interest in promoting the embryonic subject of E&P data management. For those who would like to learn more about PNEC—we did a three part ‘history’ last year in our May, June and July issues. One highlight of PNEC was Phil’s 2006 ‘scoop’ when he persuaded Nancy Stewart, Wal-Mart’s CIO to ‘tell all’ in what was a real technological eye opener. Attendees will also remember how he livened up a rather dry subject with his drawings for Dilbert book prizes and those infernal wire puzzles.

Phil’s spouse Cindy Crouse and the PNEC team are to carry on with the 17th annual event which will be held May 2013 in Houston. You can donate to Phil’s preferred charities and sign the online guest book.


Book review—Hadoop: The Definitive Guide

Tim White’s guide to Hadoop from O’Reilly Books, now in its third edition, is a 650 page introduction to the deployment of ‘big data’ applications. But what is Hadoop? What’s in it for oil and gas?

Hadoop is one of those IT things that—the more you hear about it, the less you understand, where hype blurs reality. We decided to go to the horses mouth, with a quick spin through Tom White’s ‘Hadoop, The definitive guide (HTDG).’ Well a quick spin is not quite right for a 650 page manual—but here goes.

Hadoop is a data storage and analysis platform that was developed for ‘big data’ such as generated by web traffic to Google and Yahoo where the tool originated. Analyzing multi terabyte log files was proving hard with the technology of the day for several reasons. The relational database (briefly re-baptized by White, sans explanation, as the ‘rational’ database) requires data to be read in a random fashion from many disk locations. Such ‘seek’ activity is slow. It is better to stream data from disk leveraging all available bandwidth. Hadoop was also designed to be deployed and scale across the massive commodity clusters used by Yahoo and Google and to tolerate hardware failure—which when you have 100,000 nodes is a relatively frequent event. Hadoop adopts a write once, read many approach to minimize data movement and is said to be suited to processing large, unstructured data sets. Interestingly, for geophysicists, Hadoop confronts the same problem set as high performance computing environments i.e. network bandwidth. Hadoop is said to offer a simpler programming model than MPI. Hadoop’s design means that if a problem is running slow, you just add more nodes to the cluster. Hadoop is inherently parallel.

But how does it work? In chapter 2, HDTG walks through a typically Hadoop-esque problem—analysing a data set from the US National Climatic Data Center. Ascii data is first analyzed with a typical Unix approach using awk taking 42 minutes on a single EC2 instance. To speed up, parallelization is required, but at the expense of considerable programmer effort. Enter MapReduce—the essence of Hadoop. MapReduce is a little hard to grasp although the explanation appears simple enough. Like special relativity, one would not want to be tested on it after a first read. But to cut to the chase, the Hadoop program scales transparently and ran in six minutes on 10 EC2 nodes.

We’ll skip the next 600 pages, replete with code and Hadoop-derivated projects, to checkout what Hadoop is used for. A contribution from Facebook outlines the use of Hive, an open source data warehouse and SQL front end to Hadoop which has displaced Oracle for some tasks. Hive allows SQL programmers to play with Hadoop without the ‘complexity of MapReduce.’

So what is Hadoop’s potential in oil and gas? It is tempting to see application in seismic processing (lots of sorting there) and maybe in real time data. The question is who is going to do the MapReduce for seismics and will the results better the years of effort already spent on these problems by the imaging community? Speaking at the EAGE workshop on open source software in geophysics, BG Group’s Chris Jones made a passing reference to running Seismic Un*x on Hadoop (page 7). Maybe we’re onto something.


MVE announces Move on Mac

Geological modeling flagship port to Mac OS. iPad, Android versions ‘real soon now.’

Midland Valley Exploration (MVE) is porting its Move geological structural modeling flagship to 64 bit editions of Mac OS 10.6 and above. The Mac port will be available in the 2013 Move release and will support all modeling modules, FieldMove and MoveViewer. Move 2013 will continue to operate on 32-bit and 64-bit Windows Vista and Windows 7 and Red Hat Enterprise Linux 5 and 6 (64-bit only).

Move chief engineer Mike Krus observed, ‘Adding the Mac to our supported platforms provides another option for our clients in industry and academia to utilise the hardware available to them. Long-term it represents the beginning of our software development plans to introduce new products on platforms including iPad and Android.’ The Mac port was facilitated by MVE’s use of the Qt cross platform GUI development environment. More from MVE.


Ryder Scott revamps freeware lineup

Consultancy releases plethoric reservoir engineering tools for Microsoft Excel.

Ryder Scott has released new, Microsoft Office 2010-compatible versions of its Reservoir Solutions freeware. These are compatible with all post Excel-97 versions and now include native file formats for Excel 2007/2010, including 64-bit versions.

The tools cover a wide range of engineering applications. RscCBM computes coalbed methane volumetrics. TruVert 2-D calculates true vertical thickness and net pay in deviated wellbores that penetrate dipping reservoirs. RyVOL is a menu-driven program for calculating volumetric in-place and recoverable reserves.

The Reservoir Solutions Module solves common problems such as pseudocritical properties, compressibilities and formation-volume factors. QuickLook economics evaluation software computes screening economics for prospects, evaluating workovers and preliminary lending economics.

The Material Balance application calculates original gas in place (OGIP), estimated ultimate recovery (EUR), BHP/Z vs. cumulative gas production and Tc and Pc properties from gas gravity while adjusting for contaminants.

Flowing Pressure Analysis program evaluates the performance of producing gas or gas-injection wells. LogWizard offers density/neutron or sonic logs analysis and RamBal performs material balance calculations especially of abnormally pressured, unconventional gas fields.

Ryder Scott estimates the user base of its engineering and geoscience applications at around 10,000 users in over 80 countries. Download the tools from Ryder Scott.


Shell sponsors PetrisWinds Recall nomenclature

Petris Portal clients get access to time-saving curve data dictionary generator.

Users of the Petris Portal (0801) can now access a PetrisWinds Recall curve data dictionary generator whose development was sponsored by Shell. The generator saves users from ‘laborious’ manual editing, providing definitions for core curves, zonal parameters and more.

Users can download formatted spread sheets populated with curve definitions. Following review, existing Recall data can be checked against the definitions. The system provides color coded traffic lights of data quality along with specific details of warnings and failures. If the target Recall system uses custom primary keys, users can reset validation criteria according to in-house rules. The dictionary is also available from Petris via email. More from Petris.


PPDM Denver meet

Compliance test questioned. ‘Ego’ management. PPDM for facilities? Surveying the Barnett Shale.

EnergyIQ’s Steve Cooper offered an interesting slant on data Quality and PPDM’s business rules in the Denver chapter’s meet earlier this year. Cooper questions the rationale of current ‘table-based’ compliance tests. Table-based compliance offers ‘slim to no possibility of multiple software applications having update interoperability on the database.’ This is because the complex PPDM database requires a methodical approach to data population*. Cooper suggests that the development of a common object model atop of the PPDM database should be a priority for the Association.

You have often heard calls for ‘collaboration’ in various contexts. But what happens when war breaks out between the collaborators? Neuralog’s Tarun Chandrasekhar spoke about ‘Ego management in data management’ and ‘how to please everyone and live happily ever after.’ The answer, it would seem, is to try to keep as many ‘collaborators’ out of the data loop as possible and to make life easy for those remaining.

Wes Baird, speaking on behalf of Emile Coetzer (Axioma Asset Engineering) offered what some might perceive as a rather contentious advocacy for PPDM to pitch into the facilities engineering model space that is already rather full with representations from Fiatech, Mimosa and POSC/Caesar.

Jan Van Sickle of Downtown Design Services offered an interesting and very thorough presentation on the complexity of survey data management. This is coming to the fore in high intensity environments such as factory drilling in the Barnett Shale. Van Sickle’s tools of the trade include Google Earth (with Arc2Earth), the Pipeline Open Data Standard (PODS) and DDS’ Tract Manager Xtreme database. Download the presentations from the PPDM website.

* A similar observation led both Schlumberger and Landmark to recommend API-based access only to their data stores.


Software, hardware short takes

Geomodeling Technology, ESRI, Cutting Edge, GE, Kepware, Kongsberg Oil & Gas Technologies, New Century, Onset Computer, dGB, Petrosys, Caesar Systems, Software Toolbox.

Geomodeling Technology’s 7.3 release of AttributeStudio adds new features for fractured reservoir characterization from AVAZ, VVAZ, curvature and other directional attributes.

The ESRI/Cutting Edge data appliance for ArcGIS comes in Linux and Windows flavors and offers a secure, in-house equivalent to ArcGIS Online. The hardware bundle includes imagery and vector data that can be augmented with proprietary content.

GE’s ProficySCADA suite is now available on the Apple iPad—providing mobile workers with a window into the control room.

The 5.9 release of Kepware’s KEPServerEX offers upstream oil and gas ‘affordable’ support for electronic flow measurement. Kepware interfaces with Flow-Cal and PGAS for custody transfer and includes a WITS Level 0 OPC server for real-time drilling data4.

Kongsberg Oil & Gas TechnologiesSiteCom 9.3 introduces a new Discovery Mobile module feeding real time drilling data to the iPad and, real soon now, other mobile devices.

Express Loader from New Century Software provides bulk data loading to a PODS database.

A new hybrid wellhead outlet from AnTech provides fibre, electric and hydraulic connectivity for high pressure, high temperature wells. The Type-C4 is designed for use in permanent completions.

Onset Computer’s Hobo data loggers are used to gather baseline water quality data for natural gas fracking operations.

The 4.4 release of dGB’s OpendTect includes new seismic net pay and seismic feature enhancement plug-ins from Ark CLS. CLAS, a new module from Geoinfo offers open hole log analysis, ‘integrating petrophysics and geophysics.’

Petrosys V17.2 introduces a new ‘Exchange’ feature for data transfer between third part systems. Transferred objects can be renamed en route and jobs saved for subsequent replay.

Caesar SystemsPetroVR 2012 offers improved visual analytics, easier update deployment and more flexible time-based display and data merge.

The 5.9 release of Software Toolbox’s TOP Server introduces electronic flow measurement, better diagnostics and new midstream functionality. TOP Server is powered by Kepware (see above).


USPI-NL Annual Management Board meeting

Shell’s engineering information management on Kashagan. EU-backed maintenance knowledge management project. AixCAPE/Bayer—ISO 15926 ‘viable technology, but technically complex.’

Speaking at a meeting of the USPI-NL plant standards body in The Netherlands earlier this year, Shell Global Solutions Jason Roberts provided an insight into Shell’s current engineering information management (EIM) with a case history of engineering handover on the giant Kashagan development in Kazakhstan. EIM enables project information workflows and also consolidates asset information from multiple repositories such as SAP, Aspen HySys, Aveva PDMS and many more. But the real drivers for EIM are time (a week saved on Kashagan equates to around $35 million) and asset integrity. Because asset integrity is a life-of-facility issue, so too is EIM, which spans design build, handover, operations and maintenance. Kashagan’s build involved operations across the western hemisphere, from Aberdeen to Yokogawa. These produced millions of equipment tags and engineering documents. Great consideration was given to exactly what was required from vendors and to the format of information delivered.

Shell’s standardization effort centers on the Shell reference data library (RDL), a web based application holding engineering objects, relations, properties and reference data. The application allows stakeholders to drill down through the engineering hierarchy, cloning and modifying existing objects to suit requirements and output specifications in Excel or Word for use by members of the front-end engineering design team.

Bas Kimpel (Momentive Specialties) described progress on the EU-backed maintenance knowledge management (MKM) project whose goal is to publish EU-guidelines for consistent, complete and timely information required for maintenance activities. The idea is to capture maintenance knowledge into modules so that maintenance is ‘less dependant on the individual carrying out the work.’ A tested knowledge base will be made available to companies in the process industry supply chain. Modules will be available under NEN/CEN/ISO and will be maintained by industry as required—there should be ‘no need to re-invent these all the time.’ The project interviewed maintenance specialists in Shell, SABIC, BP and Dow and has established minimum information, knowledge and competence requirements for use cases such as repairing a process control valve. The study has established that it is possible to define a common repair methodology that can be made available as a standard process. Work orders are likewise amenable to standardization—preferably using a common maintenance dictionary. The work embeds the Orchid plant engineering information standard and will ultimately inform PAS 55/ISO 55000 maintenance standards. Visit the embryonic MyMaintenancePortal (in Dutch).

Manfred Theissen introduced the Aix la Chapelle (Aachen) computer aided process engineering organization AixCAPE (1102), an ‘application-oriented platform for research transfer.’ AixCAPE member Bayer has been testing the ISO 15926 plant information standard for information exchange across tools and companies. Bayer concluded that ‘ISO 15926 is a viable technology, but technically complex.’ Today the project has achieved ‘lab scale status’ and more work is required to use at-scale. A common understanding of modeling issues has yet to be reached to avoid ‘starting from scratch every time.’ ISO 15926 needs more take-up in commercial tools and better support from CAE providers. Read the presentations on USPI-NL.


EAGE 2012, Copenhagen

Tullow’s E&P ‘boldness.’ Joanneum’s seismic attribute database. Techlog on cement evaluation. Unicamp’s control valve economics. Intelligent completions data management. Maersk’s ‘3D Close the Loop.’ Statoil’s model building framework. Computer-aided interpretation. Dinosaur seismics. DecisionSpace Desktop revisited. Petrel 2012 and the Studio knowledge database.

In the special ‘Boldness in E&P’ session, Tullow Oil Uganda’s Shane Cowley stated that Tullow’s fundamental premise is that ‘teamwork is the key.’ A well drilled in 1930s showed reservoir, seal, shale and shows. But nobody believed that the threshold could be met. A World Bank funded airmag survey proved a significant deep basin. This was later refined with an airborne full tensor gravity gradiometry survey that ‘identified most structures we have now drilled.’ The now proven theme of hanging wall anticlines (pizza slice plays) is ‘a somewhat unconventional play’ of sediment against basement. This has been extended into Ethiopia and Kenya where Tullow has one major discovery and ‘an acreage position the size of England.’

Austria-based Joanneum Research Institute’s Johannes Amtmann is building a seismic attribute database for effective literature research. The project compiled attributes described in interpretation software (CGG/Geovation, Kingdom, Open dTect, Petrel, Promax …) publications which have been classified and stored in a database. The initial work was delivered to OMV as a Microsoft Access database—now migrating to a ‘Seismic Attribute’ web resource.

A Schlumberger Techlog booth presentation called for more rigorous cement evaluation in deep offshore drilling. Deepwater wells with thick casing and lightweight mud present problems for conventional cement bond logging tools (CBL). A new flexural attenuation tool and specialist processing can sense through cement to the formation. When combined with ultrasonic acoustic impedance a clearer identification of top of cement is available along with potential channeling issues and free pipe. The presenters suggested that the API RP96-B3 cement evaluation that relies on subjective evaluation could be beefed-up.

Unicamp, Brazil’s Denis Schiozer has been investigating production optimization and the economics of high-end inflow control valves. A literature review of intelligent wells and optimization revealed some unfair comparisons and confusing terminology. A simple reservoir simulation model was developed to check on the viability of inflow control valves (ICV) and simpler on/off valves. An artificial intelligence method used a genetic algorithm to generate hundreds of cases and compare net present value for different oil prices and water disposal costs. This gave some unexpected results. Not all ICV deployments pay back the cost of deployment. Several cases have the same NPV with different total production. Schiozer concluded, NPV optimization is a complex process.

Clifford Allen (Halliburton) is working on a data system for management of intelligent completions. The idea is to combine Scada data with flow assurance, decline analysis, allocation and well tests data in a hierarchical data structure. While conventional Scada devices have a relatively standard interface, downhole instrumentation is more complex. Fields now have two sets of RTU/PLCs, and dual Scada systems or more. Polling devices across these systems is hard so data usually goes to the historian. But even here, different time stamps make it hard to get data into enterprise reporting.

Halliburton’s answer is its Asset Optimization Service, an ‘open source’ system that handles all data interfaces and adds ‘complex algorithms.’ A screen mimics the well from BHA to surface. Data hooks connect to third part devices for hydraulic control. AOS, sold as a service offering, can optimize production leveraging all the available data. In the Q&A Allen was asked about the status of the IWIS downhole data standard which was supposed to address such issues—he responded that while Petrobras was pushing strongly for IWIS, it is ‘hard to see who will pay for such a fundamental shift.’

Mosab Nasser (Maersk Oil) is critical of seismic inversion models that are ‘thrown over’ to interpreters and geologists who often don’t understand them. Nassar advocates using rock physics to build the reservoir model. A case study of a West Africa submarine fan involved multiple possible geological scenarios. These were triaged using seismic amplitudes and rock physics. The ‘3D close the loop’ process involves rigorous rock physics modeling of the reservoir and forward modeling to a synthetic seismic section. This is compared with the 3D data and differences explained in terms of fluid content. The ‘3D CtL’ process is run from Schlumberger Petrel along with an in-house developed ‘Mod2Seis’ that computes the elastic response. Iteration aligns the model with the data—but, Nassar warns, ‘it may not be right!’ The technique has led Maersk to change its geological concepts—losing channeling and gaining faults. Another caveat is that the conditions need to be right—these techniques ‘would not work in the pre-salt.’

Another model building workflow was presented by Statoil’s Xavier van Lanen and Jan-Arild Skjervheim. The idea is for an automated process from geomodeling through to simulation that can be updated when new data arrives. The model is conditioned to seismics and represents and propagates uncertainty. The process spins-off multiple fluid flow simulations for history matching that then updates the model. Simulation, point estimate and ensemble methods are all run under a workflow manager. The output is forecast production profiles. Subsequent seismic and well data can be used to update depth surfaces and faults. The ‘base case’ model runs on a PC which fire off multiple realizations on the cluster. The workflow manager is used to test sensitivities and scenarios with ‘smart’ workflows. The process can be very compute intensive with for example, 28 CPU hours per realization for a 20 million cell geo grid/4 million cell sim grid. But the results are worth it—showing the effect on structural uncertainty and production profiles. The automated workflow allows Statoil to pursue otherwise unachievable analysis of a field. Whole loop workflows towards an assisted history match and back to the structural model are now feasible and the prior model can be compared with the update and against production.

Several papers addressed computer-aided interpretation. Eric Suter of Norway’s IRIS Research Institute has developed a geometrical transform that ‘stretches and squeezes’ internal bed geometries at will while dissociating rock properties from bed geometry. A geometrical transformation acts on formation boundaries (no grids involved) with intelligent linking across faults. An automated fault update facility inserts new faults avoiding a lengthy manual update. The idea is for an ever green earth model while drilling. The absence of grids means that complex multi-z surfaces such as recumbent folds can be modeled with properties intact. In the Q&A, Jean-Laurent Mallet noted an apparent similarity of this technique with his own UVT transform as used in Paradigm’s Skua.

Steen Agerlin Petersen (Statoil/TUD) wants to go back in time to when the dinosaurs were around. Current static seismic cube interpretation works, but does not use all the geological processes and leaves out a lot of extra information in the seismic data. Enter ‘earth recursion,’ or data restoration/model restoration ‘DR/MR.’ The DR/MR moves back and forward in time—laying and eroding sedimentary units, adding diagenesis to end up with what is observed in the seismics. The DR/MR modeler can fault and fold beds and switch between reflectivity and seismics. This leads to dual ‘contexts,’ reality and simulation that develop over time and intersect at the present where the model meets reality—in the form of rock properties, seismics and logs. Petersen is now working to extend the method to planning and flow simulation. A North Sea example showed how interpretation was facilitated by going back in time and interpreting seismics ‘as the dinosaurs would have seen them!’ DR/MR is claimed to be a great integration workflow for people and disciplines.

On the exhibition floor we saw the latest release of Landmark’s DecisionSpace Desktop with a rather compelling capability for planning and executing multiple horizontal wells for non conventional development. The idea is to move from a blind ‘factory drilling’ concept to a more adaptable approach, targeting shale sweet spots with a holistic integration of all available information—from real time data to GeoEye satellite imagery. For Landmark, this means spanning the traditional Engineer’s Data Model/OpenWorks divide. These two data sources will soon share a logical data model—currently being developed in an internal project named ‘Common Ground.’

Schlumberger’s Petrel 2012 demos were spectacularly well attended. Petrel 2012 includes just about anything you could think of—with new functionality integrating seismic processing with interpretation for ‘seismic-driven reservoir modeling.’ Even the Schlumberger/Chevron developed ‘Intersect’ high-end reservoir flow simulator is now accessible from Petrel. All of the above and more is now tied together with Petrel Studio—with data stored in a ‘Studio Knowledge Database.’ Schlumberger’s GUI specialists are working on an Office 2010-style ‘ribbon’ interface for the next major release.


Open Source Workshop

BP recap of open source software in E&P. Geophysics and reproducible research. ‘SeaSeis’ a seismic workhorse. JavaSeis at ConocoPhillips. BG Group ‘uses and supports open source.’

The EAGE-sponsored workshop on open source E&P software was a reprise of an event held six years earlier. BP’s Joe Dellinger set the scene with a recap of the open source movement in general and of Unix-based seismic systems including SEPLib, Seismic Un*x (SU) from the Colorado School of Mines and Amoco’s USP. The ‘reproducible’ movement originated at Stanford where the inclusion of reproducible code is now mandatory. The 2000s saw a wave of oil company mergers and a software shake out—it was easy to leave with software like FreeUSP, FreeDDS or with Amoco’s synthetic seismic data sets. The last decade saw the development of CPSeis from ConocoPhillips and JavaSeis. Madagascar was also released and reproducibility made easier while Python based processing systems proliferated. DGB’s Open dTect got a mention as an open source commercial success. So what’s in store for the 2010s? GPU-based software is a flash back to earlier days of the array processor. Big data and big graphics are important trends. Companies will ‘cover walls of rooms with high resolution screens.’ And maybe open source will see industrial strength and easy to install software that has so far been lacking.

Columbia’s Victoria Stodden described the central role of geophysics in reproducible research—thanks to John Claerbout’s influence in research and on US policy. The reproducible movement came about in reponse to a credibility crisis in computational science in the last century. In the June 1996 Journal of the American Statistical Association, nine out of 20 papers were computational and none included code. By 2011, 29 out of 29 articles were computational and 21% made their code publicly available. An article about computational science is not the scholarship itself, it is just an advert for the scholarship. The scholarship is the complete software development environment and the code. Why is all this important? On the one hand, computation is now a branch of science in its own right. On the other hand there is the ‘ubiquity of error’ and a lot of loose thinking that tends to generate ‘breezy demos,’ not reliable knowledge. Intellectual property is an issue for open source. But much thought has been devoted to fixing this with notably the Apache Foundation’s licensing, Creative Commons and Stoddens own work.

Bjorn Olofsson presented SeaSeis—his open source sequential pre stack batch seismic processing system. SeaSeis is an Ascii control file based workhorse with 80 modules and a 2D viewer. The system is based on the idea that ‘simple tasks should be simple to run.’

Chuck Mosher (ConocoPhillips) spoke on parallel I/O and computing in JavaSeis. Originally developed by Arco, JavaSeis is used by ConocoPhillips and Halliburton. JavaSeis is run from the Eclipse IDE and offers Matlab integration.

Chris Jones explained that BG Group uses and supports open source seismic software because it gives BG’s seismologists the ability to do what they want, to experiment with algorithms in a scalable, flexible environment. BG deploys an ‘agile’ approach atop a huge Fortran 77 code base. Open dTect’s attribute engine, Lustre and Slurm job management are used, as are visual programming tools like Scratch, MIT’s AppInventor and the ubiquitous Perl. Yanghua Wang’s Multichannel matching pursuit algorithm has been ported to Open dTect. Other use cases include Baysian classification of seismic facies and depth sensitivity analysis. BG uses Matlab (grads are more familiar with Matlab than Python) and for performance—Matlab can also be run on the cluster. BG is interested in running Seismic Un*x in Hadoop as it is ‘especially good for parallelization.’ I/O remains a sticking point—JavaSeis is ‘half way there.’ Service providers should provide open source solutions and more reusable code a la Matlab. More from the Workshop website.


Folks, facts, orgs ...

Allied Specialty Vehicles, MIT Energy Initiative, Bill Barrett, Ensco, Sigma3, GFZ-Potsdam, GP Strategies, Intergraph, IHS, Inova, Jeffries, Ryder Scott, KBC, Knight Oil Tools, Kongsberg, Knowledge Reservoir, Neuralog, Noah Consulting, Reservoir Group, Well Data QA, Oildex.

Allied Specialty Vehicles has appointed Don Kyle as president of its oil and gas unit. He was formerly with DynaPump.

Saudi Aramco is a founding member of the MIT Energy Initiative (MITEI), a $25 million, 5 year program. Aramco is to create a satellite R&D center in Cambridge, Massachusetts.

Bill Barrett Corporation has appointed LB Capital’s Carin Barth, as an independent director.

Wood Group chairman Ian Wood is to retire in November. He is replaced by CEO Allister Langlands who in turn is replaced by Bob Keiller.

Mark Burns will become executive VP and COO of Ensco when Bill Chadwick retires in August.

Gareth Block has joined Sigma3 Integrated Reservoir Solutions as senior director of technology integration and commercialization. He hails from ExxonMobil.

GFZ-Potsdam has announced ‘GASH II America,’ a joint industry project extending the work of gas shales in Europe (GASH) to North American tight oil.

GP Strategies has recruited Helmer Andersen, Ken Daycock and Mike Levesque to its energy services unit to promote its EtaPRO performance and condition monitoring system.

Welch Sun is to head-up Intergraph’s new ‘Greater China’ region. The company also recently opened a newly-expanded office in Kuala Lumpur, Malaysia.

IHS has named CERA founder and author Dan Yergin as IHS Vice Chairman.

Inova Geophysical has named John Bell as senior VP business development and global sales. Bell was previously with EMC.

Brad Handler has joined Jefferies as MD and senior equity researcher for oil services and equipment. The company stresses that he is not related to Richard Handler, chairman and CEO!

John Hodgin is to leave his position as president of Ryder Scott to work for the US Securities and exchange commission.

Andy Howell has joined KBC and is to promote the recently acquired Infochem Multiflash technology and KBC’s own Petro-SIM process simulator.

Barret Lemaire has been promoted to director of IT at Knight Oil Tools.

Hugh Griffiths heads-up Kongsberg Oil & Gas Technologies new office in Perth, Australia.

Knowledge Reservoir has appointed Sheldon Gorell as VP technology. He hails from Halliburton.

Texas Tech graduate Shawn Abrams has joined Neuralog as sales manager for its drum label printers.

Fred Kunzinger has joined Noah Consulting. He was previously with Hess.

Reservoir Group has appointed Simon Howes to head up its Interica data management business.

Martin Storey and associates have founded Well Data Quality Assurance.

Transzap/Oildex has appointed Michael Weiss as VP software engineering and Chris Dinkler VP sales.


Done deals

Weatherford, Aker Solutions, Saudi Aramco Energy Ventures, Argus Media, Energy Technology Ventures, Dassault Systèmes, Industrial Defender, IHI, IHS, KBC, Object Reservoir, more ...

Weatherford has acquired oil and gas completion tools specialist Petrowell.

Aker Solutions has acquired Subsea House and SSH Engineering.

Saudi Aramco Energy Ventures is a new 100% Aramco unit targeting ‘sustainable domestic energy and water consumption.’

Argus Media has acquired DeWitt & Company, a provider of market assessments and business intelligence to the petrochemical industries.

GE, NRG Energy and ConocoPhillips joint venture Energy Technology Ventures has invested in On-Ramp Wireless.

Dassault Systèmes has completed the acquisition of Vancouver-based geological modeling and simulation company Gemcom Software International for approximately $360 million cash.

Doyles has been bought by Axon Energy Products.

Drilling Info has acquired County Scans.

Furmanite has acquired ‘certain assets and operations’ of Crane Energy Flow Solutions.

Industrial Defender has acquired backup and disaster recovery specialist Fandotech.

IHI Corporation is to acquire Kvaerner Americas’ operations and the EPC Center Houston. The new company will be called IHI E&C International.

IHS has acquired ‘semantic search’ specialist Invention Machine for $40 million. IHS also bought engineering information specialist GlobalSpec from Warburg Pincus for $135 million. And IHS has acquired Citation Technologies’ CyberRegs unit for $11 million. CyberRegs provides EHS regulatory and compliance information for the US, Canada and Mexico.

KBC has purchased Infochem Computer Services for £9.5m as a ‘first step’ in its expansion into the upstream oil and gas software and services market.

Knight Oil Tools has acquired Tri-State Tools & Inspection.

Object Reservoir has been acquired by Halliburton.

AGR has acquired Ocean Riser Systems which will integrate AGR’s Enhanced Drilling Solutions unit.

Riviera Energy and the transaction advisory group of Ensley Properties have merged. Pritchard Capital Partners has a share in the new venture.

SherWare and Learn Tax have combined to form an accounting and tax resource center for small independent oil and gas producers.

Schlumberger has taken a 20% stake in Chinese oilfield services provider Antonoil.

Statoil’s Technology Invest unit is now a shareholder in HPC specialist and Investinor portfolio company Numascale.


Schlumberger, Altair team on parallel Eclipse speedup

How-to guide to accelerated reservoir flow simulation with PBS Professional workload manager.

A 13 page white paper from Schlumberger and Altair Engineering provides a ‘how-to’ guide to integrating the Eclipse reservoir fluid flow simulator with Altair’s PBS Professional workload management system. The system was demoed at last month’s ISC with Eclipse running on the latest Intel MIC ‘supercomputer on a chip’ (Oil IT Journal October 2011).

PBS Professional allows Eclipse simulations to be run across diverse computing resources and locations to maximize throughput. End users see a single interface to all computing resources and IT managers can optimize available compute resources by dynamically distributing workloads across wide area networks. Large MPI-based jobs are run automatically as PBS detects failed nodes and reschedules tasks around them.

The integrated solution hides workload management from the end user who does not need to interact with PBS Professional. Full license management is also included. The functionality is available in the 2012 Eclipse and Intersect releases. Request the whitepaper and more on PBS Professional.


Siemens—multi disciplinary approach to compliance

New ‘Siemens in a box’ offering addresses oil and gas knowledge management.

Anil Gokhale, general manager of Siemens’ oil and gas consulting unit, which was formed when Siemens acquired Houston-based Benwanger in 2006, was interviewed by Siemens’s in-house ‘Venture’ magazine this month.

Following the Macondo incident, everyone is concerned about process safety. There are engineering issues that must be dealt with and a host of new regulations involving many different disciplines. Enter ‘Siemens in a Box,’ a combination of different parts of the Siemens organization that helps customers address the new regulatory concerns.

Siemens process safety experts are recognized by the regulator. According to Gokhale, Siemens experts are called on frequently by the regulatory agencies for standards-setting advice. Finally, Siemens has patented its ‘risk-based’ methodology that, it claims, assures a long-term solution rather than a quick fix. The services extend to ‘conceptual engineering’ of new developments, ‘telling clients how much money can be earned by exploiting a new find and what facilities are needed to maximize recovery.’ More from Venture magazine.


Shell, Limit Point work on commercial ‘SheafSystem’

Successful GameChanger trials lead to abstract data ‘munging’ system take-up.

Limit Point Systems (LPS) and Shell Global Solutions are to develop a technology road map for oil and gas for LPS’ ‘sheaf’ data technology (Oil IT Journal June 2012). The ‘SheafSystem’ has been under trial in Shell’s GameChanger program. Shell’s Hans Haringa said, ‘GameChanger has had a long relationship with LPS and has funded several sheaf technology for oil and gas applications. We believe the technology is now mature enough to start exploring its application in oil and gas and in many other industries.’ The project will gather input from prospective customers, both within Shell and at other organizations, draft a technology development road map, and propose a development plan.

But what the heck is the sheaf? We quizzed LPS president, David Butler who explained, ‘The sheaf is not a data format. It is an abstract model, rather like a relational database, for math data. It has particular application what is known as data munging—performing data transformation chores between applications. Sheaf-based transformations also parallelize well across multi-core architectures.’

We suggested that the sheaf sounded rather like Hadoop. Butler agreed that the sheaf is an abstract data model that can be deployed in many different contexts, but claimed that it is better for math data than the ‘text-oriented’ Hadoop. He suggested that use cases might include manipulating seismic velocity models for processing or for reformatting data from geomechanical tetrahedral meshes to cellular models for the reservoir engineer. Butler claims that such transforms see several orders of magnitude speed-up with the sheaf approach. Checkout the LPS patent on 1801 and visit Limit Point.


Quorum Insight—oil and gas business intelligence

SAP Business Objects-based system provides consolidated view.

Quorum Business Solutions (QBS) has announced Quorum Insight, offering oil and gas companies a ‘consolidated view’ of their business activities and key performance indicators. QBS developed Insight on SAP BusinessObjects business intelligence suite using its functional expertise to deliver the business intelligence solution tailored to the oil and gas vertical. Insight extracts data from oil and gas applications and provides user-friendly, ad-hoc reports and executive dashboards for increased visibility of KPIs. Executives get an overview of operations and can drill down to areas that need attention, regardless of which application generated the data. Analysts can scrutinize KPIs such as profitability by asset across the company’s business units. More from Quorum.


Sales, contracts, partnerships and deployments

Aker Solutions, Cybera, Emerson Process Management, CGI, FreeWave, Pantech, Fugro Offshore, Gregg Marine, Halliburton, IFS, McDermott, FTSI, Petra Energia, Rosneft, ExxonMobil, Senergy, Liquid Robotics, Schlumberger, OPIS, Telvent, vMonitor, Al-Haitam, Wood Group Mustang.

ExxonMobil has awarded Aker Solutions a five year frame agreement covering engineering, procurement, construction and installation for operated assets on the Norwegian continental shelf.

Shell Oil Products US and Motiva Enterprises LLC have chosen Cybera Inc.’s Cybera ONE security services platform for Shell-branded wholesalers nationwide.

BP Exploration has selected Emerson Process Management as automation contractor of choice for offshore oil and gas operations in the North Sea. Emerson will supply integrated control and safety systems for five UK offshore fields including Clair Ridge and the Quad 204 FPSO.

CGI is to supply the US Environmental Protection Agency with hosting and virtualization services, architecture and transition support for moving applications to the cloud.

FreeWave Technologies and partner, Pan-Tech Controls have deployed hundreds of I/O Expansion Modules for a major natural gas producer in the Barnett Shale in Texas. The modules provide remote I/O and control of tubing pressure, casing pressure, chemical tank levels and arrival sensors.

Fugro Offshore Geotechnics and seabed robotic drilling specialist Gregg Marine have formed Seafloor Geotec to offer deepwater seabed-based drilling and soil surveys.

Halliburton has opened new remote operations command and control (ROCC) centers serving North Dakota, the Mid- Continent region, south Texas and Louisiana.

Kuwaiti EPC IMCO is implementing IFS Applications to streamline project and contract management. The contracts include licenses and services worth approximately $1.7 million.

McDermott has been awarded two projects for Saudi Aramco in the Arabian Gulf in the Karan, Safaniya and Zuluf fields. The contracts include a new wellhead platform with a 15kV composite power and fiber optic cable.

FTSI and Petra Energia are to form FTS Brasil to provide well completion products and services for to the Brazilian onshore market.

Rosneft and ExxonMobil are to establish a joint Arctic Research Center for Offshore Developments.

Senergy has been awarded a framework agreement worth up to £1.8 million by the UK government. The three-year contract to the Department for Energy and Climate Change seeks to maximize recovery from existing oil and gas developments and provide access to new reserves in technically challenging fields.

Liquid Robotics and Schlumberger have created Liquid Robotics Oil & Gas, a joint venture to develop services for the oil and gas industry using Wave Gliders, ‘wave-powered, autonomous marine vehicles.’

OPIS and Telvent GIT have entered into an agreement to make OPIS price benchmarks available in Telvent’s DTN fuel pricing and trading software.

vMonitor has partnered with Al-Haitam to promote its gas lift data acquisition and processing in Saudi Arabia.

Wood Group Mustang has been awarded a multiyear EPCM framework agreement by Shell Oil Products US for pipeline and terminal projects throughout the Gulf Coast region.


Standards stuff

AIPN LNG sale and purchase agreement. OGC Internet of Things. DCMI in RDFa. OGC GeoSparql.

The Association of International Petroleum Negotiators (AIPN) has published the 2012 version of its unified model LNG master sale and purchase agreement (MSPA). AIPN President William Lafferrandre, ConocoPhillips explained, ‘The new MSPA will help promote the development of an LNG secondary market by reducing transaction costs and time involved in trading and hedging cargoes. The MSPA is the latest in a series of hydrocarbon-related model contracts published by the AIPN to facilitate the negotiation of energy transactions around the globe. The MSPA and other model contracts are free to AIPN members and are available to non-members, for a nominal fee, from the AIPN web site.

The Open Geospatial Consortium (OGC) has formed a working group to promote the sensor web for the Internet of Things (IoT). OGC Sensor Web standards support complex tasks such as controlling Earth imaging satellites. Sensor Web for IoT convener, Steve Liang, director of the GeoSensorWeb Laboratory at the University of Calgary said, ‘These standards will ensure that sensors will be easy to read and control with web services while maintaining security and data integrity.’ More on the IoT from the OGC.

A release of the Dublin Core Metadata spec now includes HTML markup describing all of its properties, classes, datatypes, and vocabulary encoding schemes in machine-readable RDF. The new release leverages the W3C RDFa Lite 1.1 spec., described as ‘the simplest variant of the RDFa syntax for embedding structured data in web pages.’ RDFa web pages provide, in the same source document, both the human-readable text rendered on-screen by browsers and the detailed machine-readable representation needed by Semantic Web applications.

The OGC GeoSparql standard is now available. The standard defines a set of Sparql extension, rule interchange format (RIF) rules and a core RDF/OWL vocabulary for geographic information based on the general feature model, simple features (ISO 19125-1), feature geometry and SQL/MM (a spatial SQL standard). Download the spec here.


Oracle’s industry scorecard for big data

Survey of 300 North American execs find oil and gas laggard in data management stakes.

A new study from Oracle provides an ‘industry scorecard’ on big data business challenges, based on a survey of over 300 North American executives. The report is replete with ‘FUD*,’ finding that ‘big data,’ while poorly understood, is perceived as a real threat. Captivating statistics abound—e.g. 93% of executives believe that poor data management is losing them an average of 14% of revenue per year! Oil and gas ‘loses’ most.

Oracle’s study would make a great starting point for those embarking on a data improvement project. In particular, its list of the biggest data management gripes will resonate with many. But it is hard to see what is in it for Oracle, whose omnipresence would seem to make it at least part of the problem.

One solution described in the study came from an oil and gas CIO who has ‘tripled storage capacity in the last month.’ Others are hiring, some are outsourcing. One is trying to ‘figure out how to get different systems to talk to each other.’ Sounds like there is no quick fix—just more of the same. Or maybe Oracle has some Hadoop up its sleeve—see page 3. Download the study here.

* Fear uncertainty and doubt.


Venture’s ‘Discover, transform, manipulate’ methodology

Ian Jones offers some hints on mitigating data entropy—with help from Venture’s DTM methodology.

An interesting new contribution to Venture Information Management’s knowledge exchange from consultant Ian Jones discusses data quality and ‘extracting the truth from multiple data sources.’ Jones observes how poor management leads to data ‘entropy’ and an inexorable slide down the quality ladder. A Shell study found that quality can decline by ‘up to 5% per month’ due to incompleteness, conflicting sources of the same data and inaccuracies such as incorrect values, names and poor control over edits.

There is a natural tension between user preferences for particular nomenclature and formats and a consistent, managed approach. The solution outline is simple—identify the correct version, correct bad data and render such cleansed information accessible and secure. But the devil is, of course, in the detail.

Enter Venture’s ‘discover, transform manipulate’ (DTM) methodology, a structured process for moving a data resource from chaos to order. DTM leverages Venture’s V-DAT Oracle/PPDM-based staging database to hold and QC data before loading to ‘gold’ level corporate systems. Read the presentation here.


Stone Bond’s IIMD monitors and assures network traffic quality

Integration integrity manager device embeds patented technology in hardware appliance.

Stone Bond has announced the ‘Integration Integrity Manager Device’ (IIMD), a hardware appliance that sits in a data network, monitoring traffic integrity. The IIMD appliance embeds Stone Bond’s previously released Integration Integrity Manager, whose technology is covered by US patent 7065746.

The IIMD monitors network traffic checking against defined formats and data schemas and signals any data issues to developers, database architects and network administrators. The device catches changes in data structure and additions or omissions from the target schema. Stone Bond CTO Pamela Szabo said, ‘Integrated environments involve complex interdependencies. IIMD assures data integrity and offers agility in the face of changing infrastructure.’

When a change in a data source is detected, the IIMD alerts stakeholders and notifies them of what needs to be done to fix the issue and prevent ‘potentially catastrophic impact.’ Corrections can be applied immediately on site or in a cloud environment. The IIMD operates either stand alone in any networked data environment or as a component of Stone Bond’s Enterprise Enabler solution where it can leverage components to create and manage metadata and to access and monitor hundreds of types of data sources and destinations. More from Stone Bond.


Allegro DR offers Dodd Frank compliance for energy traders

Derivative regulation application targets compliance with international regulations.

Energy trading and risk management (ETRM) software house Allegro Development Corp. has released Allegro Derivative Regulation to support Dodd-Frank compliance processes for energy companies. Key functions include regulatory reporting, position limit monitoring and data retention. The system will also support EU market infrastructure regulations (EMIR) and other future regulatory initiatives.

Eldon Klaassen, Allegro CEO warned, ‘Energy companies face an urgency to act, since new requirements may go into effect as soon as the next 60 to 90 days. Derivative Regulation will allow customers to quickly implement the business processes required to mitigate their compliance risk. This new component will help energy firms rapidly achieve compliance readiness.’ Allegro Derivative Regulation is available as a component of Allegro’s flagship ETRM solution or as a stand-alone product for companies with non-Allegro ETRM systems. More from Allegro.


INT teams with Saddleback Geosolutions on seismics

Platform-independent imaging solutions address gaps in current geoscience workflows.

INT and Saddleback Geosolutions are teaming on a suite of platform-independent scientific computing plug-ins for INTViewer, INT’s development platform for seismic analysis and data QC. The plug-ins will provide interoperability with leading third-party tools, targeting ‘gaps’ in current geoscience workflows. First release candidates are due out later this year with two plug-ins currently being tested by key clients.

The Seismic Workbench plug-in is a workflow builder that integrates popular open source seismic processing packages such as Seismic Un*x, FreeUSP and Madagascar. Other plug-ins offer a Java NetBeans integrated development environment, shell-scripting and a Matlab interface.

The Matlab plug-in is claimed to be an ‘indispensible tool’ for geoscience researchers allowing data in an INTViewer workflow to be manipulated inside Matlab for interactive algorithm development and custom processing. INTViewer includes data readers and writers for most all data formats, including SEGY, SEGD, SEG2, SU, SEPlib, RSF, JavaSeis, SVF, CST, LAS, and others. INT’s own seismic indexing capability adds rapid access to ultra-large data-sets. More from INT and Saddleback.


ISN Solutions provides virtualized desktop to Premier

Geoscience applications run remote on Citrix XenDesktop and Nvidia Quadro hardware.

Aberdeen-based systems integrator ISN Solutions has helped Premier Oil virtualize its geoscience desktops. Premier’s geoscientists are located in the company’s London, Aberdeen and Stavanger offices and frequent collaboration is required on seismic interpretation projects using tools such as SMT Kingdom. In the past, managing applications, data and interpretations at these site has been problematical.

Enter ISN’s ‘remote access to 3D applications’ solution, a Citrix XenDesktop-based virtual desktop leveraging the compute power of an Nvidia Quadro graphics card to power 3D applications. A Dell T-series workstation in Aberdeen was repurposed as a XenDesktop host and a Windows 2008 server was deployed as a controller. Geologists in London and Stavanger can now go launch a remote desktop session on the XenDesktop host. The high-end graphics card provisions dual remote monitors at a 1920×1200 resolution. Early users report improved response over the previous VNC remote desktop running Kingdom. Other applications and usage scenarios are being tested in the proof of concept deployment. More from ISN Solutions.


Troika’ Magneto sparks move to RODE on disk

User-friendly seismic tape-to-disk transcription encapsulates data to disk files.

UK-based Troika International has announced Magneto, a user friendly seismic tape transcription utility that is designed to ‘secure’ the contents of any seismic tape by writing it to disk. Magneto transcribes individual SEG-Y files from tape and writes them to an encapsulated SEG-Y file on disk for further analysis and archive. The system can read from legacy 9 and 21 track tape drives as well as modern SCSI devices. A batch mode allows for transcription of multiple projects. The thorny problem of adopting non SEG-Y seismic formats to disk is also solved with encapsulation—using the industry-standard, SEG supported RODE* format. Magneto runs on Windows XP, Vista, Windows 7 and Linux. More from Troika.

* Record oriented data encapsulation.


Moonwalk builds oil and gas data archive system on NetApp

Industry veteran Duncan Wright behind ‘scalable solution’ for oil and gas ‘big data’ challenges.

Palo Alto, CA-based Moonwalk is offering a NetApp-based solution for oil and gas data management. Moonwalk’s software proactively archives data according to rules and policies based on criteria such as age, size, type, name or creator. The solution is said to be transparent to users and applications.

Moonwalk CTO and industry veteran Duncan Wright (joint founder of PECC, and Enigma Data Solutions) said, ‘Moonwalk is a scalable solution to the data management of the oil and gas industry. Previously, developments in disk technology have more or less kept pace with data growth meaning that performant data management and archiving have been seen as ‘nice to have’ rather than mission critical. With the massive expansion of non conventional activity, we are now seeing data management moving center stage.’ Moonwalk’s technology for NetApp targets projects with ‘big data’ challenges such as management of large unstructured datasets. The Moonwalk product’s core design enables effective data management across storage tiers. More from Moonwalk.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.