At this month’s Society of Exploration Geophysicist’s (SEG) annual conference in Denver (full report in next month’s Journal), Halliburton’s Gene Minnich and LMKR founder and CEO Atif Khan teamed on a surprise announcement regarding the future of Landmark’s PC-based geoscience workstation, Geographix. Acquired back in 1995, Geographix has languished of late in the shadow of Landmark’s OpenWorks mainstream interpretation suite and has lost market share to segment leader Seismic Micro Technology.
The agreement between Landmark and Dubai-headquartered LMKR transfers management, sales, marketing and product development of Geographix to LMKR. Landmark remains ‘vested’ in the success of the Geographix business for an initial 5 year period and customer support and training will continue to be offered through the Landmark organization.
LMKR has doubled the number of developers working on Geographix and has introduced new workflows, including land-based unconventional development challenges.
Khan, a Colorado School of Mines alumnus, told Oil IT Journal that the spin-out deal results from a long-term collaboration between Landmark and LMKR which has been serving the Halliburton unit and its clients with development and data management services for over a decade—LMKR currently performs all new development on the Petrobank products line. During an initial five year period, LMKR will develop new Geographix functionality, filling certain seismic and engineering lacunae.
LMKR also plans to build on the Microsoft X-Box controller and Direct 3D11-based ‘in-scene’ reservoir movie technology that was unveiled at last year’s SEG (OITJ November 2009). Following the port of the Discovery 3D module, all of Geographix has been ported to Windows 7. LMKR also unveiled two new Geographix components at SEG, a geomodeling while drilling solution for horizontal drilling in unconventional targets and ‘GeoAtlas,’ a web mapping functionality with a capability for streaming satellite and other geodata sources.
LMKR has a couple of other software irons in the fire—the XpoSim training solution, (OITJ July 2009), PetroHive, an E&P catalog solution (Technology Watch, PNEC 2009) and a workflow/collaboration solution, ‘iScrybe’ under joint development with Adobe.
Khan told Oil IT Journal of his plans to release a ‘Facebook for the enterprise’ solution ‘Cotribe’ (previously known as iSkrybe). LMKR is half owned by Actis private equity and has some 600 employees worldwide. More from www.lmkr.com.
Following a successful evaluation on behalf of a group of majors, Knowledge Reservoir has determined that Cimarron Software Services’ real time monitoring and control solutions have applications in oil and gas production hubs and real-time operations centers. This has resulted in an alliance between Cimarron and Knowledge Reservoir which is to provide integration and implementation services and subject matter experts. Alliance partner Cimarron boasts a 29 year history of real time systems engineering and integration. Cimarron contributed to the design and implementation of NASA’s mission control facilities for the space shuttle and international space station.
Cimarron’s commercial off-the-shelf real time systems are an alternative to proprietary systems and custom programming. This approach is said to ease deployment of remote operations and sustainable control centers. Knowledge Reservoir CEO Ivor Ellul noted that, ‘The new solution leverages data across projects, maintenance and production, connecting information to subject matter and domain experts.’ Ellul was encouraged to report that the real-time data solution has been adopted by ‘one of the leading majors.’ More from drees@knowledge-reservoir.com.
New offerings in the geoscience workstation space are looking rather like London buses this month. You hang around waiting for ages, then three come along at the same time. This month, Schlumberger’s hegemonic Petrel sees revitalized competition from Baker Hughes (page 12), Geographix (this month’s lead) and Landmark (page 3).
~
Chatting with a friend at the SPE in Florence last month—BTW, what a great location for a tradeshow—I got into one of my rants about press releases and counter productive marketing efforts. Reading hundreds of releases every month at least puts me in a good position to contrast and compare—and perhaps offer some advice as to what goes wrong. My friend suggested that I write a book on the subject—which on reflection I think is a great idea. In fact I may start right now by recycling some editorial contributions to this new body of knowledge. What follows is, perhaps not Chapter 1, but maybe a section of the future oeuvre.
When I was a teenager learning to drive some schools were introducing ‘dual control’ systems. While instructors could already grab the learner’s wheel and steer the car around a little old lady on a pedestrian crossing, in some circumstances—like when the tyro turned briskly into a one way street going the wrong way—they felt the need for more control of events. Hence ‘dual controls,’ a replicated brake system so that the instructor could stop the car in an emergency.
Now as you all know, driving is, and was even more so in the middle of the 20th century, a quintessentially macho activity. Driving a (preferably big and powerful) car implied that the driver was also big and powerful. The big problem with dual controls though is that they are essentially sissy. What self respecting (macho) learner driver is going to get into a car with somebody else’s foot on the brake? Dual control manufacturers and driving schools were facing a marketing problem and decided to address it head-on as it were. If you have a sissy thing to sell, you need some wording that addresses the perceived weakness and makes a strength out of it.
What better way than a brazen claim that black is in fact white? Enter the rebrand of the dual control system as ‘He-Man Dual Controls.’ I kid you not, I learned to drive in an automobile which had ‘Equipped with He-Man Dual Controls’ plastered all over it!
I am not yet sure how I will be weaving the He-Man Dual Controls (HMDC) story into my marketing narrative. Does such an approach actually work? Can you sell a product by blustering its greatest weakness into a strength? Let’s just say that HMDC defines a point in marketing taxonomy—it is a technique that is in use today—in oil country software and IT at large.
One of the most persistent HMDC (and possibly the most successful) usage in IT centers on the dichotomy between buyers and sellers. Buyers want flexibility and freedom of choice. Sellers want lock-in and perpetual license sales. So how does a seller sell a closed system with proprietary formats that doesn’t integrate with anything—not even other products in its own range? With HMDC marketing that’s how. Thus a closed product is marketed as ‘open.’
The example that springs to mind is of course Schlumberger’s Ocean development environment for Petrel, with its tag lines of ‘open innovation, ‘open by design,’ ‘open framework,’ ‘open specification’ and so on. Ocean is a closed, proprietary system. Develop code for Ocean and you can be sure that it will not run with anyone else’s products. But HMDC marketing ‘opens’ what was previously ‘closed.’
HMDC marketing was also in evidence in our story last month in regard of Petrel’s capacity to load data. Your users report long load times? Quick! Rush to press with some HMDC marketing about loading an impossibly large data set in record time.
But I don’t want to beat up exclusively on Schlumberger. Landmark was an early adopter of HMDC marketing back in the day, with its ‘OpenWorks.’ What is ‘Open’ Works? It is a closed, proprietary interpretation system. Like Ocean it has an application programming interface—hence the thin claim to openness. Software with an API is better that without. But develop with the OpenWorks API and you can be sure that your code will only run against Landmark’s products.
I checked out Microsoft’s MURA material to see if HMDC marketing was in evidence here. I have to report that no, Microsoft is not claiming openness for MURA. But this may be more because for Microsoft, ‘open’ is the devil’s work.
Oil companies themselves are not immune to a bit of HMDC ‘marketing’ of their own IT initiatives. Let’s suppose, a big company, after years of backing various standards initiatives has more or less thrown in the towel and decided to go it alone with an in-house system developed entirely with proprietary systems. All that is lacking is a bit of HMDC marketing—to define the mish-mash of selected tools as a corporate ‘standard.’
Well I’m not entirely convinced that I can keep this up for the length of a whole book. Not while I have a day job anyhow. But I do think there is an underlying point to be made as to the industry’s use and abuse of ‘open’ and ‘standard.’ There is actually a lot of really good software around that is either open or standard and in some cases both. It can be very useful and if leveraged judiciously in a company’s workflow can avoid reinventing the wheel. If you develop in Java, porting from Linux to Windows is a cinch. If you develop in .NET then you will never port to Linux. You may even have a hard time porting to the next manifestation of the Windows dev kit. In contrast, I have Unix shell scripts I wrote 30 years ago that run fine on my brand new Mac Mini.
The real problem is that, when compared with vendors’ spend on HMDC-type ‘persuasion,’ the marketing clout of the open source movement and even the standards organizations is nearly zero.
This book* targets young engineers starting a career in oil and gas and experienced ones coming in from another vertical. It should also be useful for software engineers trying to see the ‘big picture’ of how oil and gas facilities both onshore and offshore are put together.
Baron is a project manager with French engineering contractor Technip and has 15 years experience of building FPSOs and LNG plants. He also lectures at the French Petroleum Institute (IFP) school.
The Oil and Gas Engineering Guide (OGEG) is a 200 page, well illustrated English language publication. Illustrations include CAD imagery, Hazop plans, spec sheets, piping and instrumentation diagrams and more. In fact just about everything that you will encounter in a major oil and gas project.
Baron’s motivation for the work came from the observation that existing literature was all highly domain specific—focusing on individual disciplines such as process, electrical or civil engineering. OGEG fills this gap with an overview of the whole process. As well as describing each discipline’s contribution, Baron does a good job of describing the arcane division of labors between the multiple stakeholders. He notes that as projects are increasingly divvied-up and performed at far-flung locations around the world, it is hard today for a young engineer to gain ‘end to end’ knowledge.
OGEG fills this gap with a highly accessible book—pitched between the level of a ‘Dummies’ guide and textbook. Baron writes with authority and packs as much relevant information into as little space as possible. In an age where ‘knowledge is power’ and information retention is often the norm, this book is a breath of fresh air.
OGEG covers the whole project lifecycle from front end engineering design, through procurement and construction. A dozen or so individual engineering disciplines are covered and copiously illustrated.
What surprised us was the lack of what have become (to us) familiar facets of oil and gas plant information management. Our reporting from events like PlantTech and Fiatech led us to expect more on computerized information handover and databased plant information. OGEG’s focus on the engineering document as the basic element of construction—still mostly in paper form—reflects the real world.
OGEG offers a short list of frequently used acronyms to help beginners get started. But, like many publications from France, it does not have an index. A surprising lacuna that makes the reviewer’s job hard. Are the topics of ‘handover,’ ‘prime contractor,’ ‘owner operator’ covered? Without an index it is hard to be sure. But really, this is OGEG’s only shortcoming. If your job takes you anywhere near an engineering project, you have to buy this book!
* The Oil & Gas Engineering Guide, H. Baron 2010. Published by Editions Technip. ISBN 9782710809456.
Nick Purday gave Oil IT Journal a demo of Landmark’s new interpretation flagship, the Decision Space Desktop (DSD). DSD is a ‘unified workspace’ for upstream interpretation. The DSD environment comprises an OpenWorks database and a suite of tools for visualization and workflow management. Currently three major applications plug-in to the base framework—for seismic interpretation (derived from SeisWorks), model building( from Earth Model) and a new geological interpretation package, developed in collaboration with Statoil. A fourth application is being integrated into DSD—the AssetView well planning environment. Other tools will follow—although not, as we understand, GeoProbe, whose functionality will ultimately be embedded in other DSD applications.
DSD is marketed as a ‘collaboration’ platform for the asset team. Multi-user access to a common database benefits from standard OpenWorks/Oracle transaction management. Workflow control is stored in an XML file which, in a future release, will allow for re-play and batch execution of parameterized jobs.
Our demo began with basin-scale interpretation of geo-referenced sketch maps and cross sections in one window. Cultural data in the form of ESRI ArcMap files can be viewed live from the database—a future DSD release will have a bi-directional link to ESRI. New ‘geo-shaper’ technology allows for stratigraphic units to be defined and interpretation history—who did what and when—is also stored in OpenWorks.
Interpretation proceeds with the creation of a sealed model leveraging the Shapes topology engine that Landmark acquired from GeoSmith in 2007. Well data defined events can be incorporated into the model using the conformance modeling technology originally developed in Geographix’ cross section tools. Property modeling leverages the Isatis geostatistical engine—thanks to an ‘exclusive’ relationship with Geovariances. Non-sequential methods can be multi-threaded, benefitting from modern multi core architectures. For non specialist users, DSD exposes ‘off-the-shelf’ workflows—such as a ‘west Permian basin carbonate’ play. Experienced geostatisticians have the option to tweak their own variograms.
DSD generates vertical (as opposed to pillar) grids. The gridding was developed by Halliburton and aligned with the simulator group. Gridding can create a VDB file for Nexus or write out Rescue formatted data for Eclipse and other tools. The next release will include simulator post processing and visualization.
DSD runs on Red Hat Linux and porting to Windows is underway, leveraging Landmark’s investment in Java, alongside Nokia’s Qt. This leaves the Halliburton engineering data model (EDM) toolset outside of DSD’s immediate scope—although data connectors are available. Early tests show performance on Windows to be ‘equivalent’ to Linux although no benchmarks were available. Purday commented, ‘There is no rush to Windows—most customers are happy with Linux. Some major accounts won’t leave Linux because of Windows’ security issues.’
Our demo benefitted from a very high end display—an 8 megapixel, 4 x HD Barco screen driven by dual Nvidia PCI 16 graphics cards. Plenty of monitor real estate is required to get the maximum use of DSD’s component apps. In fact, DSD is really more of a ‘workstation’ than a ‘desktop’ if such a distinction can be made these days. More from www.oilit.com/links/1010_2.
‘GeoFacets’ search tool was developed for key oil and gas users of Elsevier’s huge geoscience collection (65,000 articles from 31 earth science Journals.) In addition to the usual keyword search of titles and abstracts, Elsevier has digitized, indexed and georeferenced every map appearing in the body of an article’s text—currently around 117,000 maps. A user seeking information on the ‘Niger Delta’ for instance will see all articles containing ‘Niger Delta’ in all the text—including map captions. Maps appear as overlays in a Google Map interface and can be downloaded as georeferenced GeoTiffs for import into Esri’s ArcMap.
Product manager Phoebe McMellon told Oil IT Journal, ‘We have partnered with IHS to add its petroleum basin glossary and outlines to GeoFacets—further enhancing search. We interviewed hundreds of geologists to establish work practices and requirements.’ A pilot is underway to enable Elsevier’s clients to georeference their own maps in reports and internal documents. Watch the GeoFacets video on www.oilit.com/links/1010_6.
Speaking at the 2010 ECIM data management conference last month, Ketil Waagbø (Blueback Reservoir) observed that Petrel’s proprietary, flat file format makes the product ‘fast and popular,’ but creates problems for data management. User ‘freedom’ can lead to data duplication as projects are scattered around the network. The data manager’s challenge is of lost control over proliferating projects and data.
Blueback Reservoir’s data management toolset—the ‘Blueback Reference Project Tracker’ (BRPT) uses the new Petrel Reference Project and the Ocean API to add ‘proper’ data management to Petrel. This lets companies enforce corporate naming conventions, rename existing Petrel objects, set well symbols and report on Petrel projects. BRPT can be used to set and track goals for data management and cleanup. The BRPT plug-in can be scripted to run in batch mode across multiple Petrel projects and also exposes Ocean functionality such as coordinate reference system and units management. BRPT also offers data management metrics to track project progress. BRPT is scheduled for commercial release in Q4, 2010 and will be available from the Ocean Store. More from sales@blueback-reservoir.com.
Jamie Cruise unveiled Fuse Information Management’s ‘XStreamline’ seismic data management solution at last month’s ECIM data management conference in Haugesund. XStreamline takes a bottom-up approach to solving the everyday issues confronting the asset team. The system replaces traditional paper transmittals and email communication with a ‘collaborative e-transmittal’ solution that captures key metadata before data is shipped. The metadata can be used to kick off project workflows and also to populate a corporate data store. Once in-house, XStreamline initiates the data loading process including QC and validation in a semi-automated process.
By simultaneously populating both projects and the corporate data store, XStreamline avoids potential delays and ensures that the CDS is up to date. Fuse claims that the tool avoids the need for highly skilled data managers and extensive toolkits of data loading utilities. XStreamline offers tools to enable users to ‘find, use, share and explore’ (FUSE!) their data which can be loaded to applications ‘in a single click.’ Once data is in-house, XStreamline provides a standards-based workflow management platform that integrates datastores, applications and (currently) middleware such as OpenSpirit. But Fuse’s mid-term aim is to add client software and replace existing middleware with its own data services. More from www.fuseim.com.
Petrom has awarded IBM a 10 year, ‘multi million’ dollar contract for the provision of a data center and services at ‘Petrom City,’ Petrom’s future Bucharest HQ. Petrom City will regroup around 2,500 employees in a campus comprising the data center and a 5MW power plant.
IBM will take over operations of Petrom’s data center infrastructure and will provide services to Petrom, OMV and other Romanian clients. Services will include cloud computing, data hosting and business continuity solutions with on-site backup and disaster recovery capabilities.
Petrom CFO Reinhard Pichler said, ‘The collaboration with IBM reaffirms the high standards used for building and equipping the data center in Petrom City. Through this agreement, Petrom will benefit from a market leader’s expertise in data center hosting as well as cost savings from the joint use of the facility.’ Petrom is 21%owned by Austrian OMV. More from www.ibm.com/services.
Schlumberger has rolled out Merak Peep 2010 with 165 ‘transparent’ fiscal models and a new plug-in development API for clients’ workflows. 15 oil and gas companies provided input to the new release—www.merak.com.
Geologic has added new features to its GeoScout flagship in an exclusive partnership with www.PNGexchange.com whose property information is now available to GeoScout Land users. Property listings can be clicked-through for more information on a property—www.geologic.com.
Mathcad 15.0 includes new design of experiment methods, embedded engineering reference materials from Knovel’s 4,000 reference works and integration with Kornucopia’s algorithms for finite element analysis—www.mathcad.com.
ABB and Dresser Masoneilan are to collaborate on a monitoring and testing solution for emergency shutdown valves. The solution combines ABB’s safety instrumented system with Masoneilan’s emergency shutdown device and partial stroke test (PST) controller ‘to improve safety and increase availability in an emergency’—www.abb.com and www.dressermasoneilan.com.
The 202 release of Exprodat’s Team-GIS Segment Analyst includes support for raster and contour datasets as inputs, the ability to apply weightings to input layers and a data extraction tool for integration with third party applications—www.exprodat.com.
The 2.6 release of Kappa’s Emeraude production log interpretation toolset includes a new ‘continuous’ interpretation technique, streamlined workflows for multi-probe tools, a temperature model for energy balance studies and open hole permeability corrections from production logging results—www.kappaeng.com.
Safe Software has added a Tibco/OpenSpirit interface to FME 2010 for integration of geotechnical applications and data. FME’s data transformation engine can now move GIS, CAD and E&P data in and out of oil and gas applications and data stores. Safe claims support for over 250 spatial and non-spatial data formats including OpenWorks, Seisworks, GeoFrame, Iesx, Charisma, Petra, PPDM, ‘managed’ SEG-Y, Finder, SMT, Epos/Geolog, SDE and Recall—www.safe.com.
ExperTune has added a capability to tune cascade control loops from a web browser to its PlantTriage Loop Performance Monitor. Founder John Gerry noted that ‘Most cascade loops in plants are poorly tuned, resulting in oscillations and interactions between the inner and outer cascade loops. These tools make it simple to tune loops right and will help stabilize plant operations’—www.planttriage.com.
KSS Fuels has introduced a pricing performance management (PPM) service for users of its PriceNet and RackPrice fuel price management solutions. PPM lets clients ‘strategically and proactively’ monitor and adapt their software configurations to changing market conditions and competitive influences’—www.kssg.com.
Shell’s Neil Shaw introduced the ‘anthropic principle*’ of information management—focusing on how we interact with technology and concluding that two fingers, a keyboard and mouse was ‘primitive.’ Shaw wondered what Steve Jobs would make of the tools used in E&P today. Will future explorationists be using touch screens? Not that long ago, we lived in a world of punched cards and no screens at all! Now we can visualize with virtual reality, haptics and immersive environments.
‘Street level’ computing in the form of Apple’s iPhone and iPad are here now and Microsoft has been pushing surface computing for a couple of years as ‘as significant a move as the move from MS-DOS to Windows.’ He believes that there is a strong business case for a move away from the desk and workstation paradigm. A change is needed to attract and retain technical talent, to offer a better user experience with ‘end to end’ standards use, improved data workflows and collaboration. The new technology promises an end to repetitive strain injury and improved productivity.
Geoscience Australia could benefit from the new paradigm with controlled and assisted data entry to Government databases and well data filing. PPDM could also play a role in the brave new E&P IM world—as the basis of a ‘federated, or virtual database that achieves the required level of data abstraction.’
Jess Kozman (CL Tech) described a project performed for BP on its east Texas Blocker gas field using technology from 3-GIG and MetaCarta. BP’s users were faced with a litany of problems such as a lack of standards for data exchange, no audit trail for interpretation workflows and poor capture of results from prior field studies. Organizational knowledge was being lost as there was no repository for technically validated data. Enter 3-GIG’s Prospect Director, an asset lifecycle information management solution build around a PPDM data repository.
GeoCom’s Keith Woolard offered a primer in seismic data loading—noting that seismic metadata management was easier in the old days of paper sections! Today there are multiple ways of geting things wrong, misaligning trace and navigation. One answer is GeoCom’s STQC utility for Landmark environments. The tool checks shotpoint positioning and trace increments before load. Woolard also remarked on a recent trend to do away with data loading specialists. Today more data is loaded by geoscientists with ‘ramifications’ on disk management and data quality.
Petrosys’ Rob Bruinsma provided a step by step introduction to ‘GIS in PPDM.’ Bruinsma, who sits on the PPDM data modeling committee, believes that storing spatial data in the database is the best way of making sure it can be found later. PPDM 3.8 can store generic spatial data including points, lines and polygons although there is no (easy) place for spatial attribute data. But to fully utilize spatial data in PPDM, it has to be grouped into familiar business objects. To do this the model has to be extended with a few tables—Bruinsma explained how, with minimal extensions, PPDM can be configured to store generic spatial attribute data. Petrosys is now proposing ‘that these or similar extensions be included in PPDM 3.9.’ More from www.ppdm.org.
* www.oilit.com/links/1010_10.
The SPE sure knows how to stage a good plenary session with around 2,000 present for the opening event—a debate on the future of natural gas. Chairperson Schlumberger CEO Andrew Gould described as ‘spectacular’ the growth in unconventional gas which now makes up over 40% plus of US domestic production. Non-conventional reserves are known to exist in other parts of the world, the question is, can the success be replicated in populated and environmentally sensitive areas such as Western Europe? Whatever the answer, for Gould, gas has come of age. It is no longer an ‘inconvenience.’ Not even just a ‘transition’ fuel and may well represent the ‘fuel of the future.’ Howard Paver (Hess) was a bit more circumspect. Demand growth out to 2035 is forecast at 4000 TCF—the equivalent of 1.3 million Barnett shale wells and a $4.5 trillion investment! Sara Ortwein (ExxonMobil) expects natural gas production in 2030 to be 55% up on 2005 with 50% conventional. Under 15% of the global gas ‘resource’ has been produced to date. Demand growth is mostly from non-OECD countries so energy needs to be cheap. Mike Stoppard (IHS/CERA) noted that gas is being ‘out-competed.’ Not in the marketplace but in the policy area. Gas needs to raise its voice and make its case in the new ‘decarbonized’ energy economy where its rightful position is alongside green energy.
Sticking with the natural gas theme, Mike Economides (University of Houston) offered an entertaining talk on the potential of China’s ‘third coast,’ i.e. its river system, to bring natural gas to internal markets. The clean fuel is urgently needed because today, China’s energy comes 70% from coal and, ‘you can’t see the sky in many Chinese cities.’
Speaking at the Digital Energy special session Mike Crawford (ExxonMobil) and Rick Morneau, (Chevron) noted that although companies want integrated operations, the current state of standards development inhibits interoperability. Most standards today are XML/web services which reduce exchange friction. The Witsml data standard has accomplished this for real time drilling. There may be cooperation between standards bodies or there may be competition leading to a ‘fork’ in their potential value. A gas lift optimization program might involve seven different standards from OPC to ProdML. Today’s standards bodies are uncoordinated. PPDM, OPC, Energistics and ISO represent so much conflicting and competing terminology. The authors consider that the next stage of the digital oilfield requires ‘common purpose and alignment of upstream standards.’ There is neither a lack of standards organization, nor a lack of participation in both personnel and financial commitment. But still, integrated operations eludes the upstream. Maybe the time is ripe for a rethink of standards organizations’ process and methods—perhaps under the auspices of the SPE.
Ali Dogru described how Saudi Aramco has evolved from a ‘silo’ mentality a couple of decades ago that tended to produce ‘conceptual’ reservoir models. These proved inadequate to model unswept zones in Aramco’s massive reservoirs. Today the company has a comprehensive toolset for modeling, intelligent completions and for keeping track of the ‘explosion’ in measurement and the revolution in computing power, graphics and communications. Aramco still has domain specific departments—but no more silos thanks to improved communications. Aramco now uses ‘high precision’ reservoir models to reduce uncertainty and optimize reservoir management. The reservoir simulator is regarded as a powerful integrator. Here Aramco’s objective is to preserve the fine grain geological models—made up of billions of cells—right through to the simulator. This avoids losing resolution of thin beds and other flow barriers. Today’s model of the Ghawar supergiant has a 250 meter grid size and 10 million cells. This is not fine grained enough. Aramco is now working on a 25 meter grid size and a billion cell model. Running the simulator code on such a monster means massively parallel simulators. But the prize is huge. For Ghawar, a 1 % hike in recovery means an extra billion barrels of oil.
Comparisons of different resolution models on the same field have shown that coarse models fail to resolve closely spaced wells. High resolution simulations of water cut against time are ‘much closer to observation.’ Aramco has made a significant investment in cluster technology to run its GigaPowers simulator. A billion cell model of the Safaniya Field, the worlds largest offshore oilfield took 15 hours to match 50 years of history, leveraging Aramco’s 4,000 core cluster. Dogru wound up saying ‘this is all about finding oil in field outliers with our fine grained model—this is what justifies my existence’ and exhorted others to ‘build bigger models.’ Aramco is already planning for a 100 billion ‘exacell’ model for 2014.
Pat Leach (Decision Strategies) believes that we are all too risk averse. We evaluate situations of identical risk differently according to how they are presented to us—a phenomenon captured in Kahneman and Tversky’s ‘prospect theory*.’ For most people, the ‘certain equivalent,’ i.e. how much you would take in exchange for an opportunity, is less than the expected value—even though these should be the same. The caveat is that you must be able to afford to lose—or for companies, that no single project will put them ‘in distress.’ Portfolio managers should be risk neutral and base decisions on the expected value. Applying risk at the project level wastes a company’s ability to absorb losses and share gains, although project managers may not like this approach. Companies are even more averse to development project risk because of the magnitude of the potential loss. Here value of information analyses may help. Efficient frontier analysis is also a great tool for applying set levels of risk tolerance.
* www.oilit.com/links/1010_11.
This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech.
The Digital Energy panel session at the SPE ATCE in Florence last month was poorly attended (40 in the audience and another 7 on the panel). Total’s Philippe Malzac set the scene noting that the digital oilfield (DOF) can mean different things in different circumstances. In a brown field, communications may be ‘good to non existent.’ Pilots may be successful in a specific context but disappoint elsewhere and be viewed as expensive IT ‘gadgets.’ DOF deployment may need organizational reshaping, requiring sponsorship and demonstrated added value—even though a comprehensive business case may be hard to make. DOF deployers may have to include trades unions in their plans. One of Total’s DOF attempts was thwarted by union opposition!
Meyer Bengio (Schlumberger)
has been doing key word search across the SPE’s Intelligent Energy/Digital Oilfield
library to find that ‘data management’ is down, ‘smart iron’ (intelligent completions)
is up and ‘production optimization’ is stable. DOF stakeholders often have different
viewpoints and may ‘pull in different directions.’ While IT may be looking for
a ‘reference architecture,’ such an objective may be ‘far removed from the asset
manager’s viewpoint.’ R&D may have a new algorithm and want a field test.
But engineers have a daytime job and may be unavailable for such work.
How do we create an environment where new technology can be tested? For Schlumberger
this poses existential questions like ‘who are we selling to?’ Some clients
have a ‘big brother’ mentality and are resistant to data transparency. Perhaps
industry needs a ‘test field’ along the lines of Schlumberger’s own test well.
Peter Kapteijn (Maersk Oil) admitted surprise after early DOF events demonstrated an average 20% hike in NPV. He expected all to rush home and proselytize. It did not happen. He has since quizzed colleagues to find a lack of clarity as to what DOF actually is. While the theory of ‘smart’ is fine, while closed loop optimization is a done deal and IT’s value is accepted, it remains hard to communicate to an asset what it is all worth. There are also concerns on software reliability and system complexity and some are reluctant to share information—particularly with remote operations. One possible explanation of this is the DOF’s ‘risk profile.’ The DOF is a high profile activity and while it should be a ‘no brainer,’ people shy away because of the potential for a failure. There is a tension between individual outcome and business value. Kapteijn suggests that management should ‘de-risk’ the DOF for individuals and their projects—perhaps with a DOF sandbox (like Bengio’s test field). Maybe we should stop communicating the ‘big vision’ which is too complex and long term. It would be better if ‘leaders were leading!’ But you can do DOF yourself. Although you do need to ‘make the right choices regarding architecture and infrastructure.’ Kapteijn did not say what these ‘right choices’ were.
Muhammad Saggaf noted that, for Saudi Aramco, the DOF goes back 30 years with the introduction of SCADA systems in 1982. Today, half of Aramco’s fields are DOF and the others are being retrofitted. The 1.2 million barrel/day Khurais is the largest DOF in the world. Following analysis and intense reservoir simulation, a three month water injection campaign was initiated to boost reservoir pressure. Real time isobaric measurements tracked progress. All this was done before a single drop of oil was produced. The DOF is not a collection of ‘gadgets,’ rather a stack comprising surveillance, integration, optimization and automation. The future is the ‘autonomous’ field combining data collection, simulation and action that produces an optimum strategy at any point in time.
In the Q&A some reported skepticism as to the DOF’s added value. This may be partially due to the fact that many projects are successful but appear outside of DOF scope. An instrumented gas lift optimization exercise may not be called a ‘DOF.’ Others wondered if the NOC culture, with its long term view, was more conducive to DOF projects. There was general agreement that domain/discipline silos remain a barrier to DOF roll-out. Kapteijn warned of projects being hijacked by ‘fundamentalists,’ ‘If you wait until you have solved everything, described all workflows in detail you’ll never get started. You need to start with the basics of a workflow described at the highest level.’ Reporting from an existing DOF, an Aera Energy representative stated that Aera had now automated 90% of its engineer’s repetitive workload, leaving them free to working on ‘investigative stuff.
The joint industry project on geospatial integrity of geoscience software (GIGS) held an open meeting in ExxonMobil’s UK headquarters earlier this year. JIP members include BP, Devon, ExxonMobil, Petronas and Shell. The project is managed by geomatics specialist Cain and Barnes. Funding has been set at $1.5 million over 3 years. The project is to evaluate the geospatial data handling capacity of some of the industry’s major vendors including ESRI, Halliburton and Schlumberger. GIGS will issue a test data set and provide a mechanism for vendors to self-certify products and work towards a GIGS-compliant specification for upstream software. JIP members will get the detailed software evaluations while GIGS will also release summary information and a test dataset to industry. GIGS also has a role in educating users and developers in the finer points of oil and gas geodetics, data management and geographic metadata. In fact much of the day was taken up with a series of tutorials on geospatial data integrity.
GIGS includes a checklist of recommendations to developers for documenting their software’s data handling capability and troubleshooting activities such as merging datasets with different coordinate reference systems.
A key GIGS evaluator is the degree of compliance of a software tool with the authoritative geodetic data set of the European Petroleum Survey Group—epsg.org. GIGS plans to build on the EPSG work to create a suite of geodetic parameter libraries. Another GIGS sub project covers the user interface for geospatial information—seeking to assure consistency and completeness in geodetic nomenclature. More from www.oilit.com/links/1010_9.
Advanced Visual Systems has promoted Anoop Chatterjee to CTO. Chatterjee has been with AVS since 1998.
Per Harald Kongelf has been appointed executive VP for Energy Development & Services at Aker Solutions, replacing Jarle Tautra who is now special advisor to the CEO. Tore Sjursen heads-up the new MMO unit.
Edward Flores has been appointed President and COO of Best Energy Services, Inc, succeeding Eugene Allen. He was formerly with Key Energy Services.
Compuware’s Covisint unit has joined Energistics.
Hector Manosalva Rojas has been appointed Executive VP of E&P of Ecopetrol.
Fuel Tech has named Vincent Arnone executive VP, worldwide operations and Paul Carmignani VP new product development.
Ingrain has opened a digital rock physics lab in Mussafah, Abu Dhabi.
Former Governor of Alaska Steve Cowper has joined Knowledge Ops as senior fellow for government affairs.
Brian Gifford has been named VP HR with Lufkin Industries. Gifford hails from X-Treme Oil Drilling.
Michael Baker Corp. has appointed Mark Moore VP and leader of its oil and gas engineering business development.
Investment bank McNicoll, Lewis & Vlak has opened a Denver office headed by Brent Lewis.
Kathy Yelick has been named Associate Lab Director for Computing Sciences at the National Energy Research Scientific Computing Center.
Ed Will, VP Marketing & Strategy for Cameron International, was elected to the board of OFS Portal, replacing Jerry Lummus, who has retired.
Dan Weisberg and Michael Le Lion are joining Panopticon Software, as respectively VP business development for North America and SVP international sales. Weisberg hails from Sybase, and Le Lion from Bishopsgate Financial.
Denise Dorsey, formerly of Lumina Geophysical and Fusion Petreoleum, has joined Blueback Reservoir as sales account manager. Former Paradigm Geophysical software developer John Bisbee, and ex-Ikon engineer Douglas Waights have also joined the company.
Pacific Gas and Electric is making information about its natural gas pipelines available online to customers.
John Rutherford has been appointed executive VP of Plains All American Pipeline with responsibility for Business Development and Strategic Planning functions. He was previously Head of North American Energy at Lazard Freres.
Qinetiq has appointed a new Health, Safety and Environment Director, Stephen Evans-Howe, formerly of VT Group.
Petroleum engineer Moksh Dani, formerly of Marathon Oil, and Joseph E. Stowers have joined Ryder Scott.
Sentry Petroleum has appointed Dr. Paul Boldy as CFO and member of the Board of Directors.
Halliburton reports that it has deployed 500 ‘SmartWell’ completion systems in 26 countries via its WellDynamics unit.
John Simmons has been appointed CFO and Director of Stewart & Stevenson. He hails from Cooper Energy Services.
T.D. Williamson has promoted Bruce Thames to VP Eastern Hemisphere.
VP Tom Robinson heads-up TerraSpark’s new Houston office.
The National Oil Spill Commission has appointed Tyler Priest of the University of Houston to write a history of offshore oil regulation for its report on the BP oil spill.
Jim Curry has joined Van Ness Feldman to advise on pipeline safety and compliance. Curry hails from the pipeline and hazardous materials safety administration.
KemeX founder, director and executive VP Ken James has been appointed to Wescorp’s Board of Directors to oversee technology development.
Wireless Seismic has recruited Kip Ingram as VP Engineering and Lawrence Doudna as VP Business Development.
Fugro has purchased the 30% minority holding in Fugro Jacques GeoSurveys of St. John’s, Newfoundland from Stantec Consulting. The unit has a staff of 80 and will operate under the name of Fugro GeoSurveys. Annual revenues are around $15 million.
Halliburton has acquired The Permedia Research Group, suppliers of petroleum systems modeling software and services.
National Oilwell Varco has signed an agreement, subject to final approval by its board of directors and other conditions to closing, to acquire the Advanced Production and Loading PLC subsidiary of BW Offshore Limited in a $500 million cash transaction.
Qbase Holdings has acquired MetaCarta from Nokia. Nokia will retain the geographic intelligence technology which will be incorporated in its search services.
Qinetiq has sold its S&IS security operations and access control business to ManTech International Corporation for $60 million cash.
Eurasia Drilling Company and Schlumberger have signed a Letter of Intent to sell and purchase each other’s drilling and service assets, while entering a strategic alliance in the CIS. Schlumberger has agreed to sell all drilling, sidetrack and workover rigs currently operating mainly in West Siberia to Eurasia. As part of the sale, the rigs’ crews will transfer to Eurasia. Schlumberger has also agreed to purchase the Eurasia drilling services businesses.
The Building Technologies Division of Siemens Industry has signed an agreement to acquire Site Controls, an Austin, Texas-based provider of enterprise-wide energy management solutions for multi-site commercial businesses. Terms were not disclosed.
Siemens’ IT business has become a limited liability company and will operate under the name SIS GmbH. It will remain a long-term IT service provider and preferred IT solutions partner for Siemens’ Energy and Industry units.
Australian software house Mincom, better known for its presence in the mining vertical, is expanding its oil and gas portfolio with the extensive deployment of a new release of its asset management tool at Colombian NOC, Ecopetrol. Ecopetrol has deployed Mincom Ellipse to consolidate a multiplicity of internal developments with a centralized solution for its corporate maintenance, materials and accounts payable management system (known by its Spanish acronym, SCAM).
Ecopetrol’s asset management environment comprises some 2,900 users in 32 districts, 320,000 assets at 125 warehouses and 15,500 monthly work orders. The company spent around $100 million to catalogue its 289,000 materials items and spends some $350 million per year on maintenance. Ecopetrol’s activity spans national and international exploration and production and refining for the domestic market and export to the southern US.
Alongside the Ellipse enterprise asset management core module, Ecopetrol has deployed Mincom Enterprise Reporting and Analytics for visibility and Mincom Work Management for maintenance. Ellipse provides ‘out-of-the-box’ integration to leading ERP applications including SAP, used by Ecopetrol’s HR and Finance departments. Australian Woodside is another Ellipse user. More from www.mincom.com.
Speaking at the 2010 ECIM data management conference last month, Vasily Borisov introduced Kadme’s WhereOil plug-in for Petrel. WhereOil for Petrel generates metadata for all Petrel projects including information such as date modified and interpreter’s name. WhereOil spiders Petrel projects on a daily basis, indexing such metadata and ‘spatializing’ seismic cube and grid outlines, wells and tops. The technology targets data reuse by tracking previous work and avoiding re-work. WhereOil sets out to help data managers build a Petrel ‘master’ project of quality controlled, spatialized information in Petrel by exposing quality indicators such as ‘projects without name,’ ‘no CRS,’ ‘unrecognized well.’ Top quality projects can be integrated into a GIS, dashboard, or a business intelligence application via the WhereOil REST API. WhereOil has seen take-up in the national data repository space with deployments for Colombia’s ANH and the Nordics ArcticWeb project. An agreement with Tata Consulting allows for enterprise scale support. Watch the WhereOil video on www.oilit.com/links/1010_8.
VRContext is the latest oil and gas software boutique to swear allegiance to Microsoft’s Upstream Reference Architecture (MURA). VRContext has submitted examples of its Walkinside architecture for remote users, distributed control system data access and 3D visualization to the project. Walkinside offers a 3D CAD-type view of sensors on ocean-bottom equipment, allowing drill down through the model to real time data.
Contributed use-case scenarios include subsea asset management and operation. The press release includes what is perhaps the most concrete example of what MURA might include. VRContext’s contribution includes ‘OPC UA/UI collaboration protocols and services, Microsoft SharePoint Server, SOAP HTTP messaging services and XML data access.’ Inputs to Walkinside come from CAD models and ESRI GIS subsea data ‘delivered in a Microsoft .NET and OpenGL environment.’
VRContext CEO Francois Lagae said, ‘We look forward to expanding our MURA contribution by sharing our topside asset operations and maintenance, operator training simulator interfacing and HSE workflows.’ It may or may not be significant but, despite the enthusiasm in the press release, there is no mention of MURA on www.vrcontext.com.
GE continues to build out its $40 billion energy technology portfolio with the acquisition this month of Dresser and Allied Wireline. The $3 billion Dresser deal sees technologies for gas engines, control and relief valves, measurement, regulation and control solutions for fuel distribution added to GE’s Energy unit. 85% of Dresser’s revenue comes from energy customers. Dresser had revenues of $2 billion and earnings of $318 million in 2009.
In a separate deal, GE Oil & Gas’ downhole technology arm has signed a partnership agreement with newly-formed Allied Wireline Services to develop GE’s open hole ‘Ultrawire’ formation evaluation tool suite. GE is also to provide Allied with technical and services support from its established facilities in the United States, Canada and the United Kingdom. More from www.ge.com.
Sharecat Solutions has won an NOK 30 million ($ 4.9 million) three-year information management contract from Statoil for collecting and classifying oilfield equipment information.
Barco is to provide a 4D visualization room for simulation results and a collaboration room for Petrobras-funded research at the University of São Paulo’s numerical offshore tank (TPN) lab.
DNV has been contracted by the Joint Investigation Team of the US Departments of the Interior and Homeland Security for the forensic examination of the blowout preventer and lower marine riser package from the Gulf of Mexico Macondo/Deepwater Horizon blowout.
Energy Solutions International has signed a contract with Transportadora Brasileira Gasoduto Bolívia-Brasil (TGB) to roll out PipelineTransporter to manage TBG’s transportation contracts, nominations and allocations processes.
Epsis has announced a strategic partnership with Synergy AV of Houston.
Esri has now joined the Microsoft Upstream Reference Architecture initiative.
Foster Findlay Associates and Statoil are continuing their R&D collaboration on 3D seismic analysis. Project results are leveraged in Statoil’s ‘AVI’ advanced volume interpretation application.
Invensys has strengthened its strategic relationship with Cognizant with a ‘go-to-market’ agreement to accelerate delivery of its InFusion Enterprise Control System-based solutions. Invensys has also signed a five-year, multi-million dollar contract with Bangkok-based Thai Oil Public Company for implementation of its SimSci-Esscor Romeo optimization software to improve refinery operations.
Management Controls has sold a license for its Track flagship to ‘one of the world’s largest refining, chemicals, and oil sands companies.’ Track will be deployed to gain transparency and manage costs and project status for complex projects at 17 locations in the US and Canada.
Paradigm has announced an agreement with visualization and collaboration technology provider Cyviz to showcase Paradigm solutions at the Cyviz Technology Center in Stavanger.
Senergy Group has selected Dassault Systèmes’ Abaqus finite element analysis software brand as its primary tool for geomechanics and structural stability assessments of subsurface engineering projects.
Seismic Micro-Technology has joined the NetApp Alliance Partner Program.
WellPoint Systems is to supply its back office system to Alta Mesa Holdings. WellPoint also reports a sale to Mississippi-based Callon Petroleum Co. for the provision of back-office IT services and to deploy its ‘Intelligent Dashboard’ real-time business intelligence solution. Both companies will be using WellPoint System’s Bolo product.
ExxonMobil, on behalf of the Marine Well Containment Company (MWCC) has awarded a contract to Technip for front-end engineering and design of underwater well-containment equipment. This system will be used by the MWCC to provide emergency response services in the U.S. Gulf of Mexico.
EMGS reports progress towards a standard format for electromagnetic (EM) data leveraging the HDF5 file format. The JIP that involves EMGS, Statoil and Interaction has now released a format for EM time series data, H5EM-TS. HDF5 includes libraries for C, C++, Fortran, Java, Python and Matlab. A data browser, ‘HDFView’ is also available from www.hdfgroup.org. An overview of the new format can be downloaded from www.oilit.com/links/1010_12.
GeoNetwork has issued an update to its eponymous open source toolset for geographic metadata management including search and indexing enhancements, an Inspire search panel and metadata view, an embedded OpenLayers-based web map viewer and multilingual metadata display and editing support for the ISO19115 et seq. metadata. Download the software from www.oilit.com/links/1010_13.
Following on from Jill Lewis’ geophysical standards presentation at last month’s ECIM, we received a statement from the Norwegian Petroleum Directorate (NPD) ‘No decision has yet been taken by the NPD but it is our intention to mandate the most modern standard for the acquisition and storage of seismic field data, i.e. SEG-D Rev. 3, assuming that the standard is fit for purpose, robust and brings real value to the industry. The NPD notes that key players like PGS, WesternGeco, Fugro and Sercel have contributed towards the development of the new standard and take this as a strong indication that the standard is a real step forward. Before any final decision is taken, however, the NPD will consult with industry and with other government authorities.’
The US Federal Geographic Data Committee steering committee has officially endorsed a group of OGC and other standards which underpin the ‘GeoOneStop’ geospatial platform for place-based initiatives. More from www.oilit.com/links/1010_14.
Pearson-Harper (PH) has been awarded a contract for engineering information management (EIM) services and software by Chevron Australia, operator of the Gorgon LNG project. The contact is valued at approx. $10 million. Pearson-Harper’s PHusion will provide a ‘single source of truth’ on engineering equipment to all stakeholders. PH’s Daren Hinchliffe said, ‘Staff and subcontractors from around the world need access to the same engineering information set. We first collect and verify the data and then enable round the clock access to the current data set.’ Both software and data are hosted on Pearson-Harper’s servers with a ‘dual hosted’ disaster recovery system providing data integrity.
The Gorgon Project includes a 15 million tonne per annum (MTPA) LNG plant on Barrow Island and a domestic gas plant with a 300 terajoules/day capacity. Partners in Gorgon include Chevron, ExxonMobil and Shell. PH focuses on major projects. Previous contract wins include the $7.5 billion Azeri-Chirag-Guneshli (Azerbaijan) project, Nexen’s Buzzard development, BP Angol’s Block 31 (4 FPSOs) and Greater Plutonia. More from www.pearson-harper.co.uk.
BP has selected Qinetiq’s OptaSense technology as the security system for the Turkey section of the 1,770 km Baku Tbilisi Ceyhan (BTC). The optical fiber pipeline intrusion detection system detects and identifies potential threats to buried pipelines. An initial 100km of pipe was protected by OptaSense last year and successfully detected incidents including vehicular approach, footsteps and digging next to the pipeline. The system raised alarms allowing control center operators to alert the police with the exact location of the incident, preventing a potentially significant environment and business impact.
OptaSense can be retrofitted into existing block valve stations, using existing communications fiber adjacent to the pipeline as an acoustic array to monitor sound and seismic activity. Over 100km of pipe can be monitored from a single station. Qinetiq is now partnering with Future Fibre Technologies to cross market OptaSense and FFT’s ‘Secure Fence’ solution fence mounted perimeter protection. More from www.optasense.net.
Barco has teamed with VidSys on a real-time situation visibility for the control room. Barco has extended its ‘TransForm N’ visualization platform and CMS-200 software with VidSys’ physical security information management (PSIM) system. PSIM marries physical security device management with automation and artificial intelligence. PSIM systems analyze information from physical security devices, with a regard for automatic or manual resolution.
The project sees integration of Barco’s display wall technology with security camera feeds. Operators can control and view content from PSIM on the display wall from any workstation on the network. The result is a large common operating view of all parts of a facility and current status.
VidSys senior VP Dave Fowler said, ‘The integration of PSIM the Barco video wall creates a world-class solution for control rooms real-time visibility is critical.’ Barco’s CMS-200 control room management suite provides operators with configurable access to plant information. Audio-Visual over IP technology makes for flexible deployment. Barco expects to finalize integration partnerships with other security applications companies in the coming months. More from www.barco.com.
Zebra Enterprise Solutions (ZES) launched its Personnel Safety Solution (PSS) at the Process Safety in E&P and Petrochemical Facilities Conference in Houston last month. PSS leverages the location accuracy of ultra-wide band (UWB) radio technology to track all on-site workers, visitors and truckers and to exclude access to dangerous areas. PSS automates safety processes and real-time hazard detection and incident management. Operators receive notifications when personnel enter dangerous areas without permit. PSS also counts and locates thousands of people in real time as they assemble in evacuation areas, providing coordination of evacuation procedures and emergency teams.
PSS is also used in ‘what-if’ exercises to run real world scenarios through software analysis, visualizing the location of assets and personnel following an incident to discover spatial patterns. The solution also provides reporting on safety violations and compliance issues, providing feedback on improvements to evacuation processes. The Dart UWB product is ATEX certified for use in harsh, dense metal environments, and in the presence of explosive gases. One leading oil and gas client has deployed PSS to improve work and visitor safety at a 125,000 square-foot facility. More from www.oilit.com/links/1010_7.
Speaking at the SPE ATCE in
Florence, Jonathan Zwaan enumerated some of Baker Hughes Inc.’s (BHI) recent
acquisitions in the upstream software space including JOA Jewel Suite, GMI and
most recently, Meyer & Associates. The acquired toolset is to integrate
BHI’s Reservoir Description Service (RDS) unit and will be ported to the ‘Jewel
Suite framework,’ now a platform for seismic to simulation (STS) workflows.
One differentiator in BHI’s STS offering is the finite element geomechanical
modeling from GMI and the integration of tetrahedral meshes with Dassault Systèmes’
Abaqus finite element analysis toolset. Workflows on show included quick look
reservoir screening in IMEX/CMG along with multi point geostatistics with Iphasia.
A ‘multi scenario’ ensemble Kalman filter has been developed with TNO.
Calculations pass back and forth from grids to tetrahedral meshes, with Abaqus results feeding back into the simulator. BHI is also working with major clients to include their own workflows. Jewel Suite offers copy/paste and version control of interpretation objects. WITSML is used to stream micro seismic data from BHI’s PSFusion/Magnitude units for shale gas frac monitoring. Is this the first ‘true shared earth model’? Zwaan thinks BHI is getting close thanks to Jewel’s mature SDK. BHI’s RDS unit includes the 2008 Gaffney Cline acquisition. More from www.bakerhughes.com.
Speaking at the Nvidia GPU technology conference last month, Tom-Michael Thamm and Marc Nienhaus of Mental Images (MI) described how its experience of GPU-based computing for DreamWorks and Sony has been leveraged in a seismic data pilot for an unnamed ‘US oil and gas major.’ MI’s scalable platform for 3D graphics includes the Mental Ray ray-tracing flagship and RealityServer, a web services based GPU compute server.
But the core component for the seismic display environment is MI’s distributed compute environment, ‘DiCE.’ DiCE, whose customers include Autodesk and Dassault Systèmes, hides GPU/multi core complexity from the developer, providing cluster management and networking services, a ‘distributed in-memory’ database, thread management and scheduling.
MI found seismics to be very different from entertainment visualization. While transparency and raytracing are common to both verticals, subsurface data is ‘sparse’ and requires a different approach.
The system is claimed to scale to petabytes of data, distributing and balancing the workload across groups of specialized clusters. Real time interactive volume horizon rendering was tested with 20 horizons with 20 million triangles each. MI’s DiCE is claimed to simplify scaling of high-performance applications, enabling developers to concentrate on the core aspects of their field of expertise.
Mental Images is a wholly-owned Nvidia brand. More from www.oilit.com/links/1010_1.
In a webinar this month, Tibco Spotfire and Drillinginfo teamed to show how data visualization is used to find meaning in data and liberate users from Excel ‘hell.’ Combining Spotfire analytics with Drillinginfo’s live data feeds avoids the need for local copies of data and the risk of out-of-date analyses. Spotfire provides multiple clues and hints to investigators with map-based representations of production data and mouse-over pop-ups showing well or oilfield metadata.
Four data dimensions can be viewed simultaneously and controls used to highlight, say, the top six producers. Data drill down is used to investigate trends and root cause analysis. A click on a trend line shows which wells are involved in a particular event.
Established workflows can be saved as a library file and made available to other users. Analysts don’t need a local copy of Spotfire, they just log on to the web player. Various real world scenarios were demoed—which Eagleford shale operators are selling, where they are located relative to the oil window, what neighboring acreage is about to expire and so on.
At the Spotfire Energy Forum earlier this year Jeff Mathews described how similar functionality has been embedded in Chevron’s integrated reservoir analysis and visualization environment (iRAVE). iRAVE is used to perform due diligence during acquisitions. More from www.spotfire.com.
Oilfield services provider Weatherford International has deployed a contract lifecycle management solution from Dallas-based CLM Matrix to automate internal workflows and meet corporate governance, compliance and reporting needs. The solution will be used in bid tender negotiations with major oil producers to extend existing services or provide new products and solutions.
CLM’s Matrix Enterprise package will be used to establish global standards and procedures for customer contracts. Weatherford’s compliance and risk management teams will now be party to the entire contract process—giving them the visibility they need to stop or change a contract that would have created unwarranted risk.
Matrix Enterprise is the first international application that Weatherford’s IT organization has rolled out internationally on Microsoft SharePoint. Deployment will start in fall of 2010. More from www.clmmatrix.com.