November 2012


Uncertainty mastered

Statoil and Emerson sign multi-year agreements to develop uncertainty management and production forecasting. Enable ‘proxy-based’ history match gets EnKF boost. Thin client/NoSQL for EC cloud.

In our report from the 2012 EAGE (Oil ITJ July 2012), we summarized a presentation by Statoil’s Xavier van Lanen and Jan-Arild Skjervheim outlining an automated workflow spanning geomodeling to simulation. The compute intensive technology (28 CPU hours per realization was cited) provides a ‘whole loop’ workflow around structural modeling and production history matching—along with rigorous evaluation of uncertainty in production forecasts.

Oil IT Journal can now reveal that much of the enabling technology for Statoil’s workflow comes from Emerson’s Roxar unit which has now embarked on a three year ‘Total uncertainty management’ (TUM) program with Statoil to further develop and commercialize the technology. TUM sets out to enhance Roxar’s reservoir management software, notably the Enable assisted history match and uncertainty management tool. TUM will lead to commercial software applications that allow uncertainties to be quantified across the complete reservoir characterization and development workflow. New solutions for horizon and fault uncertainty in Roxar’s RMS structural modeler will be unveiled early in 2013.

TUM builds on the ‘proxy-based’ history matching technique behind Roxar’s Enable which has undergone field trials by Statoil and will be enhanced with Statoil’s in-house developed ensemble Kalman filter, said to be successful in modelling the results of 4D seismics.

Roxar claims a key differentiator for its toolset is its thin client approach and the ability to spin-out compute intensive work to a Linux cluster. Even if you haven’t got your own cluster, that’s OK, you can run it in the Amazon EC cloud. As Roxar’s Dan Dailey told Oil ‘90% of the world’s fastest computers run on Linux. Linux, along with our thin client architecture, eases the transition to the cloud. Other reservoir management software packages [read Petrel] are more ‘fat-client,’ with most of their resources installed locally. This approach is less well-suited to cloud computing.’ Roxar claims a significant cost advantage to running Linux in the cloud over similar number crunching on Windows.

Another key component of Roxar’s Cloud/cluster strategy is its ‘.Rox’ (pronounced ‘dot Rox’) environment, first announced in 2010 (OITJ July 2010). The .Rox platform comprises what Roxar describes as ‘a schemaless, transactional, NoSQL object store.’ At first the system will ship with filesystem-based storage. Later on, open-source and/or commercial databases will be available. More on the state of play in high performance computing as witnessed by the latest TOP500.org list on page four of this issue and from Roxar.


Refinery-wide optimizer

Invensys adds Spiral Software’s ‘ground-up’ refinery management solution to its SimSci-Esscor hydro-carbon processing industry (HPI) product line-up.

Invensys has acquired Cambridge, UK-headquartered Spiral Software, a provider of ‘crude oil knowledge-management tools’ for traders and refiners. Spiral’s planning and scheduling solution is a multi-user environment for sharing data and models across supply chain work processes.

The new functionality will be available from Invensys’ SimSci-Esscor unit as a component of its hydrocarbon processing industry (HPI) solution which will now support ‘refinery-wide’ optimization.

Invensys VP Tobias Scheele said, ‘Our software helps clients understand the financial impact of business decisions in real time. The extension of our portfolio will help drive profitability and performance improvements across the refinery.’

Spiral’s yield accounting, off-sites and planning and scheduling functionality allows for the update of linear programming models as feedstock profiles or equipment configurations change. Crude purchasing, production schedules and sales can be aligned with real-time market conditions. Risk-analysis functions provide insights into how planning and scheduling scenarios may be exposed to changes in feedstock costs, product demand and refinery operations. More from Invensys.


Saudi Aramco’s 100,000 trace seismics and 7 petabyte datacenter

Editor Neil McNaughton attends a mind boggling presentation at the Las Vegas SEG. Processing multi petabyte surveys forces rethink of workflows and QC. Field tests and visual shot inspection are old hat, replaced with automated QC and interpretation. ‘Noise’ is no more—it’s all signal now.

I have a confession to make. I have in a previous life destroyed data. I knew I was doing it at the time, but I just could not help myself. Out in the field with a state-of-the art hard disk and 21 track recording set up, we were concerned over noise levels and did what was considered, circa 1980 as a ‘test.’ This involved fiddling around with a few different patterns of geophone arrays for input to the 48 trace system and visually inspecting the results. Like thousands of seismologists before (and after) us we decided that a largish array gave us significantly better signal.

Even then I knew that there was something wrong with summing in the field. But my own guilt was mitigated by the knowledge that the data would be summed and visually inspected a lot more before the processing shop was through with it.

Fast forward 30 years to the 2012 meet of the Society of Exploration Geophysics’ this month in Las Vegas (full report in the December issue of Oil IT Journal) and two presentations from Saudi Aramco’s Peter Pecholcs and Brian Wallick that debunked once and for all the concepts of ‘stack’ and ‘noise.’

These were not theoretical talks. Aramco presented the results from three surveys conducted by WesternGeco using a 100,000 trace land seismic acquisition system. The idea was to abandon arrays in favor of point sources and receivers. Pecholcs observed that, ‘in the old days, the array gave us a good feeling, you could see data.’ But today the clever stuff is done in the computer center, ‘giving the signal some respect.’

The largest survey produced 165 billion traces and three petabytes of data. Processing this data abundance required some 7 petabytes of storage in the data centre and a rethink of processing workflows and quality control. With such large volumes, ‘you can’t just do stuff (like changing a decon operator) over.’ Current visual inspection of a 15dB plot is ‘no good any more.’ A technique evolved around fast track volumes followed by pilot processing of a mere 55 terabyte/9 billion trace data subset. Substantial up front 2D field testing failed to give a clear cut indication of the potential of the mega survey. But the survey went ahead regardless and produced spectacular 3D results. Pecholc recommends you ‘don’t waste money on 2D tests.’

To process these huge data volumes, computer technology needs to scale and QC needs to be improved, difference plots ‘just don’t hack it.’ Moreover, with good spatial sampling, what used to be thought of as ‘noise’ is actually coherent signal, scattered from a shallow low velocity layer. This was the first time the top ten meters of sand had been imaged.

In the Q&A, Pecholcs was asked if ‘visual’ QC could usefully be replaced with statistical methods. He opined that what was needed was ‘just good geophysics, no neural nets to complicate our lives, go back to geophysics 101.’ Finer sampling means that you see what ‘noise’ actually is. But the scary thing is that the signal ‘is so complex that no time domain method can flatten an event, and no one can build a velocity model.’

Aramco’s project is for an ‘integrated broadband acquisition-to-rock mechanics’ methodology. Brian Wallick took over to address the interpretation and reservoir characterization aspects. The aim is for data that is easy to interpret, ‘data should interpret itself.’ One major problem is intra bed multiple contamination—but as this is better sampled, it is easier to remove. Bandwidth has increased from 8-30Hz to 3-45Hz bringing better lateral continuity and data that is much more amenable to autotracking. Aramco has only scratched the surface with this data set, there is lots more to do in terms of rock physics and ‘impedance fidelity.’

Just in case you don’t see the trend, the Aramco presentations were followed by Shell’s Guido Baeten who described tests performed by BGP on the new HP/Shell/PGS wireless sensor network—with a million channel capacity!

All in all, the race for traces represents a whole new set of challenges and opportunities. Tomorrows’ processing and interpretation will require more rigor and automation right across the workflow. More from Pecholcs’ and Wallick’s abstracts.

Follow @neilmcn


Letter to the Editor

Graham Davidson, CNR International comments our ‘training the data managers’ editorial.

I read with interest the October 2012 editorial, ‘Training the data managers.’ I have attended a number of discussions on this subject under the auspices of Robert Gordon University here in Aberdeen and elsewhere. Most such discussions focus on ‘unstructured data,’ the stuff which isn’t in a database, in the mistaken but understandable belief that, for instance, OpenWorks is managing your data for you or at least that once it’s in there your data is somehow ‘clean’ and remains so.

Of course this misunderstanding is not confined to subsurface data. Many believe that the implementation of an enterprise resource planning or asset integrity management system is an end in itself. In fact it is just a milestone somewhere near the beginning of a never-ending road!

I am currently working on an application strategy for our organization and I am starting with an analysis of why we have applications at all. We talk about ‘supporting business processes’ but what exactly do we mean?

One view is that an application strategy is an element of a quality management system for information, an idea that I am now pursuing through the auspices of the Chartered quality institute and amongst other avenues. Comments anyone? Graham Davidson.


Interview—Eldad Weiss and Duane Dopkin, Paradigm

Oil IT Journal interviews Paradigm CEO and EVP-Technology. How a ‘pure play’ software house is different. APAX acquisition and an IPO? Linux, Windows and HPC. Automating interpretation.

What differentiates Paradigm?

Eldad Weiss—Paradigm has no attached service business. Our clients like the fact that we are a pure play software provider. Also we like not being in the cutthroat seismic business!

Duane Dopkin—Software developed by a pure play company is different than that developed for a service company’s own needs. We closed our service division a couple of years back because of this and because of conflict with our customers’ in-house seismic processing divisions and reservoir services organizations.

Where does Paradigm originate its software? Universities, consortia?

EW—About 95% of our software is organically developed. Even when we acquire companies their tools are re-written to a common code standard and interface. Whether you buy or build you have to re-build anyhow.

DD—We remain active in many consortia. But these provide ideas rather than code.

What was the rational for the Apax deal?

EW—A company’s development needs to match the size of its financial backers. Ten years ago, Fox Paines’s $2.5 billion assets were a good match for Paradigm. The APAX deal moves us to the next level. With $36 billion under management, Apax is one of the largest private equity companies in the world.

So when’s the IPO?

EW—Apax invests for the long term. Growth is the first objective.

But what does Apax do with acquisitions?

EW—They will eventually seek liquidity.

Presumably it will be hard to find an even bigger buyer! So an IPO is likely?

EW—You are starting to connect the dots... Paradigm has been a public company in the past.

Last time we spoke, the subsalt Gulf of Mexico was Paradigm’s strength.

DD—Our software’s ability to image the sub salt is still key. But is not necessarily the emphasis now, as you can see from the diversity of our customer base. They are now faced with pressure to meet the new requirements of non conventionals.

What are your key offerings here?

DD—Our ES 360 technology is important in this space. We are also working on novel seismic processing techniques which avoid traditional data sectoring, using the whole data set all at once. Clients also want to simulate the whole oil field with multi billion cell reservoir models. These are achievable with Gocad/Skua’s unstructured gridding. This approach has huge implications in the Middle East and China.

We attended an Aramco talk on 100,000 trace seismic acquisition.

EW—Acquisition companies are starting to work with us as they promote new techniques. Processing software has to follow. Our tools are used up front to optimize acquisition.

Where is the Linux/Windows debate going today?

DD—On the desktop we are 50/50 Linux/Windows. On the cluster it is primarily Linux. Microsoft has not really made it in the cluster arena.

EW—All the new broad band processing and acquisition produces great imagery. But what do you do with it? You need to help the interpreter. We are showing novel autopicking technology to selected clients. Constrained autopicking in Skua picks hundreds of horizons at once. The results are displayed as multiple pick scenarios that the interpreter can chose from. Autopick is a big change for the industry—not everyone realizes how big!

Like the shift from 2D to 3D in the 1980s?

EW—Exactly. Today’s high quality data sets are forcing a rethink of how we do interpretation.

DD—Scalability is also important. Other interpretation tools run on a single system. But some of our clients have a thousand users. This requires a different approach to software engineering and security. We will be making announcements around security, data management and vendor neutral data formats over the next couple of years.

What formats?

DD—Open data standards—well accepted, published standards such as ResqML and PPDM. There is a consensus that these now represent mature, rich data models.

And the SEG?

DD—Not so much. This is more up to the data creators.


Book Review—‘After the US Shale Gas Revolution’

Thierry Bros’ exhaustive account of the worldwide natural gas business.

Société Générale analyst Thierry Bros’ new book, ‘After the US Shale Gas Revolution*’ is a 150 page, copiously illustrated and highly readable textbook-cum-treatise on the worldwide gas industry in the light of the ‘shale gale.’

We were concerned by the title that ‘After’ might be an attempt to look into the future in what, as was outlined in our September 2012 editorial, is a complex field with many unknowns. But Bros is no Nostradamus and ‘After’ is a solid analysis of the US situation and an overview of the potential of shale gas in other parts of the world (where the future demand will come from) and the impact of ‘green’ issues.

Bros’ argument can be summarized thus. The cost of natural gas production in the US is around $4/mcf but current market ‘imperfections’ (‘free’ gas associated with shale liquids) are ‘blurring’ the price signal—gas was selling for around $2 when ‘After’ was written. As the imperfections ease, prices will move up to around the $4 mark—but not much more as drilling will quickly return, dampening a potential hike.

The forecasting part is contained in the last ten pages where the impact of LNG export from the US is briefly examined. Bros opines that today, EU is paying almost 1% of GDP on ‘overpriced’ Russian gas today. But since Gazprom’s cost base of $6/mcf is about the same as the delivered cost of US LNG to the EU, Russian exports will remain key.

‘After’ is bang up-to-date, including Exxon’s Polish disappointment and the 2012 US Department of Energy downgrade of the Marcellus resource (but not production). It is hard to capture, in a short review, the sheer quantity of information in ‘After,’ every page is packed with facts, figures and perceptions. A must read—even sans an index!

* Technip 2012, ISBN 9782710810162—www.editionstechnip.com.


INT+Tibco/OpenSpirit+Matlab=ecosystem?

Int Viewer links to Matlab, Seismic Workbench. Spotfire leverages OpenSpirit Connect.

A number of announcements this month suggest that a new upstream development ecosystem is emerging from a few key component and infrastructure providers. The 4.5 release of INT’s IntViewer seismic data visualization application and development platform sees the addition of Matlab and Seismic Workbench plug-ins providing an integration pathway to script-based processing systems and open-source processing systems including Seismic Un*x.

Tibco has announced OpenSpirit Connect, ‘pre-built’ web services which connect the world of upstream data to Tibco’s hub—linking geotechnical data with corporate and financial data sources. Tibco’s Spotfire data analysis package now plugs and plays with a variety of upstream data sources—again via OpenSpirit—promising ‘event-enabled’ integration, discovery and predictive analysis of energy ‘big data’ sets.

A MathWorks’ webinar heard Graham Dudgeon vaunt the merits of Matlab and Simulink in the energy sector where the high level scripting language is used to perform data access and predictive modeling.

Is this an ecosystem? Well, INT’s technology is under the hood of several OpenSpirit tools and Matlab provides an alternative route to high performance computing, including number crunching on the GPU (Oil IT Journal May 2012).


Geophysicus—‘more than meets the eye’ in seismic data

3rd Science Software’s Steven Lynch’s claims that SeisScape reveals hidden information.

Erudite blogger ‘Geophysicus,’ rumoured to be Steven Lynch, Principal Investigator and CEO of Denver-based 3rd Science, believes that there is ‘more than meets the eye’ in seismic data. Lynch believes that seismic sections contain an ‘almost limitless’ amount of information that may be hard to communicate to the viewer visually—impacting an interpreter’s ability to make informed decisions.

While digital processing has made great strides in improving resolution, our ability to communicate this to the interpreter still relies on the same combination of wiggle trace displays, chromatic variable density displays and achromatic grey scale displays as it was over a generation ago. Lynch believes that such technology is ‘antiquated,’ contending that it fails to reveal ‘entire levels of detail and coherent signal’ that exist in the data.

3rd Science purports to bring this hitherto unseen data into focus with its MeshLab and SeisScape technologies. Actually we reported earlier on SeisScape—then marketed by Birch Tree Software (OITJ March 2000). Those baffled by Lynch’s claims can download his 2008 PhD thesis.


Eliis/PaleoScan unveils Open CL-based acceleration

PaleoScan user meet sees novel geobody extraction, API and hardware acceleration.

Eliis hosted the PaleoScan user meeting last month in Montpellier, France, introducing new interpretation functionality to its guided sequence stratigraphic seismic interpretation toolset. These include geobody extraction, multi core optimization a data link to Schlumberger’s Petrel and an API for developers.

A new flattening/horizon stack function is useful in visualizing structural evolution of both seismic and log cross sections in synchronized windows. Hardware acceleration eschews Nvidia’s ubiquitous Cuda, Eliis’ developers have opted instead for the OpenCL standard which promises a more portable route to acceleration via multi core CPUs and GPUs.

Tests using an Nvidia GT440 accelerator showed speed up in the range of 3 to over 60x for various compute intensive tasks. Further ‘dramatic’ speed-up of most tasks was reported by users of solid state disks.

Eliis is now working on enhanced management of CPU/GPU acceleration along with PsDataIO, a C++ library used develop the Petrel data link and now available for third party use.


TOP500.org list led by 17 petaflop Oak Ridge Cray XK7

Saudi Aramco’s Dell cluster and Petrobras’ Itautec in Top 500.

The November 2012 edition of the TOP500 list of supercomputers includes seven new entries from oil and ‘energy’ companies. While Petrobras and Saudi Aramco are brave enough to name their submissions, the others curiously prefer to remain anonymous.

All of the new energy entries run Linux. Four on the 2012 entries come from Hewlett Packard’s SL230/250 ‘Gen 8’ cluster range. Petrobras’ Xeon/Nvidia-based machine was build by Brazilian Itautec. Saudi Aramco’s supercomputer is a Dell PowerEdge M610 cluster. A ‘UK-based’ oil company deploys an x3650M4 cluster from IBM.

One of the ‘anonymous’ HP supplied machines comes in at a respectable Number 48 with a 458 terabyte bandwidth. This compares with a reported 17 petaflop performance from the current leader— the US DOE/SC/Oak Ridge National Laboratory’s Cray XK7. Like the Petrobras machine, the Oak Ridge Cray leverages Nvidia hardware acceleration. The HP and IBM energy clusters appear to manage without GPU accelerators.

As usual, a caveat is in order. Not all HPC installations bother to report to the List and some may be there masquerading as ‘Geoscience’ machines which we have not tallied. Maybe some of the ‘Energy’ companies are actually designing wind farms! More from top500.org.


Software, hardware short takes

Emerson/Roxar, Hitachi Data Systems, Aveva, Microsoft, FFA, Kepware, Ironclad Performance Wear, Larson Software, SAIC, Ubisense, NetApp, Madagascar, Principle Energy Services, Accenture.

Emerson/Roxar’s Tempest 7.0 offers modules for simulation, pre and post processing, PVT analysis and economics. It is said to handle very large data sets, CO2 injection, coalbed methane, SAGD and non conventional reservoirs.

Hitachi Data Systems has announced a project data management solution with enhanced backup and recovery capabilities for Schlumberger’s Petrel. The plug-in enables project-level data management by automatically synchronizing active data to a centrally managed storage.

Aveva has teamed with Microsoft on a ‘Future of plant design’ initiative targeting development of a Windows 8 tablet application to access Aveva’s 3D software in the Windows Azure cloud.

The 2012:2 release of FFA’s GeoTeric image processing application adds new ‘high definition frequency decomposition’ that is claimed to highlight ‘the most subtle’ geologic features.

Kepware has enhanced its flagship KEPServer with a new ABB TotalFlow driver and electronic flow measurement support for Fisher ROC and ROC+ devices.

Ironclad Performance Wear has announced the ‘TouchScreen’ work glove to allow users to operate touch-screen devices ‘without having to remove their gloves.’

Larson Software has added new functionality to its Studio plotting application to create posters and export PDF, import PDF, DWG and DXF. New products include Larson RIP for Canon plotters, a CGM to vector convertor and an Apple iPad viewer.

SAIC has released GRGlobe, a GIS front end for viewing data in Google Earth. GRGlobe spatializes data into organized layers for display and management.

Ubisense too has unveiled a mapping application, ‘myWorld’ for enterprise users of Google Maps. myWorld combines geospatial information and enterprise data in a single interface.

The latest version of NetApp’s Santricity management software for its E-Series, storage adds SSD cache and new mirroring and replication services for data protection.

The 1.4 release of the open source Madagascar seismic processing package adds five new ‘reproducible’ papers and the Iwave modeling package developed by Bill Symes at the Rice Inversion Project. Madagascar reports that downloads have now reached 20,000.

Principle Energy Services has launched its Oilfield noise mitigation products and service line. The technology includes sound impact assessment, accoustic modeling and sound proofing for wellsites and compressor enclosures.

The Accenture Life safety solution (ALSS) was named new product of the year by Occupational Health & Safety magazine. ALSS is a wireless-enabled multi-gas detection system that protects workers in hazardous environments. ALSS integrates Wi-Fi and location-based technologies along with detectors for hydrogen sulfide, carbon monoxide, aromatic hydrocarbons and oxygen.


New release of VSG Avizo. Shell, ExxonMobil report usage

XLab simulators model shale reservoirs. ExxonMobil and Shell present use cases at SCA meet.

The latest release of FEI unit Visualization Sciences Group’s Avizo includes new functionality for investigations of shale reservoirs. The Xlab simulators model molecular diffusion, absolute permeability and formation resistivity. The simulators were developed in cooperation with the Solid state chemistry unit of the French CNRS R&D organization. The new release also includes video animations that can be exported in MPEG and AVI.

Papers presented at the International Symposium of the Society of Core Analysts held in Aberdeen earlier this year illustrated use of Avizo to analyze Micro computed tomographic studies of reservoirs and shales. ExxonMobil used Avizo’s XLab solver to study upscaling in a Devonian shale gas reservoir.

In a second presentation at the same event, Shell used the Avizo Matlab bridge module in an investigation into in-situ measurement of capillary pressure in a sandstone reservoir. The technique is now being implemented as a native module using the Avizo API. Read the case studies on VSG’s Papers page.


Transform Software takes AIM

New ‘analytic interpretation and modeling’ package classifies seismics with multivariate statistics.

Transform Software unveiled its analytic interpretation and modeling (AIM) system at the SEG in Las Vegas this month. AIM is a component of Transform’s TerraSuite interpretation platform. President Murray Roth said, ‘AIM replaces qualitative interpretation with statistics-based classification. AIM uses predictive analytics in the form of MVStats, our proprietary multivariate analysis toolkit.’

MVStats provides a variety of linear, non-linear, unsupervised and other classification techniques. These can be applied to a variety of interpretation problems such as facies mapping, well log lithofacies identification and microseismics. Roth observed ‘Predictive analytics is not new to our industry. But current tools are generic packages built to span dozens of industries and cannot incorporate geoscience data.’

MVStats was used by Transform to analyze well production and seismic data over the Eagle Ford and Bakken/Three Forks plays. The results are available as a non exclusive data set.

Transform also rolled out TerraLocate at the SEG, a package for data quality control of microseismic surveys. TerraLocate’s development was guided by a consortium of eight leading operators and addresses ‘persistent QC challenges in microseismic data acquisition and processing.’ Deliverable are ‘living databases,’ with operators agreeing to an initial purchase and annual subscription to accommodate continuous updates and enhancements.


DOME IDOC 2012—Oman

The third Dome Exhibitions ‘International digital oilfield conference’ held this year in Oman hears from PDO’s smart fields program, Statoil’s IO, Kongsberg’s simulators, Microsoft/mr:con on new PetroVisor platform. KOC/Weatherford digitize the Burgan field. Shell’s Smart mobile worker.

Khamis Albusaidi of Petroleum Development Oman (PDO) advocates an ‘appropriate level of smartness.’ For PDO’s smart fields program this currently combines smart capabilities and a collaborative work environment (CWE) to enable optimization across the entire production system from subsurface to surface facilities. The future vision will see the introduction of some closed loop control. PDO’s digital oilfield program builds on plethoric sensors, for water, vibration, oil in water, solids, H2S and more. Projects include gas breakthrough control, a digital net oil computer and WiMax communications to the well head to support real time well management. PDO has nine collaborative work environments (CWE) across the organization and has established a smart fields governance structure with champions at ops and engineering director level.

Change management and sustainability are challenging. The idea is to ‘make smart the only way of working.’ This is succeeding as well restoration time is now down from 6 to 2 days and has reduced deferment to the tune of $8million per year. The CWE has brought better data visibility and enables cross-silo workflows. These bring improved operations and better skill sharing and transfer. In 2013, 80% of PDO’s fields will operate at the ‘appropriate’ level of smartness.

Trude Sundset described Statoil’s integrated operations (IO) as ‘very mature’ with multiple activity guideline documents, broadband communications, a data integration architecture and collaboration centers. Statoil has developed a good division of labor with its contractors as witnessed by the recent drilling services contract, awarded to Baker Hughes, that covers 25 fields. Other IO successes include the Vessel Traffic Management information system (VTMIS), telemedicine, and equipment monitoring. IO is bringing ‘deep changes’ to Statoil’s way of working with multi discipline, cross silo and concurrent processes. Rapid access to domain specialists, real time data and situational awareness are making for a ‘proactive’ way of working.

Maersk Oil’s Pieter Kapteijn looked to the future of the digital oilfield (Dof). For Maersk, the core principle of the Dof is a model-based controller and optimizer running in parallel to the real world of the field. Today most all optimization is ‘reactive,’ 70% of the potential for ‘smart’ is as yet untapped. Today we are ‘reasonably smart’ at well delivery, production operations and asset management. Other core activities are not so well served and ‘lifecycle smartness’ is in its infancy. Worldwide fewer than 0.1% of all wells are smart. We have to live with poor basic data and we struggle with integration. The future tools of the trade will include pervasive sensing, ‘unlimited compute power,’ bandwidth and a new ‘digital’ generation of employees.

Kongsberg’s Shane McArdle argues that there are more uses for a plant simulator than in operator training. At Statoil’s Kårstø gas processing facility, a plant lifecycle simulator is used in a variety of roles to ‘break down the silo boundaries’ and enable change management. Kårstø’s operations are complex. Around 700 ships dock every year and Kårstø is the EU’s biggest exporter of NGL and LPG. A dynamic plant simulator has been in use since 2003 and used at every phase of operations. The simulator is now a Statoil best practice and has a dedicated five person team of automation, operations and process engineers. The system is used to test advanced process control applications such as Statoil’s ‘Septic’ project.

Konsberg’s simulator has been used in multiple revamp projects and has proved its worth with faster commissioning, reduced DCS and design faults and, of course, in operator training. The simulator also provides model-based decision making for ongoing field development and ‘fast and easy access’ to reliable information. Life cycle simulation is now included in all specifications. Models are built for the main processes and used to identify long lead time items. DCS checkout is now considered a ‘value add’ activity that helps to understand how the simulator will be used throughout the life of the facility.

Microsoft’s Ali Ferling and Michael Stundner of Austria-based myr:conn introduced PetroVisor, a rapid development environment tailored to oil and gas. PetroVision runs in the Microsoft Azure cloud, is said to be compliant with Microsoft’s upstream reference architecture (Mura) and has support from Accenture. PetroVisor has been used to write a production tuning demonstrator game and, Stundner hopes, will eventually evolve into a digital oilfield ecosystem.

Schlumberger’s Hammad Mohamed argues that data is the most valuable corporate asset since most everything else can be replaced. Oils recognize this and capture data to in-house or commercial data stores. But data storage and capture over the years has evolved into an assembly of document systems, technical and operational databases—all of which present multiple challenges to data managers. Data growth has been reported at 80% per year for technical data and unstructured data growth is ‘uncontrolled’. There is a mismatch between traditional data management and the ‘web’ of E&P data. Current data systems lack the flexibility and scalability required to integrate multiple repositories. Data quality is generally low and still adversely impacts users. One answer is Schlumberger’s InnerLogix data quality management solution which automates data cleansing, aggregation and validation across today’s multiple data sources.

Ahmad Al Jasmi (KOC) and David Joy (Weatherford) reported from KOC’s digital Burgan field pilot. The proof of concept was deployed on a single gathering center of around 100 wells with water issues. A digital Modbus connection to wellheads provided flowline measurement and remote choke control. Fiber links added downhole temperature and pressure measurement along with monitoring of pump wear. The system leveraged the KwIDF (OITJ October 2012) infrastructure of control rooms, computing and communications. A key contribution comes from the ‘i-DO’ toolset, used to automate ‘sick well’ identification and to develop and enact remediation plans. ‘Raw’ digital oilfield automation is not enough. Systems need to allow for human interaction and management of change prior to CWE introduction and afterwards, to ensure that they are used effectively.

Basil Elzin presented Shell’s ‘smart mobile worker’ that extends the ‘comforts’ of the CWE to the field worker. SMW offers advice, applications and monitors worker health remotely. SMW provides an audio/video link, multiple mobile devices and fall protection. More from DOME and IDOC.


SPE Drilling systems automation technical section workshop

‘Shaping the future of drilling’ meet hears from NOV’s Florence on making machines smarter. Emerson’s Berra on misplaced fear of automation. LeBlanc on NI’s FPGA-based CompactRIO controller. InTechSys’ Tovar on working with DARPA. Debate—FlexRig, is drilling behind the curve?

National Oilwell Varco ‘s Fred Florence kicked off the Drilling systems automation technical section (Dsats) workshop held during the SPE ATCE in San Antonio last month. Dsats is now established as a not for profit with its own officers. Dsats holds regular meetings and is cooperating on R&D with Universities in the US and EU. A data interoperability workgroup has been formed and is engaged with the Standards leadership council. For those who are wary of drilling automation, Florence stated, ‘Dsats is less about using an offshore pipe handling system onshore—it is more about how to make the machines we use today smarter.’

John Berra, past chairman of Emerson Process Management, observed that despite many examples of successful drilling automation, there is no real momentum. He puts this down in part to the day rate model, but also to fear of change and the fact that folks ‘don’t want to contribute to their own demise!’ Such fears are misplaced. Berra put Emerson’s success down to having spotted the use of electronics in the 1970s, a disruptive technology at the time. In the following decade, Emerson was an early adopter of microprocessors in control systems—again, in the face of great resistance to change. At the time some competitors believed they could corner the market with proprietary systems and protocols. Emerson sought to ‘differentiate in an open environment’ by adopting the PC-based technology and supporting open communication standards—which ‘make the pie bigger for everyone.’ Turning to the drilling arena—efforts are underway on standardization but again, there is fear of change. Berra believes that the process control standards should be leveraged in drilling—‘don’t build from the ground up.’ Somebody needs to take a lead—‘you need a coalition/consortium’ to navigate the ‘political’ process that will take time. ‘Geekery’ should be avoided by setting clear goals and developing use cases. In the Q&A, Berra cited the FieldBus, OPC and Hart Foundations as examples of standards successes.

Chris LeBlanc’s company, National Instruments supplied hardware and software bundles for drilling control systems. Many control systems operate in silos and don’t scale well. LeBlanc advocates building controls systems with ‘common off-the-shelf’ (Cots) technology—even if his idea of Cots is rather esoteric. The dash for unconventional resources with complex frac and completions has led to the adoption of low cost small footprint tools and hardened embedded control systems. Making these involves a graphical design tool for control systems, a math capability and optional GUIs. Enter NI’s CompactRio), a reconfigurable embedded control and acquisition system based on reconfigurable field-programmable gate array (Fpga) and programmable with NI’s LabView. This offers a graphical programming paradigm that can be used by a domain specialist. One NI poster child is Optimation’s OptiDrill—an intelligent top drive control system. Another is Lime Instruments frac control platform which offers redundant peer to peer control across pumps, blenders, chemical trucks and the data van.

Ed Tovar (InTechSys) informed the drilling community that the US Darpa defense research organization has a $3 billion ‘discretionary’ budget. So ‘if you’ve got a good idea...’ Darpa’s mission is to prevent technology surprises. Tovar sees potential for collaboration between Dsats and Darpa but ‘wait until the new year after the election.’

In the debate, one drilling contractor opined that although the industry is perceived as lethargic, if new technologies are not in the contract, they will not be deployed. Operators should say what they want and specify required equipment.

One of the few operators present (maybe 6 out of 150) responded that operators are not attuned to process control. It is used on small subsystems, but can be a struggle to understand and apply at scale. What should operators be putting into contracts?

One researcher observed that ‘requirements’ and ‘research’ may not always be easy to match-up. As Henry Ford observed if he had asked what his users wanted, they would have likely replied ‘a faster horse.’ Darpa’s environment encourages ‘co-evolution of requirements and research.

LeBlanc argued that rapid prototyping could facilitate the integration of novel control systems in requirements documentation. The operator supported the development of some clear specifications for such contract documentation. Operators deploy process control gas plants but not, so far, in drilling.

Integrating technology development in the boom and bust environment can be hard. How is technology to be advanced in the face of the next bust? The answer may be through new builds where the relationship between operator and contractor is more malleable. Are the new ‘6th generation’ fully automated? Do they need a man suspended 100 ft in the air? These considerations needs to be taken into account during the design phase when they are less costly to implement.

A FlexRig representative vaunted the merits of its technology. Here touch screen control has replaced the brake handle. Simple automation has proved very effective. But the technology has not seen much take-up. Despite considerable investment and a technical success, ‘Wall Street made my life hell!’

But the Dsats movement could benefit from more clarity, ‘it’s not just about rate of penetration gains, but also about harder-to-define concepts like well bore quality.’ One shale gas operator seemed quite happy with the current state of the art, reporting on sub-seven day ‘factory drilling’ wells and straighter, better quality holes. Industry may not be behind the curve at all, ‘technology is moving ahead at about the right speed.’ The use of semi-automated systems such as the managed pressure drilling (MPD) kit was suggested as a significant safety enhancement—a ‘best in class early detection system’ that is used today on some land wells.

Those interested in drilling automation should check out the Summer 2012 issue of Schlumberger Oilfield Review which has an introduction by Fred Florence and a good summary of the state of play from Walt Aldred et al. The next Dsats symposium will be held in Amsterdam on the 4th March 2013. Visit the Dsats home page.


Folks, facts, orgs ...

Caesar Systems, ISS, Absoft, Noah, PPDM, Neos, Katalyst, Technip, Geospace, Emerson, King Fahd University, Southwestern, Meridium, Oiltanking, Argus Media, Bureau Veritas, Hampson-Russell, Hydrocarb Energy, Prospectiuni, AnTech, OGC, McLaren, Panopticon, Inova Geophysical, Ikon Science, SLR Consulting, Brunei Development, Lloyds Register, Emerson, Fieldbus Foundation.

Jean-Claude Goyon has been appointed EMEA Region Director with Caesar Systems. He hails from Shell.

Shane Attwell and Colin Yamey have been re-elected as Directors of ISS Group.

Absoft has appointed Peter Drury as its first chairman.

Noah Consulting’s Fred Kunzinger is chairman of the PPDM association board.

Neos GeoSolutions has appointed Kristina McGrath as VP HR. She was previously with LyondellBasell.

Kelman Data Management has changed its name to Katalyst Data Management.

Alexandra Bech Gjørv, formerly of Statoil, has been appointed to the Technip board replacing Daniel Lebègue. Pascal Colombani has been appointed chairman of the audit committee.

Geospace Technologies Sucursal Sudamericana has opened in Bogota, Colombia.

Emerson has partnered with the King Fahd University of Petroleum and Minerals in Saudi Arabia on a $25 million technology and innovation centre in the Dhahran Techno Valley.

John Gass has been elected to the Board of the Southwestern Energy Company. He was previously with Chevron.

Former Juniper Networks and Microsoft executive Eddie Amos has joined Meridium as CTO.

Anne-Marie Ainsworth has been selected to succeed Carlin Conner as President and CEO of Oiltanking Partners. Carlin Conner is now MD of Oiltanking GmbH.

Argus Media has appointed chairman of Cella Energy and governor of the London School of Economics, Bryan Sanderson to its board as a non-executive director.

Carlos Esnard has joined Bureau Veritas as CFO for the North American unit. He hails from Ryder Systems.

Emma Pope has joined Hampson Russell’s Crawley office as a Sales and Training Coordinator.

Hydrocarb Energy has appointed Thomas Hoak as manager of geology and geophysics for its Namibian unit.

Bucharest-based Prospectiuni has appointed Andrew Clark to a newly created position of President. He was previously with Geokinetics.

AnTech has recruited Clair Brown as engineering manager, Nicola Monger as office manager, Francisco Arjonilla, electrical engineer, Zoe Williams, production engineer and Lauren Heath as production assistant.

Denise McKenzie has joined OGC as executive director, marketing and communications.

Richard Beck, formerly of WorleyParsons, is director of McLaren’s new Australian business unit in Perth.

Bruno Saint-Cast has joined Panopticon Software as a senior marketing VP.

Inova Geophysical has promoted Glenn Hauer to president and CEO.

Alsing Selnes heads-up Ikon Science’s new office in Norway.

Previously with RPS Group, Nicki Bourne has joined SLR Consulting as principal of its oil and gas team.

Alan Smith (also ex RPS Group) is now principal consultant at the Brunei Economic Development Board.

Lloyd’s Register has appointed R. S. Sharma, former chairman and MD of India’s ONGC, as chairman of operations for South West Asia.

Stuart Brown has been appointed as General Manager, UK and Ireland for Emerson. He takes over from Paul Smith who has been named VP Middle East and Africa. Brown was with AMEC before joining Emerson in 1995.

Ulrich Turck is chairman of the Fieldbus Foundation’s EMEA Executive Advisory Council. He replaces Honeywell’s Jean-Marie Alliet.


Done deals

Asco Group buys into Oniqua. Bentley bags EuResearch, Invensys acquires Spiral, McLaren merges with FMx. Pansoft goes private. Petrofac acquires Oilennium. Verdande seeks cash for CBR.

ASCO Group is now a majority shareholder in Australian analytics-based technology solutions company Oniqua MRO Analytics.

Bentley Systems has acquired Eu-Research, developer of Microprotol, an application for the design and analysis of pressure vessels and heat exchangers.

Invensys Operations Management has acquired Cambridge, UK-based Spiral Software, a provider of integrated solutions ranging from crude assay management to refinery supply chain optimization. The business will continue to be managed by Spiral Software’s existing executive team.

McLaren Software has merged with computer aided facilities management solutions provider FMx Ltd. The merger follows Idox Group’s acquisition of McLaren’s parent company.

ERP software service provider for the oil and gas industry in China, Pansoft, is to merge with Timesway Group and Genius Choice Capital. Pansoft will then become a privately held company and its shares will no longer be traded on the Nasdaq.

Petrofac Training Services has acquired specialist e-learning provider to the energy industry Oilennium.

Verdande Technology’s investors, including Investinor AS, ProVenture Management and Statoil Technology Invest are driving an internal effort to raise US$8 million for continued enterprise adoption of case-based reasoning. Verdande recently received a US$1.75 million grant from Innovation Norway to expand its development efforts and promote the use of CBR.


Cyber security round-up

Shamoon hits Aramco, RasGas. Kapersky’s ‘secret’ OS. Tofino on ANSI/ISA-99 deployment.

A Wall Street Journal report this month provides a good summary of recent computer virus activity in the oil and gas sector and reveals that Chevron was affected by the 2010 Stuxnet malware. The Shamoon virus is said to have destroyed data on 30,000 computers in Saudi Aramco’s network—despite what the company describes as ‘rigorous protection technologies.’ Aramco claims however that its incident response plans and protections (firewalls and network segmentation) meant that ‘all our core operations continued smoothly.’ Shamoon is also reported to have hit Qatar’s RasGas.

In his blog last month (1802), Eugene Kaspersky described today’s industrial control systems (ICS) as ‘defenseless’ and unveiled a ‘secret project’ to develop a ‘secure operating system’ for ICS that can be built into the existing infrastructure. Kaspersky Labs is working on an OS that focuses on running a control system and that is ‘not intended for playing Half-Life or blathering on social media.’ The company is also developing ‘software which won’t be able to carry out any behind-the-scenes, undeclared activity.’

Those interested in a more conventional approach to control system security may be interested in a white paper from Tofino Security published earlier this year titles ‘Using ANSI/ISA-99 standards to improve control system security.’ The white paper includes a analysis of data in the Repository for industrial security incidents (RISI), a database of Scada system security incidents and an informative account of a real world attack on oil refinery.


Oracle OpenWorld oil and gas track

Hyperion for Halliburton’s business, Marathon’s planning and Murphy’s forecasting. CSC on big data.

JR Irvin described Halliburton’s financial reporting systems which are divided between SAP which ‘runs the business,’ and Oracle Hyperion to ‘manage the business’. Every three hours, data transfers from SAP into Essbase. This acts as a core repository for Hyperion planning and financial management. Hyperion systems also provide information for SEC and management reporting. Five IT staffers support Halliburton’s 17 strong finance team while IT infrastructure is outsourced.

Michael Elliott described Marathon Oil’s Planning and Performance Management, also based on Hyperion, which was implemented in 2007 to improve data gathering and speed operations, financial and statistical forecasting. Initially a constellation of Hyperion Planning, Essbase, WebAnalysis, and Financial Reporting were deployed to deliver a monthly ‘rolling’ forecast. After roll-out, corporate financial requirements became clearer and the system was re-tooled to improve the Hyperion Strategic Finance (HSF) model calculations. Now Essbase reporting meets management and ad-hoc analysis requirements.

John Dobbs outlined Murphy Oil’s move from a legacy Excel-based long range planning process which was hampered by the interface with accounting and asset valuations which were disconnected from financial projections. Robust price, expense and project maturity scenario analyses were difficult. Hyperion was implemented in 2011 and now, asset valuations are aligned with financial projections. Price forecasts can be made easily and Murphy is now moving towards ‘evergreen’ forecasting. Following initial roll-out, a phase 2 implementation has simplified customization and application complexity with universal model templates and streamlined data input and integration. Data quality is reviewed in a front-end Essbase staging area. Murphy is now planning to further reduce its reliance on Excel with Hyperion Planning and to offer dashboard reporting on the iPad.

Computer Science Corp. (CSC) outlined how ‘big data’ and predictive analytics are transforming organizations—a component of an industry-wide revolution that includes social media and proliferating mobile devices. All of which represents both ‘a problem and an opportunity.’ Big data—in-memory analytics, database appliances and unstructured data—must co-exist with traditional business intelligence. CSC has developed an ‘oil and gas reference model’ and ‘petroleum intelligence framework’ which it claims integrates technical, operational and financial systems. This includes production management, laboratory information systems and real time dashboards. All of which, real soon now, are to leverage Hadoop/MapReduce, ‘shared nothing’ architectures, ‘NoSQL’ and/or ‘parallel relational’ databases. More from OpenWorld.


ISS Group hikes sales team as Schlumberger deal expires

2012 growth, new clients for Babelfish’s ‘operation’ as opposed to ‘information’ technology.

ISS Group reported revenue and profit growth for 2012 and new customers including Chevron Australian, Oil Search PNG, and Singapore LNG. The company’s flagship Babelfish software suite is described as ‘operation technology’ as opposed to ‘information technology,’ a distinction made by Gartner which has it that ‘an independent world of OT is developing separately from IT groups.’ Gartner further warns that ‘If IT organisations do not engage with OT to create alignment and integration, they may be sidelined from major technology decisions and place OT systems at risk.’

ISS is to increase its sales team and marketing effort in 2013, in part to compensate for the expiry of its deal with Schlumberger covering Babelfish sales into the oil and gas vertical. This brought in $4.5 million in 2012, some 20% of total revenues. A final $1.3 million payment is due in 2013. More from ISS.


Sales, contracts, partnerships and deployments

Allied Geotechnical, IFS Applications, GE, PanAmerican, NanoSeis, Wireless Seismic, Geospace, Baker Hughes, CGGVeritas, iSeis, SRD Innovations, Intergraph, FMC Technologies, Technip, KBR, Paradigm, Aveva, Hyperion, Ikon Science, Aker, LMKR, MapMart, Saddleback, INT.

Alliance Geotechnical Services reports that it has been awarded a contract for the development of a hydrocarbon national data centre from the Brunei Economic Development Board.

Maersk’s drilling and supply services units have chosen IFS Applications in a license and service contract worth $11.5 million. The deal includes financials, project management, supply chain management, and maintenance.

GE Oil & Gas has won a £102 million contract to supply production equipment to Chevron’s offshore W. Africa Lianzi project.

PanAmerican Geophysical has entered into an agreement with NanoSeis to commercialize narrow beam scan microseismics.

Pinnacle Asset Integrity Services has been working to enhance inspection and reliability at Husky Energy’s Lima, Ohio refinery.

MicroSeismic has acquired an cable-less RT System 2 from Wireless Seismic for use in passive seismic monitoring operations.

Geospace Technologies has won an order from TGC Industries for a 24,000-channel wireless seismic recording system. Geospace was also awarded a $160 million contract by Statoil for a 660 kilometer seabed seismic reservoir monitoring system for the Norwegian Snorre and Grane fields.

Baker Hughes and CGGVeritas have announced a collaborative relationship to improve shale reservoir exploration by integrating seismic and well services to identify sweet spots and improve development decisions.

Cableless seismic recording manufacturer iSeis and SRD Innovations have launched hyMesh-Sigma, a new wireless seismic data acquisition system.

Intergraph SmartMarine 3D has been selected by Keppel Fels as its 3D modeling and production solution for its new offshore projects including its next-generation semisub.

FMC Technologies has received a $33 million order for subsea equipment on Total Angola’s Pazflor field.

Total E&P Angola also awarded Technip an EPIC contract for the second phase of its Girassol development project.

KBR has been selected by GDF Suez to execute design work on a floating liquefied natural gas production project offshore Darwin, Australia.

PanAtlantic Exploration has adopted Paradigm’s geophysical and petrophysical applications for its exploration and production work in the deepwater environments of Brazil and Africa.

Synergy Engineering reports successful use of Aveva Laser Modeller to generate an ‘as-built’ intelligent PDMS model for an offshore oil and gas modification project.

Hyperion has delivered a complete laboratory infrastructure to Hellenic Petroleum.

Ikon Science Canada has been awarded a project by Nalcor Energy to map regional pore pressures in the offshore shelf and deep water regions of Newfoundland and Labrador.

Aker Solutions has won a two-year contract extension worth NOK 700-800 million with Statoil for provision of mechanical wireline services—2118.

LMKR has partnered with MapMart to allow GeoGraphix users access to MapMart’s Core Bundle of terabytes of streaming high-resolution world imagery, topographic maps and elevation data.

Saddleback Geosolutions has leveraged the latest versions of INT’s IntViewer and Java-based plugin API to create its first commercial product, the ‘Attribute Workbench.’


Standards stuff

Fiatech & POSC/Caesar unite. OGC GeoPackage, SensorML 2.0. PODS SP0507. EU data exchange.

The US Fiatech and Norwegian POSC/Caesar Association (PCA) are to unify their ISO 15926 interoperability initiatives under the ‘iRing’ name. Projects include Proteus, the ‘dot15926’ editor, iRingTools and the 2010 Joint operational reference data, Jord. Jord Phase 1 recently completed, delivering a reference data endpoint, compliance guidelines and a ‘methodology for compliant mapping of information interfaces using template signature patterns.’ Jord Phase 2 is underway and will complete delivery of ‘dependable ISO 15926 reference data services, including those for validating third party interfaces. More from iRing and Jord.

The Open Geospatial Consortium has kicked off a work group to advance its GeoPackage (GPKG) standard, a means of providing geospatial services to field workers with intermittent connectivity. OGC is also seeking input for its candidate SensorML 2.0 standard for describing sensors, actuators and measurement processes. SensorML 2.0 is a component of the embryonic ‘Internet of Things.’ Evaluate SensorML 2.0.

PODS is seeking volunteers from member companies to join a NACE/PODS joint task force to ‘re-validate’ the SP0507 external corrosion direct assessment (ECDA) integrity data exchange format, a comma delimited text file format.

The EU Process automation user associations EI, EXERA, WIB and NAMUR met in Brussels last month and agreed on information exchange and the formation of joint workgroups in the fields of safety, wireless automation, IT security, flow devices and alarm management.

The EU has kicked off a FP7 project, the ‘Linked data benchmark council’ (LDBC) with the objectives of developing benchmarks for RDF and graph data management systems (a.k.a. the Semantic web) and to ‘spur industry cooperation’ around such.


Monnit teams with M2M on wireless sensing offering

Low cost low power sensors in Nema 4X enclosures monitor temperature, light and more.

M2M Data has teamed with Salt Lake City-based Monnit Corp., to offer Monnit’s wireless sensing solutions to its oil and gas customers. Monnit’s wireless sensors, available in Industrial NEMA 4X enclosures, monitor critical equipment and assets for temperature, light, voltage and more. Sensors operate autonomously and are powered by self-contained batteries that provide up to 5 years operation without recharging.

M2M Data CEO Donald Wallace said, ‘Monnit’s low-cost, low-power wireless sensors will allow us to expand our sensor offering and enable our customers to monitor more conditional and environmental variables.’ Data from the new sensors will be captured to M2M Data’s remote monitoring services (RMS). The hosted service remotely monitors fluid flow, compressor performance, tank batteries, artificial lift devices and other related surface equipment. More from M2M and Monnit.


SPE PD2A meet hears of ADCO’s top down modeling

Petroleum data-driven analytics technical section hears of a decade of neural net optimization.

Fareed Abdulla, speaking at the launch meeting of the Society of Petroleum Engineers new Data driven analytics (PD2A) technical section last month described ADCO’s use of ‘top-down’ modeling for ‘making difficult reservoir management decisions.’ ADCO has been investigating D2A since 2001 and began at-scale use in 2005 with a well production optimization on the Asab field. Here, water injection to the lower reservoir tended to break-through to an upper zone and kill the well. A study on a surrogate reservoir model used pattern recognition to train an intelligent system to identify candidate wells. Neural nets and genetic optimization of production rates was very successful.

The technique was then applied to Bu-Hasa, a large, geologically complex field where similar techniques were used to build a comprehensive reservoir model that has provided history matches of static pressure, time-lapse saturation and production rates for all the wells in the asset. The model’s validity was tested by sub-setting the data and performing ‘blind’ history matches—omitting several years of production data and getting the surrogate model to ‘predict’ the missing data.

The technique of top-down modeling requires no knowledge of the underlying physical processes involved. The idea being that such information is in general either unavailable or incomplete. It is better to build a surrogate model that is derived empirically from the ensemble of available measured data. More on TDM from PD2A luminary and professor of Petroleum Engineering at W. Virginia University Shahab Mohaghegh and from the SPE PD2A.


Hexagon Leica TruView Integrator for SmartPlant Enterprise

‘Game changing’ status claimed for bridge between laser scans and plant data management.

Stockholm, Sweden headquartered engineering and geospatial software house Hexagon is claiming ‘game-changing’ status for a new mapping/data capture solution for owner operators. Hexagon has integrated Intergraph’s SmartPlant asset management software with Leica Geosystems’ TruView laser scanning technology extending its usefulness from its traditional ‘as-built’ role to ongoing operations and maintenance and revamp activities.

Leica TruView integrator for SmartPlant provides photorealistic laser scans of plant information and documentation to inspection and maintenance operators. Offline mobile device support brings these representations of assets to field workers.

Ludwig Englmaier, head of engineering with AlzChem AG observed, ‘Plant workers performing inspection and maintenance activities spend too much time locating the items they need in the plant. Smart Plant mobile, along with photorealistic TruView visualisation will help our operators carry out their work faster and safer.’ More from Hexagon.


MAOP validator backed by Pacific Gas & Electric

Pipeline safety boost from Coler & Colantonio’s maximum allowable operating pressure calculator.

Pipeline software specialist Coler & Colantonio has announced the MAOP validation calculator (MVC), an aid to gas pipeline safety developed in collaboration with San Francisco-based Pacific Gas & Electric Company (PG&E). MVC is used by pipeline operators to calculate a pipeline’s maximum allowable operating pressure (MAOP) as determined from federal and state regulatory requirements. The Calculator output allows for standardized report generation and the engineering analysis of MAOP validation issues.

The MVC includes federal and state regulations and applicable PHMSA* advisory bulletins to perform MAOP validation for pipeline components. An audit trail of the analysis can be exported to a GIS system and operational and regulatory reports can be generated on the fly.

Sumeet Singh, senior director of asset knowledge management for PG&E’s gas operations said, ‘We continue to invest in best-in-class technologies to enhance pipeline safety and have used the MVC for MAOP validation of our own pipelines. Now other pipeline operators can use it to manage critical assets and enhance the safety of their gas transmission pipeline systems.’

* US Pipeline safety administration.


SIGMA3 targets naturally fractured reservoirs

Geo-engineering solution blends seismic, well and production data for sweet-spot mapping.

Houston-based Sigma3 (Sigma-cubed) has announced a suite of tools for planning, developing, and optimizing fields, with particular attention to naturally fractured reservoirs. The ‘GeoEngineering’ solution is a software and services offering that includes geoscience and engineering technology from Sigma³’s experts using proprietary Crystal software to generate high-resolution geologic models constrained by seismic and well data.

Deliverables include maps of sweet spots, a ‘best-in-class’ 3D geocellular grid and high resolution imaging and inversion. The ‘continuous fracture modeling workflow,’ a.k.a. ‘Real-time dynamic earth modeling,’ blends completions data, microseismics, well geometries and the geological model to provide field-scale reservoir property predictions. Sigma³ CEO Jorge Machnizh said, ‘Geoscience and engineering integration needs to be compelling for geoscientists and engineers to actually want to work together! Our technology-led approach to reservoir understanding and management it the sole focus for our expertise and R&D dollars.’ More from info@sigmacubed.com.


IFS’ rolls-out corporate social responsibility reporting toolset

Modules address ‘eco footprint,’ HSE, export controls and compliance/audit management.

IFS has announced a new package for corporate social responsibility (CSR) reporting. The four solutions cover ‘eco footprint’ management, occupational health and safety, export control and quality assurance. Eco-footprint covers raw material sourcing, logistics and emissions tracking compliance and reporting. IFS Health & Safety is an integrated platform for safety policy management, risk assessment, incident management and KPI reporting and compliance.

The Export Control module targets companies subject to export restrictions, for high tech equipment and electronics. The solution, which was developed in collaboration with industry-leading customers, identifies any restrictions on a product in question and prevents unauthorized sale or purchase without a valid export license. The QA solution supports quality standards compliance planning with integrated audit management, non-conformance reporting corrective and preventive actions. The solution is also equipped with a graphical analysis tool.


Recursion Software white paper on mobile SCADA alarms

SCADA-aware Mobile uses ‘most reliable’ available route to ensure alarm reaches destination.

Frisco, Texas-based Recursion Software has announced ‘Scada-aware mobile’ (SAM), an alarm broadcast system that is claimed to exceed the capabilities of SMS, e-mail and pagers. SAM delivers alarms to mobile devices using the most reliable available route and guarantees that alarms are delivered on time. Alarms and associated data from the control system are sent to a designated group, escalating at defined intervals until someone acknowledges. SAM runs on iPhone, Android, and Blackberry. The application can be implemented non-intrusively on existing infrastructure and requires very little bandwidth. The verification ‘heartbeat’ takes less than 250 bytes.

Conventional alarms are associated with a Scada tag. SAM can also raise its own alarms when a server or mobile device fails to talk. Specific events can be associated with different sounds, vibrations, or to flash the Led on the mobile. The SAM GUI displays tags in the alarm state along with their status—color coded according to the escalation level. Drilling down into an alarm provides more information and graphs of tag history.

A database editor lets administrators configure alarms, users, actions and behavior in the event of timeouts. SAM users can be authenticated against the corporate Open Ldap server or Microsoft’s Active Directory. An extra level of security is provided by SSL and encrypted passwords in the SAM.


NetApp, Landmark, Quantum team on ProMax benchmark

‘Accelerated processing architecture’ gets workout on E-Series, StorNext appliance bundle.

A new white paper, co-authored by NetApp, Landmark and Quantum Corp., provides an analysis of storage options for seismic processors and new ProMax benchmarks. Prestack seismic data usage and increasing channel counts due to wide azimuth and other novel acquisition methods mean that IT infrastructure and storage systems need a rethink.

Enter Landmark’s ‘Accelerated processing architecture,’ a combo of NetApp’s seismic processing solution, Quantum’s Gateway appliances and ProMax software. Landmark’s researchers have worked with NetApp and Quantum to resolve I/O bottlenecks and have tested the performance and scalability of NetApp’s E-Series systems with large prestack JavaSeis datasets.

The APA is claimed to offer a ‘scalable, cost effective solution’ that is easy to manage. Under the hood is the StorNext high-performance, parallel, shared file system that creates shared data pools and provides concurrent access across heterogeneous environments of SAN and LAN infrastructures.

Benchmarks on a ‘up to’ 80 HPC compute nodes and a 3.4TB JavaSeis data set from SEAM showed that sustained JavaSeis data rates of 2.6GB/sec were achievable with a single E-Series storage system. Best I/O performance was achieved by running one process per client host. Write performance amounted to 76% of the 2.5GB/sec peak of dual 10GbE network.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.