Oil IT Journal: Volume 21 Number 7


'Rapid,’ ML4Shale

Apache has tried conventional decline curve analysis and found it wanting. Enter ‘Rapid,’ machine learning for shale production forecasting. But is it ‘reliable technology’ à la SEC?

Speaking at the 12th Annual Ryder Scott reserves conference this month, David Fulford described Apache Corp.’s use of machine learning to model and forecast liquids-rich shale wells. Working on production data from its unconventional wells, Apache noticed a few issues. First, popular methods of decline curve analysis gave a poor fit to the data. Second, least squares curve fitting failed to accurately forecast ultimate recovery. Third, human surveillance of production is impractical, given the huge number of wells that need forecasting every quarter.

But Apache has a lot of data, as one of the first movers in the Eagle Ford play. Some of its 2008 wells are amongst the oldest multi-fractured horizontal shale wells in the world. This has allowed for extensive look-back ‘hindcasting’ of production, testing various methods for forecasting both production and ultimate recovery.

Shale liquids production is complicated by the fact that flow regimes change as time goes on. In the early stages, when fractures are wide open, a linear flow regime predominates. After a while, the fracs begin to close up and subtle changes to the production mechanism occur. These are relatively well understood and can be characterized by a variety of parameters controlling production, both inside the different flow regimes and during the transitions.

The fly in the ointment of the conventional flow-modeling/physics-based approach is that evaluating the different flow parameters, particularly the onset and duration of the transition period, is hard and subjective. Apache found that its forecasting was yielding unreliable results, particularly using the ‘overwhelmingly popular’ approach to reserves forecasting, the modified hyperbolic model.

Fulford summarized the situation saying ‘for the specific case of forecasting production from shale wells there is no theoretical justification or convincing empirical validation of the modified hyperbolic model.’

So if the physics is faulty, how about letting the machine learn from the data? Enter Apache’s ‘Rapid,’ (rate analytics with probabilistic inference and diagnostics.) Rapid uses Markov chain Monte Carlo simulation, a ‘proven technology with over 20 years of oilfield use.’ In fact Fulford believes the approach could qualify as ‘reliable technology’ in the sense of the SEC. In all events, Apache has got something right with its recently announced ‘3 billion barrel’ find on southern Delaware basin’s Alpine High.

Comment – if Rapid is not ‘reliable technology’ where does that leave the conventional approaches it was designed to fix? Visit this and other presentations from the conference and brush up on the SEC’s position as de facto ‘rating agency’ for shale operators in the downturn.


Cat bags M2M Data

Cat Connect digital platform extended with acquisition of machine-to-machine communications specialist.

Caterpillar Oil & Gas has completed its acquisition of Denver-based M2M Data Corp. M2M (a.k.a. machine to machine) offers a cloud-based service for remote operation and maintenance of industrial equipment. In its seventeen years of operations, M2M has build on its gas compression origins to have a significant footprint in the upstream.

Caterpillar Oil & Gas’ director Craig Lange said, ‘This acquisition expands our Cat Connect digital services technology platform. Combining our capabilities will enable us to deliver high-level operational support solutions.’

Cat Connect brings industrial internet of things connectivity to oil country power solutions and includes end user analytical tools for remote monitoring of equipment performance. M2M CTO Trevor Harper added, ‘Our suite of hardware, firmware, software and services provides the level of customization needed to address our customers’ business challenges.’

MVP Capital advised M2M on the transaction. More from Caterpillar.

Comment: Cat Connect brings yet another ‘platform’ to the digital oilfield, alongside GE, SAP and NOV to name but three!


A Flanders and Swann-inspired reflection on novelty

Editor Neil Mcnaughton remembers the 1960s singing duo Flanders and Swann’s satiric commentary on the new technology of the day, the shiny 33rpm vinyl record. In a ‘back to the future’ mood he rolls-in a recent exchange in the Financial Times on the merits or otherwise of 'antiquated’ Cobol and revisits a hobbyhorse viz: Is there anything new in the new ‘data science’ movement?

Current talk of computing ‘in the cloud,’ at the edge,’ (i.e. near the sensors) or in the ‘fog’ (i.e. somewhere in between) reminded me of a 1960’s song from the singing duo Flanders and Swann. Their records were popular with both parents and children which was quite unusual at the time. One of Flanders and Swann’s hits was a satirical comment on the new technology of ‘high fidelity.’ Their ‘Song of reproduction’ was written at the time of technology upheaval as the old, unamplified 78rpm records were being placed by stereophonic 33rpm micro-grooved LPs.

I had a little gramophone,
I’d wind it round and round.
And with a sharpish needle,
It made a cheerful sound.

And then they amplified it,
It was much louder then.
So they sharpened finer needles,
To make it soft again.

The other day, I noticed my wife’s oldish MacBook air playing some music while she was out of the house so I switched it off. I was surprised in that this took just a few seconds. My much more powerful PC chunters on for yonks when I shut it down. I am certainly not the first to have observed that for many straightforward tasks, increasing compute power is more than offset by lumbering software. Or, as F&S might have had it…

I had a little PC,
On which I wrote my stuff.
It only had a meg of Ram
But that was quite enough.

Then I got a workstation
And gigs more memory
But bags of blooming bloatware
Brought the blighter to its knee(s).

When the move to the cloud started (and I am talking about Office365 as well as enterprise IT here) it was obvious that there would be a problem of bandwidth. Even a modest desktop PC can easily provide almost a hundred MB/s of read/write bandwidth, more if you are prepared to pay for it. The cloud, for most of us, is likely to be a factor of 10x below this and also brings problems of quality of service and latency. Microsoft has addressed this by providing double copies of its software, ‘lightweight’ i.e. crap versions that clunk along in the browser, and lumbering bloatware to run on the desktop.

The internet of things is likely to be another such train wreck even though the ‘unforeseen’ (i.e. blindingly obvious) consequences of bandwidth and latency are being addressed by adding computing at the ‘edge’ or in the ‘fog.’ This sounds rather familiar. Before the cloud we had real time systems in the factory/plant. In the field, on site intelligence like a pump off controller would do what it had to do locally. Only a subset of data would make it into the network or scada pipeline. This situation was deprecated by the nouvelle vague of IT, as data that did not make it into the network was deemed to have disappeared down the data drain instead of being gathered as it should have been into the data lake.

Cloud, drain, fog, lake … so many metaphors! So many moving parts! So much confusion! What does it all mean? I think it means that just about any new or not so new technology can now be shoehorned into one or other of these new and nebulous paradigms.

~

An interesting exchange took place recently in the Financial Times where Lisa Pollack reported on a study by the US Government’s accountability office that ‘highlighted the continued use of Cobol in public agencies.’ This was deemed by the Office (and by Pollack) to be a ‘bad thing.’ Cobol is seemingly an ‘antiquated’ language that exposes users to ‘code fragility’ and that it is heading into a ‘digital dustbin’ presumably along with its coders. Cobol needs to be urgently replaced by the kind of ‘micro services-based architecture’ that is favored by the proponents of the cloud.

This reminded me of my October 2002 editorial ‘Don’t mention the ‘F’ word in marketing!’ Where I opined that a programming language ought to be tuned to its end-users’ needs. This is true for both science and or business. I also demonstrated that in 2002 at least, Fortran was alive and kicking and likely providing the world with more real compute cycles than most other languages. While I was mentally drafting a letter the FT along these lines. I was pipped to the post by someone far better qualified, one Stephen Castell of Castell Consulting. He argued cogently that Cobol-based code has stood the test of time and that it is in fact particularly robust in the face of code fragility. He encouraged developers to ‘get back to the future’ with ‘unfashionable’ Cobol.

~

I recently stumbled across a 2015 paper by David Donoho, statistics professor at Stanford University, titled ‘50 years of data science.’ Well that sounds like back to the future already. I was even more intrigued in that Donoho proposed to ‘review [...] the current data science moment and [...to investigate...] whether data science really different from statistics.’

Donoho’s paper was based on an address he gave at the centennial celebration of the birth of John Tukey who ‘called for a reformation of academic statistics [...and...] who pointed to the existence of an as-yet unrecognized science, whose subject of interest was learning from data, i.e. data analysis.’ And that was over 50 years ago! Tukey was a part time geophysicist, famous for his fast Fourier transform. As everyone knows, geophysicists do (data) till it Hertz!

@neilmcn


Oil IT Journal interview - Mark Reynolds, Southwestern

Southwestern’s senior solutions architect talks about data management in the downturn. Low to no drilling means more time for analysis of historical data. Data managers are getting hammered with requests for stuff that has practically never been looked at before. Fulfilling the queries has been made possible with a large SQL/Witsml repository along with Spotfire, a.k.a. Excel on steroids!

What is Southwestern’s data strategy in the downturn?

We are looking at a year without drilling so focus has moved to our production activity including a lookback at old data to see how to improve drilling and completion when things startup again. Almost all departments are now looking hard at data, even stuff that has practically never been looked at before. Our data managers are getting hammered with requests. One of our first responses to the new interest in data intensive computing was exposing much of our data in Spotfire. Making accessible this formerly siloed information from geological prognosis, geosteering, completion and production has been a big win for us. We consider Spotfire like Excel on steroids. It provides access to data from multiple sources with tabular and pivot table analytics.

What exactly do you mean by analytics?

For us today this means analysis, spotting trends and doing data clustering. We are in a transition stage. We have not yet embarked on predictive analytics or machine learning.

How is data stored?

With help from Petrolink we store real time Witsml data from our wells. Witsml data is deconstructed and stored in a large relational, SQL database. The idea is to be able to find the ‘pacesetter’ well and see why it was successful. We have been storing Witsml data since 2012 as part of our vision for land-based drilling information aggregation, proactive real-time systems, and post-drill analysis.

So how is the data stored? As tables, traces, XML in blobs?

Currently all the data is in the database and can be accessed via Witsml or by direct SQL query. But direct data access is challenging and Witsml is not conducive to analytics and machine learning. We are working architectural solutions to resolve both the query lag and on Witsml variabilities.

Have you considered using blobs?

We could us blobs or a NoSQL type data store. But right now we are working on accessing data in the existing store. We are also working on a similar project with production and operations data. We want to be able to do analytics on live data for production optimization with machine learning and using predictive analytics to scheduling downtime/maintenance/workovers.

You are storing lots of time series data. Did you consider PI?

PI carries a heavy commitment and we are determined to stay light and nimble. But we are in no rush, we are still looking for the best technological fit.

And Petrolink’s NoSQL repository?

It is an option. But in the first instance we are looking for a permanent storage option for cleansed data from the field. We are working on problems like ‘how many ways can you spell ‘block height!’ PetroLink’s realtime focus is through a relational (SQL) schema.

Are you planning for a data lake?

I don’t like the term but we sure have lots of data to archive. We have scada, Witsml, geosteering – you name it. And we got lots more data with the Chesapeake asset acquisition.


ESRI Petroleum User Group

Esri builds big data capability with ’spatio-temporal’ geodata store. DigitalGlobe connects ArcMap to the cloud. Infosys brings Witsml into ArcSDE, adding risk object. New GeoAnalytics engine revealed.

Esri’s Mark Bramer introduced the Esri ArcGIS spatio-temporal big geo-data store. ArcGIS 10.3 introduced a ‘managed’ relational database designed to capture inter alia, time series data. In 10.4, this has been upgraded to handle high volume and velocity, streaming real time data at ‘tens of thousands of writes per second.’ The system is connectable to a variety of real time data streams including Hadoop/Kafka, MQTT, MongoDB and even Twitter! Tests on the system have demonstrated single node write through bandwidth of over 100k/sec as compared to 0.2k/sec for ArcGIS relational.

In another presentation, DigitalGlobe’s Harsh Govind introduced a new ArcMap to Cloud connector for ‘geo-big-data.’

Infosys’ Sethupathi Arumugam described a novel approach to capturing Witsml drilling data to ArcSDE for real-time monitoring and risk assessment. The idea is to expose generally underutilized spatial information in the Witsml data stream allowing for its use in geosteering and drilling reporting. Infosys has extended Witsml with a ‘risk object,’ an XML description of e.g. fluid losses over a given interval. The risk object adds information on severity, likelihood and possible mitigation measures. In well planning, multiple wells and risk objects can be assessed simultaneously.

~

The geo-big data theme was also a highlight of the 2016 Esri plenary UC in San Diego with a different slant on spatio-temporal analytics. Esri is working on GeoAnalytics, a new server capability that allows spatial analysis to be run as distributed computations across a cluster of machines. The demo covered a real time data feed of billions of financial transactions. GeoAnalytics was used to aggregate and investigate this large and complex data set to spot fraud. The system spotted multiple near simultaneous fraudulent transactions in different localities, all just below the legal reporting threshold, a likely indication of a money laundering operation. Watch the GeoAnalytics video and download the US PUG presentations.


CDA’s unstructured data challenge

How to ‘make sense’ of a disparate document archive of 11,000 wells and 2,000 seismic surveys.

Earlier this year, the UK’s joint industry upstream data holder Common Data Access (CDA) announced, by way of its website, an ‘unstructured data challenge,’ asking the industry if it could ‘make sense’ of the documentation describing over 11,000 wells, and 2,000 seismic surveys. A ‘small number of data and information analytics tool vendors’ were invited to apply their expertise to extract information from 50 years’ worth of reports, log images and other unstructured data types.

Speaking at this month’s ECIM data management conference in Haugesund, Norway, Paul Coles presented Schlumberger’s findings in the context of a wider ‘intelligent repository’ project. This has delivered a spatially enabled keyword database capable of delivering automatic composite logs and assembling interpretations. Tools used included Wipro’s Holmes AI platform, Apache Solr enterprise search and other tools for automated text summary and classification. Machine learning was used to establish data relationships and correct around 15% of the data set with ‘esoteric’ curve names or wrongly labeled logs. Automated petrophysical interpretation was combined with cuttings descriptions from reports. All running on a ‘scalable cloud-based platform.’ The result is an automated mapping system for the UKCS, delivered in Petrel, that brings insights into hydrocarbon systems and future business opportunities. More from CDA.


A storm in the regulatory teacup?

PPDM announces regulatory work group. But what of Energistics’ prior efforts in this space?

Speaking at the at the PESGB/GeolSoc conference on ‘preserving geological assets’ earlier this year, PPDM association CEO Trudy Curtis surprised at least one person in the audience with her keynote address on ‘the value of regulatory data standards.’ Regulatory agencies need to deal with many operating companies and ideally there should be ‘global standards for vocabulary, data and formats.’ PPDM has been working with the Alberta Energy Regulator to investigate such issues with a questionnaire issued to stakeholders including North American operators and other users of regulatory datasets along with regulators in Australia, Canada, Mexico, Spain and the US. The survey set out to identify ‘pain points’ such as cumbersome regulations which may divert capital to other jurisdictions. Regulators and reporting companies are having a hard time keeping up with the evolution of explorations technology, especially with unconventionals.

PPDM has kicked-off a regulatory work group to address such issues, to find consensus and develop a regulatory vocabulary to serve as a ‘Rosetta stone’ to disambiguate key terms and phrases such as well, log and completion. The work group is also to investigate the extension of the PPDM data model to better address the regulatory environment. Data exchange formats are considered out of scope for this project.

Comment: This is a laudable effort but Curtis’ apparent failure to refer to the work of Energistics in the regulatory field, and the National data repository work group is puzzling. The next meet of the NDR is scheduled for June 6-8th, 2017 in Stavanger, Norway.


Geospatial Corp’s GeoUnderground pipeline manager

Google Maps and cloud storage help out with US Pipes Act compliance.

Google Maps technology partner Geospatial Corp. reports that its ‘GeoUnderground’ pipeline management system will help operators of gathering, transmission and distributions systems comply with the new rules emanating from the 2016 US Pipes Act. The act empowers the Pipeline hazardous material safety administration (PHMSA) to require various certifications, data management, testing and mapping of all types of the more than 3 million miles of buried pipelines across the USA.

Geospatial chairman and CEO Mark Smith said, ‘We are well positioned to benefit from the new PHMSA regulations. Oil and gas is a major part of our current and future business and clients have been paying attention to the accuracy and completeness of their asset mapping in anticipation of the new regulations.’ GeoUnderground, leverages the Google Maps API and cloud to provide a ‘total solution’ to underground and aboveground asset management. The solution assures accurate positioning of assets and creates three-dimensional digital maps and models of underground infrastructure.


Paradigm to Petrel via RESQML

Standards-based data exchange minimizes data loss risk.

Paradigm reports use of the relatively new Resqml model data exchange protocol to exchange geological models between its own interpretation and modeling environment and Schlumberger’s Petrel. Energistics’ Resqml is a non-proprietary data exchange standard for reservoir characterization data and cellular geological and flow models.

Data can be exported from Paradigm applications into Petrel via the Petrel Connector, a plug-in to Petrel developed by Paradigm. Resqml consists of a set of XML schemas (XSD files) and other standards-based technologies like HDF5. Using the protocol minimizes the risk of data exchange issues such as data loss or corruption, enhancing productivity. Other connectors allow for exchange of data from Paradigm’s Geolog and Epos environments. More from Paradigm.


Software, hardware short takes

LandWorks, Trimble, Aveva, Blue Marble, Beicip, ObsPy, Seismic Utensils, SeisWare, Optify, CMG, DNV-GL, Exprodat, Fraunhofer Institute, Hexagon Geospatial, ThingWorx, Petrosys, PetroWeb, Progea, Pegasus Vertex, Safe Software, Red Hen Systems, SAP, WellDog.

The 5.5 release of LandWorks suite offer full support for the Esri ArcGIS platform in both Online and Portal flavors.

Trimble’s GasOps solution for natural gas utilities is a cloud-based, mobile data collection solution that supports capture to systems of record and streamlined reporting to US PHMSA rules. A barcode scanner auto-populates form fields, reducing input errors.

Aveva’s engineering information management Aveva Net Gateway 5.0 has achieved certification to run with SAP’s ERP application.

Blue Marble’s Geographic Calculator 2016 SP1 provides an improved interface for working with seismic survey conversions, a new online geodetic database update tool and support for reading coordinate systems from a Petrel database.

Beicip’s TemisFlow 2016 introduces a new lithospheric approach for the modeling of basins with complex thermal history. Also new is a drainage area and trap analysis model for rapid estimation of reservoir potential.

ObsPy, an open source Python package from Munich University, gets the thumbs up from Matt Hall as a useful tool for reading Seg-Y seismic data files.

Seismic Utensils has released AVO-Detect, an ‘affordable’ AVO tool that adds functionality to existing workflows. The solution is available under a non-exclusive agreement with SeisWare.

Scott Schneider’s startup Optify is working with Landmark, Schlumberger and IHS Markit to deliver plug-ins and connectors for geotechnical data exchange. First out of the blocks is a plug-in for DecisionSpace to connect with the IHS Enerdeq data library.

The 2016.10 release of CMG’s ‘Results’ reservoir simulation post-processing and visualization tool integrates 2D and 3D functionality for quick comparison of simulation outputs with field history data. A new SR3 file format supports large, memory intensive data operations for navigating thousand well datasets.

A joint industry project led by DNV-GL has validated its Helica software for stress analysis of umbilicals and flexibles. JIP partners included Ultra Deep, Exxon-Mobil, Oceaneering, Shell, Technip and ABB.

Exprodat’s Data Assistant v222 is now compatible with ArcGIS Desktop 10.4 and 10.4.1 and includes new data formats for unstructured grids in SeisWorks, IHS Kingdom, IESX and Charisma. Also new are ‘beach ball’ representations of fault type. The company has also released Exploration Analyst Online, an ArcGIS-based application for web-based play assessment.

Fraunhofer Institute has released XtreemView for Windows with ‘all the features of the Linux edition.’ XtreemView is terabyte-scale 3D visualization software for post-stack seismic data. The single node only Windows edition requires a 64 bit edition of Windows 7 up and ‘as much RAM as you can afford.’

Hexagon Geospatial has launched M.App Enterprise a privately-hosted solution that enables organizations to share their dynamic location-based information services. Hexagon’s ‘SmartM.Apps’ (maps - geddit?) are lightweight mapping applications. M.Apps can be marketed on the Hexagon M.App Exchange.

The 5.21 edition of Kepware Technologies KEPServerEX expands interoperability with the ThingWorx IoT platform, adding over 150 communication drivers to provide real-time, bi-directional industrial controls data to the ThingWorx platform.

New Century Software’s Gas HCA (high consequence areas) Analyst 5.0 offers improved Esri integration (with file geodatabase loading) and more analytical rigor to meet changing regulations.

Petrosys v17.8.1 adds a ‘powerful new’ grid editing capability that updates surfaces as the nodes or associated contours and faults are edited. Connect-ivity is enhanced with direct import of DEM, ArcGrid binary and other raster files and objects can be dragged and dropped from Petrel session on to a Petrosys map.

The 16.1 edition of PetroWeb’s Enterprise DB/Navigator brings enhanced and administrable query tools. A new elevation tool provides the ability to get an elevation profile for a line drawn on the map and support for checkshots, time/depth data and more well metadata. Data management has been enhanced with international UWIs and bulk load of well aliases and reference value management.

Progea has announced Movicon.NExT V3.0 with new iOS and Android apps for smartphones/tablet access to scada and HMI data. NExT is based on Microsoft .NET and on WPF/XAML vector graphics rendering.

Pegasus Vertex has announced PlugPRO, a cement plug placement model that calculates under-displacement volumes and optimizes fluid volumes to balance slurry and spacer levels after pull out of the hole. PlugPRO also models the displacement hydraulics of fluids.

Safe Software has added the RCaller transformer to its FME statistics calculator, giving users the ability to run any R script in the middle of an FME workflow. Users can now perform (inter alia) fast Fourier transform, non-linear regression, matrix algebra and Monte Carlo simulations.

Red Hen SystemsisWhere 3.1.0 media mapping add-on tool for geotagged imagery brings a 10x speed up an represents a ‘breakthrough’ in video track recording to Google Earth.

SAP has announced SAP Hana ‘express edition,’ a downloadable version of its in-memory database optimized for data-driven applications.

Following successful field trials with industry partner Shell, WellDog has announced a new ‘Shale Sweet-Spotter’ (SSS) service for unconventional operators. SSS uses lasers and sophisticated detectors to pinpoint hydrocarbon-rich zones and optimize development plans. SSS (a.k.a. the reservoir Raman system) was originally developed for coalbed methane development. WellDog claims that the SSS is the industry’s only downhole Raman spectrometer.


EAGE 2016, Vienna

'This time is different.’ Even with $100+ oil, discoveries failed to match consumption. Now, sub $50 oil hits the service sector hard. The downturn maybe an opportunity for something new - machine learning, automated interpretation, fractals and high-end computing. Aramco floats upstream ‘W3 Prov’ standard. Schlumberger struts it’s stuff while Halliburton is a no show!

The 2016 meet of the European Association of Geoscientists and Engineers, held earlier this year in Vienna, Austria was a subdued affair. In the opening plenary, Joseba Murillas (Repsol) offered a fairhanded but bleak view of the current industry situation. Exploration budget cuts mean lost jobs especially in the US and in the service sector. National oil companies and producing countries are feeling the squeeze. For consumers the news is good. Refiners are doing well and, at least for integrated oils, ‘still paying our salaries!’ Older employees tend to think this is ‘just another cycle.’ Others believe ‘this time is different’ and envisage a ‘deep transformation of the industry.’

Heiko Meyer (Wintershall) observed that CO2 and local pollution in megacities were key issues although coal is ‘hard to beat on price’ and oil is ‘hard to replace in transport, despite electrics.’ ‘Our children are less tolerant to our industry.’ ‘We need to fight for natural gas as a transition fuel.’ Murillas asked how many oils were working in renewables.

Most have some activity here although, as Tim Dodson (Statoil) said, ‘We struggle with this. We are in competition with renewables but also need to be part of the action.’ Statoil has a separate renewables business which is capital intensive but in smaller projects. This all feeds back into ‘what’s different this time’ as renewables get competitive. Oil and gas execs need to do better job of explaining our primary task. Only 4% of oil goes to power generation and much of the rest is used as feedstock that does not create any emissions. Meyer observed that ‘industry missed a trick with Fukushima.’ Instead of a dash for gas, the Japanese built 40 coal-fired power stations.

The discussion then turned to the future of exploration. Luca Bertelli (ENI) thinks it will be hard to make exploration ‘sustainable.’ Although the oil price ‘can’t stay this low for long,’ it may not get back $100. Is $30 to $50 an anomaly? Bertelli thinks that the anomaly was the last 5 years of super high prices, ‘In 2006 $30/40 was the norm, industry was making money and everyone was happy!’ What has changed is the cost base. The oil price is down 70% but costs are only down 25%. This is in part due to the portfolio mix with more high cost deep water plays.

Unsurprisingly, in 2016 discoveries are running at a six decade low, at around one third of consumption. But even in 2013/14, when global exploration spend was at a peak, ‘we did not deliver.’ ‘Industry can’t discover the equivalent of worldwide consumption, even at $100.’ There was a crisis in exploration before the price drop. We need to rethink how we explore, to find new basins, new ideas and reset our costs.

Ceri Powell (Shell) sees light at end of tunnel. Shell has rebased costs drama-tically, with a 50% year on year reduction in the Gulf of Mexico.

Dodson was less sanguine, observing that the quality of fields developed at $100 were mostly ‘3rd and 4th quartile 1970/80s discoveries, with short to no plateaus,’ representing ‘no more than a blip on the curve.’ ‘We are all struggling with our resource base, our projects are at wrong end of cost curve. Unconventionals will never be high quality assets. More and better new ideas are needed.’

Jean-Georges Malcor (CGG) is focused on cost reduction. CGG is now 50% down on cost compared to 2013. But, ‘is this sustainable? Nobody is making money, not even covering the cash cost.’ ‘Our shareholders don’t like our capital intensive industry with little visibility. We need to work together for better visibility. Five years for me is game over!’

On the ‘new ideas’ front Powell cited Shell’s ‘heartlands’ activity in mature, well known basins since 2005, driven in part by a low tech ‘rejuvenation opportunity now’ program that replaces interpretation workstations with ‘Mylar and colored pencils.’ The RON approach combines global exploration savvy with deep local knowledge. Elsewhere, the low cost environment has allowed for huge basin wide 3D acquisition such as the Sarawak Broadseis 3D.

The theme of doing more with less is seen by many as an opportunity to take a closer look at the data with a variety of novel(ish) techniques. For the (mostly) geophysicists of the EAGE this means applying machine learning to seismic interpretation. Anders Waldeland (University of Oslo) has used machine learning to automate the identification of salt bodies in seismic reflection from a variety of 3D attributes. A simple ‘nearest mean’ metric of data from inside a known salt plug was used to train the system. A plot of coherence vs. grey level co-occurrence matrix was used to determine the salt boundary. A North Sea dataset was interpreted successfully using a Gulf of Mexico-derived classifier. The basic technique is not exactly new, one reference dated back to 1973 (EarthDoc 85125).

Muhammed Shafiq (Georgia Institute of Technology) has evaluated five ‘perceptual and non-perceptual’ measures of textural dissimilarity to develop a 3D gradient of texture metric. This dissimilarity metric is ‘consistent with human perception’ and yields better dissimilarity than non-perceptual measures. Tests on a North Sea dataset show that the perceptual dissimilarity measures are computationally more efficient and better at delineating salt domes (EarthDoc 85267).

Hendrik Paasche (Helmholtz Institute) used data-driven inversion of near surface geotechnical data (logs, seismic, radar) to link shear wave velocity and sleeve friction (a measure of soil strength). Data-driven concepts based on fuzzy sets without prior knowledge did the trick, at least on a site-specific basis. Elsewhere, your mileage may vary. This is not exactly a killer app but interestingly provided some insights into a ‘weak and as yet unrecognized, physical link between electromagnetic wave propagation and sleeve friction’ (EarthDoc 85040).

Xavier Refunjol showed how Swift Energy has used simultaneous inversion and neural networks of log and core data to generate impedance, porosity, and TOC volumes in the Eagle Ford shale play. Refunjol studied lateral and vertical variability of reservoir qualities. Eight wells were used to train the system and results were validated by eliminating one well at a time to compare the resulting log (EarthDoc 85014).

Data management and standards were rather downplayed at this year’s conference. However, an interesting contribution from Aqeel Al-Naser (Manchester University and Saudi Aramco) showed how the World-wide web consortium’s ‘W3 Prov’ standard can be used to tag subsurface data objects with a provenance audit trail. Typical workflows across multiple interpretation systems should ideally carry provenance metadata throughout the workflow. W3 Prov is a graph-based data model of information about entities (e.g. a horizon), activities (e.g. an interpretation), and agents (e.g. the interpreter). The prototype was implemented across a seismic-to-simulation workflow spanning Paradigm Echos, Petrel, Gocad and Aramco’s GigaPowers flow simulator. A mouse-over event pops-up a text box with provenance information such as the date of interpretation and the name of the interpreter (EarthDoc 85402).

High performance computing progress was reported by Hui Liu (University of Calgary) who used a 32k core IBM Blue Gene/Q supercomputer to accelerate large-scale reservoir simulations. A novel scheme allows simulations to be parallelized such that the simulator has linear scalability. Reservoir simulations ‘can be accelerated thousands of times using thousands of CPU cores.’ The huge memory bandwidth of such computers also allows extremely large reservoir models to be computed (EarthDoc 85027).

Michele De Stefano (Schlumberger) has borrowed a fractal generation technique used in the computer graphics/gaming industry to provide ‘pseudo-realistic’ topographies and three-dimensional geophysical models. Applications for the technique include simulating datasets for testing inversion algorithms, interpolation and upscaling. The diamond-square algorithm (DSA) was devised in 1980 by Loren Carpenter of the then Lucas Film company (EarthDoc 85405).

Several papers addressed hardware speedup with a variety of accelerators, a field where Intel now challenges Nvidia with its Xeon Phi coprocessor. Gerard Gorman (Imperial College) observed that a) seismic imaging is hard, b) hardware is complex and c) parallel programming ain’t easy either. But computing is changing ‘like it has never changed before,’ with a plethora of different architectures such that it is hard to find people to run the show. Code is costly to optimize. What is needed is code modernization for high level abstractions and performance. This is achieved with code generators for domain specific languages, without which, high level languages are ‘slow and expensive.’ Gorman gave a ‘shameless plug’ for his LCS-Fast consultancy. Gorman is bringing together open source software developers and seismologists to develop a domain specific language for seismic imaging. Part of the picture is Firedrake, an auto coder that allows an end user ‘write python to run on 10k processors.’ SymPy is presented as a domain specific language for finite difference algorithms. It allows for cross-hardware, rapid development and ‘crucifies’ legacy, hand tuned code! The work was funded by Intel and BG Group.

Daniel Grünewald introduced Fraunhofer’s ACE asynchronous constraint execution codebase for reverse time migration at ‘extreme scale.’ Today, a single shot may be too big for a single GPU device. The answer is to parallelize with Fraunhofer’s ACE Communicator GPI 2.0 with both Nvidia GPU and Intel Xeon Phi support.

Lin Gan (Tsinghua University) described the speed up in reverse time migration using multiple K40 GPU cards. Best results with K40s were 28 times faster than an OpenMP/dual Intel E5-2697 CPU implementation although less optimization effort went into the Intel code (EarthDoc 84814).

Gabriel Fabien-Ouellet (INRS-ETE) has also used GPUs to speed seismic inversion but with a twist. Instead of Nvidia’s proprietary Cuda programming language, here, the open source OpenCL was used to allow for the use of heterogeneous clusters. Tests on large clusters with nodes built with Intel CPUs, NVidia GPUs and the Xeon PHI confirmed the 80x supremacy of the GPU. But OpenCL made for ‘a better usage of the computing resources available using a single source code for a multitude of devices.’ An open source code base for full waveform inversion, SeisCL will be available on Github ‘real soon now’ (EarthDoc 84811).

A sign of the times no doubt, Halliburton/Landmark was absent from the 2016 EAGE leaving the floor open to arch rival Schlumberger. We spent some time on the booth and heard about recent developments with the Petrel Guru, a workflow management plug in that is also available in Techlog, Intersect and even the Next training environment. Guru can be branded with a company’s own logo and configured to what are considered local best practices. The Guru offers advice on checking data quality during data transfer although Schlumberger is keen to emphasize that this does not cannibalize its own Innerlogix quality toolset.

Schlumberger continues to enhance its Blue Cube hardware bundle-cum-cloud solution. Petrel was demonstrated running on an iPad! over the conference Wi-Fi, served from the Schlumberger cloud in Aberdeen. Blue Cube is delivered in partnership with Dell EMC as either a private or remote SaaS offering. Schlumberger claims 250 internal users and the system is said to ‘work as well as a workstation.’ Western Geco’s multi client data now also streams from the cloud. Behind the scenes the Linux KVM provides virtualization while HP’s RGS technology adds remote, ‘thin client’ visualization. The UK Oil and Gas Authority uses the solution to stream seismic data.

Finally, Gaynor Paton (Geoteric) presented the results of a curious investigation into color blindness in seismic interpretation. 23 individuals, five of whom had some form of color deficiency, were invited to carry out various tasks such as determining fault orientation on different data sets with and without color coded orientation. Unsurprisingly, the study found that the use of color helped ‘standardize interpretation.’ The tools used in the study included Vischeck. Konan’s ColorDX was also used to simulate color blindenss. The point of the study escaped us. Could this be a very soft sell for Geoteric’s colorful aids to the interpreter? Surely not!


Folks, facts, orgs ...

ABB, Anadarko, API, Arria, Badger Explore, Calgary Scientific, CartaSite, Caterpillar, Chief Outsiders, SINTEF, CO-LaN, IPL, DAMA, Geospatial Corporation, GSE, iLandMan, National Oil Varco, P2 Energy Solutions, SEG, RSI, BP, USPI, National Data Repositories.

ABB has appointed Guido Jouret as chief digital officer. He hails from Cisco.

Former president and CEO at Sasol Limited, David Constable has been elected to Anadarko’s board of directors.

The American Petroleum Institute has named Michael Tadeo and Brooke Sammon as spokespersons to its communications team. Hilary Moffett is director of federal relations.

Sharon Daniel is now chair and Matthew Gould is CEO at Arria following Stuart Rogers’ resignation. Falcon Clouston is deputy chairman of the board and head of the audit committee. Michael Higgins has stepped down from the board.

Roald Valen is now CEO at Badger Explorer succeeding Øystein Larsen who is stepping down to pursue opportunities outside the company.

Laurie Wallace has been named to the Calgary Scientific’s board of directors.

Cartasite has named Wes Felteau as director of platform products. Becky Gibbs is senior product manager. Mike Wille is to lead product development. Maria Ingemarson leads QA, Deborah Diaz is user interface engineer and Laura Thompson, senior software engineer.

Morgan Vawter heads-up Caterpillar’s information analytics team, a part of the marketing and digital division.

Andrew Poon is now chief marketing officer at Chief Outsiders.

Olaf Trygve Berglihn is the new representative of Sintef in CO-LaN.

Mark Humphries has been elected to represent IPL at the Data management association’s (Dama), UK chapter.

Geospatial Corp. has added Todd Porter to its executive management team and is to create an Energy services division in Houston.

Grant Thornton retiree, Jim Stanker has joined the GSE board of directors and the audit committee.

Alfred Tovar is account executive in iLandMan’s sales department.

National Oil Varco has promoted Isaac Joseph to president of its wellbore technologies segment.

Ben Wilson is now CTO and global head of product management at P2 Energy Solutions. He hails from GE.

Nancy House is the Society of Exploration Geophysicists president elect for 2016/2017. Madeline Lee is second VP, Lee Bell is treasurer. Paul Cunningham and Ruben Martinez have been appointed directors at large.

Anthony Greer is CEO of Rock Solid Images following Richard Cooper’s resignation. Cooper stays on as an advisor. Andy Phipps is president of RSI Americas.

BP has joined the USPI standards body and has named Peter Whittall to represent BP on the management board.

Deaths

Lee Allison, Director of the Arizona Geological Survey and chair of the executive committee for NDR2016, died this summer following a fall.


Done deals

IFS acquired by EQT. Hewlett Packard Enterprise buys SGI. Fitch withdraws Halliburton’s rating. Aveva pays £10.5 million in fees for aborted transaction. Leidos and Lockheed Martin’s information systems unit merge. Canadian Energy Services bags Catalyst. Arria switches listing.

ERP specialist IFS has been acquired by venture capitalist EQT, via its IGT Holding unit. Following EQT’s acquisition of Elliott’s shares in IFS at a price of SEK 396.73 per share, EQT holds some 97% of IFS. The IFS board of directors is to initiate compulsory acquisition proceedings and apply for the delisting of its A and B shares from the Stockholm Nasdaq.

Hewlett Packard Enterprise (HPE) is to acquire supercomputer manufacturer SGI in a transaction valued at approximately $275 million, net of cash and debt. SGI had revenues of $533 million in fiscal 2016 for a loss of $11 million. SGI has a significant footprint in oil and gas with notably, Total’s Pangea supercomputer which was recently upgraded to a sizeable 6.7 petaflop compute bandwidth. Since its separation from the old Hewlett-Packard, HPE has done enterprise-scale deals with Computer Sciences Corp, MicroFocus and now SGI.

Earlier this year ratings agency Fitch ‘affirmed’ Halliburton’s ‘A-’ rating and at the same time, withdrew its ratings for the company. Fitch attributed its negative outlook to the debt Halliburton incurred from the failed Baker Hughes merger and the ‘subsequent lack of additional cash flows [..] in the current depressed oilfield services environment.’ Asked about the ‘withdrawal,’ a Fitch representative told Oil IT Journal, ‘Fitch withdrew coverage of Halliburton for commercial reasons as the relationship was unsolicited.’

Aveva reports that in fiscal 2016 it incurred exceptional costs of £15.2 million including the princely sum of £10.5 million in professional fees, mainly for legal and financial due diligence services related to the aborted Schneider Electric transaction as well as in the acquisition of FabTrol Systems.

Leidos is to merge with Lockheed Martin’s information systems and global solutions business in what is described as a ‘reverse Morris trust transaction’ (not to be confused with a Morris dance!) The combined entity is a $10.8bn Fortune 250 Company focused on the delivery of IT solutions to both commercial and government organizations.

Canadian Energy Services and Technology has acquired Midland, Texas-based Catalyst Oilfield Services in a cash and paper transaction.

Arria is transitioning its primary stock exchange listing from the London Stock Exchange’s junior AIM board to the main board of the New Zealand Stock Exchange, with secondary listings on the Australian Securities Exchange main board and the main market LSE. The company expects that the potential of its NLG artificial intelligence will be ‘fully realized’ in 2017.


Back to School - upstream data management special

IFPen, Aberdeen University, Robert Gordon University announce upstream data courses.

Europe may be divided by Brexit but is seems united in the need to teach oil and gas data management with no less than three different courses coming on stream over the next few months.

France’s IFP/Energies Nouvelles has announced an 11 month Masters in Petroleum Data Management. Teaching is in English and the course is split between two campuses: at IFP School (Rueil-Malmaison) and at ENSG (Marne-la-Vallée) both near Paris.

Aberdeen University is also offering a Petroleum Data Management MSc, in collaboration with UK’s joint industry data management organization, Common data access (CDA). The course is is to kick off in 2017 on a part-time basis, with distance learning and full-time options to follow in 2018.

For those seeking a fast track to upstream data management skills, Aberdeen’s Robert Gordon University is offering a Graduate Certificate in petroleum data management, again in partnership with CDA. The distance learning course will be delivered via RGU’s CampusMoodle online tool.


DNV-led JIP puts lid on the document explosion

New report outlines best practices for subsea documentation rationalization.

A cross-industry project led by DNV GL that set out to ‘halt the boom in subsea documentation’ has completed with the delivery of a free 97 page report*. The two-year collaboration has delivered a recommended practice that is claimed to reduce subsea documentation and encourage reuse. Subsea documentation has increased fourfold since 2012. Today’s projects can entail up to 40,000 documents with three revisions resulting in 120,000 transactions. A major project may require a contractor to have 25 people on document control.

‘Technical documentation for subsea projects’ describes a minimum set of documentation to be exchanged between operators and contractors for construction, procurement and operations. One JIP participant estimated that the adoption of the RP could deliver a 42% reduction in engineering hours from document standardization and re-use and from avoiding ‘unnecessary reviews of non-critical documents.’

Project co-chair, Statoil’s Jan Ragnvald Torsvik said, ‘The approach of package specific requirements has a positive impact on standardization, efficiency and quality. We are already seeing the benefits of implementing a draft version of the RP in Statoil’s Johan Sverdrup project last year.’ JIP partners included Aker Solutions, ENI, FMC, Engie, Kongsberg, Statoil and Subsea 7.

* DNVGL-RP-0101, ‘Technical documentation for subsea projects’ is a free download from DNV-GL.


EPIM news

Norwegian joint industry body rolls out LogisticsHub, EPIM ID and demos future ReportingHub.

Norway’s LogisticsHub is now ready for prime time adoption by oil and gas companies and their suppliers. The LogisticsHub project started in 2013 as a means of sharing tracking information for cargo carrying units and equipment. The Hub is said to simplify the logistics supply chain by enhanced search mechanisms for lost and delayed goods and improved data quality and shipment planning.

In a separate initiative, EPIM is to create a ‘cross organizational identity and access management solution,’ EPIM ID. Initially this will be adopted in EPIM’s own applications and later will be offered to web application developers for inter-company information sharing. The ID project originated as a spin-off from the ‘revitalization’ of Norway’s SOIL network.

Finally EPIM reports progress on making its ReportingHub data more accessible. ReportingHub is an RDF triple store of drilling and production reports that can be natively queried in Sparql. A web interface now offers common mortals access to the data providing a ‘glimpse of ReportingHub’s capabilities.’ Visit the web demo here.


ITF announces petrophysics JIP

Petgas III to further develop PETMiner software.

ITF, the UK’s ‘Industry Technology Facilitator’, with partners Energie Beheer Nederland and Petroleum Development Oman, have launched a joint industry project, Petgas III, a.k.a. the petrophysics of tight gas sandstones. The project’s earlier phases saw the creation of a database of petrophysical properties along with the development of ‘PETMiner,’ a software tool for petrophysical data browsing and visualization.

Phase III will involve further development of the database to improve log interpretation and reservoir characterization during exploration, appraisal and production. Project partners, EBN and PDO, are contributing £321K in total and the project, now in its third phase, will run for a period of three years. The project remains open to late participants. More from ITF.


Sales, deployments, partnerships ...

Data Foundry, GE, Atos, DNV-GL, eLynx, Emerson, IFS, iLandMand, OGsys, GE, MIT, Intergraph, Kappa, IFPen, Beicip-Franlab, Geosoft, Botswana Geoscience, OFS Portal, Golar, Schlumberger, Exprosoft, Weatherford, IBM, Wood Group, Librestream, L&T Infotech, Honeywell, IPCOS.

Data Foundry is to provide data center colocation to Carrizo Oil & Gas.

GE Oil & Gas has been selected by Modec to supply and service gas turbines for FPSOs in Brazil using its digital solutions.

GasTerra is to extend its agreement for the outsourcing of IT Management with Atos for another three years.

Wintershall has awarded DNV-GL a five year contract for independent verification services of its offshore installations.

eLynx’s ScadaLynx software was the poster child for Microsoft’s Azure internet of things at the recent worldwide partner conference in Toronto.

Emerson has been selected by Shell Australia to provide automation maintenance and reliability services for its Prelude FLNG facility. The company has also won a ten-year frame agreement from Total for the provision of control and safety system maintenance services to its worldwide upstream operations.

Seychelles Petroleum is to replace its legacy ERP system with IFS Applications 9 suite.

iLandMand has partnered with OGsys to offer an integrated land management and accounting solution for E&P.

GE has funded the MIT energy initiative, providing $7.5 million over a five-year period to develop ‘advanced energy technology solutions.’

JSC Giprokislorod is to use the Intergraph suite to improve engineering quality in its upcoming oil and gas projects.

Kappa has partnered with IFPen and affiliate Beicip-Franlab to offer a ‘comprehensive reservoir engineering software workflow.’ The partnership starts with the addition of IFPen’s PumaFlow and PVTFlow to the Kappa portfolio.

Geosoft in collaboration with the Botswana Geoscience Institute and industry sponsors have launched the Botswana Geoscience Portal providing free access to the country’s multi-disciplinary datasets.

White Rock Oil & Gas, Lime Rock Resources Operating Company, MRC Global, RSP Permian, Memorial Production Operation, and Fleur de Lis Energy have joined the OFS Portal community. Operator membership now stands at 256. Petrobids has also joined OFS Portal, and is the 33rd approved network in the community.

Golar and Schlumberger have signed a joint venture agreement to create OneLNG, an exclusive vehicle for LNG projects.

Total has endorsed Exprosoft’s WellMaster well integrity management system as a means to ‘increase organizational understanding.’

Weatherford and IBM have signed a joint initiative agreement to develop analytics software and internet of things infrastructure for oil and gas producers.

Wood Group and Librestream are to collaborate to provide advanced business solutions for operations, maintenance and integrity challenges in oil and gas.

L&T Infotech and GE Digital have announced a ‘global strategic partnership’ to develop digital industrial solutions.

Honeywell is to provide automation and safety systems and serve as the main automation contractor to Pieridae Energy’s Goldboro LNG project, Eastern Canada.

IPCOS has signed a collaboration agreement with Global Corporation for the joined delivery of process control solutions in Pakistan.


Standards stuff

Energistics Prodml V2.0. IOGP on naturally occurring radioactive material. DNV-GL safety spec. Open Geospatial’s Sensor Things. The Open Group’s business architecture. Fiatech’s ‘Eye in the sky.'

Energistics has announced a candidate release of Prodml V2.0 with new domain capabilities and data objects covering simple volume reporting, fluid and PVT analysis and distributed acoustic sensing data exchange.

An IOGP task force set up to study the management of naturally occurring radioactive material (Norm) in the oil and gas industry has concluded with the issuance of IOGP Report 412. Uncontrolled work activities involving Norm can pose a risk to human health and the environment. The comprehensive, 68 page report addresses Norm risk evaluation and mitigation.

DNV-GL has published a ‘Service specification’ (DNVGL-SE-0466) to help operators comply with the EU directive on the safety of offshore oil and gas operations. The spec addresses key aspects to be targeted by third-party verification during operations.

The Open geospatial consortium’s SensorThings API has been officially adopted. The API provides a ‘geospatially enabled’ internet of things including sensors, smart watches and ‘smart shirts!’ A smart pants edition will be available real soon now. SensorThings leverages the Rest, Json, Mqtt and OData protocols.

The Open Group has released its Open business architecture (O-BA–Part I), a painfully longwinded description of ‘an approach to the practice of business architecture in the decision making and direction-setting phase of enterprise transformation.’ The 85 page document includes business planning, initiative development and the use of TOG’s companion ‘Togaf’ standard.

Fiatech’s Eye in the sky project addresses the ‘safe and efficient operation’ of unmanned aerial vehicles (drones) equipped with visual sensors to produce accurate and complete 3D imagery for monitoring construction projects. Project members include University of Illinois and engineer Zachry Group.


PODS Next Generation

Pipeline open data standards association delivers proof-of-concept logical model.

The Pipeline open data standards association, Pods has reported progress on its ‘next generation’ pipeline data model. The new model sets out to address problems with the legacy Pods model including performance, implementation issues and the disparity between vendor solutions. Pods is also to address the ‘exponential’ growth of pipeline data, new reporting requirements from PHMSA and FERC and new technologies. The aim is for a simple, well documented schema along with clear guidance for implementers, data loaders and QA/QC tools.

At the core of Pods-NG is a single logical model, instantiated as either an Esri geodatabase, a relational projection or as a file-based protocol for data exchange. Esri’s APR-compatible linear referencing is to be used throughout. Currently the work group has delivered a proof of concept edition of the logical model and XML-based specs for exchange of pipeline and associated business data and metadata.

Image Matters was selected to develop PODS-NG following a reviews process. IM president Kurt Buehler is a specialist in data standards development having previously worked with the Open geospatial consortium on the Geospatial interoperability framework and on NIEM, the National information exchange model. More from Pods.


Patent potpourri

Cartasite secures wells. Adelos takes on Halliburton. Agile on spec data. Gravitomagnetism?

Cartasite has been awarded a US patent (N° 20140091141 A1) that will ‘streamline oil and gas field operations.’ The patent covers a ‘failsafe alternative’ to current, error-prone methods of securing oil and gas wells with an improved system of well tag identification. Cartasite’s ‘high-tech’ barcodes and plastic tags and digital recording are claimed to eliminate misreads and false alarms. More from Cartasite.

Digital acoustic sensing (DAS) specialist Adelos has accused Halliburton of ‘improper access to and misappropriation’ of proprietary technology that was developed out of its work with the United States Navy’s Blue Rose technology (Oil ITJ 2015 N° 5). Adelos alleges that Halliburton ‘tried clandestinely to obtain a patent on [the technology] and falsely claimed and marketed it as their own.’ Adelos was selected by the United States Navy to be its world-wide exclusive licensee to market and commercialize the technology. Others accused of infringing Adelos’ patents include Optiphase, Sensortran and Pinnacle Technologies. More from patent search specialist RPX.

Blogger, programmer and geophysical industry observer Matt Hall (Agile) has posted an analysis of the patentability of seismic spec data in Canada. Following multiple lawsuits emanating from GSI relating to alleged infringement of its rights over spec seismic data, the Honorable Madam Justice Eidsvik of the Alberta Court found that seismic data ‘is not like ordinary data.’ Justice Eidsvik ruled that ‘the creation of field and processed [seismic] data requires the exercise of sufficient skill and judgment of the seismic crew and processors to satisfy the requirements of [copyrightability].’ More on the intricacies of the case and its implications for other jurisdictions from Agile.

Most improbable patent of the month (if not the decade) goes to startup Gravitomagnetism, and its US patent No. 9,318,031 B2 for ‘disk calibration, energy generation, propulsion, and teleportation.’ Acording to founder Michael Boyd, the company’s technology will ‘one day allow a wind-up watch-sized device that could power your entire house, your boat, your car and your airplane.’ Keep on winding us up Mike!


Yokogawa acquires FogHorn Systems

Engineer to ‘co-innovate’ with fog computing specialist across IT/OT boundary.

Yokogawa has acquired a modest $900,000 stake in Silicon Valley-based FogHorn systems, a ‘fog computing’ startup. Yokogawa aims to foster the development of fog computing technology and to expand its own range of solutions. So what is fog computing? According to the release, the ‘huge number of devices’ operating in the cloud and over the internet of things are creating network congestion and data processing delays. Fog computing solves this problem moving computing to the ‘edge’ of the network and lowering bandwidth requirements through a ‘fog’ of distributed computing layer between the cloud and devices in the field. Suggested use cases include mitigating oilfield electric submersible pump failure through analytics ‘at the edge.’

Led by March Capital and GE Ventures, FogHorn has raised $12 million in from Yokogawa, Robert Bosch Venture Capital GmbH, Darling Ventures and other investors. Yokogawa is to implement the new technology leveraging its process co-innovation concept for automation, adding its own measurement, control and information technologies to the mix.

FogHorn recently rolled out its new Lightning software platform for real-time analytics applications running on ultra-small footprint edge devices. Lightning allows application developers, systems integrators and production engineers to build ‘edge analytics’ solutions for industrial operations and IIoT use cases. Lightning ingests high-speed data via OPC-UA, MQTT, Modbus and other protocols and includes ‘VEL,’ a real-time streaming analytics engine, along with connectors to data stores such as Apache Hadoop, Kafka, Microsoft Azure, Cloud Foundry RIAK and more. More from FogHorn.


Flotek, YPF team on chemicals data management

Nano technologies to be applied to Vaca Muerta shale development.

YPF Technologia, a joint venture between Argentina’s Yacimientos Petrolíferos Fiscales (YPF) and ‘Conicet,’ the country’s National R&D organization have signed a fine year agreement with Houston-headquartered Flotek for the joint development of technologies to further development of the Vaca Muerta shale and Neuquen tight gas basins. Initial focus is on the use of custom chemistry for ‘full fluid systems,’ leveraging Flotek’s patented complex nano-fluid and pressure reducing fluid technologies. The partners are also to design and implement a YPF-centric chemistry data management application. Y-Tec general manager Santiago Sacerdote said, ‘Innovative custom chemistry and integrated data analysis will enable us to design efficient solutions for the development of both conventional and unconventional resources.’

Recently commissioned, independent studies of Flotek’s production enhancing chemistries are available on the Flotek website (registration required).


Emerson gets device power from heat

Perpetua’s Power Pucks integrated with Rosemount wireless data transducers.

Emerson is using a novel power source for its Rosemount line of wireless data transducers. The technology, from Corvallis, Oregon-based Perpetua uses excess heat from (typically) refinery processes to generate electricity. Perpetua’s thermoelectric ‘Power Pucks’ are said to reduce operating and maintenance costs and to simplify wireless technology adoption.

The Power Puck energy harvester provides continuous, reliable power for the life of the transmitter and includes an intrinsically safe module for back-up power. The Power Puck connects to flat and curved surfaces at temperatures of up to 450°C. The intrinsically safe solutions are classified for use in hazardous areas (ATEX and IECEx). More from Emerson.

Another ‘alternative’ energy source is reported by Honeywell whose UOP modular natural gas processing equipment has allowed Virginia Indonesia Co. to use natural gas straight from the wellhead to power on-site equipment. The fuel gas-conditioning unit uses Honeywell’s Separex technology to remove corrosive contaminants from natural gas streams, allowing it to ‘safely and reliably fuel on-site machinery.’ More from Honeywell.


OMV rolls-out M-Files

Legacy document management and reporting system replaced with M-Files EIM.

Austrian OMV is to replace its deploy M-Files to replace ‘most of’ its legacy enterprise content management systems covering regulated documents. OMV Group was using a combination of systems for managing its information assets*. But these systems were found to be difficult to use, leading to low user adoption. Many employees continued to store files on internal network folders and local drives.

OMV needed a content management solution for several thousand users across its operations in Norway, the UK, and its Austrian headquarters. After a formal RFP process that included intensive solution usability testing, M-Files enterprise information management (EIM) solution was chosen. The system has since been deployed to OMV’s upstream operations in Romania, Pakistan and Austria. M-Files’ scope has now extended to include technical documentation related to refineries, and the solution also serves as the regulations and standards platform for the OMV Group.

OMV’s Florian Neuböck said, ‘M-Files is not only easy to use but it’s also easy to configure and administer in a manner that enables us to comply with strict regulatory requirements.’ M-Files’ Greg Milliken added, ‘Quick access to accurate and up-to-date information brings a major competitive advantage to information-intensive industries such as oil and gas.’ More from M-Files.

* OMV has previously reported deployments of EMC Documentum in its upstream Isis project (Oil ITJ March 2009) and in a WebGIS environment developed by Austrian Synergis.


Ellis, Geoteric team on ‘cognitive’ seismic interpretation

Seismic image processing meets sequence stratigraphy as boutiques connect.

Geoteric (a.k.a. Foster Findlay/FFA) has teamed with Montpellier, France-headquartered Ellis to combine its image-processing approach to seismic interpretation with Eliis’ ‘Paleoscan’ guided/automated stratigraphic analysis. The collaboration involves the development of a ‘seamless’ link between GeoTeric and PaleoScan and is claimed to be a step forward in ‘cognitive seismic interpretation.’ The aim is to improve the precision of horizon interpretation and the detail obtained from color blends and multi-attribute volumes.

Oil IT Journal caught up with Eliis at the EAGE and checked out functionality in the Paleoscan 2016 release. There really are a lot of attributes on offer, and that was before the hook up with FFA. Now the temptation for data futzing across the two platforms must be great, although this should be mitigated by Paleoscan’s increasingly impressive auto picking. Ellis also provides links to Petrel. Another connector for Landmark’s DecisionSpace is under development. For pre-stack work Ellis teams with Sharp Reflections.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.