Oil IT Journal: Volume 23 Number 4


Oil and gas digital transformation, a review

What is it really? Why all the hype? Will it work? We look at transformation as seen by Honeywell, ABB, Siemens, Shell and Accenture to conclude that the putative ‘transformation’ is more driven by marketing than by business requirements.

In assembling this issue, we have studied a plethoric body of literature covering ‘digitalization’ and ‘digital transformation’ of the oil and gas industry. While the exact meaning of digital transformation is obscure (it could just mean ‘buy more of our stuff’) we examine some of the more coherent arguments for change as we a) try to pin down a definition and b) look behind the hype. But first, by way of an executive summary, here are our conclusions from this short trip through the latest thinking on the digital transformation.

While there are many possible interpretations, we consider that digital transformation is essentially about moving data to the cloud. The perception is that the cloud will be a better place to integrate data and make it amenable to artificial intelligence (AI). What is less clear is whether the move to the cloud is an inevitability to which all must face-up, or even a prerequisite for doing AI. We consider that digital transformation is part rehash of pre-existing solutions and platforms and part an attempt at a land grab by the IT and consulting community. The Accenture position paper is particularly revealing as it concludes that in the future, ‘[data science] expertise [will] triumph over industry experience.’ At another level, the hardball marketing extends into the IT department itself with, as a recent Forbes article has it , ‘CIOs must adapt their personal and professional skills to meet today’s demands. Those who don’t risk becoming irrelevant.’ You have been warned!

Honeywell - good metadata is the key to cloud success.

Paul Bonner, speaking at the 2018 Honeywell User Group (HUG) in San Antonio, made the case for differentiation between transformations that focus on information technology versus operations technology. In the IT camp, companies are ‘either engaged in or planning for DT’ driven primarily by IT. While ‘some success’ is reported in data integration and cloud usage, cyber security and data governance challenge the pace of progress. OT has perhaps a head start on IT with many pilot projects completed. The latter have shown that ‘process data is highly correlated*, general purpose big data tools and data scientists are not effective.’ Also, in the OT domain, streaming process data to the cloud requires a different toolset from IT. Bonner suggests building an OT transformation strategy by starting from an understanding of the existing digital footprint and capability. Then the required ETL tools and processes can be deployed to move data into a data lake in the cloud. A corporation’s readiness for such upheaval should be addressed with a digital maturity model such as those emanating from the Industry 4.0 movement, see for instance Science Direct (open access). Bonner sees the cloud from a different standpoint to some in the IT community. Good metadata is the key to cloud success. The cloud is not a suitable place for storing high volumes of raw process data which should be cleansed aggregated and compressed ‘at the edge.’ Various combinations of on-premises and hosted clouds can be deployed depending on governance and regulatory needs. All of these expose different and non-negligible cyber security issues. Which is where Honeywell’s enterprise secure cloud comes in, acting as a single conduit from plant to data lake. Honeywell advocates maintaining dual data stores in the cloud. An enterprise historian for operations and a data lake for analytics. These are linked by a ‘context model’ of plant equipment and data sources. Bonner concluded that the OT journey is not an easy one and there are no correct answers to tool selection - except to choose the right partner, Honeywell HCP of course!

* AI techniques often assume input variables are uncorrelated. Early attempts (from the 1960s!) at numerical taxonomy in biology and geoscience came unstuck as measures (length, breadth, weight) are generally correlated. Correlation is a good issue to raise with your pet data scientist.

ABB - Oil and gas transitions to new energy ecosystem.

A 16 page position paper from ABB explains “how digitalization enables oil and gas operators to transition to a new energy ecosystem” rather confusingly conflates the transition of oil and gas to a greener energy ecosystem with the digital transformation. Success during the energy transition requires a ‘robust digital strategy’ with board level support. ‘Boardrooms need to act decisively and embrace digital by (inter alia) accepting a flatter organization where decisions can be made by well-informed colleagues deeper in the organization who are receptive to new ideas and ways of working and by collaborating with the supply chain (i.e. ABB) and by ‘forming digitally-powered, multidisciplinary teams with the freedom to think differently.’ ABB’s copywriters extend themselves somewhat when they advocate ‘social media crowdsourcing’ which apparently ‘has already proved useful in reserve analysis, where seismic data and other information is put into the cloud for crowds to suggest analytical technique improvements.’ The approach is moreover said to ‘work well in the sharing economy that oil and gas operators are now entering.’

A barrier to sharing (of data) across the supply chain is the lack of standardization of sensor data. Other issues involve uncertain ownership of and access to data from suppliers, operators and contractors. ‘There is a lack of standardization and even when data is accessible, it is often too complex or large, obscuring any clear insights.’ ABB cites The Open Group’s work with ExxonMobil and Lockheed Martin as showing promise in this context but observes that where standards are ‘ambiguous or too general’, ABB develops its own.

Siemens’ Mindsphere-based digital hydrocarbon solution.

At the Upstream Intelligence Data-Driven Drilling and Production conference in Houston earlier this year, Jan Pawlewitz presented Siemens’ Digital Hydrocarbon offering which is based on its ‘MindSphere’ cloud-based, ‘open’ Internet of Things operating system. MindSphere promises out-of-the-box IoT connectivity along with ‘MindApps’ for end user applications. For Siemens, the next stage in oil and gas ‘competitiveness and efficient operations’ will come from digital and advanced analytics. Siemens uses the EU-backed Industry 4.0 paradigm.

Whereas ‘Topsides 4.0’ is arguably inside Siemens purview, a cryptic reference to SIFeld 4.0, an ‘integrated digital twin of reservoir and facilities’ would appear to be a new branding of the work performed for AkerBP a.k.a. the ‘MindTwin for remote offshore operations’ on the Ivar Aasen Field. Another Siemens reference is Bahrein Petroleum, Bapco, which deployed Siemens XHQ operations intelligence software in 2012. This leads to the question as to how new and novel all the digital twin stuff really is and the extent to which it is a repurposing of existing ‘solutions.’ Oil IT Journal’s earliest XHQ reference dates back to 2003!

Shell’s master class in digital transformatio at TechTonic.

TechTonic 2018, a gathering of Shell’s technologists hosted at its Bengaluru, India IT hub heard Jay Crotts (executive VP and group CIO) and Nitin Prasad, (chairman of Shell Companies in India) expound on ‘digitalization’ and Shell’s future vision. Prasad put digitalization ‘at the heart of the energy transition.’ Shell unveiled ‘Agile’ - a new project management tool to manage workstreams and information flow alongside its integration of SAP in its digitalization journey and new cloud computing capabilities. Gartner’s Rich McAvey led a masterclass on digital transformation and the future of work in oil and gas with emphasis on how ‘CIOs must change IT to remain relevant and impactful.’ T-Systems’ CTO Jo Campbell led another class on IoT/edge technologies that are to ‘transform the landscape of the energy industry and refineries.’ Crotts cited Shell’s increased focus in expanding in-house IT expertise as testimony to the importance of digital ecosystems for driving progress in the energy world. ‘This digital agenda has been at the forefront of our thinking for years- from automation of oil field [production] to the engagement with our end-customers. I believe the digital agenda gives us a platform for standards that allow us to execute business processes cheaper than we have ever done before.’

Accenture - how to compete with free energy!

The title of Accenture’s latest white paper (and full report here) ‘Oil and Gas: How do you compete with free?’ might make you think of open source software. Not at all. The threat to oil and gas is free energy! Oil demand is expected to peak in the next 20 years and its share of the energy mix is expected to fall from 80% to near 50% by 2060. This means that leading oil and gas companies are on high alert. 54% believe their growth strategies are at risk. The rest are ‘less concerned,’ in part because they believe their digital investments will protect them. Enter the ‘tremendous opportunity’ of data that is ‘just waiting to be turned into actionable insights that can reduce the cost of supply, increase operational responsiveness, and open the doors to new and profitable business models’.

Accenture’s analysis suggests that applied intelligence, driven by analytics, has the potential to shift the P&L equation with double-digit gains in efficiency, productivity and cost savings. The shift to an AI-driven world means changes to the oil and gas job market. According to Accenture, geoscience professionals’ jobs will ‘increasingly be filled by image processing experts from other industries such as high tech or health care*.’ Oils need to realize that ‘expertise triumphs over industry experience.’ ‘To compete with free, companies need to shift focus to providing a service, not selling a commodity at the wellhead. To enable this transition, the entire ecosystem must evolve to ensure molecules are dispatched to end users that exhibit the greatest demand. Expanding collaboration and risk-sharing across ecosystem partners, all the way to the customer, will be key.’

* This claim merits a health warning! Geoscientists are already to a large extend ‘image processing experts’ witness the huge seismic imaging market and the use of image processing tools (Aviso, Geoteric and others). This is a typical ‘grass is greener on the other side of the fence’ claim, assuming that the reader has a poor knowledge of the state of the art in the other (healthcare) field.


Linux Energy Foundation

Open source software initiative targets energy transition and smart grids.

The Linux Foundation, owner-operator of the ubiquitous open source operating system Linux has announced ‘LF Energy’, a coalition of open source software users in energy. The initiative is to ‘speed technological innovation and transform the energy mix across the world’, with an initial focus on smart grid and electricity distribution. LF Energy is an umbrella organization that will support and sustain multi-vendor collaboration and open source progress in the energy and electricity sectors.

LF Energy Executive Director Shuli Goodman said, ‘A collaborative open source approach to development information and communication technologies across companies, countries and end users, will provide the innovation needed to meet our respective goals in renewable energy, power electronics, electric mobility and digitalization of the whole energy sector.’

Current projects include PowSyBl a framework of reusable modular components for modeling distributed energy resource environments and ‘RIAPS’, the resilient information architecture platform for smart grids. RIAPS was developed at the Institute for Software-Integrated Systems at Vanderbilt University and funded by the US Department of Energy’s Advanced Research Projects Agency for Energy (ARPA-E). LFE’s anchor tenant is France’s transmission system operator RTE. Others involved in the initiative include the EU Network of Transmission System Operators, Vanderbilt University and The Electric Power Research Institute.

We asked LFE if there were any oil and gas companies in the club, possibly through their involvement in generation or use of microgrids on or offshore. A spokesperson responded that ‘We are still in the very early stages, so we don't have any oil and gas companies just yet. But we will keep you posted’.


A conspiracy theory

Why computers are not going to take your job.

It is now commonplace at conferences and exhibitions to see hackathons with (usually) younger folks beavering away in Python deriving insights from big sets of data. Should we all be learning to code? Or more specifically, what relative importance should be placed on learning to code as against learning say, mathematics or physics? If your data contains all you need to know then why bother with the bottom up approach of learning geology, geophysics and what have you. The data driven brigade have it that the new paradigm of machine learning will sweep away many traditional jobs – not least seismic interpreters (see elsewhere in this issue).

As an ex seismic interpreter myself, I confess to being more than a little skeptical about this – but forget that, and assume that we are moving to a world where ML is going to replace knowledge work. Or, as a recent report from that great source of nonsense and editorial inspiration, the World Economic Forum has it, that tomorrows workforce will comprise ‘data scientists, developers, social media specialists and marketing’. Or, to rub salt in the wound, you as a geophysicist will be replaced by a data scientist massaging big data. So, your employer is going to replace you, the geophysicist (at $100,000/year) with a hot shot data scientist (at $350,000/year)? There is something wrong here.

I have a crazy conspiratorial theory as to what is wrong with this curious development and it starts in the early days of computing. For a few decades computing did a straightforward job of a) speeding up calculations and b) doing more with a smaller headcount*. To pursue the geophysical example, it was rumored in the West that the Chinese did seismic migration (a compute-intense task) with huge teams of abacus-wielding engineers. The model here is for one person and a computer doing the work of thousands. The remark attributed to IBM founder Tom Watson, that in 1943 there was only a market for about five computers is seemingly apocryphal, but it summed up the idea of multiplying the efforts of a small number of programmers and computers**.

The early promise of the computer was one of a multiplication of effort. A program ought to perform a calculation or a task repetitively. A a good program is one that wakes up as a Unix cron task at regular intervals does its stuff, spiders the file system, QC’s some new data and pumps it into the database. The ideal programming workflow is write-once … and then run for ever, doing many people out of work including the programmer! In the plant and process industries there are a lot of these programs around controlling valves and motors in what is known as operations technology. OT is generally looked down on by IT.

This is not quite how things panned-out. Over time, the computer industry contrived to produce machines that require constant attention in the way of program maintenance and upgrades. Programming languages were tweaked and re-invented and battles raged between supporters of one programming paradigm and another. This was in part due to genuine technological progress, but not entirely. As the industry grew, Madison Avenue stepped-to help grow the business with its marketing. Nothing wrong with that. Back in the day, buyers recognized that real and not so real technological progress would be matched with a marketing effort designed to foster ‘fear, uncertainty and doubt’ (FUD) in those that failed to catch the next ‘big thing’.

The arrival of the personal computer threw a bigger spanner into the ‘multiplication of effort’ paradigm. Instead of a lonely programmer sitting in front of a console attached to a big machine doing a lot, everyone got to play with their own computer. Instead of a market of “maybe five” there was suddenly a market of “maybe five billion!” For the computer industry, the birth of the PC was manna from heaven. Instead of one programmer doing the work of many, the picture shifted to everyone doing stuff on their own (forcing the marketing department to convince us, despite the evidence, that we were all now ‘collaborating’ with ‘productive’ software).

To understand the computer industry, you need to ‘follow the money’. Imagine what would happen to the industry if somehow there was a shift back, even slightly, from one computer per person. A massive revenue stream would be lost! So Microsoft and others do everything in their power to work to counter the original nature computing, i.e. to do more stuff with less human effort. In the PC world this is achieved by trivial ‘upgrades’ to computer operating systems, by inefficient ‘bloatware’, and by eye candy-oriented interfaces. In the larger field of enterprise computing, the objective of most development seems to be a ‘dashboard’, again with the implication that a human looking at a screen is the essential endpoint.

In a presentation at the 2018 ECIM we heard from a proof of concept by Arundo Analytics (report from ECIM in our next issue) which was, on the face of it, an AI success. Arundo’s deep learning-based predictor of compressor failure correctly warned of failure three weeks before it happened. The only trouble was, ‘nobody was looking at the screen!’ The unasked question is, ‘why would anyone want to look at a screen for months on end waiting for such a warning?’ It seems like IT’s task is done once a dashboard is up and running with graphical widgets and cute eye candy to display a few KPIs.

But I have digressed from my original question as to the role of data scientists and subject matter experts (SME) like geophysicists. Let’s assume that AI will replace many SMEs including geophysicists and results in machines doing the bulk of knowledge work. This leads to the interesting question as to how computer industry will make its money faced with a greatly reduced head count and no hardware sales (now that all is done in the cloud).

A glance at the GAFA’s (and Microsoft’s) revenues makes it clear that this is absolutely not how things are developing. No way is the AI revolution (or any other next big thing) going to result in a ‘rational’ use of compute resources that reintroduces a computerized multiplication of effort. Marketing will step in to assure that business as usual continues, that more and more powerful computers will have to be deployed, that compute nirvana awaits those with a mastery of ‘R’, CUDA or what have you. I say ‘will’ but the marketing madness is all around us right now. Wild claims abound for AI as “controlling safety-critical infrastructure across a growing number of industries…” (no it does not!). A possible best-in-class wild claim just popped into my inbox from Accenture where we are invited to “find the holy grail, the driverless supply chain, with quantum computing in oil and gas”. Really? Back in the day a mild amount of FUD was accepted as part of doing business. But things are getting out of hand! As I put in in a short letter that the Financial Times has the good grace of publishing recenly...

IBM and others have been so successful in entangling members of the Forth Estate*** (FT Big Read Technology September 4) that it is hard to see how near they are to making a quantum computer that actually works. Perhaps the quantum computer is, like the qubits that drive it, in a state of superposition, being both real and not real at the same time. The question is, can the marketing folks keep the promise of quantum computing alive before it, like the qubit, decays and is replaced with the next ‘big thing.’

Best regards Neil McNaughton

* Although the job picture here was actually more nuanced, see the US Bureal of Labor Statistics Occupational Changes in the 20th Century.

** It is not even as silly as all that if you allow that today, there are just a handful of GAFAs running ‘real’ computers in behemoths of datacenters.

** Fancy talk for the media!


Review: Quantifying Uncertainty in Subsurface Systems

AGU textbook provides provides broad, erudite introduction to resource evaluation with innovative Bayesian approach.

Quantifying Uncertainty in Subsurface Systems (QUSS), aka Geophysical Monograph 236, by Céline Scheidt, Lewis Li and Jef Caers (all from Stanford University) is a co-publication of the American Geophysical Union and John Wiley and Sons.

QUSS is a textbook, almost a work of literature, that provides broad coverage of current subsurface assessment in the fields of oil and gas, water, geothermal, storage and more. QUSS is bang up to date with for instance, the use of data science in North American shale development. QUSS covers much more than it indicated by its title, discussing the limits of a fundamental scientific approach in the face of complex geological and other phenomena. The authors compare predictive (forward) modeling from first principle with data-derived models. And that is before we get into quantifying uncertainty (QU) in all of the above.

QUSS takes quite a while to get around to the topic of the title. QUSS per se is discussed in a 50 page chapter starting on page 217! As the authors explain in the introduction, “the primordial question is not necessarily the quantification of uncertainty of all the [...] variables but [rather] of a decision-making process involving any of the target variables in question.” Such decision making might include whether to acquire more seismic data over an oil field, introducing the topic of value of information.

A chapter on decision making under uncertainty introduces the ‘science’ of decision analysis. This section is an extensive, well-illustrated overview of data science including dimension reduction, principle component analysis, regression (including boosted trees), kriging and kernel processes and cluster analysis. A further chapter covers Monte Carlo-based and its use in model simplification with “Sobol” regionalized sensitivity analysis.

A whole chapter is devoted to the philosophy behind Bayesian methods with a historical context explaining why Bayesianism is now a ‘leading paradigm’ for quantifying uncertainty. This leads-on to a presentation of the role of the prior distribution in both deterministic and stochastic inversion, geological heterogeneity and geostatistics. The philosophy of science approach is leavened with some interesting asides, including the attempts to model Saudi Arabia’s Ghawar field, where flowmeter measurement led to ‘ridiculous’ permeability values (200 Darcys!) and ad-hoc ‘fixes’ to previous models. There is a tendency in the modeling community to ‘shy away from bold hypotheses certainly if one wants to obtain government funding’ and the fact that modelers tend to ‘gravitate toward consensus under the banner of being good at team-work.’ The chapter concludes with a discussion of the nature of geological priors and their relationship to inter alia, model building and flume tank sedimentological experiments.

Chapter 7, billed as ‘the most novel technical contribution’ of QUSS, introduces a collection of methods called ‘Bayesian evidential learning’ (BEL). These address the problem of matching large, realistic geological models with limited computing resources. BEL leverages Monte Carlo methods to generate a training set of data and prediction variables that can ‘allow for predictions based on data without complex model inversions.’

QUSS is a fantastic compendium of terminology and methods addressing a wide range of subsurface problems. Some practitioners may find the comprehensive approach rather bewildering which is probably in the nature of the subject. But even the amateur decision-maker will find much to intrigue and challenge. QUSS undoubtedly merits a more leisurely read that this reviewer could afford. In our rapid run-through we spotted an amusing typo on page 23 where a geological model of a “buried valet” is discussed. Poor chap!

Quantifying Uncertainty in Subsurface Systems (QUSS) by Céline Scheidt, Lewis Li and Jef Caers all from Stanford University. Wiley ISBN: 978-1-119-32583-3.


Open Subsurface Data Universe

Shell backs The Open Group initiative for a standard upstream data platform. TOG rolls-out DPBoK snapshot.

Speaking at The Open Group’s (TOG) recent Houston event ‘Digital Transformation in the Energy Industry’, Shell CIO Johan Krebbers announced a new initiative, the Open Subsurface Data Universe (OSDU), a forum that sets out to deliver a ‘standard data platform to bring together exploration, development, and well data.’ Krebbers presented the business drivers for an ‘open, cloud-based data architecture for oil and gas’. OSDU is to ‘separate data from applications’ with a data-centric approach ‘supported by metadata.’ OSDU targets the ‘faster delivery of capabilities and lower implementation and lifecycle costs across the subsurface community. The forum aims to combine new digital technologies and best business practices to address business and technical issues related to subsurface data. OSDU is to be a consensus-based group of customers, suppliers and academia, relevant to oil and gas operators, cloud services companies, vendors and others.

In separate announcement, TOG has released the ‘first official snapshot’ of the Digital Practitioner Body of Knowledge (DPBoK), representing ‘the interim results of what is intended to become the DPBoK standard.’ The DPBok snapshot is a 100,000 word plus discourse on things digital. Companies in the throes of a digital transformation may like to check out TOG’s contribution to the field that involves a scenario ‘wherein an individual is looking to buy a prosthetic limb for her brother’ and the ‘seven levers of change’ involved in the journey to digital nirvana.


2018 Nvidia GPU Technology Conference

Schneider’s neural nets and latent space deep learning power pump controller.

Speaking at the 2018 Nvidia GPU Technology Conference, in the “AI at the edge” track, Schneider Electric’s Matthieu Boujonnier showed how Schneider is applying deep learning to analyze artificial lift with Dynacard records. Labelled Dynacard records can be treated as image data for classification with a convolutional neural net, just as pictures of cats are classified on the internet! The problem is that it is hard to obtain a decent set of labelled Dynacard data, there is more usually only a smallish set of typical pump responses labelled by an expert.

Schneider’s approach is to ‘augment’ the data set by combining cards showing similar effects. This is said to make for a simpler model and less ‘overfitting’. The dataset can be further augmented using an autoencoder and a latent space methodology. Another approach is to extract ‘new’ features from images such as gradients. Different statistical models can be combined in an ensemble model which is said to increase the odds of success. Schneider’s ‘Realift’ solution rod pump controller, bundled with data acquisition and cleansing apps, can now be deployed ‘at the edge’ i.e. in the field for on-the-spot diagnostics and beam pump optimization.

CGG trains LeNet to identify seismic faults.

Steve Dominguez (CGG) compared current interpreter guided interpretation techniques such as deployed in CGG’s InsightEarth* 3D interpretation solution with a data-driven deep learning approach. CGG is currently working on accelerating compute intense processes for automatic fault extraction, geobody extraction and noise reduction by applying simpler and faster neural network approaches. Dominguez has retrained the LeNet system to recognize faults and gets 70-80% accuracy in fault identification.

* Developed by the Geoscience Interpretation Visualization Consortium (GIVC) in a 12 plus year R&D effort.


BP, BHGE and Predix .. or is it?

Advanced analytics solution, developed with BHGE, will be installed on BP’s upstream assets around the world.

Following trial deployment in 2015 [BP’s production data infrastructure from bespoke to COTS (N° 7 2015)], BP has extended deployment of what is presented in the release as (but see below) GE’s Predix internet of things platform in an extension of its Plant Operations Advisor (POA). POA is described as a ‘cloud-based advanced analytics solution’ developed with BHGE that is now in operation on BP’s Atlantis, Thunder Horse, Na Kika and Mad Dog platforms. The solution is scheduled for deployment on some 30 other BP upstream assets worldwide. POA captures 155 million data points per day from 1,200 pieces of equipment, providing insights on performance and maintenance.

BP’s global head of upstream technology Ahmed Hashmi, said, ‘POA, co-developed with BHGE, is a key plank of modernizing and transforming our upstream operations. We expect the technology to deliver improvements in safety, reliability and performance of our assets and help raise the bar for the entire industry.’

POA is said to be built on GE’s Predix platform, an ambitious cloud-based industrial internet whose evolution Oil IT Journal has followed closely since 2013. Originally presented as a joint venture between Accenture and GE, leveraging its own data centers along with Amazon’s analytics, Predix was developed atop Pivotal’s Cloud Foundry. Earlier this year, Predix was announced as to be standardized on Microsoft Azure in what GE CEO John Flannery described as part of a ‘major change’ in how the company is run, abandoning plans for its own cloud. More recently, the Wall Street Journal reported that GE is planning to seek a buyer for parts of its digital unit, ‘unwinding a signature initiative of former CEO Jeff Immelt that loses money despite billions in investment.’

In a short email exchange, we asked BHGE for clarification on the nature of “Predix” in the BP deployment. A BHGE spokesperson kindly provided the following which we have edited...

“Predix is a platform created by GE Digital. BHGE Digital leverages this platform for some customers. BP’s POA was developed in partnership with BHGE as ‘fit for purpose software’.[ … ]We develop and deliver software that can be deployed on both the Predix platform and in other cloud environments, like Google Cloud Platform, Microsoft Azure, and Amazon AWS. This multicloud approach allows us to be flexible with our customers and help them maximize recovery, optimize production, reduce nonproduct time, and improve operational efficiency. [ … ] We deliver capabilities in asset health, reliability, strategy, integrity, as well as corrosion and process management. For our Fit for Purpose offering, the BP POA deployment is a great example of bringing our advanced analytics expertise together with a customer to meet specific needs. Underlying everything we do is advanced analytics and AI capabilities and the flexibility and domain expertise to work with our customers for improved operational outcomes.”

Reading behind the lines, Oil IT Journal therefore concludes that the BP POA does not run on the GE Digital Predix platform as previously presented by GE O&G, i.e. the Pivotal/Cloud Foundry-based ecosystem.


IFPen and Kappa extend partnership

Kappa to add ‘powered by IFPen’ modules. New DDA handbook improves treatment of unconventionals.

In an interview published in the 2017 IFPen Annual Report, Jean Burrus, CEO of the Beicip-Franlab unit, provided and update on its partnership with French well test software boutique Kappa Engineering. Burrus stressed the need for closer coupling between software targeting the well and solutions addressing broader reservoir engineering issue. The partnership, first announced in 2016 (Oil IT Journal), involved the development of data linkages from IFPen’s PumaFlow and PVTFlow into the Kappa toolset. The extended partnership will see Kappa’s software more fully embed these and other IFPen modules and will result in a new ‘powered by IFPen’ brand of Kappa’s product line. The IFPen’s consulting arm, Beicip-Franlab, will also add its domain expertise to support users and clients.

Kappa has also released a new version (5.2) of its Dynamic Data Analysis tome. A 700 plus page textbook on well test analysis. The latest edition includes a revised and improved treatment of well testing of unconventional reservoirs with Kappa’s Citrine tool which was ‘substantially enhanced’ in 2017. Feedback from Kappa’s formation testing training activity has been incorporated into the new edition. Read our review of the previous edition of DDA here.


Software, hardware short takes

Emerson, Yokogawa, SpotSee, AiPrime, IFPen, Sintef, Neftex, Oseberg, Petrosys, SharpReflections, Safe Software, Roxar, Spectro Scientific, Siemens, Schlumberger, SPOC Automation, Technical Toolboxes, Thermon, Rock Flow Dynamics, Justcroft.

Emerson has announced Paradigm 18, its flagship geoscience and earth modeling package. Paradigm is now said to use machine learning as a data integrator and process automation tool. Paradigm 18 introduces new solutions for unsupervised seismic facies classification of pre and post stack seismic data. The SeisEarth interpretation suite now includes geobody detection that can leverage results from the classifier. New sensitivity analytics show the impact of parameter changes across the interpretation workflow. Other enhancements include a stratigraphic context tag to enable versioning of database entities. Time-based production data in Epos can now be accessed by from StratEarth and Skua-GoCad for ‘more comprehensive data analysis’. A new 3D Hybrid grid for Skua-GoCad adds a state-of-the art meshing capability for geomechanical finite element computations.

Yokogawa has announced a new solutionfor mitigating hydrate build-up in offshore production. Two applications from Yokogawa’s KBC unit, Maximus and Multiflash, act on real time scada data. Maximus performs compositional steady state simulations of the pipeline network of wells, choke valves, flowlines and processing equipment. Multiflash adds pressure-volume-temperature modeling for hydrocarbon flow assurance. The combined solution is used to pinpoint the location of hydrate risk and recommend inhibitor injection volumes that balance risk and cost. Operators can optionally deploy automated FluidCom metering valve technology from Yokogawa’s TechInvent AS unit for end-to-end hydrate risk management.

SpotSee’s OpsWatch is a cloud-connected vibration monitor that monitors shocks and vibrations on equipment in use and in transit, alerting users if an item is dropped or damaged. SpotSee provides transparency to the heavy equipment shipping process, monitoring top drives and mud pumps while they are in use, or alerting operators to potential compressor skid damage. OpsWatch is also available in an ‘EX’, Atex-certified version for hazardous environments.

Ai Prime is a new document sharing and mapping platform that its developers claim is to replace Google Earth for the Texas oil and gas sector. The free collaboration platform integrates information from multiple sources with imagery that is ‘10 times clearer and more up-to-date’ than other sources. Users can purchase additional information or incorporate their own data. Ai Prime provides 10cm resolution imagery across the Permian basin.

IFPen, the French energy R&D organization has announced GeoAnalog, a web service decision support and training solution that provides access to a database of geological structure analog models. GeoAnalog is the fruit of some 30 years of IFPen research and compiles around 1,500 experimental models.

Norway’s Sintef has released a new edition (2018a) of its Matlab Reservoir Simulation Toolbox(MRST). The new release includes a new corner-point geometry that better agrees with results from commercial simulators and now supports vertical flow performance tables and reservoir voidage rate targets. MRST runs on both Linux and Windows.

Neftex’ new TectonicExplorer is a cloud-hosted combination of the company’s geodynamic plate model and its Insights interpreted wells, geochronology and mineral deposit databases. Users can generate a robust tectonic story for early stage exploration of an area of interest.

Oseberg has released Oseberg Map Services, making its US land, lease, regulatory and drilling datasets available to any ESRI-compliant application. The use of cloud-based data fixes the problem of shapefile imports and out of sync data.

Petrosys PRO 2018.2 introduces a new probabilistic resource calculator, direct display of Excel spreadsheets on maps and improvements to the grid editor.

SharpReflectionsPre-Stack Pro 5.2 features an interactive spectral decomposition module, with intuitive RGB blending of maps or time slices. All calculations are carried out in memory, and any parameter change is updated instantly in the viewer. Pre-Stack Pro now supports execution of user-created plugins, including complex workflows created with Fraunhofer ITWM’s Aloma tools. PSP development is supported by sponsors of the Foundation Project IV R&D Consortium.

Emerson/Roxar’s RMS 11 reservoir characterization solution includes a new technology plug-in that extends functionality and positions it for future web and cloud-enabled workflows. Other enhancements to the API allow users to integrate their own IP into reservoir modeling workflows. Extensions to Petrel I/O tools and other new features improve data transfer between RMS and flow simulators.

Safe Software’s FME 2018.1 adds an Apache Hive Reader for Hadoop and HDFS data, read/write support of raster data in R and a reader for FME’s augmented reality format. The iOS AR app has also been updated with fine grain scene control.

Spectro Scientific’s TruVu 360 enterprise fluid intelligence platform is a comprehensive web-based fluid analysis data management system. The new platform closes the gap between maintenance recommendations on the oil analysis report and the impact on continuous process improvement. TruVu 360 integrates Spectro’s MiniLab on-site oil analysis hardware used in industrial applications in manufacturing, mining, oil and gas and power generation.

Siemens has announced a mobile process control tele-controller with integrated GPS functionality. The Simatic RTU3031C (Remote Terminal Unit/RTU) is intended for new use cases requiring locality awareness and time synchronization. The unit compares a tag’s nominal position with the GPS for mobile monitoring of measuring points and more.

The 2018.1 release of Schlumberger’s Eclipse reservoir fluid flow simulator allows users to run multiple connected models using the block parallel license to reduce runtime. Well connection factor multipliers provide flexible and dynamic update of connection transmissibility for advanced workflows such as damage due to water influx. The top of the range Intersect simulator is also released as 2018.1 with greater model fidelity to honor geology, reservoir and operation complexity. The new release improves usability, integration and ‘openness’ to ‘help customers explore the spectrum of possible scenarios that they could never explore before.’

SPOC Automation’s new ‘Revelation’ pump-off controller analyzes down-hole pump and surface data cards along with trend data of pump fillage, rod load, and polished rod horsepower. Data is exportable locally or via scada. The controller is integrated with SPOC’s IronHorse rod pump variable frequency drives.

Technical Toolboxes’ new horizontal directional drilling power tool (HDDPT) addresses horizontal drilling projects such as pipeline drilling and installation at road crossings, bodies of water and railroad rights of way. The HDDPT targets pipeline design and construction, lowering costs and monitoring drilling and pulling operations.

The 2019.1 release of the Tecplot RS reservoir simulation post processor adds a new stamp plot feature for simultaneous visualization of well production data with the simulation grid output. The new release also simplifies the process of creating an arbitrary slice through a chain of wells, connecting, for instance, an injector with its producers.

Thermon Group has announced the TraceNet Genesis control and monitoring system for managing heat trace circuit performance on process lines tanks and instrumentation. TraceNet provides access to trace circuit performance history, fault analysis and circuit drawings. TraceNet Genesis is desctibed as a ‘fully connected industrial internet of things platform’ that delivers, inter alia, up to six months of performance trending data.

The 18.1 release of Rock Flow Dynamics tNavigator includes a vectorized version of equation of state flash calculation. Users can now accelerate compositional model simulations with CPU embedded SSE/AVX vector engines and/or use GPUs for flash calculations. The functionality extends existing capabilities to run linear solver on single or multiple GPUs.

Justcroft International’s JustImage 5.3 core image data management solution improves localization support and adds new input and printer file formats and a new JustSearch function for document retrieval.


Microsoft brings the mountain to Mohammed

Data residency issues force Equinor to partner with Microsoft on in-country data centers.

Moving to the cloud may be an essential prerequisite for digital transformation but, it is not as easy as it might appear. An individual hacker can provision an Amazon web services ‘instance’ and, as we did back in 2015, have a ‘hello world’ app running in the cloud in a matter of … well a couple of hours in our case. The hacker does not need to worry about what or where the cloud is actually located. The situation is different for a corporation which is subject to internal and external compliance issues as Equinor (previously Statoil) discovered.

Equinor’s originally intended to run its Omnia data base ‘in the Azure cloud.’ Trouble came with the realization that Microsoft’s data centers were located in Ireland. This meant that Equinor was exposed to regulations on ‘exporting’ data and an additional bureaucracy around data movement. The answer was to bring the mountain to Mohammed, or more specifically, for a ‘strategic partnership agreement’ between Equinor and Microsoft and the establishment of new data centers in Stavanger and in Oslo.

Statoil CIO Åshild Hanne Larsen said, ‘The partnership enables our digital journey to deliver more safe, secure and efficient operations. Our ambition is to become a global digital leader within our industry. A cloud data center in Norway will simplify and accelerate Equinor’s adoption of the cloud.’ The seven year commitment is said to be in the hundreds of millions of dollars range. Cloud services from the new data center will be available in late 2019. Read the release here.


2018 EU Association of Geoscientists and Engineers (EAGE) Annual Meeting, Copenhagen

Plenary debate - Oil and gas in the energy transition.

The opening plenary session of the 80th annual conference of the EU Association of Geoscientists and Engineers (EAGE) was graced with a royal presence in the person of HRH Prince Joachim of Denmark. In a short but to the point soliloquy, the Prince observed that although Denmark was undergoing a ‘rapid transition from fossil fuel to renewables … until green energy can fully meet our needs, oil and gas will remain the energy backbone of modern society.’

EAGE president Jean-Jacques Biteau announced a registered attendance of 4500 and an astonishing 1,229 papers and 500 posters. Coverage extends beyond oil and gas, with sessions on geomechanics and geothermal and a renewed focus on geology but, ‘we want to keep our brand which is mostly geophysics*’. The EAGE is also ‘modern’ with special events covering machine learning, AI and more.

The ensuing set piece was a ‘debate’ (in reality there was little debate and ‘no questions’) hosted by Danish environmentalist Martin Breum who observed that the ‘transition era’ has begun as ‘Equinor has taken over Statoil, at least in name.’ Breum noted the paradox of a ‘growing demand for oil and gas at the same time as climate change is at top of the agenda’.

Arnaud Breuillac stated that Total’s role is to provide affordable energy for all. Breuillac cited a study by the Carbon Tracker organization to claim that Total is the only major that has ‘fully integrated [the COP 21] 2°C scenario into its strategy’. This is done by improving energy efficiency and reducing carbon in operations along with a ‘credible’ (i.e. profitable) reduction of oil and gas use over the next 20 years. Total does not see any short/medium term impact to its business. If as the IEA forecast, renewables will produce 20% of world’s energy in 20 years’ time, that means that 80% will be oil and gas! If it is profitable to go green faster, ‘Total will be at the forefront.’

Breum quizzed Statoil/Equinor’s Jez Averty on the name change. We are on a ‘journey’ across an uncertain landscape of scenarios and outcomes. In 2050 the world may see anything from 60 to 120 mmbbl/day of demand. Meanwhile, carbon prices, photovoltaic and e-vehicles are on the up. But so is coal, oil and CO2. ‘Equinor’ reflects a revised strategy addressing high value, safe and low carbon energy needs including new renewables business areas with an ‘ambitious target’ for capex share across renewables by 2030. The aim is for both profitably and ‘sustainability.’ Breum expressed puzzlement as to what happened to the ‘Stat/state’ part of the old name.

WoodMac’s Paul McConnel was up next, expounding on the latest Global Trends/Energy Outlook surveys. These foresee peak demand coming first for coal first and later for oil and gas. Global CO2 will continue to rise and the Paris (COP21) targets will not be met, even though electric and renewables are growing. But there are many uncertainties and portfolios need to be flexible. ‘Uncertainty is the word. Oil, coal, gas and renewables all interact.’ Breum pressed on the likelihood of a drop in demand for oil. This is possible – with the price maybe down to $5-15/bbl before peak oil finally catches us. Different companies have already adopted widely different stances. Dong has already made a complete transition to renewables. Total and Equinor have made some moves but are still basically oil and gas companies. Exxon has unashamedly adopted a ‘last man/last barrel standing’ position. Total does not like the last man standing theme and is working on CCS in a JV with Equinor. Equinor does not like this either, ‘we need a coordinated policy response now to CO2’. The key is to ‘turn CO2 into a value chain and asset’ (thermodynamics notwithstanding!).’ Total has a new unit investing in solar, wind, batteries (SAFT) and the ‘battery of the future’ all of which will ‘integrate the value chain alongside fossil fuel.’

McConnel wondered aloud why oils should be into renewables. They may not have much choice if electric vehicles (EV) get popular. How will oils respond to an ‘evaporating’ customer base? EVs ‘cost virtually nothing to run’ and wind costs are coming down. Breuillac agreed that EVs’ impact will be ‘massive,’ perhaps accounting for 6-8 milion bopd by 2040. While this is significant, it depends where the energy will be coming from. Today, much electricity in China comes from coal. Total plans to integrate across the value chain because ‘we don’t know where the disruption will come from.’ One significant development is Total’s involvement with French smart grid startup Greenflex. Averty opined that a ‘revolution’ is needed to accommodate a 2° scenario, one that decouples prosperity from energy use. Having said that, what happens in Norway is ‘totally irrelevant’ in the greater scheme of things. Even the EU is ‘a bit’ irrelevant. What counts is what happens in China and India. McConnel stated that China is going for an energy transition with lots of EVs and a massive investment in batteries and solar. Total tests all new projects for energy efficiency using a €30-40 carbon price. The company has also established a technology hub in Copenhagen following its acquisition of Maersk. ‘Our message to the young of today is that we are not an industry of the past.’

* ‘EAGE’ originally stood for the EU Association of Exploration Geophysicists.

EAGE Forum on Digitialization.

Martin Breum was back the next day to compere the EAGE ‘Digitalization of the E&P Industry Forum’. While ‘digitalization’ is the big thing today, BP’s seismic imaging guru John Etgen pointed out that the industry has been digital since the 1950s with seismic as the original ‘digital business.’ Today, BP views digital as a way to stay competitive. Digital barrels are the cheapest in the portfolio, even in a world of relatively abundant resources and production. Digital has proved its worth in interpretation and in production optimization. In the future, ‘machine-driven’ solutions will enable the ‘Connected Upstream®.’ Breum asked how many of BP’s 5000 employees were involved in the digitalization effort. In fact, there are only some 50 ‘hard core computer nerds’ but another 2,500 engineers are involved.

Schlumberger’s Ashok Belani gave a less nuanced, full-Monty sales pitch for the digital capabilities that ‘allow us to disrupt innovation and business processes.’ Digitalization of E&P is ‘substantially different’ to previous digital work. ‘We should work with all North Sea data 100% of time but we don’t.’ ‘We should work with terabytes all of the time not just today’s multi gigabytes.’ Data is currently loaded into point applications. This is not how things will work in the future. In the future, all data will be available to all apps all of the time. Data will be exposed to machine learning and apps will become ‘interfaces to humans.’ This was demonstrated with ‘AI for tectonics’ automatic fault picking and top salt interpretation. This is ‘not rocket science, all are doing this.’

Chief geophysicist Darryl Harris stated that Woodside is reaping large benefits from data science. Data driven decision making means avoiding pre-conceived ideas and misconceptions. Woodside has hired lots of young data scientists to whom you can ‘ask any question.’ The idea of ‘citizen data scientists’ is also taking hold.

Repsol’s Francisco Ortigosa is also keen on the democratization of computing. There is currently too much software for seismic imaging. Repsol alone had some 27 apps developed for geophysical high performance computing – which were only used by 12 people. Today, these have been combined and made available in the cloud to Repsol’s 500-strong team of geoscience professionals. This is bundled in a joint venture with Microsoft as Inventemos el futuro, a democratization of geophysics with ‘fully data-driven automatic seismic interpretation.’ The cloud-based tool has ‘picked 90 horizons in under 5 minutes’ sans human intervention. Repsol has also move all of its data to Microsoft’s Azure Dublin data center which adds Microsoft security to its own infrastructure security.

Total SVP Michael Borrel stressed the importance of the EAGE as a key event for Total having just completed the acquisition of Maersk. Total is currently spending around €300 million on upstream R&D and some €30 million on digital. For Total, digital transformation is about ‘iPhones, Windows phones (sic)… and ‘simply an enabler of more efficient, safer operations and more profit’. A topsides data lake is ‘changing the way we work.’ Other initiatives of note include Maersk drilling’s teaming with IBM and a collaboration with Google in the cloud. Borrel cited the oft-repeated notion that current work practices break down as 85% on repetitive tasks and 15% creative tasks. The ‘hope and expectation is that we can reverse this and spend 85% on creativity*.’

John Etgen tempered the enthusiasm for bringing the big IT organizations on board. These are not the only resources, even though there may be some intersections. If you focus too much on the Microsoft/Google ecosystem you are likely to miss much of the potential. BP’s digital business and venture capital arm try to look beyond the public cloud/IT providers.

Belani countered that soon most compute infrastructure will be in the cloud so the faster you move, the better off you are. Infrastructure is a thing of the past the way forward is a technology stack that combines many ecosystems and lets ‘upwards of 100 companies’ work together. But Belani was only just getting going. ‘Oil and gas will NEVER lead in the digital world. There is NOTHING, NOTHING in oil and gas that is utilizing the cloud fully today. There is no question as to whether we are leading, we are not even using it! There is all this unused capability, get on with it. If we don’t adopt it as a priority we won’t get people with the right capability.’

Etgen observed that the days of the traditional oil and gas company are numbered.We produce energy now, we don’t need to lead in machine intelligence when we can harvest the technology’.

On the topic of security, Etgen observed that many large enterprises have been hacked. Nobody can claim to be 100% safe. Security is not really a cloud issue. Etgen added that he did not believe that the transition to the cloud would go as fast as some think/advise. No one will deploy HPC excluively in-house either. There will be a shift to other near-shop suppliers and the cloud. You should think of the computer as a printer or a toaster, sometimes in house is better. For service providers the cloud is OK, but for specialized R&D, inhouse is better.

Harris opined that the Googles and Amazons have been working on security for longer than we have. It is arrogant to think that we could do better. Belani agreed, it’s their profession to do this. Data is safer in the cloud than inside an IoC, let alone smaller companies. Gmail is one of safest platforms around. Borrel complained that inside Total ‘We are not allowed to use WhatsApp, although it is the most secure system in the world!’ Etgen added that security can be made a non-issue. The real problem is getting the right commercial terms and the fact that many have vested interests in perpetuating the status quo. Harris agreed that cost is a major issue, ‘it can be cheaper to do stuff in house than in the cloud’. Ortigosa disagreed, ‘the cloud is much, much cheaper’ Belani allowed that an FWI application may not be available in the cloud, but for a 1,000 CPU cluster, there is no comparison. Etgen looked on skeptically.

Breum turned the debate around to the question of what the role of the geoscientist would be in a digital future with an entreaty to ‘answer with all sincerity!’ Borrel answered that, in the in near and medium term, computing will allow geoscientists to create more value, by taking out the grunt work. He was less sure about longer term prospects for geophysicists. Harris noted that good geoscience is coming up with hypotheses and testing them. What happens when the computer generates the hypotheses? This is maybe some way down the road. Belani stated that new reserves will be found by individuals, not by machines. Machines will make the geoscientist’s life more interesting. Automation leaves more time for judgement. Is geoscience going to go away? No! I don’t know why people think like this! For Etgen, the ‘bandwidth’ of the human eye-brain system is hard to beat, ‘especially for driving insights.’

Bream asked what sort of skills are required today. Etgen observed that jobs have always changed. Today, we still need people trained ‘classically’ but also folks trained in data science. In the field of seismic imaging, the trend towards hyper-specialization needs to stop although ‘there is no easy answer to this one’.

EAGE Forum on Integration for more efficient industry EAGE Forum.

Howard Leach related how BP has swung back and forth over the years from an ‘asset focused’ organization circa 2000 when geoscience and drilling was co-located in an asset team. A decade or so later, BP ‘re-balanced’ with a more function-based organization, centralizing processes for risk management. This then exposed the ‘challenge of cross-function integration’ and too much multi-tasking, with ‘10 balls on the football pitch’. Individuals experienced ‘contextual overload’, working on one problem for an hour, then switching to another, reading email and so forth. BP also noted that in a data room, people work well together outside of standard processes. The company is now trying to leverage this finding with ‘LEAN’ processes with the objective of delivering a specific product cutting out the multi-tasking. An agile approach that delivers a ‘product’ inside a week.

Tim Dodson Statoil/Equinor observed that in the last decade the context, and expectations, have changed. Companies are at a crossroads of both an energy transition and a digital transformation. Regarding digital, Equinor’s stance has changed since last year. Today, Google and others have overtaken oil and gas which has lost its lead in big data. Equinor has engaged in self-examination to identify skill gaps. While there are ‘no obvious gaps’ there is a need for people who can on an integrator role across project, subsurface risk, politics and tax.

For ENI, as Luca Bertelli told, integration has been a multi-year journey that has resulted in a ‘design to cost’ exploration model, using capital effectively through short cycles. This aims to compress turnaround time to two years from discovery to FID and another 2.5 years to startup. Eni has moved from a sequential to a parallel approach with early project screening by a multi-disciplinary task force and by the application of ‘high performance computing!’ Conventional exploration is now conducted with an ‘unconventional-like’ approach to achieve early cash flow. Digitalization as enabler of upstream integration and Eni now ‘accepts additional risk mitigated by accelerated digitalization’. In Eni now, a big 3D seismic imaging project is done in 5 days, ‘it used to take 10 months’.

Rune Olav Pedersen (PGS) presented a less rosy picture from the standpoint of a seismic contractor that has seen a 60% drop in revenue and let go of 50% of its staff. The company has embarked on a ‘project-oriented’ organization, getting people to work differently, moving people around and training leaders. The company is also working on new stuff, marine vibrators, machine learning and big data, through JIPs, consortia and technology collaboration agreements. Here, there are challenges with IP, ownership and commercial models. Pedersen touched on a sore point re HSEE. ‘You all have your own safety standards and audits and you all audit the same thing! This is time-consuming for you and costly for us and it does not advance safety. Operators should share safety audit standards!’

Marc Gerrits Shell EVP global sees data as the fuel of ‘new and disruptive technology’ with new players, new platforms and ‘unlikely’ partnerships. However, while the value of big data is a given, it is not clear which collaboration model will prevail. When will we share, when will we treat data as providing a competitive advantage? How do we mitigate the risk of being digitally disintermediated? Gerrits agreed with Pedersen that HSEE was one of many examples of gross inefficiency. We need to standardize the safety audit and produce a win-win. We need our existing and new partners. ‘Nobody has monopoly of new ideas and best practices’. Shell no longer dictates ‘this way you do it’ to contractors, but now asks ‘what do you think? can this be improved or done more cheaply?’

Data Standards @ Total.

Data standards underpin Total’s multiple partnerships and “extended enterprise.” Total’s Pierrick Gaudin observed that e-standards are a must have for our business, although awareness has been low in the past. Total is working to rectify this situation with a firm commitment from management and the publication of company rules relating to e-standards. Total also now has a transverse organization to handle its data strategy and every new project embeds data management.

Ross Philo thanked Total as a ‘staunch supporter’ of Energistics’ and as a lead developer of the standards portfolio. Today Witsml, Prodml and Resqml all run atop the common technical architecture and support the combination of data in cross-functional workflows. Today there is a lot of hype regarding big data and AI. Some imagine that ‘somehow data will be magically transformed and that standards don’t matter anymore’. Nothing could be further from truth. Users, whether humans or machines, must have trust in their data. The ‘best analytics are worth nothing with unqualified data’. Data standards are the key to a successful digital transformation. Here Resqml is envisaged as the focus of 3D gridding, static and dynamic reservoir analysis and geology. All Energistics standards now include a data assurance object and activity log so that users can track trust and establish confidences level in their data.

Francis Morandini provided an update on the use of Resqml in Total’s in-house developed Sismage-CIG integrated platform for geoscience. Sismage now does data management, interpretation, modeling and field monitoring. Such in house-developed software allows for rapid deployment of novelties and fixes from user requests. Total likes to cherry pick its interpretation tools. This is possible with the Resqml’s open data model and other open source solutions. Most data types (including Petrel) can be exchanged and Resqml can easily be extended to other data types and data model using the open source FESAPI. Resqml is now used by large and small companies to develop in weeks what took months before. Total is seeking partnerships around the use and development of the technology.

Gaudin presented Total’s use of Witsml to stream data from remote drilling locations into its real time operations center in Pau, France. Total has its own Witsml database and a certification module for Witsml data streams. The RTOC is to be extended with machine learning and analytics as a component of Total’s ‘Ambition2025’ program. But there is still a need for data evangelization. Both top-down from management and bottom-up from users. Questioned (by Oil IT Journal) on Resqml persistence (Energistics has traditionally focused on ‘data in motion’) Gaudin outlined a joint venture with Emerson/Paradigm on OpenDB, a 1:1 mapping of Resqml into an HDF5 data store. Currently ETP is “not too cloud-compatible” hence the idea of using Resqml for micro-messages to and from the cloud.

EAGE member meeting.

The uptick in Argentinian shale gas activity forced EAGE president-elect Juan Soldo (YPF) to leave the EAGE board. Current president Jean-Jacques Biteau (Total) is to stay on another year. Biteau recalled the objectives of the EAGE’s strategy for the current year as globalization, membership and a ‘one stop shop’ for E&P knowledge and community. He also announced a new joint venture with the Petroluem Exploration Society of Great Britain, a machine learning workshop to be held in London in November 2018. The EAGE has now deployed Centium Software’s EventsAIR to manage its publishing and events. Treasurer Evert Muijzert reported ‘considerable losses’ at the holding company. Write-offs are being negotiated with auditor in respect of opex, cancelled workshops and IT. A reduction in journal income and ‘considerable negative’ impact from the EAGE’s student activities mean that the EAGE is now spending its (considerable) cash reserves. The balance sheet was down €2.4 million Euros in 2018.

AI in interpretation.

Paradigm has been using back propagating neural nets for facies classification since 1991. Paradigm has now made Total’s multi-resolution graph-based clustering (MRGC) algorithm commercially available as ‘Facimage’. MRGC e-log facies classification was introduced (and patented) in Geolog Facimage in 2000. In 2016, SeisEarth introduced facies prediction with ‘democratic neural nets.’ Stratimagic included an unsupervised neural net for seismic facies classification leveraging patented technology from Total’s seismological guru Noomane Keskes. Current automated interpretation is usually done on post-stack attributes. Paradigm is now trialing convolutional neural nets for pre-stack data clean up and automatic interpretation. Pre-stack EarthStudy 360 local angle domain directivity gathers are used as input to the classifier. Principle component analysis was performed on 7,000 1x1 km images (using Squeeznet). The approach improves shallow fault imaging compared with diffraction imaging and provides a clearer, automatically generated seismic volume.

Antoine Guitton (DUG) gave a presentation on deep learning in fault detection to a packed room. Deep learning, a class of neural nets, has been used for years in the industry. But now the technique can be extended with maybe 100s of layers thanks to freely available software and powerful hardware. Enter the Google Inception architecture, deep learning for pattern recognition and classification. This is used to produce likelihood maps of faults. Will it be useful? Is machine learning invading our space? Will you keep your job? Wherever you have a filter machine learning will help. ‘Low skill’ tasks like picking top salt are at risk. In fact most ‘picking’ will go away (hooray!) as the workflow is ‘parameterized’. ML will get you to an 80% solution right away. Will it replace everything? The answer is yes for IT-related functionality, no if a technique embeds physics, such as wave equation work where the ‘relationship between data and outcome is hard to comprehend’. Tools of the AI seismic trade include Dave Hale’s Java toolkit and labelled training data, ResNet 50 and Softmax classification. The technique gives ‘pretty reasonable’ fault extraction and is ‘good with good data’.

Agile Scientific Hackathon.

We visited Matt Hall’s (Agile Scientific) popular machine learning hackathon and saw a compelling demonstration of a Geoteric-like clone developed in the two day Python hack. Read Hall’s blog on the event here.

Petrel celebrates its 20th birthday.

Martyn Beardsell presented a potted history of Petrel, celebrating its 20th birthday this year. Petrel was born in somewhat mysterious circumstances when RMS developer Nils Fremming ventured to port RMS to the PC. This proved challenging at the time of (relatively) big iron hardware from SGI and others. ‘Few believed it would be possible.’ Working from his garage, Fremming produced Petrel V1.0 in 1998. It was unveiled at Petex in 1998 with a snappy ‘3D 4U ON PC’ tag line. (Note that Oil IT Journal, then Petroleum data manager reported that ‘Shell liked Petrel’s corner point gridding and bought the first license’.)

Meanwhile Geoquest was working to port Geoframe to the PC as ‘iGeoFrame.’ By 2002, as Technoguide built out the subsurface workflow, it was ‘struggling’ with the geomodel. At which point, Schlumberger bought the company, grew the Petrel team and added its own technology (CPS3, FrontSim) and later some of the Eclipse code base ‘completing the seismic to simulation dream.’ There are now ‘hundreds’ of developers working around the world on Petrel whose success is down to ‘good software, openness, customer involvement and quality.’ The last item was met with a few giggles which Beardsell acknowledged with, ‘OK but Petrel is now a very stable product’. The next EAGE will be held in London from 3-6 June 2019.


Folks, facts, orgs...

Agile, Airborne Oil & Gas, API, Argus, AspenTech, Associated Resources, Atlas, Atwell, Aveva, BCCK Holding, Bluware, Diamond Offshore, US DoE, Flotek, Fortis, Cfihos, Hexagon, Higher Landing, Hunting, HyrdaWell, IFPen, Indegy, WTX Pumping, ITT, Marathon Oil, OGC, OPC Foundation, P2 Energy Solutions, Petrofac, CGG, PRCI, Quorum Software, SAP Ariba, Schneider Electric, SEG, Siemens, SkySpecs, Solaris, Spectris, Sure Shot, target, Total, Tata, W&T Offshore, Wellsite, Texas A&M.

Robert Leckenby is now geoscientist at Agile.

Oliver Kassam is the new CEO at Airborne Oil & Gas. He hails from SBM Offshore.

The American Petroleum Institute has appointed Amanda Eversole as COO. She was formerly with JPMorgan Chase. Ben Marter joined as director of communications. He was previously communications director for Senator Dick Durbin, D-Ill.

Argus has named Henri de Castries to its board.

Gary Weiss has joined Aspen Technology as COO. He hails from OpenText.

Mike Gibson heads-up Associated Resources’ new Houston office.

Atlas has appointed Mark Kryska as VP technology and Rocio Cabrera as director of procurement.

US Marine Corps veteran, Rod Townsend is now director of Operations, Oil & Gas at Atwell.

Craig Hayman is now CEO at Aveva. He hails from PTC.

Bob Swann is director of project management and controls at BCCK Holding. The company also announced the opening of a new office in the Woodlands.

Dan Piette is now CEO at Bluware. EV Private Equity’s Per Arne Jensen is chairman of the board.

Scott Kornblau has been appointed Senior VP and CFO at Diamond Offshore.

Lou Hrkman is now deputy assistant secretary for clean coal and carbon management at the US Department of Energy’s Office of Fossil Energy.

Founder and CEO of Warwick Energy Group, Kate Richard has been appointed to the Flotek Industries board.

Joyce Ryel has joined Fortis Energy Services as VP HSEQ. She was previously with Superior Energy Services.

Hanwha Total has joined the Cfihos project. The company will be represented by Uk Chae who developed the company’s new engineering data management process and structures.

Hans Vestberg stepped down as Hexagon’s vice chairman.

Higher Landing has introduced a ‘Shuttle Program’, a career transformation program funded by the Alberta and Federal governments.

Hunting PLC has launched a TEK-HUB at its Badentoy, Aberdeenshire facility. The HUB aims to attract new companies and develop technological partnerships.

Tom Leeson is Chief Commercial Officer at HydraWell. He was previously Interim CEO of Decom North Sea.

IFP Energies Nouvelles has appointed Yves Boscher as director of security management, succeeding retiree Thierry Chappat. Véronique Ruffier-Meray is to replace Yves Boscher as director of HR. Thierry Bécue is director of physical chemistry and applied mechanics research.

Indegy has named Joe Scotto CMO and Todd Warwick VP Sales, Americas.

WTX Pumping Services’ Don Sinclair is now a member of Intrepid’s advisory board.

Luca Savi EVP and COO at ITT is to succeed retiree Denise Ramos as CEO and president.

Zach Dailey has been promoted to advisor to Marathon Oil CEO, Lee Tillman. Guy Baber is now VP of investor relations.

President and CEO Mark Reichardt is transitioning to part-time work with the Open Geospatial Consortium. A search for the CEO/President is underway. Joshua Lieberman is director of the innovation program.

SAP’s Veronika Schmid-Lutz has been elected chair of the OPC Foundation’s board.

Dale McMullin is now CTO at P2 Energy Solutions. He hails from Halliburton.

Fady Sleiman is global chief digital officer at Petrofac. He was previously with Waha Capital.

CGG has named Yuri Baidoukov as Group CFO. He was recently with OilServ.

The Pipeline Research Council International (PRCI) has appointed Gary Hines to VP operations. He hails from Southern Gas Association.

Quorum Software is to open a new regional office in Denver, Colorado for sales, services, and account management.

SAP Ariba has launched datacenters co-located within SAP facilities in the United Arab Emirates and Saudi Arabia to offer a comprehensive range of cloud-based solutions to streamline companies’ entire source-to-settle process.

Schneider Electric is to relocate its current operations to a newly constructed, larger space in Edmonton. The new facility includes a Smart Factory environment.

Richard Miller is president-elect of the Society of Exploration Geophysicists.

Martina Maier is now head of global compliance at Siemens AG following Klaus Moosmayer’s resignation.

Franz LaZerte is account manager at drone specialist SkySpecs’ new EU office in Amsterdam.

Solaris has named Ray Walker as new independent director to its board.

Andrew Heath is to succeed retiree John O’Higgins as CEO of Spectris. He was formerly CEO at Imagination Technologies Group.

Sure Shot Drilling has named Michael Walker as CFO and Debbie Perry as Controller.

Swagelok has promoted Theresa Polachek to VP corporate communications and Joey Arnold as VP continuous improvement and quality.

Target Energy Solutions is opening an office in Houston. Recruitment of sales, business development and technical staff is in progress.

Total and Tata Consultancy Services have partnered to create a digital innovation center in India headed by Sylvain Comiti, Total VP Refinery 4.0, to explore disruptive technologies and solutions.

Danny Gibbons, Senior VP and CFO is to retire from W&T Offshore. Janet Yang has been appointed acting CFO.

Jessica Roger has been promoted VP of administration at Wellsite.

Deaths

Sam Mannan, regents professor of chemical engineering at Texas A&M University and founding director of the Mary Kay O’Connor Process Safety Center has died.


Done deals ...

3esi-Enersight, Aucerna, Bluware, Voltaware, Total Energy Ventures, NIO Capital, WayKonect, Suse, GE, Dresser Natural Gas, Baker Hughes, Indegy, PDI Software, Siemens, Mendix, SitePro, Schlumberger, Shearwater GeoServices, SSP Innovations, WolfePak Software.

Following the acquisition of Palantir Solutions by 3esi-Enersight, the company is to change its name to Aucerna. Palantir is 3esi-Enersight’s sixth acquisition in four years, the company now serves over 500 customers from 12 locations.

Bluware has opened a funding round led by EV Private Equity and Shell Ventures.

BP Ventures has invested £1.5 million in smart meter provider Voltaware as a part of its alternative energy strategy for low-carbon power, storage and digital energy. Voltaware allows businesses to track their energy demand in detail, down to individual appliances.

Total Energy Ventures and NIO Capital have signed an agreement to cooperate and invest in the Chinese mobility sector in China, notably in areas such as electric vehicles, self driving and connected vehicles other mobility services. Earlier this year Total became a founding partner of the Cathay Smart Energy Fund, which focuses on the new energy sector in China. Last year NIO’s EP9 set the lap record for an electric vehicle at Nürburgring’s Nordschleife. Watch the video .

Total has also acquired French startup WayKonect, a provider of B2B connected vehicle solutions for enterprise fleet management. The acquisition follows on Total’s 2017 launch of its fleet card service GR Analytics. WayKonnect provides secure access to vehicle data on maintenance, odometer data, fuel consumption and driver behavior. WayKonect is based in Lille, France.

Venture capitalist EQT VIII is to acquire Linux boutique Suse from Micro Focus for $2.535 billion.

First Reserve, a private equity investment firm, is to purchase Dresser Natural Gas Solutions from Baker Hughes GE. First Reserve previously owned the business for approximately 10 years when it was part of Dresser, Inc., a former First Reserve portfolio company which was sold to GE in 2011.

In a separate announcement, GE stated that it is to ‘fully separate’ from Baker Hughes, over the next two to three years. GE holds a 62.5% stake in BHGE currently valued at $23 billion.

Industrial cyber security specialist Indegy has raised $18M financing in a Series B round led by Liberty Technology Venture Capital participation from Centrica and others. The funds will be used to accelerate growth and expand global go-to-market initiatives.

PDI Software has acquired Inform Information Systems, aka FuelsPricing. The deal expands PDI’s offerings to include pricing solutions for the global retail and wholesale petroleum supply chain.

Siemens is restructuring into three operating companies, ‘Gas and Power’, ‘Smart Infrastructure and Digital Industries’ and ‘Strategic Companies’. Siemens has also acquired Mendix, a ‘low-code’ application developer, in a €600 million deal that is to accelerate adoption of MindSphere.

Water management software boutique SitePro has closed a financing round led by Cottonwood Venture Partners. SitePro’s cloud and edge based software comprises an end-to-end solution for water management, digitalizing and automating water infrastructure in the oil and gas industry. Financial terms were not disclosed.

Schlumberger has sold the marine seismic acquisition assets and operations of WesternGeco to Shearwater GeoServices. Schlumberger will receive cash consideration based on an enterprise value of $ 600 million, a 15% post-closing equity interest in Shearwater plus payments under an earn-out agreement linked to future vessel usage. An additional $50 million cash will be injected in Shearwater as working capital bringing the total transaction to $650 million, to be funded by $325 million equity and $325 million in debt financing.

SSP Innovations has acquired TC Technology, a provider of mobile solutions and services for the utility, telecom, and pipeline industries. TCT’s MIMS (mobile information management system) will extend SSP’s application suite and its ‘evolves into the era of the Esri utility network and mobile ArcGIS technology.’

WolfePak Software has acquired Conquest RT, developer of a mobile app for e-ticketing and reporting on the transport of crude oil and water. The acquisition merges WolfePak’s oil country ERP solutions with Conquest’s business operations in a ‘fully integrated’ solution for the collection, management and automation of oilfield tickets. The expanded business will serve over 1,500 oil and gas upstream and midstream companies across the US.


More improbable announcements on the blockchain front!

Diamond Offshore launches blockchain drilling service

Data Gumbo has provided Diamond Offshore with an industrial blockchain as a service solution that underpins Diamond’s ‘first of a kind’ blockchain drilling platform. The blockchain drilling service provides an immutable record of transactions relating to well construction activities, including drilling services, materiel management and the supply chain. Users can access and analyze performance from any web accessible device for ‘near real-time total cost of ownership management’.

Configurable modules can be adapted to the platform for individual on and offshore well or multi-well campaigns. The platform tracks transactions from procurement stage through construction, completion and production, ‘reducing spend, eliminating waste and helping to deliver a well successfully.’ Diamond is to implement the service fleet-wide to create the industry’s first ‘Blockchain Ready Rig’ fleet.

Big Four team on Asian blockchain banking

The world’s four largest auditors – Deloitte, Ernst & Young, KPMG and PwC are to add digital technologies, ‘from blockchain to AI and big data’ to the XBRL’s computer-readable reporting format and thus ‘transform global business and the financial world’. The initial lucky testers of the transformational technology form a group of 20 Taiwanese banks that are to test blockchain technology to confirm transactions. The pilot will replace manual audit confirmation of transactions with a blockchain that will be accessible by the audit firms. More from Regulation Asia.

EY, blockchain promises auditing without auditors!

Speaking at the GBC IIoT and Digital Transformation in Oil & Gas conference earlier this year (see elsewhere in this issue), Artiom Kozlovski (EY) described blockchain as the most interesting app in oil and gas. Blockchain offers bullet proof protection against tampering, trading without traders and auditing without auditors*. Blockchain can (could?) be used to trace tubulars and other equipment without needing to know everyone on the network. A closed blockchain-based platform promises trading ‘validated by the system.’ The main current problem is scalability.

* Coming from EY, this is beyond improbable!

But not all agree (letter to the editor)

A reader responds to our skeptical editorial Blockchain is bullshit.

Dear Neil,

I would just like to take a moment to congratulate you on your latest editorial (‘Blockchain is Bullshit’). It takes courage to point out the emperor’s new clothes, when every second piece of IT marketing & research seems to be eulogizing this technology. I have been mildly concerned, for the last year or so, that I just can’t see the revolutionary potential for upstream and have only of late started to think, maybe it’s not me, maybe it’s them. I recently had a conversation with an old mate of mine, now CTO at a major international bank and asked him to explain the benefits of blockchain to me. His response? ‘Can’t see any – we only do transactions with entities we trust already and we don’t need that extra layer of assurance. It’s also already pretty efficient, so where’s the benefit?’ So, we might be proven wrong in time & I suspect there will be some niche uses once the hoopla settles down, but in the meantime, well said!

David Edwards Group IS Manager Premier Oil


Safety first

DNV GL - Can we trust AI in safety critical applications?

DNV GL asks ‘Can we trust artificial intelligence to keep the oil and gas industry safe’ and answers its own question (with a ‘no’) in a position paper, ‘AI + Safety’. The paper’s thesis is that ‘as AI systems begin to control safety-critical infrastructure, the need to ensure safe use of AI in systems has become a top priority.’ The paper asserts that data-driven models alone may not be sufficient to ensure safety. DNV-GL is therefore ‘calling for a combination of data-driven and causal models to mitigate risk.’ At issue is the relatively sparse training data that is available from the (fortunately) rare safety incidents. ‘AI and machine-learning algorithms that rely on data-driven models to predict and act upon future scenarios may not be sufficient then to assure safe operations and protect lives.’ Notwithstanding such misgivings, DNV GL has teamed with Equinor, Kongsberg Group and Telenor to establish the Norwegian Open AI Lab to ‘improve the quality of research, education and innovation in AI, machine learning and big data.’

IOGP on safe driving

It is old news but worth repeating every now and again, but ‘risk’ to life and limb in the oil and gas industry does not necessarily come from the most obvious sources. Workers may spend their day in a high-risk environment, but it is often the drive home that kills. The International Association of Oil & Gas Producers has determined that ‘driving-related incidents’ are a significant cause of fatalities in upstream operations. Transport-related on-the-job fatalities in the US in 2016 were more than 2,000.

Schlumberger’s mobile app for safe driving!

Schlumberger recently announced measures to improve employee safety with regular fit-for-purpose driver training, including the use of driving simulators and driver-improvement monitors to provide real-time in-vehicle feedback on driving performance. Global journey management centers support drivers during each journey and reinforce safe driving behaviors. In-vehicle technology enables the centers to monitor driving behavior in real-time and provide immediate feedback on driver performance. The company has also developed a mobile app for drivers.

TekSolve ‘drivers four times more likely to crash using cell phone’.

If you are concerned about your employees fiddling with a mobile app while driving, TekSolve may be able to help. The company reports that drivers are four times more likely to crash using a cell phone while driving. TekSolve and the American Allied Safety Council is offering advice on creating a company cell phone policy.

Hexagon/Guardvant detects and responds to driver fatigue.

Hexagon’s recently acquired Guardvant/OpGuard solution detects and responds to driver fatigue and distraction and also provides collision avoidance and proximity detection solutions to provide drivers with 360° situational awareness. The solution targets industrial worksites, trucking and hauling and aviation.

CSB reports on US gas well blow-out.

The US Chemical Safety Board (CSB) is investigating the January 22, 2018, blowout and fire at the Pryor Trust gas well in Pittsburg County, OK that killed five workers. A report and video reconstruction of the incident is available on the CSB website.

NAP free e-book on designing safety regulations for high hazard industries.

The US National Academies Press has just published a free e-book on ‘Designing Safety Regulations for High-Hazard Industries’. Arthur Meyer (Enbridge Pipelines) was a co-author of the 165 page investigation of regulations covering, inter-alia, the catastrophic risks of pipeline failure. A timely study in view of the September 2018 explosion and fire of Colombia Gas’ ‘century-old’ gas pipeline in Massachussets, the worst incident in a decade.


Global Business Conferences 2018 IIoT & Digital Solutions for Oil & Gas, Amsterdam

Accenture on ‘compressive disruption’, the ‘wise pivot’ and fighting the corporate ‘immune system’.

Why is everyone talking about digital transformation and disruption today? Because a) the cost of technology is coming down and b) there is a ‘convergence’ of web technology/big data/AI and the internet of things (IoT). Oil and gas is also experiencing ‘compressive disruption’ with pressure on margins coming from a ‘new energy scenario’ of shale, renewables, and ‘demand compression’. The answer is to keep shareholders happy with continuous improvement, to ‘wise pivot’ into future energy transition scenarios and to ‘grow the core’ with predictive analytics, drones and cross-industry consortia (e.g. blockchain in trading). Examples are Shell’s acquisition of NewMotion an e-vehicle charging stations business, BP’s acquisition of NASA AI spin-out Beyond Limits and ‘data monetization’, a key topic in oilfield services. The digital transformation needs support from the top and a courageous CEO. Leideman likens the role of leadership to ‘fighting the immune system’. ‘When you to try something new, incumbents will fight you hard.’ Digital transformation is not just about technology, it needs cultural change and management thereof. On the topic of the proof of concept (PoC), Leideman was nuanced. The PoC is not wrong but it needs to be designed with scaling in mind. Many PoCs stall after their initial success. This has been observed in predictive maintenance where scaling is the biggest issue, as is resistance to change. ‘Everyone knows how stubborn the community is, every asset is different, scaling up is very hard. I see a lot of ‘solutions’ looking for a business problem to solve’.

Advisian - more failures than success in digital transformation.

Bradley Andrews, president of Worley Parson’s Advisian unit has been working on digital transformation internally as well as for clients. In one example, a new program was rolled out that addressed widespread use of Microsoft Excel. This failed because it made five people’s jobs easier and 5,000 people’s jobs harder – and they still use Excel! So what problem is digital transformation trying to solve? Fully-formed technology brought in from other industries is unlikely to be fit for (our) purpose. Advisian interviewed 500 individuals re blockchain, AI, VR and so on. All agree on the intent and speed of the transformation but not on the pathway to success. In fact, there are more failures than success and there is a crisis of confidence in leadership. Another issue is regulatory risk around emerging technology which is slowing adoption. You need to question your beliefs and the hype! ‘Anxiety engenders creativity’.

Digital transformation in Total. Are the GAFAs friend or foe?

For Gilles Cochevelou, digital transformation in Total means working hand in hand with the CIO, ‘this is not shadow IT.’ The transformation spans social networking, mobile and ATAWAD (any time anywhere any device). The magic word, a ‘platform’, is close to evoking a natural monopoly. Are the GAFAs friend or foe? Digital is a great opportunity to break the silos. Cochevelou cited Total’s global roll-out of Microsoft Office365 with its teams functionality (a counterpart to Slack). While this was not an IT program, working with the CIO was critical. Elsewhere Total has many digital initiatives Booster, Total Energy Ventures, Incubator 4.0 and hackathons. Training and digital ‘acculturation’ are achieved with a digital passport, MOOCs, reverse mentoring, a data science challenge and digital bootcamps. Subsurface represents a big digital native community that was doing data science before the day. A joint venture with Google has seen ten Total employees moving to Silicon Valley to partner on image processing and semantic analysis. Total has several drone flying teams using, inter alia SciAero’s CyberQuad. Another trial with Swiss Flyability uses a caged drone to inspect the inside of a refinery distillation column. Predictive maintenance of rotating equipment is a field where many vendors have a ‘magic’ solution - on slideware at least! But Total wants to keep control of its data. Alongside its 20 Smart Room collaborative environments, Total has developed a digital twin of a refinery, the Quantum digital twin/virtual plant. This 3D model acts as a single source of truth throughout the asset’s lifespan. Quantum represents a shift from PDFs to an object/tag-centric approach. This is key to Total’s relationships with vendors and is ‘at heart of the digital transformation’. In retail, Total is working on an e-wallet in a joint venture with SIXT and BMW’s DriveNow unit. Now ‘the car pays for its own gas!’ Total is also working on new sales channels, selling electricity to end users and on its Fioulmarket heating oil sales portal. This means learning new search engine optimization skills and ‘learning to play chess with Google as it changes the rules’.

Microsoft – Azure ML predicts coke explosions.

Microsoft’s Tibor Bacsó opined that although many speak of digital transformation, few (like Total) are actually doing it. Clients question its applicability to a refinery, raising issues such as data quality, the investment involved and resources needed. For Microsoft, it is applicable – and provides a better understanding of thermodynamics, bringing theory and practice together and moving from ‘predictive to prescriptive’ operations. Microsoft Azure machine learning underpins a drag and drop GUI for modeling and predicting coke explosions in a Rumanian refinery, executing an R script. A model can be tested with little investment as capacity is provisioned on the cloud.

Halliburton goes beyond digitalization and into open data future.

Halliburton’s Satyam Priyadarshy went ‘beyond digitalization’ and into true digital transformation in the era of Industry 4.0 and big data. This requires a change of mindset, ‘we still spend a significant amount of time building data models’, doing a project for a few months and dropping it. What is needed is continuous transformation, repurposing your geophysicists to do data science. Despite 60 years of research and 280 published papers, there is still no perfect model for drilling rate of penetration (ROP) prediction. An obstacle to machine learning is folks’ reluctance to share data. E&P leads in this waste of resources! Why do we create multiple copies of data? And we are unprepared for future fiber optical/IoT/cyber/cloud data. Halliburton has established a big data/data science center of excellence. The OpenEarth community (devops), iEnergy (cloud) and DecisionSpace got a plug. One of Priyadarshy’s AI examples goes straight from well data to a reservoir property model, begging the question as to whether AI will cannibalize some of its own software.

Extreme teaming whittles down Shell’s PoCs to feasible, scalable projects close to the core.

Shell chief data officer Anosh Thakkar cited a Gartner definition of digitalization as using digital technology to change a business model and provide new revenues and opportunities. As such, it is not new. But data is growing exponentially and the tools available to extract value from big data are getting better. There is the threat of disruption from new players particularly as oil and gas was ‘late into the digital space’. Currently there is no ‘existential push’ for change but this is changing with AI, blockchain and other novelties. Last year Shell’s digital strategy embraced some 550 proofs of concept, most sans a road map to value. Today, things are being approached coherently across Shell with particular attention to scalability. Shell’s team of 200 digital experts are to focus on digitizing the business core and adjacent areas. Projects must show a scalable minimum viable product within 3 months. Moreover, a project must address feasible technology. One current area of research is equipment failure and deferment issues. On the Shearwater asset, compressor steady state analysis is helping engineers to move from information overload to a consolidated insight into failures, with millions in savings. Shell’s secret sauce is ‘extreme teaming’ in an ‘accelerator room’ where team perform daily sprints and standups, bringing together data scientist, programmers, SMEs and ‘business interaction designers’.

TNO on making sense of big data.

Rahul-Mark Fonseca reported on trials of ‘the largest sensor network in the world’ deployed over the Netherland Groningen gas field. The low cost acoustic sensors are used to assess structural damage and repurposed to use as seismic sensors for interferometry. In another projects, NAM (the Groningen operator) and others are trialing neural nets to forecast oil production from mature assets. While NNs are not new, deep learning is where the progress (and hype!) is. But it is not all easy, a LSTM* auto tunes and works well forecasting some wells, then falls apart on the next one. Another issue is the question as to why NNs work, what is happening inside the black box? Oil and gas engineers are skeptical. Work in DARPA’s XAI Program sets explainability against accuracy. Some attempts to open up the NN box have found signs of cheating. One system correctly identified horses from the (horse) photographer’s name. In another Minority Report-style project with the Netherlands police looking for crooks with police mugshots and LinkedIn data. This worked great but why? It turned out that NN was good at spotting LinkedIn photographs of people wearing T-Shirts, just like the mugshots. Fonsecas advised, ‘Do not forget your domain knowledge!’

* long short term memory network.

Bentley Systems AssetWise as digital transformer.

Alan Kiraly, SVP asset performance with Bentley Systems, cited a Gartner study that found that ‘85% of oils have digitalization initiatives’ but ‘only 10% are being scaled to production’. At the heart of transformation is the digital engineering model, a bridge between IT and OT that ties all data together. Bentley’s solution in this space is AssetWise as deployed on BP’s Omani Khazzan gas project where it is the central information store holding 60,000 documents and 160,000 equipment tags. Other key AssetWise deployments include Shell’s Project Vantage and the US Geismar A04 extension. See also the video of an AssetWise/MindSphere combination, ‘MindApp’, a joint venture with Siemens.

Equinor’s GoDigital program and the OMNIA data platform.

Einar Landre provided an update on Equinor’s GoDigital program and the digital center of excellence he announced last year. Equinor has six programs that frame its digital activity built atop its “OMNIA” data platform. Landre stressed that OMNIA is not “IT digitizing the business,” rather “business driving digital improvement” in a move away from data silos to “enable all data”. OMNIA provides data ingestion and cleanup to consistent naming conventions. Portals and apps for data science, IoT and more run atop the system. OMNIA is built on Microsoft Azure (although it “could have been deployed on any cloud platform”)*. Data from US unconventional operations now flows into the OMNIA cloud. The cloud-based architecture facilitates separation of data and apps. Equinor can now manage data appropriately, work through APIs and micro apps. Operators can now see which wells need attention, reducing driving/accident risk. Equinor is now working on a Digital Twin with Kongsberg to support remote operations and trialing the Hololens and other AR/VR smart devices in a digital worker pilot. “There is huge upside in collaborating on developing technology and architectural principles”.

* We also understand that Landmark’s DecisionSpace Information Server is a key component of the solution.

OMV deploys Intel’s Universal Well Controller.

Intel’s Louis Desroches teamed with OMV’s Sasa Blazekovic to report on a wellhead analytics pilot with OMV in Austria on a brownfield site with around 1,000 stripper wells. The field was originally developed in the 1940s and now covers a 2,400 KM2 area. Surveillance was previously performed with twice weekly site visits. Intel has used an ‘open source approach’ to software that allows OMV to pick and mix sensor types. Alarming and soft sensing software was developed by Ipcos. Other components include Siemens’ Simocode smart motor control and Intel’s IoT gateway. Multi-mode connectivity blends WLAN, 3/4G, LoRa and WiMax. Data from beam pumps now streams into the scada control room and Siemens’ MindSphere for remote software upgrade and central shutdown. Edge analytics support local operators and minimize backhaul bandwidth requirements. Edge devices provide load cell calibration, intrusion detection (camera) and Dynacard-based soft sensing of pump performance. A video-enabled inclinometer also ran as did a near real-time surveillance dashboard/web portal. Closed-loop control start/stops pumps and the pump off controller. The low cost solution (a box costs under $1,000) is now extending to gas lift wells. The solution is now being rolled-up into a cross-vendor universal well controller and mesh network to chat with neighboring wells. The UWC embeds open source software for ‘zero touch’ provisioning and cloud-based management. Intel is working to build an ecosystem of ISV/OEMs. Other components include the Akraino Edge Stack and the Linux Foundation’s ACRN project. Wind River also ran.

Gamified learning for Repsol’s refiners.

Iñigo Ribas Sanguesa presented Repsol’s ‘gamified’ learning for its refinery personnel. Apparently, ‘face to face training is very inefficient’. The ‘Legend of Zelda’ inspired course invites participants to solve refinery enigmas by talking to different characters in the game. The somewhat infantilizing cartoon is said to force team work, pitting a ‘sulfur team’ against a ‘catalyst team’. The solution was developed by VirtualWare. Watch the online demo.

More from Global Business Conferences.


Sales, partnerships, deployments ...

Ambyint, eDrilling, Aker Solutions, Alcatel, Nexans, AspenTech, Baker Hughes GE, Rock Flow Dynamics, CGG, DNV GL, Elsevier, AAPG, ENGlobal, Exprosoft, Chrome River Technologies, Fairfield Geotechnologies, Ikon Science, FutureOn, Ikon Science, Kadme, Kongsberg Maritime, LMKR, Red Buffer, Texas Advanced Computing Center, Quorum Software, Kappa, Bentley Systems, Siemens, StormGeo, ThoughtWorks.

Following its 2017 investment in artificial lift software house Ambyint, Equinor (formerly Statoil) is to deploy the company’s data-driven AI technology on all rod pump wells in its Bakken asset, North Dakota. The deal follows a pilot project that “improved remote data visibility and delivered a more accurate diagnostic of downhole conditions” according to Equinor production engineer Jack Freeman.

eDrilling’s wellAhead simulator has been deployed by Shell to design an exploration well on the Norwegian Coeus prospect.

Aker Solutions has transferred its internal suite of SAP business applications to the SAP cloud.

Equinor has selected Alcatel Submarine Networks to deliver permanent reservoir monitoring on the Johan Castberg field in the Barents Sea. ASN has subcontracted Nexans Norway to supply the subsea cable for the project.

Petronas is to deploy Aspen Technology’s AspenOne engineering and supply chain management software solutions at its RAPID facility in Pengerang, Johor, Malaysia.

GE’s Baker Hughes unit has teamed with inspection services specialist SGS to provide predictive corrosion management solution to industrial customers.

Cavitas Energy has acquired licenses of Rock Flow Dynamics’ tNavigator for enhanced oil recovery studies on its heavy and conventional oil fields.

CGG is to supply Petronas Carigali with an advanced imaging datacenter along with its multi-physics imaging suite. Petronas is also to explore the potential of cloud computing and machine learning.

Elsevier and AAPG are teaming up to integrate the AAPG’s content with Geofacets, Elsevier’s GIS-based information solution for exploration and development.

ENGlobal has secured an approximately $11 million contract from a major E&C firm to supply analytical process control and continuous emission monitoring systems to its new ethane steam cracker on the US Gulf Coast.

OKEA is to implement Exprosoft’s WellMaster to all its future operated oil fields on the Norwegian Continental Shelf. Exprosoft is to ensure that well operations comply with NORSOK regulations.

Chrome River Technologies is to deploy its cloud-based expense management solution to ExxonMobil’s 70,000 employees worldwide.

Fairfield Geotechnologies and Ikon Science have partnered to offer geoprediction services to the oil and gas industry in North America.

Equinor has signed a multi-license agreement for FutureOn’s cloud-based software Field Activity Planner (FieldAP), a global collaborative tool that lets oil and gas operators digitally merge large volumes of offshore oilfield data in a centralized cloud platform.

OGA has commissioned a Roknowledge study from Ikon Science for the UK Continental Shelf. The study includes a 40 well rock physics and seismic calibration catalogue.

Tatneft CIO Rustem Pavlov has won a ‘Project of the year award for the creation of a digital management platform for large geological, geophysical and production data of PJSC Tatneft. The platform was built using Kadme’s data access platform, Whereoil. The work involved specialists from Russian IT company ITPS Group.

Kongsberg Maritime is to deliver an advanced real-time monitoring and advisory (RTMA) solution for the Noble Tom Madden drillship. The RTMA is also to be supplied to Odfjell Drilling and to Seadrill.

LMKR and Red Buffer have partnered to provide continuous innovation in AI and deep learning in hydrocarbon exploration.

The National Science Foundation (NSF) has awarded the Texas Advanced Computing Center (TACC) a $60 million grant for its new Frontera high-performance computing. Frontera’s primary computing system is provided by Dell EMC.

Quorum Software Mosaic’s integration with Kappa’s Citrine enhances data management and well analysis for engineers. Three Span Oil & Gas is to implement myQuorum Land On Demand.

Bentley Systems and Siemens are to reinforce their strategic alliance with an additional €50 million, bringing the total pot to €100 million to further develop their joint business cooperation and commercial initiatives.

StormGeo and ThoughtWorks are teaming up to deliver digital transformation in Weather Intelligence and Decision Support Services.


Standards stuff...

Research Data Alliance, EU on B2B, ISO 31000, ECCMA MDQ certification, Energistics Prodml 2.1, FuelPlus/IATA, OGC, Simulation Interoperability Standards Organization, borehole data, geostatistics standards, PODS, PPDM, WWW.

The Research Data Alliance (RDA) DMP Common Standards working group recently held a work group titled Data Stewardship Realized: From Planning to Action. The work group focused on domain-specific extensions for machine-actionable data management plans. The aim is to migrate ‘not very useful’ succinct descriptions of published research to ‘machine actionable’ fully qualified representations using Dublin Core metadata. The RDA is funded by the EU Commission, the US NSF/NIST and other international research organizations.

The EU is to hold a workshop on ‘advanced and interoperable digital’ business-to-business platforms for smart factories and energy in Brussels on the 15/16 October 2018. The event concerns users and suppliers of B2B digital industrial platforms and will discuss user requirements, existing commercial and community-driven solutions and challenges. At stake is a €300 million contribution from the EU to ‘foster cooperation of stakeholders across value chains, user-supplier integration, and fast adoption of emerging standards’. The workshop is to cover evolving technologies in areas such as artificial intelligence and blockchain.

The International Organization for Standardization (ISO) has revised its ISO 31000 risk management standard. The 2018 edition provides a simpler and clearer guide to risk management principles, planning and decision making. Read the ISO article on changes to the framework here.

ECCMA is providing free certification and registration to master data quality manager level in accordance with the ISO 8000 and ISO 22745 standards. Upon completion of an online quiz, data quality professionals will receive an ISO 8000 MDQM certificate in PDF format. A ‘printed and signed’ edition is available for a fee $50.

Energistics Prodml V2.1 release candidate is now available for review. Prodml provides open, non-proprietary, standard data interfaces between software tools used to monitor, manage and optimize hydrocarbon production. Comments and issues should be submitted before December 3, 2018.

FuelPlus, a provider of aviation fuel management software, has announced eTender, a solution for paperless fuel tenders. eTender uses the 2016 IATA XML fuel tender data standard.

Standards bulimia continues at the Open Geospatial Consortium (OGC) with (at least) four new initiatives. OGC has announced a teaming with the Simulation Interoperability Standards Organization (SISO) on interoperable modeling and simulation. The idea is to advance 3D geospatial and M&S interoperability with the goal of a global common model. Next up, an OGC initiative to couple artificial intelligence with geoinformatics with in a new domain working group. OGC has also announced a borehole and associated data interoperability experiment, a ‘domain-neutral semantic description’ of the general concept of borehole. Borehole IE addresses geological and geophysical across inter alia, oil and gas, civil engineering and environmental sciences. Finally, OGC is seeking comments on its proposal for a statistical domain working group to investigate the integration of geospatial information into statistical systems for discovery, analysis and use.

The Pipeline Open Data Standards (PODS) body has released PODS Lite 1.1 with implementations for SQL Server, Oracle, and PostGreSQL in addition to the ESRI geodatabase implementation of V 1.0.

PPDM’s hydraulic fracturing model will be included as a component of the PPDM 3.10 (ten) release.

The World Wide Web consortium’s devices and sensors working group has published a working draft of its geolocation sensor specification, an interface for obtaining the geolocation of a hosting device.


Back to school

Process Industry Academy announced. Norwell Edge e-training for upstream. Seagull, Marlink satellite e-learning delivery. GSE Systems’ EnVision learning on demand.

Siemens and Bentley have announced the Process Industries Academy. The Academy, located in Karlsruhe, Germany, will offer best practice training for plant engineering and operation.

Aberdeen-based drilling project management specialist Norwell Engineering has rolled out Norwell Edge, an e-training program for the upstream. The Edge offers 50 in-depth upstream awareness training modules, an exam and an advanced course for specialists. Training is available for a monthly individual subscription of $19. More from Norwell.

Seagull Maritime is to use satellite communications technology from Marlink to distribute its onboard training material. Seagull’s training software will run on Marlink’s XChange centralised IT and communications management platform, providing full on-board hosting and monitoring of usage.

GSE Systems’ new cloud-based training platform, EnVision Learning On-Demand extends the capabilities of its EnVision tutorials and simulations. EnVision LoD provides access via a web browser from any location with no local installation. An early adopter program at a ‘large international oil and gas company’ has seen the new system integrating its internal training program, providing introductory training before starting classroom instruction.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.