Oil IT Journal: Volume 22 Number 9


Azure ‘transforms’ Chevron

'Primary’ cloud partnership to ‘more efficiently do’ oil exploration and sensor data management. Azure HPC infrastructure informed by Chevron feedback with a little help from Cray.

Microsoft bloggeuse Vanessa Ho waxed lyrical about Chevron’s selection of Microsoft Azure as its ‘primary cloud’ in a partnership that is set to ‘fuel’ the company’s digital transformation. The partnership is to provide Chevron with the compute power it needs to ‘accelerate work in data analytics and the Internet of Things (IoT).’ Chevron CIO Bill Braun said, ‘This partnership will allow us to digitally transform and leverage the scale and capabilities of Microsoft to ensure we harness the value of our data.’ The partnership includes technical collaboration, joint innovation and employee cross-training. Microsoft is to develop products that solve Chevron’s business challenges and transform its data into ‘performance-driving intelligence.’

Microsoft’s Tom Keane explained that although Chevron is already a sophisticated consumer of data, compute and IoT, and excels at high-performance computing, the partnership will investigate how all of this can leverage Azure to ‘more efficiently do’ oil exploration and sensor data management. Braun highlighted what has proved to be an extreme use case for data intensity, digital temperature sensing (DST). This leverages existing optical fiber installed in wells along with fancy instrumentation to provide a temperature/depth profile of a flowing well. Braun observed that DSTs and other onsite data sources ‘can generate up to 1 terabyte of data a day*.’

Chevron and Microsoft have partnered for many years (see our 2009 TechWatch). Braun opined that Chevron values Microsoft’s technology, technical leadership and partnership mindset. At the same time, Microsoft gains insight into the oil and gas industry and how its solutions work in a company with a global footprint and harsh operating conditions. Microsoft has improved its high-performance computing infrastructure based on Chevron’s feedback.

Another improvement to Microsoft’s Azure cloud is the recently announced availability of Cray’s XC and CS series super-computers in Azure to run HPC and AI applications. The Crays are said to ‘easily integrate’ with Azure virtual machines, data lake storage and Microsoft’s AI/ML services. A facet of the deal with Cray, not mentioned in the release, is that Crays run Linux. Linux’ growth in Azure reflects a big shift in strategy from the days when Windows was said to ‘dominate HPC ’ and when Steve Ballmer called Linux ‘a cancer’! More from Microsoft.

* DST multi terabyte data is usually processed on site with a reduced data volume (possibly) going into the cloud. DST could be considered a case of ‘edge’ computing.


Unily at Shell

BrightStarr’s portal technology feeds targeted corporate information and HR functionality to 135,000 users in 70 countries.

Shell is to deploy BrightStarr’s Unily intranet platform in a ‘global digital workplace’ that will connect 135,000 users in over 70 countries. The platform will combine eight legacy applications into a ‘single integrated intranet experience.’ Microsoft Office 365 tools will be accessible through the portal. The new portal will deliver personalized and targeted corporate information to users of desktop and mobile endpoints and act as a gateway to tools used in IT support. For HR, Unily promises an ‘integrated employee profile’ in a single platform ‘that will encourage social networking and knowledge sharing.’

David Harrington, Shell’s VP internal communications modestly reports, ‘Our people are some of the brightest in the industry and expect the newest tools and applications to drive productivity and engagement. Unily will help us to create the experience our employees are seeking.’ UK headquartered SharePoint consultancy BrightStarr previously provided Conoco-Phillips ‘The mark’ internal web portal leveraging its ‘Kinetica’ methodology. Unily was also used by Amec engineers to develop a new intranet following its merger with Foster Wheeler back in 2014.


COP23 BECCS, FECCS and the future of fossil fuel

COP 21’s best shot was BECCS, biomass energy carbon capture and storage. What might save the fossil fuel industry, or at least, prolong its life is FECCS, fossil energy CCS. A new report from the Global CCS Institute reports on the state-of-the art. Neil McNaughton reads between the lines.

After COP21* a couple of years back, it’s surprising that not so much is heard from the current COP 23. Beyond the fanfare of the 2015 edition, what actually is the game plan for the world, assuming that the consensus of the scientific community is right and that president Trump is wrong?

COP21 produced an impressive statement of intent regarding the need to keep temperatures below 2°C above the pre-industrial baseline but how this was to be achieved was somewhat obscure. As we reported, the preferred route to saving the world is ‘Beccs,’ biomass energy carbon capture and storage. Beccs is politically correct because it does not involve fossil fuels and biomass is ‘green’ isn’t it?

Disposing of CO2 from biomass is a dual use technology and one that, notionally, could, if not save fossil fuels, at least extend their life span. Enter Feccs, a.k.a. fossil energy CCS.

We have reported from various CCS-oriented gatherings in the past and I have been rather dismissive of the technology’s chance of ever seeing widespread take-up. The recent publication of a report by the Global CCS Institute (GCI) titled ‘The role of CCS in meeting climate policy targets’ provides a comprehensive summary of current efforts to capture and store CO2 from electricity generation and other industrial sources.

Fossil fuels currently meet more than 80% of global primary energy demand, and CO2 from fossil fuel combustion accounts for over 90% of energy-related emissions. So there is a good case to be made for CCS if their use is to continue. But how good a case is not clear, either from the executive summary nor from the conclusions of the report. There is too much politics involved to make a straightforward case for investment in CCS.

In Europe, CCS is perceived by the greens as a get-out clause for the fossil fuel industry and is to be resisted. In the USA, CCS trials are acceptable especially when piggy-backed onto enhanced oil recovery (EOR) projects.

The political brakes in Europe and the commercial accelerators in the US have led to the interesting situation where the denialist US is sequestering far more CO2 than the handwringing nations of the EU. In Germany, which pulled back from Nuclear following Fukushima, the return of lignite mining opened a great opportunity for CCS. But the fact that lignite mines are located in different localities than potential sequestration sites has led to large-scale nimbyism stemming from a ‘Länder clause’ in the German CCS bill that gives regional government the ability to authorize or prohibit CO2 storage on their territory. Worse, CCS and nuclear now seem to be equivalent in the public eye and are likely vote losers.

The Netherlands CCS flagship, the Rotterdam Opslag en Afvang Demonstratie (Road) project has stalled because of a funding shortfall, even though it includes a potentially commercial component (the CO2 is to be sold to greenhouses).

Meanwhile, the UK has fallen from its early poster child status when it was providing ‘the strongest policy leadership in encouraging CCS’ and when its CCS policy framework was considered a ‘good practice example.’ In 2015 the UK unexpectedly withdrew its financial support for CCS. Norway is doing better and is planning a ‘full scale industrial’ CCS project for 2022.

The US government has supported CCS since 1997 with the aim of ‘safeguarding fossil fuel use, developing global technology leadership and mitigating climate change.’ Between 2008 and 2014, Congress appropriated $6.4 billion for CCS projects. The Kemper County energy facility in Mississippi will be the largest CCS power project in the world, capturing 3 Mtpa. Here CO2 is used for EOR and represents 65% of the 582 MW electricity generation plant’s capacity although the project has seen delays and rising costs.

Contrary to what one might imagine, in the US, excess CO2 is defined as a substance that damages its citizens’ health and can be regulated by the EPA, whereas Europe ‘scrupulously avoids’ defining CO2 as a pollutant to avoid it being subject to EU directives on waste disposal!

The GCI report puts the overall cost hike for electricity with CCS at ‘between 26% and 114%’ over plants without such technology. On the other hand, ‘the non-availability of CCS appears to make climate mitigation scenarios at best much higher cost, and at worst infeasible.’ A bit of sophistry that I take as meaning that alternative green energies will not scale enough to mitigate fossil fuel use.

So where are we today with CCS? It’s hard to tell from the study just how much CO2 is being sequestered, but totting up the notional values of the demonstrators around the world I estimate that, to an order of magnitude, the world could be sequestering around 10 Mtpa if they were all up and running.

How much is that compared with worldwide CO2 emissions? The CO2.earth website gives a spuriously accurate 35.9 Gtpa of CO2 from fossil fuels. Given that one third of emissions come from transportation and cannot be considered ‘sequestrable’ (despite my editorial), this leaves a potential target of say 20 Gtpa, 20,000 times all current capacity.

How much would that cost? Say current capacity has cost $10 billion, this comes to around $200 trillion, three times world GDP. And this for the pleasure of paying at least 25% more for your electricity. How much would that add to the bill? Fossil electricity consumption is 15 TWh/year at roughly $100/MWh. 25% of this would be another $0.4 trillion. The CCS showstopper is infrastructure cost. The extra cost of electricity is just the coup de grace!

@neilmcn


Book review Data analysis for scientists and engineers

Edward Robinson’s text book targets principally the mathematical. But there is enough narrative to intrigue the philosophically-inclined, with an intricate discussion on frequentist vs. Bayesian reasoning. But the trendy field of ‘data science’ is conspicuously absent.

Edward Robinson’s (professor of astronomy at UTx Austin) book, ‘Data analysis for scientists and engineers*’ probably caught our attention because, if you re-arrange the title slightly you might imagine that coverage includes the trendy topic of ‘data science.’ It does not. Neither does it include any computer code. As Robinson explained in an email exchange, ‘I refused to include any code in my book, despite heavy arm-twisting from Princeton University Press. There is publicly available R and Python code for all the techniques I discuss. I hope the book will still be useful when R and Python are remembered only by historians.’

Data Analysis is a math-laden text book covering statistics and, to a lesser extent, time series analysis. We found Robinson’s mathematical treatment hard going and not necessarily the easiest path to enlightenment. A diagram of Dirac’s spike would be more helpful than his integrals! For the less mathematically inclined there is plenty of narrative. The introductory chapter on the laws of probability begins with an image of a dice. Turn the page and you are plunged into a pet topic, the frequentist and Bayesian approaches to statistics. Here we learn, ‘frequency is meaningless for unique events.’ As this presumably includes the estimation of the size of a ‘single’ oil or gas prospect, we are all Bayesians, like it or not.

There is a ‘deep divide’ between the frequentist and Bayesian approaches but Robinson is hard-put to explain quite what this is, in a discussion that is teasingly spread across several chapters. The main chapter on Bayesian statistics starts with a detailed explanation of the false positive problem in drug trials. But both frequentist and Bayesian reasoning give the same result. No ‘deep divide’ here then. A section on fitting a straight line through noisy data makes the differences between the two approaches a clearer, although one is left with the impression that ‘Bayesian’ really just equates to common sense**.

Many ‘big data’ techniques (regression, principle component analysis, Markov chains) are covered. But the field of machine learning and artificial intelligence is conspicuously absent, even though these embed conventional statistics. Elsewhere coverage spans Fourier analysis, convolution and noise analysis although engineers and geophysicists may have their own favorite resources for these methods.

To sum up, Data Analysis is an impressive compilation of the mathematics behind ‘traditional’ statistical data analysis. The decision not to include computer code is understandable but this has disenfranchised those whose fingers are hovering over the ‘compute’ button of a big data/analytical application. Data Analysis’ coverage underscores the gap between statistics and the way it is practiced today in artificial intelligence-oriented application. Users of these techniques may find Robinson’s book a useful reference, but they will likely be in a minority. The worlds of data analysis and data ‘science’ are drifting apart with the former concerned with finding underlying physical causes, the latter more with finding something that just works!

* Princeton, ISBN 9780691169927.

** We are not alone in this surmise.


Interview - Indy Chakrabarti, Paradigm

SVP strategy talks to Oil IT about ‘Paradigm/k,’ the company’s new flow modeler. Current focus is the ‘hard’ problem of shale where k’s physics-based modeling trumps a data-driven approach.

IC Traditionally, a geological model is built in the early days of a field’s life. After a few years, these models are often forgotten, as operational focus shifts to individual well production and test data. This move to quick-look, point analysis means that folks lose contact with the big subsurface picture and the fact that all surface issues originate in the subsurface.

Reservoir modeling is well-established field. What can there be left to discover?

Previously the reservoir could be modeled as a tank or a box, which makes for fast but overly-simplistic calculations. On the other hand, a million cell numerical model may take too long to produce a result. A full scale reservoir simulation exercise may be done once every six months or once a year. In fact, most reservoirs are never simulated!

We hear a lot today about data-driven models. Is your approach data-driven or forward modeling?

We are definitely in the physics-based, forward modeling camp. Paradigm/k is not proxy-model based.

Ok but again, this is a well-studied field, where does the breakthrough come from?

From the guy who wrote the book, Michael Thambynayagam! His 2,300 page ‘Diffusion Handbook*’ is painstaking compilation of the math behind fluid flow in porous media. Our system embeds his work, figures out the flow regime and applies the appropriate equations. Paradigm/k is 100 times faster than current systems and no special knowledge is required of the user. There are no complex decks to build before running a model.

This is a major leap for Paradigm which has to date been more in the geoscience arena. Did you do a deal with Mr. Thambynayagam?

Yes, we have an exclusive arrangement with him and his colleagues to use and develop their techniques.

Paradigm has been closely involved with standards in the past particularly ResqML. Will this and maybe ProdML be part of the new solution?

Sure we can read these formats and also grab data from other vendors’ tools. But are true focus is production optimization, nodal analysis and applications such as artificial lift. Another great facet of Paradigm/k is its collaboration platform, a kind of Facebook for wells. The collaboration tool lets users work in channels. Wells too can have a ‘persona’ and contribute to the ‘discussion.’

Does Paradigm/k talk to scada/DCS systems directly or does it operate downstream of the historian?

It is more downstream of the historian. Actually, Paradigm has its own historian but we can also read data from others such as OSIsoft’s PI system.

We have noticed that ‘shale’ is often an assortment of stringers and shales. How can you model such complexity?

We picked unconventionals as a first target because this complexity is solvable with our approach. Paradigm/k is robust in the face of such complexity. We also layer on top of these models a capability to model complex completions like branched and oriented fractures. Next year there will be more conventional.

Optimization is often divided up into fast, medium and slow loops. Where is k?

We are upending this approach. k makes it possible to do the slow loop every day!

Does this mean conventional geological models (like Skua) are left on the shelf?

No. It means that you can feed the results of k into a complex model like Skua, linking production back into the model. This is an exciting foray for Paradigm as we extend our pioneering, science-based approach into the production arena.

How do you propose to take this forward, are you planning a consortium?

We have shown it in beta to several clients and there is a lot of interest. We are open to how this may progress. We will be working with our initial clients although there may not be a formal consortium.

~

* Thambynayagam’s oeuvre won the 2011 Prose/RR Hawkins Award and he featured in a short film, ‘A Holy Curiosity: The ‘poignant story of one man’s quest to create an epic work of scholarship.’ The film is available on YouTube and includes enthusiastic endorsement of Thambynayagam’s work from Schlumberger’s Michael Prange. At the time of writing the video has had a measly 21 views. Maybe Oil IT readers can move the dial.

As Samhita Shah points out in this issue’s ‘Letters,’ we mangled Thambynayagam’s previous job title in our last issue. He was MD of Schlumberger Cambridge Research, not CTO.


Cognitive Geology raises £2 million for Hutton

Maven Capital Partners and Enso Ventures back Scottish geo-software boutique.

UK-based Cognitive Geology has raised £2 million from private equity houses Maven Capital Partners and Enso Ventures to support roll-out of its Hutton software. Cognitive CEO Luke Johnson told Oil IT Journal, ‘Hutton solves the issue of spatial sampling bias in geological datasets. Oilfield data is biased as higher quality reservoir is always over represented. The problem is worst in offshore datasets. Our TrendAware technology searches the dataset for patterns relating to depositional and burial history teasing-out geologically-realistic ways of explaining the dataset.’

‘While different scenarios may fit the data, this doesn’t mean they are equally probable. Hutton uses quality of fit and the nature of the residual to determine the most realistic scenarios. We find that results emulate the behavior of experienced geologists.’ Hutton’s main contribution is in removing interpreter bias. ‘In almost every peer review I’ve been involved in, geologists get too attached to their preferred scenario and the outcome is too narrow.’ Hutton embeds geostatistical techniques such as variography and sequential gaussian simulation. Johnson confided, ‘an expert could achieve most of this using any geostatistical packages but we estimate that this would takes about 200 mouse clicks in peer packages. In Hutton it takes five or less and can be done right by a novice user.’

The funding will accelerate the roll out of Hutton to oil companies and enable Cognitive to further develop its pipeline of petroleum geoscience software.


SEG technical standards committee

SEG-Y R2 take-up in Norway and Saudi Aramco. Sample implementations under development. Energistics contemplates shift from XML to JSON. Encapsulation/encryption back on the agenda.

At the SEG Standards committee meeting, held concurrently with the Houston SEG, the chair role passed from Jill Lewis to Victor Ancira (both with Troika). BHP Billiton’s Shawn is the new vice chair. SEG-Y Rev 2 has seen take-up, notably in Norway (now recommended in the Yellow Book, as is SEG-D Rev 3) and at Saudi Aramco. The committee is to put sample implementations on the SEG website. An upcoming revision to the SPS navigation data format will be based on the IOGP’s P1/11 formats with extra fields to improve handshake with SEGD 3.1.

Jay Hollingsworth reported on Energistics standards activity. Witsml, Prodml and Resqml have been updated and leverage common technology. The HDF5 format is used for binary data. While Resqml handles horizons and grids, in general, Energistics stays clear of positioning and binning formats and so does not cross over with P6/11. Energistics is now contemplating a shift from XML to JSON following a push from industry.

Tom Owen provided an update from the IOGP whose P1/11 and P2/11 were recently updated to V1.1. The P6/11 bin grid exchange format is now completed and user guides are being developed.

Finally the committee has been asked to look into a new format that can handle encapsulation and encryption, possibly with a move to a more modern approach that would be ‘more AI and computer-friendly.’ Current practice involves reformatting into a format that is more amenable to parallel processing.

Visit the SEG Technical Standards home page.


Letters to the editor

Pete Stark on late career renaissance. Corrections from Emerson and Paradigm.

From Pete Stark, IHSMarkit

Many thanks for your years of insightful coverage of evolving oil and gas IT. It was a pleasure to share in part of the evolution with you. As my 82nd birthday approaches it feels like a good time to hang ’em up. The IT evolution passed me by some time ago but I have enjoyed a late career renaissance that has allowed me to focus on global oil and gas resources and the shale revolution. IHSM is finally implementing my last passionate objective, the integration of well and production data with correlated tops (ProdFit) in the Permian Basin. Borehole data for all 430,000 wells in the basin are now tied to a stratigraphic framework across the producing benches. No more wells with blank or spurious information. IHSM has also packaged ProdFit data within Kingdom. Based on mass balance analyses, IHSM estimates 60 to 70 billion barrels of remaining recoverable oil resources in the Permian basin. This is not classic IT, Neil, but the impacts of this data integration effort on established industry IT capabilities are huge. Work is under way in the Anadarko and other basins.

~

From Stian Engebretsen, Emerson

A small comment on the last issue of Oil IT Journal. In the article, ‘Paradigm sold,’ you say, ‘Coverage is now pretty complete with the possible exception of fluid flow modeling.’ Emerson Roxar Mette is a multiphase flow simulator capable of big complex network simulation, typical well and flow line performance calculations, virtual metering and also transient calculations*.

~

From Samhita Shah, Paradigm

Thank you for the coverage of Paradigm/k but this sentence is inaccurate, ‘The science underlying Paradigm/k was largely developed by ex-Schlumberger CTO Michael Thambynayagam.’ Michael did not develop the solution and he was not Schlumberger’s CTO.

* We also forgot about Roxar’s Tempest (see p.5). Apologies to all.


Software, hardware short takes ...

GE Digital OPM/ServiceMax/Predix Studio/Edge. Hashmap’s Witsml Java SDK. C&C Reservoirs DAKS IQ. Entero One 2017. Librestream Onsight 5000HD. OARS360. PetroVR 2018. Ikon Science RokDoc. Emerson/Roxar Tempest/MORE/Enable and ‘big loop’ workflows.

GE Digital has announced OPM, a new operations performance management solution, an add-on to APM, its asset performance management application. OPM uses real-time and historical data to provides early warning of processes deviation. OPM initially targets the mining industry with expansion to other verticals next year. Upgrades to ServiceMax, include artificial intelligence-derived recommendations for service intervals, leveraging Apache Spark. GE Digital also introduced Predix Studio to help companies build their own applications. GE has expanded its Predix Edge capability to run analytics ‘as close to the source of data as possible.’ GE also announced Predix availability on the Microsoft Azure cloud.

Chris Herrera (Hashmap) has open-sourced a Java-based Witsml objects library/SDK for working with Witsml 1.3.1.1 and 1.4.1.1 data.

V2.0 of C&C Reservoirs DAKS IQ oilfield analog-based knowledge, reporting and benchmarking platform adds ‘frequency figure search’ to retrieve relevant high-resolution figures from an 30,000-strong image inventory.

Entero One 2017, a ‘unified’ energy trading and risk management system includes enhancements to contract management with automated workflows that ‘pave the way to a paperless environment.’ The new release adds server clustering to separate reporting, processing and real time activity across virtual or physical servers.

Librestream’s Onsight 5000HD rugged smart camera now integrates augmented reality and a new dimensioning capability. When dimensioning is activated, the Onsight Smartcam deploys an onboard laser to estimate distances which can be shared live with remote experts or stored for later review.

V3.0 of OARS360’s hosted Operations management system supports interactive, map-based asset tracking. Satellite imagery gives an overview of operations and current asset status.

PetroVR 2018 targets interoperability and support for unconventionals. The new release can import well data from Aries, provides access to PetroVR results from a web browser and more. Watch the launch video.

The latest release (6.5) of Ikon Science’s RokDoc includes a rock physics module to improving the modelling and prediction of unconventional oil and gas reservoirs. Modeling was developed in collaboration with University of Houston’s Lev Vernik. Read his book Seismic petrophysics in quantitative interpretation. Also new is Attrimod, a multi-2D modelling and attribute package co-developed with a supermajor. Watch the webinar.

Emerson’s Roxar Tempest 8.1 provides new ways of modeling flow in the vicinity of fractures in the MORE reservoir simulation module. Tempest Enable includes upgrades to the Roxar App Connector to Nexus and Petrel. Roxar’s portfolio now provides a complete ‘seismic to simulation’ package running in a ‘big loop’ workflow.


ISO 8000 part 115 data quality implications for oil and gas

Peter Eales (MRO Insyte) provides an update on the ISO 8000 suite of data quality standards. Currently awareness is ‘appallingly low.’ Saudi Arabia to mandate use in Vision2030 program.

Data quality and the exchange of data through the lifecycle of the assets are hot topics in oil and gas. Part 1 of the 2009 ISO 8000 suite of data quality standards is grandly titled ‘Master Data: Exchange of characteristic data: Syntax, semantic coding, and conformance to data specification.’ Since then the standard has grown to include quality management, frameworks, data provenance, accuracy and more. Unfortunately, awareness of the standard is appallingly low, a result of poor marketing by ISO. This is set to change in the oil and gas sector with the inclusion of ISO 8000 in Saudi Arabia’s ‘Vision 2030’ industrial strategy. Imports to the Kingdom will shortly be required to be accompanied by an electronic technical specification that conforms to ISO 8000 and major corporations will include the requirement to purchase orders.

The new Part 115 draft standard for quality identifiers will simplify the exchange of technical specifications and may bring a breakthrough in ISO 8000 awareness. Part numbers are common across industry, but such identifiers are not necessarily unique. The new identifier includes a reference to the legal owner of the identifier. A bearing reference 6204 could be resolved to any number of manufacturers, but a bearing reference SKF:6204 can only be resolved to SKF. SKF will own the rights to the SKF prefix just as it owns the domain name for their website. Likewise, companies will have a unique identifier*. The finance industry is also on board and will include this as its legal entity identifier in the upcoming MIFID2 reporting mandate which comes into force in 1/01/2018.

In oil and gas, a pilot study between an oil major, six international component manufacturers and a software vendor is demonstrating use of ISO 8000 in reducing the cost of exchanging and maintaining data throughout the asset lifecycle. Technical specifications for products, in multiple languages, can be stored in a common, free-to-access cloud platform, so that the data is available as and when required. This pilot is being carried out as a use case for another standard, ISO 18101, the oil and gas asset management and operations and maintenance interoperability (OGI) standard. More on ISO 18101 progress in December.

* So for MRO Insyte this will be UK.GOV.companieshouse.E&W:06236771.

Peter Eales is director of MRO Insyte and a committee member on ISO 8000 and ISO 18101.

Those interested in oil and gas quality standards should also be cognizant of Energistics’ National data repository data quality guidelines.


Society of Exploration Geophysicists 2017, Houston

A century after the first seismic patent, the industry is a little shaky as ‘technology delivers more barrels than the world is ready to consume.’ We report on Shell’s production-induced sea floor monitoring. PGS, ‘oil and gas falling behind in HPC.’ UC San Diego, reviving a ‘greatly diminished’ CSEM industry. OpenGeoSolutions ‘beware the AI hype!’ BP ‘is deepwater dead?’ Halliburton fixing ‘unrealistic' DAS expectations. PCA ‘Bayesian statistics fundament of all risking.'

Surviving the catastrophe that was Harvey proved something of a distraction from the other catastrophe that is the parlous state of the geophysical business. While the SEG made the right decision by holding its annual convention a couple of weeks after the major flood, as witnesses by the 7,000 plus attendees, the plenary sessions were somewhat curtailed, to ‘give more time to visit the exhibition.’ SEG president Bill Abriel noted that 2017 represents a century of seismic prospecting that began with Reginald Fessensden’s 1917 patent application for a ‘method and apparatus for locating ore bodies.’ Despite downturns and the graying workforce, SEG demographics are looking OK, albeit with a slight trough in the 35-55 age group. The long term push to globalize the Society has borne fruit. Today, only 22% of corporates are US-based and the 27,000 individual members hail from 128 countries. Abriel highlighted the SEG/Halliburton-backed Evolve education program for young professionals. Evolve is also a certification program, ‘but not licensing!’ Looking forward to next generation exascale, cognitive computing and machine learning, Abriel observed that geophysicists were ‘natural owners’ of high performance computing. The 2018 SEG convention in Anaheim will be a ‘big data/analytics joint venture with silicon valley.’

ExxonMobil’s Steve Greenlee recalled that only a few years ago, the perception was that oil resources were limited. Now technology ‘delivers a lot more barrels than the world is ready to consume.’ Although not much conventional oil has been discovered in the last 15 years, ‘unconventionals have changed the game.’ Is this a lasting phenomenon? Exxon-Mobil’s recent Permian basin acquisition has brought ‘exposure’ to 6 billion barrels at a supply cost estimated by the IEA of $40-45 (Exxon’s figure is lower). But the big issue is depletion. The IEA sees a decline from 95 to 35 million bopd by 2040. Meanwhile, demand is forecast to rise, ‘so we need to find the equivalent of the whole of today’s production again by 2040.’ For this, the world needs a ‘healthy’ geophysical industry, but one that is ‘capable of supporting a low cost of supply portfolio’ which we understand to mean cheap! North American capex is shifting towards unconventionals with, typically, a lower geophysical intensity. While this might be seen as a liability for geophysics, unconventionals are ‘really hard to explore,’ so there is a real opportunity here for geophysical innovation.

For conventional/deepwater exploration, ‘today’s technology is not good enough. Seismics takes too long to process and interpret.’ Also, regulations need to be based on sound science. Here the IAGC is working on advocacy in regard of marine life and regulations. A web search for seismic and marine mammals returns ‘lots of misinformation.’ ‘Houston we have a PR problem! In the Q&A, Greenlee was quizzed on the sustainability of shale. He observed that as elsewhere, there will be cycles related to the oil price. But some of the lower cost, high quality shale projects will last for decades and will impact market for years. Other high cost projects will not come back for any foreseeable time. Regarding shale outside of the USA, Greenlee observed that it was ‘remarkable that there is no [production] as of yet.’ Abriel added that geophysics needs to get closer to engineering in unconventionals, breaking down the silos and getting more integrated. Greenlee concluded that there is a real issue with the health of the geophysical industry. ‘Everything we do depends on a healthy sustainable geophysical industry, but there is no easy answer to the low price problem.’

The special session on the ‘road ahead’ showed that there is some life left in the geophysical dog. There was a good turnout for Paul Hatchell’s (Shell) presentation on ‘seafloor deformation monitoring for geohazards, depleting fields and underburden expansion.’ Ever since the spectacular sinking of the Ekofisk production platform in the early 1980s, industry has been aware of the need for sub cm/year accuracy in monitoring surface deformation. The ‘gold standard’ approach for monitoring was developed by the Scripps Institute and Statoil in 2010 and can measure millimeter changes in 1000 of meters of water. The stations were deployed by ROVs across the Ormen Lange field, but the technique is expensive. Shell is now working with Sonardyne on autonomous recorders that can measure every hour for 10 years! Their low cost means that 175 have been deployed across the field and show subsidence of 2cm/year.

PGS’ Sverre Brandsberg-Dahl promised that de would not talk about big data and analytics in the cloud which was a relief to some. High performance computing has been used in seismic imaging for decades, with the pendulum swinging between different programming models, specialized equipment and now, the promise of commodity HPC services from Google and Amazon. Despite the push for high-end full elastic inversion and reverse time migration, these are not yet routine. The reasons? Turnaround time won the battle and model uncertainty is eating the cake both of which make it hard to justify more fancy physics. To date, algorithmic complexity has been matched by increasing compute power. Even for surveys of 100s of terabytes and up. But new acquisition techniques like continuous shooting and recording and irregular spatial sampling are breaking the classic, easy ‘embarrassingly parallel’ compute paradigm. Brandsberg-Dahl believes that the weather forecasting community does better and has done a great job of explaining the business benefits of HPC, linking the cost of disasters (like Harvey) to the cost of what they are doing. The UK met office has just spent £100 million on a new computer for a claimed ‘£2 billion in benefits.’ ‘We need to make the same argument.’ But HPC is at a crossroads today. Will the ‘unlimited’ flops that the cloud promises be enough? Or is a paradigm shift needed. Brandsberg-Dahl hopes that seismic imaging does not jump onto the commodity hardware of the cloud. For the geophysicist, the compute platform is a competitive differentiator, for both service companies and oils.

Leonard Srnka (UC San Diego) traced the use of marine controlled source electromagnetic (CSEM) prospection back to around 1999. Since then, EMGS has conducted over 540 surveys. CSEM surveying peaked in 2008 then dropped down to 20/year. Today, the CSEM industry is ‘greatly diminished’ due to fundamental physics limits, the non-uniqueness of solutions and ‘unrealistic expectations*.’ CSEM measures resistivity, it is not a direct hydrocarbon detector. It has suffered from competition from enhanced seismic imaging and the downturn. Has the value of the technology been realized? Not according to Srnka. ‘Dry holes have no CSEM response but discoveries do!’ Moreover, ‘no false negatives of any size have been reported after 800+ surveys.’ Other opportunities lie in petrophysical joint inversion and 4D surveys. CSEM should be great for mapping gas hydrates. In Japan, ‘seismic is no longer the preferred method for hydrate research.’ State of the art is represented by EMGS’ ‘Deep Blue’ CSEM, a joint venture with Statoil and Shell. More from UCSD’s Marine EM Lab.

Gregory Partyka (OpenGeoSolutions) has no qualms about the big data/analytical approach and plans to leverage artificial intelligence and ‘do seismic interpretation and reservoir simulation all at the same time.’ This is a challenge because no single database offers the sensitivity and scale to span all the data types involved. Interacting with todays’ data and databases takes too much effort. Moreover, there are inevitably gaps in the skill sets required as we enter the world of big data. The technology behind self-driving cars could provide breakthroughs to seismic processing and interpretation. One word of warning on the big data approach, ‘beware of being dependent on the use of powerful tools in inexperienced hands.’ But if AI is used right we are promised more ‘think time’ and ‘aha’ moments! Another warning, ‘beware the hype.’ In practice, the processing building blocks will stay the same and need to be automated ‘gently.’ AI should make it easier to investigate seismic scenarios. OGS provides a library of pre-rendered images accessible through a browser incorporating ‘motion/animation.’ Suddenly we were watching commercial, a demo in fact! Partyka shamelessly tweaked frequency, ‘Keuler curvature azimuth, noise, spectral decomp and other obscure attributes before concluding with more ‘in praise of AI.’ Partyka suggested that SEG should curate a library of training images.

BP’s Scott Michell asked, in the current oil price environment, is deepwater dead? Current thinking is somewhere between ‘lower for longer’ and ‘low for ever!’ The Gulf of Mexico creaming curve shows a mature basin, the big stuff has been discovered. But now that infrastructure is in place, pipelines need filling up. That’s our job! From 2000 to 2015 deepwater seismic imaging algorithms have seen a lot of refinement. But did this make much difference? Velocity remains the fundamental issue and manual picking of top salt often gives the wrong model. Top salt topography may be very rugose and impossible to pick. Mitchell intimated that in some cases, automated top salt picking beats manual interpretation. In any event, ‘we need better images more quickly for the Gulf of Mexico to be economic.’ And we need a better low frequency acquisition source. Enter BP’s Wolfspar low frequency seismic source. Longer offsets and lower frequencies will help ‘beat the interpreter.’

Brian Hornby (Halliburton) sees digital acoustic sensing (DAS) with downhole fiber optic cables as the way forward for high end borehole geophysics. Borehole seismics has evolved from checkshots, through offset VSP and now, 3D VSP imaging. But the acquisition geometry is awkward and image quality, in the early days (2001), was poor. Overoptimistic feasibility models led to unrealistic expectations. Today things are better with anisotropic velocity models and reverse time migration. DAS is the new kid on the block. Deploying DAS is ‘free’ if the well already has fiber although signal to noise is poor compared to a geophone. In the future, DAS will be a routine supplement to 3D seismics but we need ‘fast-deployable 10k foot arrays.

Patrick Connolly (PCA) sees seismic inversion as transitioning from deterministic to probabilistic methods with Bayesian statistics as the ‘fundament of all risking.’ Stochastic inversion gives a range of equi-probable models, a non-unique solution. But, ‘for the past twenty years we have chosen to ignore this and just select one’. It’s time to take this seriously. AVO studies reduces uncertainty when geological priors such as bed thickness are added to the mix. A lot of software is available for this, Shell’s proprietary Promise, CSIRO’s Delivery, the GIG consortium’s PCube, Ikon’s JiFi and BP/Cegal’s Odisi. Why has take-up been so slow? Because collaboration across many disciplines is involved. But the software is available and ‘dimensionality is not a problem, this is not a compute issue.’

* CSEM’s ‘unrealistic expectations’ were in part fueled by Schlumberger’s Dalton Boutte’s 2004 claim that CESM ‘could replace seismic’ and also by Srnka himself who the following year stated that ‘CSEM methods may prove to be the most important geophysical technology for imaging below the seafloor since the emergence of 3D reflection seismology.’


Folks, facts, orgs ...

Anadarko, Aqualis, Aquilon, Beyond Limits, GE, Chevron, ConocoPhillips, DNV GL, Energistics, ExxonMobil, Flotek, Fugro, Geospatial, Hexagon, Meridian, Michael Baker, Norwegian Data Society, Petrofac, Dresser-Rand, Simmons Edeco, Stratas, Wellsite, Detechtion. Situations vacant.

Anadarko has named Daniel Brown EVP, US onshore operations, Mitchell Ingram EVP, international and deepwater operations and project management and Ernest Leyendecker EVP, exploration.

Bjarte Røed is now head of Aqualis Offshore’s Norwegian operation.

Jeffrey Wagner, founder and CEO of Aquilon Energy services, heads-up the new Houston office.

Shahram Farhadi leads Beyond Limits’ oil and gas unit. He hails from Oxy.

John Flannery takes over from retiree Jeff Immelt as CEO, GE. Vice chair Beth Comstock is retiring and Jeff Bornstein leaves the company. Lorenzo Simonelli is chairman of the Baker Hughes unit where Geoff Beattie is lead independent director.

Mike Wirth is Chevron’s chairman and CEO succeeding retiree John Watson. John Frank (Oaktree Capital) has joined the Chevron board.

Caroline Maury Devine is now a board member at ConocoPhillips.

Liv Hovem is CEO of DNV GL’s oil and gas business. She succeeds Elisabeth Tørstad, who is now CEO of the new digital solutions unit. DNV GL has also named Brice Le Gallo as manager, SE Asia & Australia and Frank Ketelaars as manager oil and gas for the Americas.

Bettina Bachmann (Shell), Gavin Rennick (Schlumberger) and Tommy Inglesby (Accenture) are now member of Energistics’ board.

Vijay Swarup heads-up ExxonMobil’s new New Jersey R&D center.

Josh Snively leads Flotek’s operations in energy and industrial chemistry technology. James Silas joins the executive leadership team. Robert Bodnar ‘will no longer serve’ as EVP performance and transformation.

Øystein Løseth takes over from retiree Paul van Riel as CEO Fugro.

Robert Brook is team lead, Geo-Underground at Geospatial Corporation.

Robert Belkic is acting CEO at Hexagon pending Ola Rollens’ insider trading trial.

Mark Fonda is director of engineering at Meridian Energy Group.

Martin Miner is now CTO at Michael Baker. He hails from Leidos.

Christian Torp is secretary general of the Norwegian Data Society.

John Pearson is head of corporate development and MD, Western Hemisphere, at Petrofac. He was previously with AMEC Foster Wheeler.

Paulo Ruiz Sternadt is CEO at Siemens’ Dresser-Rand unit following Judith Marks’ departure.

Niels Versfeld is CEO of Simmons Edeco. He hails from Gibson Energy.

Stephen Beck is senior director, upstream at Stratas Advisors. He was previously with IHS Markit.

Ryan Henderson is VP sales and marketing at Wellsite. He joins from Peak Completion Technology.

Mark Crews and Dennis Nerland are independent directors at Detechtion Technologies.

Situations vacant

EQT is hiring two independent board members with extensive midstream experience. Russell Reynolds Associates is conducting the search.

Jonathan White has launched oilfield services company The FOS Group in Aberdeen and is looking to create over 100 new jobs.

The Texas Railroad Commission is conducting a search for a permanent executive director to lead the agency’s day -to-day operations.


Done deals

AIP, Brock Group, CGI, Affecto, DNV GL, ComputIT, Pacific Drilling, Nabors, Robotic Drilling Systems, Geospatial, Drillinginfo, DataGenic, EMAS Offshore, Emerson, GeoFields, Intel Capital, SAEV, FogHorn, Hexagon, Luciad, Pelican, Gordon, Sword IT Solutions, Venture.

American Industrial Partners has acquired majority ownership of The Brock Group, a provider of services to the refining and petrochemical industries.

CGI has acquired Finland-headquartered Affecto, developer of ‘Battery’, an energy-focused internet of things platform.

DNV GL has acquired ComputIT whose ‘KFX’ computational fluid dynamics software models flares, dispersion and fire.

Following its delisting from the New York Stock Exchange, Pacific Drilling has entered Chapter 11 and is to restructure its $3.0 billion debt.

Nabors has acquired Stavanger-based Robotic Drilling Systems.

Geospatial Corp. has reached an agreement with shareholder David Truitt to restructure $1.4 million debt.

Drillinginfo has acquired London-based DataGenic, provider of data management and business process automation software to oil, gas and other verticals.

Troubled EMAS Offshore has signed with potential investors for a cash injection of $50 million and is working with its auditors to restart financial reporting.

Emerson has acquired GeoFields, a provider of software and implementation services for pipeline integrity data collection, management and risk analysis.

Intel Capital and Saudi Aramco Energy Ventures are backing analytics and machine learning boutique FogHorn Systems with $30 million raised in a series B round to support ‘disruptive innovation’ at the internet’s edge. Total funding is now $47.5 million.

Hexagon (formerly Intergraph) has acquired Luciad, a Belgian-based specialist in the visualization and analysis of real-time geospatial information.

Pelican Energy Partners has invested in measurement-while-drilling specialist Gordon Technologies with $330 million of committed capital.

Sword IT Solutions has acquired UK-based Venture Information Management, provider of data and information management project support and consulting services to oil and gas.


Going, going... green

Shell, Total join Statoil in Gassnova CCS. IBM’s methane detector on-a-chip. Shell blog on COP23. DOE/NETL $4 million for CCS geoscience. Port Arthur milestone. CCG validates VCQ Parachem testbed. Seeq AI for JJ Pickle Separations research. RFF’s ‘E3’ carbon tax calculator.

Statoil, Shell and Total have entered into a partnership to ‘mature’ development of carbon storage on the Norwegian continental shelf. Norske Shell and Total E&P Norge have joined the contract that was awarded earlier this year by Gassnova to Statoil. Phase 1 capacity is 1.5 million tpa. CO2 from onshore industrial facilities will be shipped and pipelined for injection into the Troll field.

IBM has developed a methane spectrometer ‘on a chip’ that could be used to create an inexpensive sensor network that autonomously monitors for natural gas leaks. The device can detect methane in concentrations as low as 100 parts-per-million. IBM is working with partners in the oil and gas industry to deploy the devices and replace in-person site inspection. More from the Optical Society.

Shell’s climate blogger-in-chief, David Hone, back from COP23, concludes that, ‘much remains to be done’ if the full Paris ‘rule book’ is to be finalized by the end of COP24. ‘In all likelihood, resolution will not be complete, but the process won’t falter either and the ongoing mobilization of global effort will continue.’ Negotiators will meet again in Bonn around the middle of the year and then in Katowice, Poland for COP24 where, ‘the biggest fight is most likely to be over hotel rooms.’

The US Energy Department has selected two projects as recipients of $4 million in federal funding for cost-shared R&D into the safe storage of CO2 in geologic formations following funding opportunity announcement DE-FOA-0001725. The National Energy Technology Laboratory will manage the projects which include integration of seismic-pressure-petrophysics inversion of continuous active-source seismic monitoring and joint inversion of time-lapse seismic data. The DOE also reports that its Port Arthur CO2 capture project has hit the 4 million tonne milestone.

Carbon Consult Group has been appointed official partner of the Valorisation Carbone Québec project and is to measure and validating the impact on greenhouse gases of the carbon capture and utilization technologies being tested at Suncor’s Parachem facility in Montreal-East.

At the 2017 Emerson global users exchange, Seeq demonstrated how analytics and AI to are helping the venerable Separations Research Program at the UTx/JJ Pickle Research establishment shift ‘from reactive to predictive operations.’ The unit is currently researching CO2 removal from stack gas.

The E3 Carbon Tax Calculator from RFF lets users test different carbon tax scenarios against a model of the US economy. The E3 model is described in a forthcoming book, Confronting the Climate Challenge: US Policy Options from Columbia University Press.


Sharp Reflections’ big seismic data analysis in the cloud

Pre-Stack Pro ‘bridge’ between seismic processing and interpretation now available on Amazon EC2.

Sharp Reflections has announced cloud-based availability of Pre-Stack Pro (PSP), its seismic processing and interpretation package. PSP running on servers in the Amazon cloud costs from €50k per year. The high-end package is accessible over a ‘reasonably good’ internet connection, 25 Mbps for interactive sessions and ‘at least 500 Mbps’ to transfer pre-stack data. If needed, data can be sent to Sharp Reflections for uploading or, in the US, Amazon’s Snowball service can be used for couriered data transfer. User workstations will need the latest version of HP’s remote graphics client.

PSP leverages Amazon’s EBS encrypted storage and security certified to ISO 27001 and SSAE-16 SOC 2. Layered access control includes public/private SSH keys, a random password generated for each session and the Amazon EC2 firewall. PSP acts as a bridge between acquisition and processing, performing remedial processing for AVO studies and parameter selection. PSP includes horizon tools to pick pre-stack reflections on conventional and full-azimuth gathers, said to be key to shale targets where AVO vs azimuth is a reliable predictor of fractures. The system has been tested on an 8 TB TGS multi-client dataset from the Utica shale.


Accenture/Microsoft annual upstream survey of ‘digital'

Upstream oil and gas ‘trends’ report underwhelms.

It’s been ten years since the combined efforts of the marketing departments of CERA, Accenture and Microsoft ‘invented’ the digital oilfield. You might have expected it to be a done deal by now. For many it is, and the marketing world has moved on to big data and the internet of things.

Accenture and Microsoft’s 2017 Upstream oil and gas digital trends survey. finds, underwhelmingly, that operators ‘expect shorter time to produce oil and gas due to digital technology investments.’ The survey reports on complicated but trivial shifts in the digital hit parade with ‘faster and better decision-making’ and ‘shorter time to first oil’ holding the same number one slot as last year. Is anything new? The ‘next wave’ of digital that has the potential to ‘further transform’ the business, despite ongoing low oil prices. The next wave is not about petabytes of data, rather ‘HPC, wearables, robotics, artificial intelligence and blockchain.’


Sales, partnerships, deployments ...

4DMapper/Blue Marble, Atlas Oil/Preventative Maintenance Technology, CH2M, Bahri Data/DNV GL, GE/Apple, Honeywell/Hyperion Group, Key Energy Services/Mix Telematics, DNOW/Badger Meter, XACT Downhole/Tendeka/Enventure Global, EPI Group/ONYX, SAP/Nvidia, Wefic, Kinetic, WellAware/CheckPoint, Wood Group.

4DMapper is using Blue Marble’s global mapper SDK to develop its online analytics services.

Atlas Oil and Preventative Maintenance Technology have teamed on fuel supply and generator equipment solutions.

BP has awarded CH2M a contract for marine engineering support on its west African Tortue development.

Bahri Data and DNV GL are to jointly leverage their big data capabilities for safety, quality and compliance.

GE and Apple have partnered to deliver predictive data and analytics from Predix to the iPhone and iPad. A Predix SDK for iOS is available for third party developers.

Honeywell and Hyperion Group have signed a three-year agreement to extend the range of solutions sold in Russia. Hyperion is to sell and support Honeywell Connected Plant UniSim Design Suite.

Key Energy Services has chosen Mix Telematics’ MiX Fleet Manager to improve fuel economy and fleet efficiency and to ensure ELD compliance for its vehicular fleet.

DNOW is now an exclusive international distributor of Badger Meter flow measurement and control technologies.

XACT Downhole Telemetry, Tendeka and Enventure Global Technology have formed a ‘micro-alliance,’ the Well Performance Network to offer a collective solution for ‘challenging’ wells.

EPI Group and ONYX have signed an agency agreement. EPI Group is now part of the ONYX portfolio of services within Malaysia.

As a result of their expanded collaboration earlier this year, SAP has upgraded its DGX-1 systems in St. Leon-Rot with Nvidia’s Volta-architectured Tesla V100 GPUs. The system is used as a testbed for SAP’s Leonardo machine learning foundation.

Singapore-based Wefic and Kinetic have launched a service facility for high-end offshore wellheads in Australia.

WellAware and CheckPoint have signed a partnership to offer pump rate monitoring and control for CheckPoint’s pumps. WellAware’s industrial IoT lets operators monitor and adjust pump parameters remotely, reducing downtime and cutting costs.

Wood Group has won a $12 million FEED contract from Honghua Group on its LNG platform development in the West Delta area of the Gulf of Mexico.


Standards stuff...

Energistics ETP ratified. OGC WFS/CAT validators. IEE fog computing work group. OSGE/IGI team on open geo-data and GeoForAll. IIC reports on testbeds. W3C spatial data on the web ontologies.

The Witsml special interest group has ratified the Energistics transfer protocol ETP implementation specifications for Witsml v2.0 and Witsml v1.4.1. The SIG is working to add OData-based query for ETP v1.2 and on leveraging the Prodml wireline formation test run object. Certification and compliance was also discussed. Energistics is now looking at how standards can help companies operate in a technology environment spanning data lakes in the cloud. A data assurance and analytics communities of practice is to pilot real soon now. More from Energistics.

OGC, the Open geospatial corporation, has published beta validators for its various standards including WFS, its web feature service and CAT, the catalogue service. OGC is also asking for comment on TimeseriesML 1.2, an interoperable exchange format for a range of time series data exchange requirements and scenarios.

A new IEEE working group has formed to create fog computing and networking standards based on the OpenFog Consortium’s reference architecture. The reference architecture is designed to enable the ‘data-intensive’ requirements of IoT, 5G and AI applications. The open, interoperable, horizontal system architecture will distribute computing, storage and networking closer to the users along a ‘cloud-to-thing continuum.’

OSGEO, the Open source geospatial foundation and IGU, the International geographical union are to collaborate on the distribution and use of open geo-data and on the development of related GIS and remote sensing software. The bodies will also support the GeoForAll initiative.

The Industrial Internet Consortium has just reported on its ongoing series of testbeds. These include a time-sensitive networking (TSN) testbed of an enhanced ‘deterministic’ Ethernet conducted by National Instruments and Bosch Rexroth. Other testbeds include equipment condition monitoring and predictive maintenance with IBM Watson. Network security has been tested against Micro-soft’s Stride IoT security analysis tool.

The W3C’s spatial data on the web working group has published a recommendation of a ‘time ontology in OWL’ specification. The ontology provides a vocabulary for expressing facts involving temporal and positional information. The workgroup has also recommendation of the ‘semantic sensor network ontology’ for describing sensors and their observations with application in the ‘satellite imagery, large-scale scientific monitoring, industrial and household infrastructures, social sensing, citizen science, observation-driven ontology engineering, and the web of things.’


Safety first ...

Fieldbit Heros for BP. ProcessMap for TNT. CSB reports on Torrance. California refinery standards.

BP reports enhanced safety in field service operations following trials of Israeli Fieldbit’s ‘Hero’ augmented reality and knowledge capture solution. Hero lets a remote specialist see what a field technician sees for accurate diagnosis. Instructions can be fed back to the operator via a mobile device or through ODG R7 ‘smart’ glasses.

TNT has deployed ProcessMap’s enterprise data intelligence cloud software for risk, health and safety management. The solution replaces multiple manual, disparate systems and consolidates safety data in real-time.

The US Chemical safety board has released its final report and video investigation of the 2015 explosion at ExxonMobil’s refinery in Torrance, California. While there were no fatalities, the blast caused serious damage to the refinery which ran at limited capacity for over a year, raising gas prices in California and ‘costing drivers in the state an estimated $2.4 billion.’ CSB recommended the operator deploy a ‘more robust process safety management system’ and concluded that the unit was operating ‘without proper procedures.’

Title 8/section 5189.1 from the California occupational safety and health standards board strengthens workplace safety and health at oil refineries with a framework for ‘anticipating, preventing and responding to hazards at refineries.’


Nubeva gets Chevron Tech Ventures cash

StratusEdge’s ‘cloud service chaining’ technology secures oil and gas cloud migrations.

Chevron Technology Ventures is backing San Jose, California-based Nubeva whose StratusEdge software has been developed to support oil and gas cloud migrations. StratusEdge extends a company’s data center security controls into the cloud, preserving investments in security capability, policy and personnel. StratEdge’s ‘cloud service chaining’ technology plugs gaps in current networking and control functionality in cloud platforms.

StratusEdge monitors packet traffic between virtual machines, networks and platform services such as .NET/PaaS. Security teams can deploy the same ‘next-generation’ firewalls or complete security technology stacks inside their Microsoft Azure or Amazon AWS clouds, as they do in their private data centers.

Nubeva CEO, Randy Chou said, ‘A migration to the cloud does not necessarily mean replacing and re-architecting everything. We help companies extend existing security and visibility investments into the cloud and to hybrid datacenters.’ The CTV Catalyst program supports early-stage companies working on technologies that can directly benefit the oil and gas industry.


Quorum acquires WellEZ

Cloud-based well analytics extends digital oilfield footprint. myQuorum Design Studio announced.

Houston-based Quorum Software has acquired WellEz, provider of cloud-based well lifecycle analytics. WellEz CEO Charles Jeffery said, ‘Customers can now selectively and logically expand their digital oilfield footprint and benefit from an integrated suite of cloud-based applications including AFE, production, land, and business intelligence.’ WellEz expands Quorum’s product portfolio with well lifecycle reporting, downtime and asset tracking and data-driven wellbore schematics. WellEz reinforces Quorum’s ‘cloud-based, mobile-first’ approach that integrates with clients’ business processes. All of the WellEz team are to join Quorum.

Quorum recently unveiled its myQuorum Design Studio at the Microsoft technology center in Irving, Texas. DS extends Quorum’s ‘persona-based’ platform with a workbench and software development kit. Quorum claims that 80% of processed US natural gas is accounted for by its software. Quorum is a portfolio company of Silver Lake/Kraftwerk.


Back to school

Boxley Group CompetencyIQ. Landmark STEPS. SEG Evolve. Lloyds register VR safety simulator.

Houston-based Boxley Group reports that a major North Sea operator received ‘near exemplary’ status from the UK regulator for its drilling and completions competency management program. Boxley’s CompetencyIQ patent-pending technology and processes were key to the success and were noted as ‘industry leading.’

STEPS, a component of Landmark’s university outreach program fosters geoscience excellence by facilitating research with pre-package datasets, project outlines, software and support. The theme for 2017/18 is big data in E&P. Tenders are open now.

The Society of Exploration GeophysicistsEvolve program, a joint venture with Halliburton, leverages the ‘iEnergy’ cloud for virtual, real-time collaboration between scientists and engineers. The SEG is recruiting teams for the 2017-2018 session and will be providing data, software, and training materials along with an online forum for discussions and technical mentoring. Teams will present their results at the 2018 SEG in Anaheim.

Lloyd’s Register launched its new virtual reality safety simulator at the 2017 Offshore technology conference in Houston. The VR safety simulator is said to support training and knowledge transfer in the energy industry. Users experience real-life challenges in a ‘non-threatening’ environment.


Artificial intelligence and the oil check

Why ‘predict’ when you can measure? Spectro’s MicroLab claims best of both worlds.

Much AI hype turns on the use of big data in equipment maintenance. So it’s interesting to hear from Spectro Scientific of their specialist hardware and software combination that provides a direct measure of rotating machinery health, using, inter alia, AI.

Spectro Scientific’s MicroLab is an automated engine oil analyzer that provides multiple measurements of chemistry, viscosity and contaminant concentrations. New V11 software adds signal processing methods, automation and artificial intelligence to provide comprehensive results in less than 15 minutes. Detailed oil and machine condition information enables companies to take maintenance actions immediately, minimizing downtime and lowering costs. Onsite oil analysis is said to safely extend the time between oil changes, reducing oil expenditures and helping to achieve green constraints relative to oil disposal.

A new user interface permits input of external data to produce consolidated fluid analysis reports that include several parameters for oil and equipment condition not previously available in the MicroLab test suite. The update includes a new, easy to read report format with improved historical trends.


Weatherford, Intel and the IoT oilfield gateway

Gateway embeds Intel security, Wind River cloud with ForeSite production optimization.

At the recent IoT Solutions World Congress in Barcelona, Weatherford IT head Colin Tait presented a digital oilfield technology collaboration with Intel that has resulted in the Weatherford IoT gateway, a means of transmitting data from production equipment in the field into the cloud or data center. The Weatherford IoT gateway leverages Intel’s Secure device onboard service and the Intel Wind River Helix device cloud for device management. Data can be visualized and analyzed with the Weatherford ForeSite production optimization software, released earlier this year.

Tait said, ‘By harnessing cloud computing, advanced analytics and the IoT, we can build an end-to-end digital oilfield solution that yields greater efficiencies across the upstream oil and gas sector.’ The ForeSite platform, released by Weatherford in May 2017, connects and analyzes data from across the production ecosystem to maximize asset performance. Initial focus is on rod-lift systems with planned expansion to other forms of lift, well management and optimization at the reservoir and surface-facility levels.


DNV GL’s 1,000-strong digital solutions unit

New unit to boost Veracity cloud data platform. GE scales-back on Predix.

GE has previously reported having ‘up to 20,000’ developers working on its Predix IoT platform. DNV GL, more modestly, reports a new, ‘1,000 strong’ dedicated digital solutions organization (DSO). DSO is led by Elisabeth Heggelund Tørstad, previously CEO at the oil and gas unit. DSO will manage the new ‘Veracity’ platform. Veracity, roughly equivalent to GE’s Predix, is designed to ‘extrapolate meaning’ from the user’s data and to be a data source for other DNV GL applications including software-as-a-service.

DSO is also concerned with DNV GL’s own digital transformation. CEO Remi Eriksen said, ‘Data is the raw material of the 21st century. It is the foundation and driver of the digital transformation and forms the basis of value creation.’ One of Tørstad’s first tasks is to hire a Chief digital transformation officer.

Over at GE, the picture is less rosy. Speaking at an investor update in New York this month, CEO John Flannery announced an exit from $20 billion of ‘non-core or smaller businesses.’ While Predix is still key to GE’s future, its scope will no longer include ‘vertical-specific solutions for adjacent industrials.’ Some $400 million costs are to be eliminated from the Predix unit.


Baker Hughes leverages Matlab ML/NN toolboxes

Huge savings expected from failure predictions derived from terabyte frac fleet training data set.

GE unit Baker Hughes reports that it has used Matlab’s statistics and machine learning and neural network toolboxes to predict failure and optimize maintenance of its frac fleet. High pressure pumping equipment accounts for some $100,000 of the $1.5 million total cost of a truck. To monitor the pumps for potentially catastrophic wear and to predict failures, a terabyte data set from 10 operating trucks was analyzed with a neural network.

Baker Hughes’s Gulshan Singh said, ‘Matlab converted unreadable data into a usable format and allowed us to automate filtering, spectral analysis and transform steps for multiple trucks and regions.’

Singh sees many advantages in using Matlab, ‘The first is speed, development in C or another language would have taken longer. Matlab also helped automate the processing of large data sets. Finally, Matlab offers a variety of technologies working with data, including basic statistical analysis, spectral analysis, filtering, and predictive modeling using artificial neural networks.’

MathWorks helped Baker Hughes develop a script for parsing binary sensor data. Signal processing determined which signals in the data had the strongest influence on equipment wear and tear. Neural network analysis found that pressure and vibration sensor data was the best predictor of failure. The application is expected to bring savings of ‘more than $10 million per year’ and ‘reduce overall costs by 30–40%!’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.