The opening plenary session of the 80th annual conference of the EU Association of Geoscientists and Engineers (EAGE) was graced with a royal presence in the person of HRH Prince Joachim of Denmark. In a short but to the point soliloquy, the Prince observed that although Denmark was undergoing a ‘rapid transition from fossil fuel to renewables … until green energy can fully meet our needs, oil and gas will remain the energy backbone of modern society.’
EAGE president Jean-Jacques Biteau announced a registered attendance of 4500 and an astonishing 1,229 papers and 500 posters. Coverage extends beyond oil and gas, with sessions on geomechanics and geothermal and a renewed focus on geology but, ‘we want to keep our brand which is mostly geophysics*’. The EAGE is also ‘modern’ with special events covering machine learning, AI and more.
The ensuing set piece was a ‘debate’ (in reality there was little debate and ‘no questions’) hosted by Danish environmentalist Martin Breum who observed that the ‘transition era’ has begun as ‘Equinor has taken over Statoil, at least in name.’ Breum noted the paradox of a ‘growing demand for oil and gas at the same time as climate change is at top of the agenda’.
Arnaud Breuillac stated that Total’s role is to provide affordable energy for all. Breuillac cited a study by the Carbon Tracker organization to claim that Total is the only major that has ‘fully integrated [the COP 21] 2°C scenario into its strategy’. This is done by improving energy efficiency and reducing carbon in operations along with a ‘credible’ (i.e. profitable) reduction of oil and gas use over the next 20 years. Total does not see any short/medium term impact to its business. If as the IEA forecast, renewables will produce 20% of world’s energy in 20 years’ time, that means that 80% will be oil and gas! If it is profitable to go green faster, ‘Total will be at the forefront.’
Breum quizzed Statoil/Equinor’s Jez Averty on the name change. We are on a ‘journey’ across an uncertain landscape of scenarios and outcomes. In 2050 the world may see anything from 60 to 120 mmbbl/day of demand. Meanwhile, carbon prices, photovoltaic and e-vehicles are on the up. But so is coal, oil and CO2. ‘Equinor’ reflects a revised strategy addressing high value, safe and low carbon energy needs including new renewables business areas with an ‘ambitious target’ for capex share across renewables by 2030. The aim is for both profitably and ‘sustainability.’ Breum expressed puzzlement as to what happened to the ‘Stat/state’ part of the old name.
WoodMac’s Paul McConnel was up next, expounding on the latest Global Trends/Energy Outlook surveys. These foresee peak demand coming first for coal first and later for oil and gas. Global CO2 will continue to rise and the Paris (COP21) targets will not be met, even though electric and renewables are growing. But there are many uncertainties and portfolios need to be flexible. ‘Uncertainty is the word. Oil, coal, gas and renewables all interact.’ Breum pressed on the likelihood of a drop in demand for oil. This is possible – with the price maybe down to $5-15/bbl before peak oil finally catches us. Different companies have already adopted widely different stances. Dong has already made a complete transition to renewables. Total and Equinor have made some moves but are still basically oil and gas companies. Exxon has unashamedly adopted a ‘last man/last barrel standing’ position. Total does not like the last man standing theme and is working on CCS in a JV with Equinor. Equinor does not like this either, ‘we need a coordinated policy response now to CO2’. The key is to ‘turn CO2 into a value chain and asset’ (thermodynamics notwithstanding!).’ Total has a new unit investing in solar, wind, batteries (SAFT) and the ‘battery of the future’ all of which will ‘integrate the value chain alongside fossil fuel.’
McConnel wondered aloud why oils should be into renewables. They may not have much choice if electric vehicles (EV) get popular. How will oils respond to an ‘evaporating’ customer base? EVs ‘cost virtually nothing to run’ and wind costs are coming down. Breuillac agreed that EVs’ impact will be ‘massive,’ perhaps accounting for 6-8 milion bopd by 2040. While this is significant, it depends where the energy will be coming from. Today, much electricity in China comes from coal. Total plans to integrate across the value chain because ‘we don’t know where the disruption will come from.’ One significant development is Total’s involvement with French smart grid startup Greenflex. Averty opined that a ‘revolution’ is needed to accommodate a 2° scenario, one that decouples prosperity from energy use. Having said that, what happens in Norway is ‘totally irrelevant’ in the greater scheme of things. Even the EU is ‘a bit’ irrelevant. What counts is what happens in China and India. McConnel stated that China is going for an energy transition with lots of EVs and a massive investment in batteries and solar. Total tests all new projects for energy efficiency using a €30-40 carbon price. The company has also established a technology hub in Copenhagen following its acquisition of Maersk. ‘Our message to the young of today is that we are not an industry of the past.’
* ‘EAGE’ originally stood for the EU Association of Exploration Geophysicists.
Martin Breum was back the next day to compere the EAGE ‘Digitalization of the E&P Industry Forum’. While ‘digitalization’ is the big thing today, BP’s seismic imaging guru John Etgen pointed out that the industry has been digital since the 1950s with seismic as the original ‘digital business.’ Today, BP views digital as a way to stay competitive. Digital barrels are the cheapest in the portfolio, even in a world of relatively abundant resources and production. Digital has proved its worth in interpretation and in production optimization. In the future, ‘machine-driven’ solutions will enable the ‘Connected Upstream®.’ Breum asked how many of BP’s 5000 employees were involved in the digitalization effort. In fact, there are only some 50 ‘hard core computer nerds’ but another 2,500 engineers are involved.
Schlumberger’s Ashok Belani gave a less nuanced, full-Monty sales pitch for the digital capabilities that ‘allow us to disrupt innovation and business processes.’ Digitalization of E&P is ‘substantially different’ to previous digital work. ‘We should work with all North Sea data 100% of time but we don’t.’ ‘We should work with terabytes all of the time not just today’s multi gigabytes.’ Data is currently loaded into point applications. This is not how things will work in the future. In the future, all data will be available to all apps all of the time. Data will be exposed to machine learning and apps will become ‘interfaces to humans.’ This was demonstrated with ‘AI for tectonics’ automatic fault picking and top salt interpretation. This is ‘not rocket science, all are doing this.’
Chief geophysicist Darryl Harris stated that Woodside is reaping large benefits from data science. Data driven decision making means avoiding pre-conceived ideas and misconceptions. Woodside has hired lots of young data scientists to whom you can ‘ask any question.’ The idea of ‘citizen data scientists’ is also taking hold.
Repsol’s Francisco Ortigosa is also keen on the democratization of computing. There is currently too much software for seismic imaging. Repsol alone had some 27 apps developed for geophysical high performance computing – which were only used by 12 people. Today, these have been combined and made available in the cloud to Repsol’s 500-strong team of geoscience professionals. This is bundled in a joint venture with Microsoft as Inventemos el futuro, a democratization of geophysics with ‘fully data-driven automatic seismic interpretation.’ The cloud-based tool has ‘picked 90 horizons in under 5 minutes’ sans human intervention. Repsol has also move all of its data to Microsoft’s Azure Dublin data center which adds Microsoft security to its own infrastructure security.
Total SVP Michael Borrel stressed the importance of the EAGE as a key event for Total having just completed the acquisition of Maersk. Total is currently spending around €300 million on upstream R&D and some €30 million on digital. For Total, digital transformation is about ‘iPhones, Windows phones (sic)… and ‘simply an enabler of more efficient, safer operations and more profit’. A topsides data lake is ‘changing the way we work.’ Other initiatives of note include Maersk drilling’s teaming with IBM and a collaboration with Google in the cloud. Borrel cited the oft-repeated notion that current work practices break down as 85% on repetitive tasks and 15% creative tasks. The ‘hope and expectation is that we can reverse this and spend 85% on creativity*.’
John Etgen tempered the enthusiasm for bringing the big IT organizations on board. These are not the only resources, even though there may be some intersections. If you focus too much on the Microsoft/Google ecosystem you are likely to miss much of the potential. BP’s digital business and venture capital arm try to look beyond the public cloud/IT providers.
Belani countered that soon most compute infrastructure will be in the cloud so the faster you move, the better off you are. Infrastructure is a thing of the past the way forward is a technology stack that combines many ecosystems and lets ‘upwards of 100 companies’ work together. But Belani was only just getting going. ‘Oil and gas will NEVER lead in the digital world. There is NOTHING, NOTHING in oil and gas that is utilizing the cloud fully today. There is no question as to whether we are leading, we are not even using it! There is all this unused capability, get on with it. If we don’t adopt it as a priority we won’t get people with the right capability.’
Etgen observed that the days of the traditional oil and gas company are numbered.‘We produce energy now, we don’t need to lead in machine intelligence when we can harvest the technology’.
On the topic of security, Etgen observed that many large enterprises have been hacked. Nobody can claim to be 100% safe. Security is not really a cloud issue. Etgen added that he did not believe that the transition to the cloud would go as fast as some think/advise. No one will deploy HPC excluively in-house either. There will be a shift to other near-shop suppliers and the cloud. You should think of the computer as a printer or a toaster, sometimes in house is better. For service providers the cloud is OK, but for specialized R&D, inhouse is better.
Harris opined that the Googles and Amazons have been working on security for longer than we have. It is arrogant to think that we could do better. Belani agreed, it’s their profession to do this. Data is safer in the cloud than inside an IoC, let alone smaller companies. Gmail is one of safest platforms around. Borrel complained that inside Total ‘We are not allowed to use WhatsApp, although it is the most secure system in the world!’ Etgen added that security can be made a non-issue. The real problem is getting the right commercial terms and the fact that many have vested interests in perpetuating the status quo. Harris agreed that cost is a major issue, ‘it can be cheaper to do stuff in house than in the cloud’. Ortigosa disagreed, ‘the cloud is much, much cheaper’ Belani allowed that an FWI application may not be available in the cloud, but for a 1,000 CPU cluster, there is no comparison. Etgen looked on skeptically.
Breum turned the debate around to the question of what the role of the geoscientist would be in a digital future with an entreaty to ‘answer with all sincerity!’ Borrel answered that, in the in near and medium term, computing will allow geoscientists to create more value, by taking out the grunt work. He was less sure about longer term prospects for geophysicists. Harris noted that good geoscience is coming up with hypotheses and testing them. What happens when the computer generates the hypotheses? This is maybe some way down the road. Belani stated that new reserves will be found by individuals, not by machines. Machines will make the geoscientist’s life more interesting. Automation leaves more time for judgement. Is geoscience going to go away? No! I don’t know why people think like this! For Etgen, the ‘bandwidth’ of the human eye-brain system is hard to beat, ‘especially for driving insights.’
Bream asked what sort of skills are required today. Etgen observed that jobs have always changed. Today, we still need people trained ‘classically’ but also folks trained in data science. In the field of seismic imaging, the trend towards hyper-specialization needs to stop although ‘there is no easy answer to this one’.
Howard Leach related how BP has swung back and forth over the years from an ‘asset focused’ organization circa 2000 when geoscience and drilling was co-located in an asset team. A decade or so later, BP ‘re-balanced’ with a more function-based organization, centralizing processes for risk management. This then exposed the ‘challenge of cross-function integration’ and too much multi-tasking, with ‘10 balls on the football pitch’. Individuals experienced ‘contextual overload’, working on one problem for an hour, then switching to another, reading email and so forth. BP also noted that in a data room, people work well together outside of standard processes. The company is now trying to leverage this finding with ‘LEAN’ processes with the objective of delivering a specific product cutting out the multi-tasking. An agile approach that delivers a ‘product’ inside a week.
Tim Dodson Statoil/Equinor observed that in the last decade the context, and expectations, have changed. Companies are at a crossroads of both an energy transition and a digital transformation. Regarding digital, Equinor’s stance has changed since last year. Today, Google and others have overtaken oil and gas which has lost its lead in big data. Equinor has engaged in self-examination to identify skill gaps. While there are ‘no obvious gaps’ there is a need for people who can on an integrator role across project, subsurface risk, politics and tax.
For ENI, as Luca Bertelli told, integration has been a multi-year journey that has resulted in a ‘design to cost’ exploration model, using capital effectively through short cycles. This aims to compress turnaround time to two years from discovery to FID and another 2.5 years to startup. Eni has moved from a sequential to a parallel approach with early project screening by a multi-disciplinary task force and by the application of ‘high performance computing!’ Conventional exploration is now conducted with an ‘unconventional-like’ approach to achieve early cash flow. Digitalization as enabler of upstream integration and Eni now ‘accepts additional risk mitigated by accelerated digitalization’. In Eni now, a big 3D seismic imaging project is done in 5 days, ‘it used to take 10 months’.
Rune Olav Pedersen (PGS) presented a less rosy picture from the standpoint of a seismic contractor that has seen a 60% drop in revenue and let go of 50% of its staff. The company has embarked on a ‘project-oriented’ organization, getting people to work differently, moving people around and training leaders. The company is also working on new stuff, marine vibrators, machine learning and big data, through JIPs, consortia and technology collaboration agreements. Here, there are challenges with IP, ownership and commercial models. Pedersen touched on a sore point re HSEE. ‘You all have your own safety standards and audits and you all audit the same thing! This is time-consuming for you and costly for us and it does not advance safety. Operators should share safety audit standards!’
Marc Gerrits Shell EVP global sees data as the fuel of ‘new and disruptive technology’ with new players, new platforms and ‘unlikely’ partnerships. However, while the value of big data is a given, it is not clear which collaboration model will prevail. When will we share, when will we treat data as providing a competitive advantage? How do we mitigate the risk of being digitally disintermediated? Gerrits agreed with Pedersen that HSEE was one of many examples of gross inefficiency. We need to standardize the safety audit and produce a win-win. We need our existing and new partners. ‘Nobody has monopoly of new ideas and best practices’. Shell no longer dictates ‘this way you do it’ to contractors, but now asks ‘what do you think? can this be improved or done more cheaply?’
Data standards underpin Total’s multiple partnerships and “extended enterprise.” Total’s Pierrick Gaudin observed that e-standards are a must have for our business, although awareness has been low in the past. Total is working to rectify this situation with a firm commitment from management and the publication of company rules relating to e-standards. Total also now has a transverse organization to handle its data strategy and every new project embeds data management.
Ross Philo thanked Total as a ‘staunch supporter’ of Energistics’ and as a lead developer of the standards portfolio. Today Witsml, Prodml and Resqml all run atop the common technical architecture and support the combination of data in cross-functional workflows. Today there is a lot of hype regarding big data and AI. Some imagine that ‘somehow data will be magically transformed and that standards don’t matter anymore’. Nothing could be further from truth. Users, whether humans or machines, must have trust in their data. The ‘best analytics are worth nothing with unqualified data’. Data standards are the key to a successful digital transformation. Here Resqml is envisaged as the focus of 3D gridding, static and dynamic reservoir analysis and geology. All Energistics standards now include a data assurance object and activity log so that users can track trust and establish confidences level in their data.
Francis Morandini provided an update on the use of Resqml in Total’s in-house developed Sismage-CIG integrated platform for geoscience. Sismage now does data management, interpretation, modeling and field monitoring. Such in house-developed software allows for rapid deployment of novelties and fixes from user requests. Total likes to cherry pick its interpretation tools. This is possible with the Resqml’s open data model and other open source solutions. Most data types (including Petrel) can be exchanged and Resqml can easily be extended to other data types and data model using the open source FESAPI. Resqml is now used by large and small companies to develop in weeks what took months before. Total is seeking partnerships around the use and development of the technology.
Gaudin presented Total’s use of Witsml to stream data from remote drilling locations into its real time operations center in Pau, France. Total has its own Witsml database and a certification module for Witsml data streams. The RTOC is to be extended with machine learning and analytics as a component of Total’s ‘Ambition2025’ program. But there is still a need for data evangelization. Both top-down from management and bottom-up from users. Questioned (by Oil IT Journal) on Resqml persistence (Energistics has traditionally focused on ‘data in motion’) Gaudin outlined a joint venture with Emerson/Paradigm on OpenDB, a 1:1 mapping of Resqml into an HDF5 data store. Currently ETP is “not too cloud-compatible” hence the idea of using Resqml for micro-messages to and from the cloud.
The uptick in Argentinian shale gas activity forced EAGE president-elect Juan Soldo (YPF) to leave the EAGE board. Current president Jean-Jacques Biteau (Total) is to stay on another year. Biteau recalled the objectives of the EAGE’s strategy for the current year as globalization, membership and a ‘one stop shop’ for E&P knowledge and community. He also announced a new joint venture with the Petroluem Exploration Society of Great Britain, a machine learning workshop to be held in London in November 2018. The EAGE has now deployed Centium Software’s EventsAIR to manage its publishing and events. Treasurer Evert Muijzert reported ‘considerable losses’ at the holding company. Write-offs are being negotiated with auditor in respect of opex, cancelled workshops and IT. A reduction in journal income and ‘considerable negative’ impact from the EAGE’s student activities mean that the EAGE is now spending its (considerable) cash reserves. The balance sheet was down €2.4 million Euros in 2018.
Paradigm has been using back propagating neural nets for facies classification since 1991. Paradigm has now made Total’s multi-resolution graph-based clustering (MRGC) algorithm commercially available as ‘Facimage’. MRGC e-log facies classification was introduced (and patented) in Geolog Facimage in 2000. In 2016, SeisEarth introduced facies prediction with ‘democratic neural nets.’ Stratimagic included an unsupervised neural net for seismic facies classification leveraging patented technology from Total’s seismological guru Noomane Keskes. Current automated interpretation is usually done on post-stack attributes. Paradigm is now trialing convolutional neural nets for pre-stack data clean up and automatic interpretation. Pre-stack EarthStudy 360 local angle domain directivity gathers are used as input to the classifier. Principle component analysis was performed on 7,000 1x1 km images (using Squeeznet). The approach improves shallow fault imaging compared with diffraction imaging and provides a clearer, automatically generated seismic volume.
Antoine Guitton (DUG) gave a presentation on deep learning in fault detection to a packed room. Deep learning, a class of neural nets, has been used for years in the industry. But now the technique can be extended with maybe 100s of layers thanks to freely available software and powerful hardware. Enter the Google Inception architecture, deep learning for pattern recognition and classification. This is used to produce likelihood maps of faults. Will it be useful? Is machine learning invading our space? Will you keep your job? Wherever you have a filter machine learning will help. ‘Low skill’ tasks like picking top salt are at risk. In fact most ‘picking’ will go away (hooray!) as the workflow is ‘parameterized’. ML will get you to an 80% solution right away. Will it replace everything? The answer is yes for IT-related functionality, no if a technique embeds physics, such as wave equation work where the ‘relationship between data and outcome is hard to comprehend’. Tools of the AI seismic trade include Dave Hale’s Java toolkit and labelled training data, ResNet 50 and Softmax classification. The technique gives ‘pretty reasonable’ fault extraction and is ‘good with good data’.
We visited Matt Hall’s (Agile Scientific) popular machine learning hackathon and saw a compelling demonstration of a Geoteric-like clone developed in the two day Python hack. Read Hall’s blog on the event here.
Martyn Beardsell presented a potted history of Petrel, celebrating its 20th birthday this year. Petrel was born in somewhat mysterious circumstances when RMS developer Nils Fremming ventured to port RMS to the PC. This proved challenging at the time of (relatively) big iron hardware from SGI and others. ‘Few believed it would be possible.’ Working from his garage, Fremming produced Petrel V1.0 in 1998. It was unveiled at Petex in 1998 with a snappy ‘3D 4U ON PC’ tag line. (Note that Oil IT Journal, then Petroleum data manager reported that ‘Shell liked Petrel’s corner point gridding and bought the first license’.)
Meanwhile Geoquest was working to port Geoframe to the PC as ‘iGeoFrame.’ By 2002, as Technoguide built out the subsurface workflow, it was ‘struggling’ with the geomodel. At which point, Schlumberger bought the company, grew the Petrel team and added its own technology (CPS3, FrontSim) and later some of the Eclipse code base ‘completing the seismic to simulation dream.’ There are now ‘hundreds’ of developers working around the world on Petrel whose success is down to ‘good software, openness, customer involvement and quality.’ The last item was met with a few giggles which Beardsell acknowledged with, ‘OK but Petrel is now a very stable product’. The next EAGE will be held in London from 3-6 June 2019.
© Oil IT Journal - all rights reserved.