Speaking at a recent meeting of France’s CRIP* organization, Jean-Baptiste Richard and Olivier Tran presented Total’s ‘Apollo’ project, a blend of DevOps and a major shift to the public cloud. The Apollo project is managed by Total’s Global IT Services division, a ‘solution maker’ for the group. Total currently has some 2,000 legacy apps running in an on-prem data center. Total’s roadmap envisages a rationalization of the portfolio, and a ‘lift and shift’ of selected applications leveraging a ‘multi-cloud’ approach. For ‘strategically differentiating’ applications, the Apollo project will provide a framework for home-grown development in the cloud. Central to Apollo is DevOps, both an ‘IT culture and state of mind’ where IT works hand-in-hand with the business, sharing the objectives and providing continuous application delivery and automated IT provisioning. DevOps in turn leverages ‘agile’ development, an iterative approach that spans data science, development and cyber security. ‘MLOps’ (machine learning) also ran.
Total is moving to the cloud because ‘everything is there’, from machine learning, through containerization, identity management and ‘infrastructure as code’. The first trials were performed on Microsoft Azure leveraging a continuous integration and deployment approach with Kubernetes orchestration, containers and ‘templatization’. Projects run under Github on Azure leveraging tools such as Sonarqube (quality), Veracode (security), Apache JMeter (test) and the Helm Kubernetes package manager.
Some twenty of Total’s developers underwent extensive on-site DevOps training. This experience has shown that DevOps is considerably more demanding than vanilla ‘Agile’, adding more constraints but resulting in a better team spirit and a shared responsibility for the delivered product. The process has resulted in satisfied users and coherent software components for application development. ‘The DevOps hype is not entirely unjustified’.
In 2019 the Apollo program proper kicked off, now running in the Google cloud and adding further security with a ‘DevSecOps’ component. In the Q&A Richard intimated that prior to Apollo, Total’s development was performed in multiple in-house and externalized service provision. Apollo means a shift to centralized in-house development.
* CRIP is a Paris-based not-for-profit IT industry body.
Paul Cleverly, long-time researcher and upstream taxonomy/NLP guru, has launched Opportunity Finder (OF), an online tool that applies artificial intelligence to find new hydrocarbon plays within a corpus of unstructured text (papers, reports and presentations). OF is marketed through Cleverley’s start-up Infoscience Technologies.
OF leverages ensemble machine learning models and some 10,000 individual ‘clues’ for identifying potential evidence for source rock, maturation, migration, reservoir, trap, seal and hydrocarbon occurrence. A patent-pending method, inspired by DNA sequencing, is claimed to ‘go beyond traditional text extraction’ looking for patterns that match with potential, non-obvious oil and gas plays. Oxford, UK-based InfoscienceTechnologies was founded in November 2018 to apply deep geoscience domain knowledge and computer science to petroleum exploration, mining and other verticals.
I sat in on a meeting hosted by the International Energy Agency here in Paris to hear the great and the good discuss the energy transition. I don’t propose to summarize this right now, perhaps next month, along with the IEA’s informative analysis of Oil and gas in energy transitions. Lets just state what seems to be the consensus and say that the industry’s days are numbered, even if we don’t yet know what the number is.
Talking of energy in its broader sense I came across a graph that I found interesting. S&P Global Platts Analytics has just released its 2020 energy outlook. The press release includes a graph of energy prices. Not, as we are used to seeing, with oil in dollars per barrel and coal in dollars per tonne. This graph shows the cost of a unit of energy. During 2019, oil varied from $8 to $12 per MMBtu* with an average of around $10. Natural gas at the US Henry hub was a snip at around $2, although as natural gas gets shipped around the world it gets quite a bit more expensive, around $6 in Japan and the UK. Coal is cheap too, at a pretty constant $3.
Now I know that the world is turning away from fossil fuels. But note how expensive oil is compared to natural gas and coal. This has meant that producers of these cheaper fuels (US for natural gas, China and India for coal) have a huge incentive to use them. China is building new coal plants as is India. So is Japan. Even virtuous ‘green’ energiewende Germany is building a new coal fired generating station at Datteln.
Of course, the above prices are the price for raw heat. To compare these with renewables we have to look at the price of generated electricity. This is quite tricky depending on geographies, taxes and so on and is pretty comprehensively covered elsewhere, by the IEA for instance. But there is a paradox here. While the reports all have it that wind and solar are now ‘competitive’ with fossil fuels for the electricity generators, this is definitely not the case for the consumer. I can heat my home (and I am afraid I do) much more cheaply with fuel oil than electricity from the grid. OK I should have photovoltaics on the roof but these, although perhaps cheaper than grid electricity would I think have a job competing with the cheap, untaxed calories that my pink diesel provides.
Going back to oil, why is it so expensive? I guess it is demand and, in particular, demand for it as a mobile fuel for cars, boats and planes. With the exception of planes, oil is now at risk of being displaced by electricity from any of the above sources. My evidence for this comes from a fellow dog walker who works in the EU automobile sector which is currently undergoing convulsions as manufacturers are being forced by government to shift their production to e-vehicles. There is little evidence of this happening on the street. Except for the paradox that there are more and more gas guzzling SUV’s on Europe’s roads. This is conjectured to be due to manufacturers dumping their last models before the new regulations, creating a likely CO2 blip that will see us through the next few tens of ppm of CO2 and tenths of a degree of warming.
Walking through a Paris suburb recently I came across a heavy-duty drilling rig operating on a rather large pad filled with service company kit. It was a geothermal well (oil exploration has been banned in France). They must have been testing the produced water as there was quite a lot of hot vapor coming off the operation. Subsequent research determined that this was a breakthrough test of the deepest geothermal target in the Paris basin, the Triassic, with water produced at somewhere around 80°C. The idea being that the deeper you go, the hotter is the produced water. The hotter the water, the more energy. Pretty obvious really.
Even for the ‘obvious’, confirmation is always useful. I chatted with some heat pump specialists at GE Oil & Gas last year who were showing off some rather impressive industrial systems that manage to use the energy in hot flue gasses. The waste heat recovery systems like their heat hot! Several hundred degrees C at least otherwise, forget it!
So how is it that the folks who promote low temperature geothermal energy, including incidentally the IEA through its HeatPumps.org resource, can get energy from tiny temperature differences of a few tens of degrees? Long-time readers of Oil IT Journal will perhaps recall my earlier quixotic tilting at the windmills of the heat pump brigade, in 2008 and 2013. Since these early rants, I have given the heat pump fallacy some further thought, and I think I have got it nailed. The heat pumpers rely on a property of the reversible air conditioning unit (the essence of the heat pump) called the ‘coefficient of performance’, the COP. This is often quoted as a number around 3x or 4x and is presented as an energy multiplier. In other words, for 1 kilowatt of energy input you get 3 or 4 kw out, a fantastic deal! But if you look into how the COP is defined there is a small problem. Wikipedia and other sources give the COP as a ratio of temperatures as T(cool)/(T(cool)-T(hot)) for a cooling system. As the temperature difference tends to zero (i.e. when there is no heat flow available), the COP actually increases. This is often cited as a benefit of such systems which are ‘most efficient’ at vanishingly small temperature differences.
This apparent paradox is due to a misunderstanding or misrepresentation of what the COP Is measuring. As a ratio of temperatures, it is a measure of the gearing in the system. The COP is not a measure of energy (heat) at all**. The COP just tells you how cold your office will be for a given outside temperature. The energy used to do the cooling all comes from the mains electricity that is driving the pump. The shallow borehole, the serpentine under your lawn, the ‘air-geothermal’ fans or pipes going down into the ocean are simply fantastical artifices to fool the unwary.
* Million British thermal units.
** Confusing heat and temperature is quite commonplace as a recent letter to Physics Today explained.
Contrary to what we may have implied in our last issue, Halliburton, far from being left out in the cold by Schlumberger’s land grab of OSDU*, is ‘working diligently’ on its OSDU commitments to the community and to the industry at large and will ‘continue to look for more opportunities and ways to make OSDU successful’. The statement was made in a blog post by Chandra Yeleshwarapu, senior director of customer experience and digital.
Yeleshwarapu reports that the idea of an Open Subsurface Data Universe (OSDU) germinated at the Landmark OpenEarth Community (OEC) founders meetings in 2018. Halliburton joined OSDU as an active member in December of 2018 and later made a commitment to implement OSDU-compliant, cloud native microservices and plans that all products and solutions from Halliburton Landmark will be OSDU compliant. OSDU will be available in OEC projects to write more OSDU compliant applications and microservices. This will enable the community of developers on OEC to develop more applications and services on OSDU.
Halliburton Landmark has showcased OSDU-compatible, cloud-native applications for AI/ML-based workflows that let data scientists build new models on top of OSDU. DecisionSpace 365 data foundation, integration foundation and assisted lithology interpretation and other apps are now ‘all running on OSDU’. DecisionSpace 365 can run on a customer’s own cloud subscription. The capability was jointly developed with Microsoft and lets users co-locate DecisionSpace 365 with the rest of their data and digital portfolio, including OSDU, taking advantage of corporate Microsoft Azure agreements. Yeleshwarapu concludes ‘As new versions of OSDU are released, we plan that all of the applications will be compatible with and run on top of OSDU’.
* The Open subsurface data universe – see Oil IT Journal N° 250.
Speaking at a January 2020 earnings call Barclays’ oil country analyst John David Anderson asked Schlumberger CEO Olivier Le Peuch about the new strategy for an ‘asset-light, more technology-driven businesses in North America’ and the ‘primary growth driver’ of digital technology. Anderson asked for more details on Delfi sales and the likely progression of revenues from the reservoir characterization business. [Le Peuch had intimated that these might double with the move to a subscription-based model]. Anderson asked for a roadmap.
Le Peuch demurred on the roadmap but did restate the ‘clear ambition’ to double Schlumberger’s digital revenue in the next few years. Furthermore, the transition to a software as a service model should not negatively impact revenue. Le Peuch did acknowledge a ‘pause’ in digital revenue growth over the last couple of years, down to market condition and the fact ‘Delfi was not having the readiness and the breadth to expand into the marketplace’. These issues have now been addressed with four new Delfi products and the ‘new momentum’ that Schlumberger’s ‘open’ strategy is bringing. Le Peuch cited flagship client ExxonMobil’s acquisition of its ‘drilling digital products’ in North America. Le Peuch did not explain how a doubling of digital revenues could square with the recent announcements around the ‘open source’ inspired OSDU. But there again, Anderson did not ask that question.
In a separate announcement, Schlumberger has teamed with Dataiku* to ‘enable the E&P industry to build and deploy their own artificial intelligence solutions within the Delfi environment’. Dataiku AI’s availability in Delfi will let petrotechnical experts ‘build and extend workflows that leverage machine learning and data science’.
Dataiku is a New York headquartered software house whose flagship product, the Data Science Studio is said to ‘democratize’ access to data, enabling enterprises to build their own path to AI. DSS provides a simple interface for data wrangling, mining, visualization, machine learning and deployment. In December 2019 Dataiku received an equity injection from Google/Alphabet’s CapitalG late-stage growth venture capital fund, bringing Dataiku’s value to $1.4 billion and a ‘unicorn status’.
The deal confirms our long-held position that Delfi’s ‘cognitive’ tag has always been a marketing gimmick. Google’s investment in AI boutique is also interesting in the light of regulators waking-up to the GAFA’s predatory acquisition of smaller potential competitors.
* Dataiku is a mash-up of ‘data’ and ‘haiku’.
A new publication from the US National Academies Press, ‘Deployment of Deep Decarbonization Technologies’ presents the proceedings of a 2019 Washington, DC workshop. The workshop explored the challenges and opportunities for deploying and scaling up technologies that could produce ‘deep’ (80% plus) decarbonization of the US energy sector by 2050. The 127-page free publication is a collection of presentations and discussions from academia and industry. One paper, from ExxonMobil’s Tahmid Mizan, highlighted energy efficiency as one of the ‘low-hanging fruits’, with significant future potential for carbon mitigation. Another possible avenue is substituting refinery heat inputs with low carbon electricity-produced heat or nuclear heat. Also of interest to the refiner is ‘blue hydrogen’ production (adding CCS to steam methane reforming units) and ‘green hydrogen’ production (using renewable electricity and electrolysis).
A new report, ‘Net Zero and Beyond What Role for Bioenergy with Carbon Capture and Storage?’ from Chatham House, a London-based think tank, warns that ‘Policymakers are in danger of sleepwalking into ineffective carbon dioxide removal solutions in the quest to tackle climate change’. The paper warns against overreliance on bioenergy with carbon capture and storage (BECCS). BECCS was mooted in the iconic COP 21 as the most promising route to sub 1.5°C global warming.
Equinor, in cooperation with partners Shell and Total, is studying CO2 storage on the Norwegian continental shelf (NCS). The Northern Lights project includes transport, reception and permanent storage of CO2. The storage project is part of the Norwegian State’s demonstration project ‘Full-scale CO2 handling chain in Norway’. A well is currently drilling into the Johansen formation on Equinor’s Aurora license (EL001) to study the reservoir’s suitability and capacity for CO2 storage.
The US Department of Energy’s (DOE) Office of Fossil Energy (FE) has announced $4 million in federal funding for national laboratories to collaborate with international partners on projects in the Accelerating carbon capture and storage technologies (ACT) initiative. The Lawrence Livermore National Laboratory is the chief beneficiary of the DOE’s largesse. A ‘Digital Monitoring of CO2 Storage Projects’ (DigiMon) will integrate a broad range of monitoring technologies with data analytics to improve system cost and reliability for carbon storage projects. Another LANL project is looking into re-using existing wells for CO2 storage (REX-CO2). A ‘procedure and tools’ for evaluating the re-use potential will be developed. Yet another project, Assuring integrity of CO2 storage sites through ground surface monitoring (SENSE) involves ground surface movement detection combined with geomechanical modeling and inversion to study pressure distribution and hydraulic behavior of storage sites. Finally, LANL and subcontractor, the University of Texas Bureau of Economic Geology are participating in ACT on offshore monitoring (ACTOM), an effort to build ‘a web-based toolkit’ to ‘collect algorithms’ for designing offshore geologic storage monitoring programs. More from the Office of Fossil Energy website.
In a recent white paper, the IOGP calls on Europe to scale-up its carbon capture and storage (CCS) effort. CCS will be essential to meet the Paris Agreement goals. It is a proven technology, with its roots in enhanced oil recovery. There are currently 18 commercial CCS projects in operation globally. The potential in Europe is around 134 Gt CO2, equivalent to 446 years’ worth of CO2 storage at the rate suggested necessary in 2050 by the European Commission. The EU should incentivize deployment and scale-up of CCS infrastructure to meet its climate objectives on time. In a separate announcement, the IOGP lashed-out at a campaign by network of environmental organizations for ‘Fossil Free Politics’ that aimed to restrict the IOGP’s right to ‘engage in the democratic and necessary public policy debate around energy and climate’.
OGCI, the Oil and Gas Climate Initiative has launched a new initiative to ‘unlock’ large-scale investment in carbon capture, use and storage (CCUS). The OGCI’s CCUS KickStarter initiative is designed to help decarbonize multiple industrial hubs around the world, starting with hubs in the US, UK, Norway, the Netherlands, and China. The aim of the KickStarter is to create the necessary conditions to facilitate a commercially viable, safe and environmentally responsible CCUS industry, with an early aspiration to double the amount of carbon dioxide that is currently stored globally before 2030.
In its January Monitor, Gaffney Cline enumerates current CCUS initiatives by Oxy Low Carbon Ventures, Abu Dhabi National Oil Company (ADNOC) and others to conclude, ‘the new wave of CCUS facilities shows the growing momentum around the technology and its role in supporting the energy transition’. However, ‘the reality is that much, much more is needed, with at least 2,400 million tonnes/year of CCUS capacity needed to be in operation over the next 20 years according to the IEA sustainable development scenario’.
The Oxy Low Carbon Ventures CCUS project is a joint venture with Svante, LafargeHolcim, and Total involving commercial-scale carbon capture and end-use at the Holcim Portland cement plant in Florence, Colorado. The project (actually, a ‘study’) will evaluate the cost of a facility designed to capture up to 725,000 tonnes of carbon dioxide per year. The CO2 is to be sequestered underground by Oxy. The project is to benefit from the US 45Q tax credit.
The Energy Futures Initiative (EFI), established in 2017 by former Secretary of Energy Ernest Moniz, has published a summary report titled, Clearing the air: Technological carbon dioxide removal R&D initiative. The report proposes three broad approaches to CDR: ‘natural’ (trees, biomass), technologically-enhanced natural processes (ex-situ carbon mineralization, advanced crop cultivars, ocean alkalinity enhancement, and BECCS*) and ‘technological’ capture, including direct air capture (DAC) and electrochemical separation of CO2 from seawater. Moniz is suggesting a modest $10.7 billion effort over 10 years to come from multiple federal agencies including DOT, NASA, NIST and the Executive office of the President (good luck with that!).
* Bio-energy wth carbon capture and storage.
A new study, The uncertain role of natural gas in the transition to clean energy, by MIT examines the opposing roles of natural gas in the battle against climate change. Natural gas is touted as a ‘bridge fuel’ toward a lower-emissions future, but it also is a contributor to greenhouse gas emissions. Uncertainty in estimating current levels of fugitive methane makes it had to evaluate natural gas’ true role. For the climate, ‘strategic choices must be made now about whether to invest in natural gas infrastructure’. MIT has looked at the uncertainty in methane monitoring to conclude that ‘present methods of controlling methane leakage need to improve by anywhere from 30 to 90%. Methane is a valuable commodity and operators have some incentive to minimize losses. Additionally, intentional natural gas venting and flaring continues.
Crestone Peak Resources, with partners Project Canary and the Payne Institute for Public Policy at Colorado School of Mines, are to implement continuous monitoring of emissions during all phases of its Colorado oil and gas production. The pilot will use Project Canary’s emissions monitoring devices and real-time data capture. The Payne Institute is to act as independent steward of collected emissions data.
BP is to deploy continuous methane measurement across future BP-operated oil and gas processing projects to detect, measure and reduce methane emissions. Measurement including gas cloud imaging (GCI), will be rolled-out to all new major projects worldwide. The technology has also been tested at BP’s giant natural gas Khazzan field in Oman. Technology providers involved include Providence Photonics (Mantis VISR camera for flare performance monitoring), Flylogix’s over-the-horizon drone service, Precision Hawk geospatial data analytics and methane sensor-carrying drones, SeekOps’ ultra-precise gas sensors, Rebellion Photonics’ gas cloud imager (static emissions monitoring), Fieldbit and RealWear HMT1 head-mounted VR tablets.
Gaffney-Cline reports on the use of OPGEE, Stanford’s oil production greenhouse gas emissions estimator. OPGEE is an open-source Excel-based tool that quantifies the Carbon Intensity (CI) of oil production with a breakdown of various sources of emissions and their relative contribution. OPGEE takes up to 50 inputs to characterize the fields production methods and evaluate the CI across the field’s lifecycle. Gaffney Cline’s own analysis of multiple oil and gas supply chains suggests that some oil and gas has over six times the CI of others, meaning that the total life-cycle emissions (including end-use) varies considerably. ‘Not all oil and gas is created, developed and operated equally’. An important consideration when companies are called-on to produce ‘sustainability’ reporting.
PTAC, the Petroleum Technology Alliance Canada, Linear Motion Technologies (LMT) and Spartan Controls have received a $2.1CAD grant from the government of Canada to fund development of a ‘state-of-the-art, affordable and fail-safe’ electric valve system to eliminate methane emissions from existing pneumatic valves. The electric dump valve actuator (EDVA) development is also benefitting from grants and in-kind support from Alberta Innovates and other stakeholders, bringing the total project funding to $3CAD. Houston-based LMT was established to commercialize patented shape memory alloy, aka ‘electric muscle’, and SmartRam actuator technologies.
The EPA is proposing to relax some of the Obama administration’s air regulations for oil and gas to remove ‘redundant’ requirements and reduce the burden on producers. The changes result from the EPA’s review of the 2016 New Source Performance Standards (NSPS) for the oil and natural gas industry, conducted in response to President Trump’s Executive Order 13783, ‘Promoting Energy Independence and Economic Growth’. The latest EPA findings are that the EPA previously failed to show that emissions from the transmission and storage segment of the industry cause or significantly contribute to air pollution. The proposal is to rescind emissions limits for methane, from the production and processing segments of the industry but keep the limits for ozone-forming volatile organic compounds. More than 100 speakers provided oral testimony at a public hearing in Dallas last October. Transcriptions of the proceedings and presentations are available on the US Regulations.gov https://www.regulations.gov/docket?D=EPA-HQ-OAR-2017-0757 website. More information, including a pre-publication version of the Federal Register notice and a fact sheet, is available at https://www.epa.gov/controlling-air-pollution-oil-and-natural-gas-industry.
Industry association Our Nation’s Energy Future (ONE Future) reports that its members reduced their methane intensity by 41% from 2017 to 2018. The improvement, in the face of increasing natural gas production, comes from ‘upgrading and replacing pipeline infrastructure, as well as actively seeking and repairing system leaks’. NW Natural has joined the ONE Future methane emissions reporting initiative. One Future recently published its Methane Intensity Report for 2018.
Yokogawa’s KBC unit has produced an excellent analysis of the issues surrounding the energy transition and their impact on the oil and gas sector. We cherry pick some insights from the 26 page Industrial energy transition manifesto. Energy efficiency is the most economic mitigation strategy and is viable today with a zero (or negative) carbon price. KBC’s own EMISS benchmarks find that upstream and refining are significantly worse than petrochemicals and ‘even a top performer can save 10-15% of energy use worth $20-30 million per year”. Energy efficiency is a ‘robust and low risk option’. While carbon capture and storage gets the thumbs-up from KBC as a viable (albeit expensive) mitigation strategy, carbon capture and re-use is deprecated. While the economics can be made to appear reasonable, re-use is a thermodynamic aberration as the multiple energy conversions ‘lead to enormous system losses’.
KBC observes that, ‘the oil and gas industry will one day face decline’, although the speed and severity of decline is open to debate. Declining industries can still generate value so long as business strategies are adapted. Four paradigms for the oil and gas industry, based on the nature of the decline and the degree of shift in products produced, can be envisioned. KBC applies an end-game* methodology to the oil and gas industry to map a matrix of decline scenarios from an un-managed collapse of the industry to a ‘harvest and protect’ niche leadership position.
As a software house, KBC sees digitalization as germane to the energy transition. Digitalization will play a major role in accelerating efficiency, driving action by providing data transparency across portfolios. Artificial intelligence will show improvement opportunities and automation will ensure the systematic achievement of maximum potential.
* End-Game Strategies for Declining Industries, by Kathryn Rudie Harrigan and Michael E. Porter, Harvard Business Review July 1983.
In a similar vein, a white paper from ABB, ‘The impact of climate change on downstream operations’ concludes inter alia that ‘Technology exists today that can make meaningful efficiency gains through standardization, modularization, energy efficiency and future proofing. The key is digitalization and automation and control’.
MIT engineers have found a novel way of removing carbon dioxide from the air. The system work on any concentration level, even down to the current ambient atmospheric 400 parts per million. The technique involves passing air through a stack of charged electrochemical plates, coated with carbon nanotubes composited with polyanthraquinone. No, we have no idea what that is either. The technology is described in a new paper by MIT postdoc Sahag Voskian titled, Faradaic electro-swing reactive adsorption for CO2 capture in the journal Energy and Environmental Science.
BP Ventures has invested $5 million into Finite Resources whose Finite Carbon unit runs voluntary carbon offset programs involving forest carbon management. The investment will enable Finite Carbon to grow a new line of business to incentivize sustainable forest management, financed by businesses seeking to voluntarily offset carbon emissions. Finite Carbon was founded in 2009 and is now the largest developer of forest carbon offsets in North America with more than 40 forest projects covering nearly three million acres.
Total has announced a $400 million global venture fund dedicated to carbon neutrality. Over a five-year period, the monies will be allocated to start-ups that develop innovative technologies and solutions which help companies to reduce their energy consumption or the carbon intensity of their activities. The fund will be known as Total Carbon Neutrality Ventures (TCNV). The fund builds on Total Ventures’ existing portfolio of 35 global start-ups that directly and indirectly contribute to carbon neutrality. That portfolio includes Solidia, Sunfire, Scoop, Shyft Power Solutions, Ionic Materials, MTPV, AutoGrid, Stem and OnTruck.
ISO, the International standards organization has just published ISO 14007, Environmental management – guidelines for determining environmental costs and benefits. The standard ‘helps in creating transparent and accurate data’ and should help demonstrate the value of ‘sustainability’. The new standard complements ISO 14008, Monetary valuation of environmental impacts and related environmental aspects, published in March 2019.
Seven IOGP member companies (Chevron, Eni, Equinor, ExxonMobil, Hess, Shell, Total) have launched a joint industry project (JIP34) on environmental genomics. The program will coordinate research into the application of eDNA-based analyses in environmental assessments and monitoring of oil and gas operations. eDNA is used to detect organisms and estimate biodiversity and is said to be faster, cheaper and more comprehensive than conventional sampling.
A paper in Petroleum Geoscience, Geoscience and decarbonization: current status and future directions, by authors from Equinor, BGS and others reports from the 2019 Bryan Lovell meeting of the Geological Society of London. The geologists concluded that ‘geoscience is critical to decarbonization, but that the geoscience community must influence decision-makers so that the value of the subsurface to decarbonization is understood’.
Total has ditched the American Fuel & Petrochemical Manufacturers association. Total considers the organization’s climate stance as not aligned with its own. Total is however, maintaining its affiliation with the American Chemistry Council, the American Petroleum Institute and the Canadian Association of Petroleum Producers, considering these bodies to be ‘partially aligned.’ The French supermajor is to advocate internally for changes in their positions and will reconsider its memberships in the event of lasting divergences.
MIT has produced a 50 page study on ‘Climate-related financial disclosures: the use of scenarios’, a report from a 2018 workshop with participation from several supermajors and international bodies. Our cursory reading suggests that the report kicks the can down the road, concluding inter-alia that, ‘There tension between providing specificity about a firm’s climate‑related risks and supporting comparison among firms. Standardization of a reference scenario … would be a contentious task’. ‘Thoughtfully designed transparency requirements of modeling methodologies, rather than full standardization’ are advocated. However, ‘Private firms have commercial and legal concerns about the disclosure of financial information beyond what is required by law, and the call for climate‑related data only adds to what are familiar and long‑standing issues’. ‘Combining quantitative analyses of current asset exposures with qualitative expressions of future options could provide a useful picture of a firm’s strategy resilience. But ‘providing only part of this information, or doing so with inconsistent components, is not helpful’.
GARP, the Global Association of Risk Professionals has launched a certificate in sustainability and climate risk (SCR) to help professionals understand and manage the potential economic and operational impacts of a changing climate on their organizations. The SCR certificate costs $650 and involve some 100 hours of study and a three-hour exam. Registration begins June 1, 2020. Interested parties can sign up for more information here.
Finally a pointer to an excellent BBC Radio 4 program where Oxford University professor Myles Allen explains climate change and what to do about it. Allen’s solution, which incidentally is the only way the oil and gas industry can survive in a ‘net zero’ world, is carbon capture and underground sequestration. ‘Fossil fuel industries must be forced to take back the carbon dioxide that they emit. If carbon capture and storage technologies make their products more expensive, so be it’. Is this possible? Absolutely yes, witness the ‘geological industry’ over the last 100 years. The fossil fuel industry needs to take care of its waste and needs a ‘clear steer from government’.
VP IT/CIO Rob Thomas reports that Kosmos Energy has recently completed a re-architecture of its geotechnical systems utilizing RiVA from Geocomputing LLC. The new platform provides superior system performance and gives Kosmos geoscientists remote access to subsurface projects.
Kosmos started very small and grew organically with the Jubilee discovery in Ghana in 2007. A decentralized geotechnical system environment emerged as it grew. Kosmos is a frontier explorer and has a multi-petabyte seismic portfolio. The legacy, decentralized architecture and growing user count had become unsustainable in system management and performance.
Thomas was hired to address these issues and began looking for a high-performance solution that addressed requirements of centralization, an agnostic and large toolset, remote access, and outsourced geotechnical IT support. Geocomputing LLC was selected and the whole Kosmos portfolio, along with its GOM Deep Gulf Energy acquisition was merged into a single system. From concept to completion took about 18 months.
The system provides a high-performance experience to users across North America, a key requirement as Kosmos likes to keep its dispersed ‘senior’ oil finders in play. When the company farmed out its Mauritania-Senegal discovery, a data room was established in London (due to visitor US visa delays) and was supported remotely from Dallas. Performance exceeded expectations with data latency at around 100ms. Geocomputing RiVA has also been tested in Kosmos’ offices in Accra, Ghana with good (203ms) latency. Thomas concluded that ‘the flexibility of this system is light years ahead of the legacy decentralized system before it’.
Speaking at the 2019 LBCG Results-driven analytics conference in Houston, Nathan Bookwalter (Anadarko/Oxy) traced the history of automation data from the early days of telephone wire to the wireless, OPC-based networks of the 1990s and the domination of embedded Windows in the early 2000s. By around 2015, consumer-oriented IoT devices brought a shift, with the adoption of open protocols and a large-scale move from Windows to Linux. Today, even industrial hardware is supplied by ‘non-typical’ vendors like Dell, Samsung and Amazon. At the same time, polling rates have increased such that real time data is everywhere and end devices can be addressed from remote locations like the control room, via the cloud.
Anadarko has some 18,000 remote devices at 14,000 wells. A single central scada host receives 15 minute data from around 8 million tags and 1 second data from a further 2 million. Standard automation and scada package makes for centralised optimization and system-wide surveillance. The new system represents a huge efficiency gain with support staff headcount down from 45 to 20 and a reduction from 10 to 3 in the number of historians.
Bookwalter analysed the pros and cons of the MQTT data transfer protocol. The store and forward mechanism, TLS security, pub/sub are positives. On the downside, the market is very immature. Higher-level protocols like Sparkplug B or OPC UA Part 14 come with ‘optional’ features that create vastly different results, meaning that legacy models may not work. The claimed reduction in network traffic is ‘over-stated and out of context’.
Having said that, Anadarko has performed extensive MQTT network tests and now, closed loop optimization and process control will interact 100% via MQTT. There will be no direct connections to the control systems. MQTT has already replaced 800 connections and in 2020 these will be deployed at 1,250 well sites with a goal of 100% real time big data, IoT, analytics and edge computing. Other systems (HSE, planning, marketing, historian and production accounting) are moving off the scada system and onto the broker network. Anadarko is calling for an MQTT coalition to steer the industry on future features and priority development.
More from the LBCG Results-driven analytics home page.
Release 7.1 of Badley Trap Tester adds a capability to transfer fault attributes and displays into Petrel. The in-situ stress analysis tool can now handle multiple well sources to create 3D stress fields and horizon stability maps. Watch the video.
de Groot-Bril has released a Beta version of a machine learning plugin for its OpendTect seismic interpretation software. The plugin comes with pre-trained models that can be applied to unseen data sets or used for transfer training. An interface with TensorFlow/Keras, Scikit-learn is coming real soon now.
Geomodeling has release AttributeStudio 8.3, introducing algorithms and workflows that apply machine learning to reservoir property prediction. ng workflows in 8.3 allow geoscientists to quantitatively integrate well data and seismic AttributeStudio embeds the Matlab Deep Learning Toolbox for supervised learning, unsupervised learning and non-linear regression.
geoLOGIC has released V8.13 of geoSCOUT with new features including date and text-based data classifications and enhanced visualization tools for completions planning. A LAS color gradient can now be added along the wellbore for ‘sweet spot’ identification.
The 9.7 release of Geosoft’s eponymous flagship optimizes and simplifies the UAV (drone) geophysics extension workflow and adds a new 2D/3D SEG-Y import wizard. The induced polarization and gravity and terrain correction extensions have been enhanced and the 9.7 release supports OMF, the open mining format for transfer of 3D scenes into Leapfrog.
Kappa-Workstation v5.30, the fourth major release, includes Saphir, Topaze, Rubis, Azurite, and Citrine and integrates technology developed under KURC, the KAPPA unconventional resources consortium.
A new major version (3.0) of Madagascar, the open source seismic imaging package, has been released with support for Python-3. The new edition also features 14 new reproducible papers, as well as other enhancements. Madagascar 2.0 was downloaded about 6,000 times. The top country, with 27% of all downloads, was China, followed by the USA, Brazil, Canada, and India.
Emerson has released Paradigm 19 now available as both a cloud-hosted and on-premise solution. The 19 release brings new interpretation workflows that bring Stratimagic and VoxelGeo into the interpretation platform. A new quantitative seismic interpretation offering ‘modified stochastic inversion’ enables multiple, broadband realizations of the reservoir that match all available data, including the seismic. The new release adds new automation and machine learning functionality for accelerated interpretation and reservoir characterization. More on the new features in Paradigm 19 from Emerson Software.
The latest 2019.4 release of Thermo Fisher Scientific’s PerGeos digital rock analysis package offers new colormaps, easier script sharing, and a simplified normalization interface. Also new is an enhanced Python experience with access to spreadsheet columns, new remote procedure call package, and fast image augmentation. Read the release notes here.
Sharp Reflections is has released Pre-Stack Pro 5.6, the ‘fourth and final’ deliverable from its Foundation Project IV consortium. PSP 5.6 adds a volumetrics calculator, a new well-calibrated velocity model and a multi-well tie QC function.
Target has announced the ‘official’ release of its Meera simulator. Meera combines AI and numerical simulation models in one framework and is claimed to be ‘the first AI-Physics augmented reservoir simulator’. The 3D, 3-phase simulator guarantees mass conservation for all reservoir fluids and wells using a ‘flux-conserved’ form of finite volume discretization. The AI/ML engine Is a multi-layer deep learning framework coupled with ‘enhanced LSTM-based’ recurrent neural networks. More from Target.
Interica has launched OneView, an ArcGIS map interface that provides windows into live and archived geoscience projects and base data. The four visualizations are high-level data clustering, full project extents, tight project extents and project content. OneView enables fast discovery, analysis and extraction of critical information from structured or unstructured data. Disparate file and application locations and large-scale datasets can be managed, curated and analyzed. OneView integrates with 3rd party applications leveraging Interica’s 30+ proprietary connectors built on graph database and HTML5 technology.
The 2020 release of INT’s GeoToolkit.JS offers new API support for ArcGIS servers, new syntaxes for ES6 harmony modules for faster loading, new interactive zoom and scroll tools and a new multilateral schematic widget. GeoToolkit.JS 2020 now also offers Kriging and ThinPlate algorithms.
Schlumberger has announced Ocean for Petrel 2020 ‘field introduction 1’ as in a ‘commercial-ready’ state for plug-ins. The Ocean library and third-party libraries will be binary compatible with the commercial release. Plug-ins can now compile and submit for Ocean Store acceptance to be released at the same time as Petrel 2020.1.
SitePro has announced SitePro Mobile, a ‘digital oilfield’ iPhone/iPad app that provides control and monitoring of real-time data and analytics from SitePro’s ‘Command Center’ fluid management software. The native iOS app can be downloaded from the App Store.
Twenty20 Solutions has developed a custom automation and monitoring solution that addresses the operating challenges of saltwater disposal in the oil and gas industry. The system provides SWD facilities with data capture, safety and security alert and alarm capabilities, and automated regulatory compliance reporting. An electronic ticketing system provides driver information and haul data, automating processing, invoicing and reporting.
Colorado-based geospatial data solution provider EcoPoint has announced a flowline mapping service to help operators comply with new regulations (Senate Bill 19-181) from the Colorado oil and gas conservation commission. Operators must provide GIS mapping data on operating and abandoned underground flowlines before year end 2020.
Gulf Energy Information has announced the Energy Web Atlas TechLink, a new spatial research tool that pairs comprehensive technical content with detailed project data. The EWA provides global maps of pipelines, refineries, petrochemical facilities, gas processing infrastructure and LNG facilities. The EWA TechLink provides spatial search across published research papers, technical articles, PDF’s, images, and additional Gulf Energy Information content from World Oil, Petroleum Economist and other journals.
MJ Logs has rolled-out WLS, its well library search enhancements a map based interface to MJ’s 2.1 million well library.
Rockware’s new PetroVisual well data analysis and visualization tool operates across public oil and gas data sets. Users can select, query and analyze data as maps, charts, grids, pivot tables and drilldown trees. PetroVisual cloud-based solutions currently expose data from fourteen US States. Pricing is $5,000.00 per year for a single user.
The Railroad Commission of Texas has launched two new interactive statewide data maps. One shows oil and gas drilling permit approvals, the other, the number of wells spudded. The initiative is said to enhance regulatory transparency and educate the public on ‘one of the state’s top economic drivers’.
Orbital Insight has added an analysis button to GO, its geospatial analytics engine. The new functionality lets users access structured data directly from GO and create additional visualization and analytical workflows, ‘unlocking a new depth of visualization and analysis’. GO adds Orbital Insight’s geodata APIs to the open-source Jupyter notebook ecosystem.
Huber+Suhner has announced Radox OFL (oil and gas flexible lightweight) cable solution, the ‘lightest and most compact’ cable to date that is set to ‘revolutionize’ offshore connectivity. Radox OFL cables reduce weight and space by up to 60%. A small bend radius and thin wall is said to reduce stripping time and make installation easier and quicker. The RADOX OFL cable is also oil, mud and hydraulic oil resistant according to the NEK606 Cat. a-d. standard.
AspenTech’s has announced a ‘full-stack’ industrial IoT solution, the AspenTech Cloud by Mnubo, a cloud-based analytics and data science studio with tools and libraries for problem solving. AspenTech acquired Mnubo in 2019.
Detechtion Technologies has launched a new product, the Enbase Asset Monitor, a rugged, industrial IoT device designed for remotely monitoring the run status and location of oilfield assets.
Release 2.1 of Mette, Emerson’s wells, flow lines and gathering networks modeler adds simulation of multiphase pumps and subsea pumps used in field development or use production. Gathering networks with loops are now also supported.
Endress+Hauser’s Proline Promag W adds a ‘full bore’ option to the electromagnetic flowmeter that provides reliable measurements in ‘close-knit’ pipeline networks without tube restriction or pressure loss. Endress+Hauser has also announced Liquiphant FTL51B and FTL41 point level instruments to reliably detect the point level of liquids in storage tanks, containers, and pipes. The instruments feature Bluetooth connectivity, automated proof tests and verification, and configuration with a mobile device.
Opto 22’s groov EPIC Firmware 1.5.0 adds new shape gadgets to enable creation of custom SVG images and animations that dynamically react to process variables. The new firmware enables tag-driven conditional formatting that can be used to change text color to reflect line status or quality metrics stored locally. OEMs can now deploy external HDMI monitors and touchscreens from Dell, Hope Industrial, SuperLogics and AIS ‘making it easier to eliminate Windows PCs and OITs for local viewing and operator interfacing’.
Hexagon has introduced HxDR, a cloud-based, digital reality visualization platform that creates ‘Smart Digital Realities’, accurate digital representations of the real world from airborne imagery and laser scans, indoor and outdoor terrestrial scan data and mobile mapping data. Users can drag and drop reality capture files into HxDR for automated meshing, or license real-world replicas from Hexagon’s 3.6 petabyte collection of towns, cities and landscapes.
Mechanical Solutions, Inc. (MSI), has released VibVue a vibration measurement and analytical system that helps identify and solve critical vibration issues. VibVue uses a high-speed camera and novel software solution diagnosis of vibration and system dynamics problems. VibVue’s ‘motion magnification’ technology, licensed from a major university.
Shell Marine has launched a new IT platform for its used oil analysis program, Shell LubeAnalyst. LubeAnalyst offers online sample registration and automated label printing that minimizes errors. A portal with personalized dashboards, interactive charts and an easy-to-use oil analysis reporting let vessel managers oversee lubricant performance across their fleets from the LubeAnalyst homepage. A Shell LubeAnalyst mobile app is available.
Release 3.2 of Yokogawa’s Exaquantum plant information management system, a component of its OpreX asset management suite, expands data acquisition and export with support from OPC UA. Exaquantum also integrates with Microsoft Office 365 and SQL Server. The client now runs on Windows 10 Pro/Enterprise SAC and is compatible with Yokogawa’s Centum distributed control system.
2019 saw the migration of the international National Data Repositories conference (NDR) from under the auspices of Energistics to its new home at the TNO-sponsored North Sea data management forum. Some 115 delegates from around the world attended the NDR 2019 event in Utrecht, NL.
A breakout session chaired by Shell’s Johan Krebbers discussed the ‘robust’ ICT infrastructures and technologies required for receiving, storing and disseminating data. Modern technological advances can be utilized to obtain better performance and scalability, support for larger data volumes, higher degree of security and data integrity and eventually enable smoother transition when migrating between systems.
The discussion focused on cloud technologies and seismics to conclude that storing interpretations was necessary, along with field data. Operators tend to store the data in their cloud, and NDR’s should ‘keep up with industry’. Other learnings are that ‘the cloud is cheaper’ although it is not available everywhere. As much data is under used, only meta data needs to be in the cloud.
In the EU, anti-competition laws force NDR’s to change providers on a regular basis. This can cause problems with the IP of the data model used. Also moving large volumes of data takes time, as does populating a new ‘data model’. The breakout session concluded (rather enigmatically) that repositories should ‘avoid using a relational database’ and that it was a good idea to ‘bring people to the data instead of data to people’. No, we don’t understand either!
In the Energistics breakout, it was argued that standards should be used as for data submittal to make it easier for operators to use commercial software. Alongside their data exchange role, standards are now usable as a ‘persistence’ (storage) format which allows analytics directly on the data in a context like OSDU. Energistics recognizes that there are gaps, especially in the energy transition realm and is planning to work with other standards bodies in geothermal, hydrology and gas storage.
At the North Sea Data Management Forum proper, as we reported back in 2017, a loose memorandum of understanding links oil and gas authorities in countries surrounding the North Sea. In previous years the NSDMF has discussed weighty matters such as data confidentiality periods across different legislations. The organization has also mooted workgroups for a common metadata map of North Sea wells, cross-border data sharing and the impact of open data and the EU Public Sector Information Directive.
Johan Krebbers also presented the Open Subsurface Data Universe (see Oil IT Journal Issue 250). Attendees questioned OSDU on data security, to hear that ‘all features of information security were ensured to be intact on the platform’. A recent collaboration between OSDU and PPDM was described as ‘significant’ and ‘an innovative and beneficial leap forward in terms of continuing to deliver the very mission of PPDM, which is to support interoperability of people, processes and data’. How this collaboration was squared with the above entreaty not to use a relational database was unclear.
The Norwegian Released Wells project is now being run by KonKraft, a collaboration between various Norwegian trade bodies. KonKraft provides national strategies to the petroleum sector to maintain the competitiveness of the Norwegian continental shelf. Released Wells involves an in-depth analysis of cores, cuttings and logs from over 1,500 exploration and appraisal wells on the NCS with data and results migrating to 21st Century big data technology. Suppliers are Rockwash Geodata and Stratum Reservoir Labs. The 30 month project kicked off in 2019.
A similar initiative in underway in the UK where the Oil and Gas Technology Centre has kicked off an ‘Overlooked Pay’ project. In 2017 OGTC issued a ‘call for ideas’ which returned a suggestion that machine learning could be used to identify ‘overlooked pay’ opportunities and ‘prove that ML can improve productivity and objectivity and be deployed at-scale’. The project come up with several recommendations for NDRs viz: historical data ‘gaps’ without digital data need to be filled, NDRs should be home to ‘gold standard’ data, non-standard naming conventions, formatting and coordinates are a challenge, legacy non-machine readable formats are common and need to be ‘upgraded’. On the plus side the NDR can form an ecosystem for wider data usage, the cloud can now handle and facilitate rapid access and interrogation of data and there is an opportunity for large, regional scale training models for ML. The project is operated by DataCo* using well data from some 7,000 wells from Norway and the UK.
* DataCo was acquired by Sword/Venture in 2019.
Read the NDR presentations here.
At last year’s EAGE we reported on the contrasting themes of ‘going green’ and ‘business as usual’. While there was plenty of ‘business as usual’ on the exhibition floor (and in the talks), the SPE ATCE picked-up the ‘greening’ of the industry theme big time, in the opening general session and in a panel session on ‘responsible’ energy development. Both sessions demonstrated the difficulty the industry is having in the face of the energy transition, a theme that is a major issue for Calgary and Alberta as witnessed by daily coverage in the local paper, the Globe and Mail and talk from some of ‘Wexit’, Western Canadian separatism.
The SPE, as we have previously reported, has collectively drunk the big-data-machine-learning-analytics Kool-Aid, along with its close cousin the digital transformation. We report from a session on the ‘good bad and ugly’ of data analytics. This was of course more about the good than the ugly, although Erdal Ozkan (Colorado School of Mines) did a good job of tempering some of the analytical excesses.
We also report from two sessions on digitalization. It is generally believed (at least by the supermajors) that ‘digitalization’ involves, at least to a degree, a move to the cloud. We report on what this means to the software development community that is now engaged on some transformations of its own with the advent of the cloud, Kubernetes, microservices and so on.
Mayor Naheed Nenshi welcomed the SPE to Calgary, ‘where Bow meets Elbow’ (rivers). Calgary has been home to Canada’s oil and gas industry since the Leduc oil boom in 1947. Today the industry is helping ‘fight poverty with access to clean energy’ (ripple of applause). Nonetheless there are ‘well-placed concerns’ about climate change and sustainability. ‘We can no longer ignore this and just be boosters for industry’. Nenshi was in New York the previous week for the climate summit events. ‘If we fail here, generations to come will never forgive us’ (no applause for that!). We need to tell stories about Alberta GHG reduction and ‘lead the fight against climate change thanks to smart PEs’. Nenshi mentioned Calgary-based hydrogen energy startup Proton as a promising way forward.
Sami Alnuaim (SPE president) agreed that a billion people ‘lack access to basic energy’. Oil and gas ‘will continue to be needed as part of the energy mix’. The value to society is ‘immeasurable’. Alnuaim was also just back from New York where he participated in the oil and gas showcase. ‘Net zero by 2050 will require cooperation with coal, steel and others’. An SPE video skillfully blended the sustainability field with the ‘digital transformation’ uber theme. Machine learning, digital transformation, efficiency will lead to a reduced CO2 footprint, less emissions, water use and improved safety and environment. The video got a ripple of applause.
Moderator Eithne Treanor (ET Media) boldly pronounced that ‘oil and gas will not be part of a climate catastrophe’, the industry is determined to reduce emissions and leverage its expertise. Climate activists call for action right now. But we also need to assure supply and meet a growing energy demand. Governments must do more, green business is good business. We need to work and find solutions like low carbon technology and emissions reduction. ‘No one is ignoring the risk of global warming’.
Jackie Forrest (ARC Energy Research) agreed, adding some ‘context’. Oil and gas has gone from scarcity and high prices to new technology and lower breakeven prices (50% down!). Investors are changing their stance to ‘lower for ever’. Companies are finding it hard to carve-out money for dividends and buybacks and to pay for emission reductions. It will be hard to transition fast as fossil fuel use is up year on year. This is a generational challenge.
On the topic of ‘net zero’, Forrest observed that the ‘net’ part is key. Can we sequester enough CO2? This will enable is to continue to use fossil fuels. Canada is a leader here. Another idea is to ‘grow seaweed and bury it’). But net zero will be hard without CCS.
And there is the incremental technological approach. Andrey Bochkov reported that Gazprom’s flaring is ‘down by 95%’ and seismic surveying ‘cuts down less trees with wireless receivers’. Leigh-Ann Russel added that BP now uses drones for methane leak detection. Flaring has been reduced with IR cameras. BP is also creating new businesses, Solar Light Source, Fulcrum, and ChargeMaster. Jeanne-Mey Sun added that Baker Hughes is also working on venting/flaring reduction and fugitive emissions with its FlareIQ and Lumen methane monitor. Another innovation is Gazprom’s use of integrated compressors to collect separator gas for use or sale, ‘a non-trivial amount’ in Russia.
The Q&A raised the tricky issue of emissions measurement, ‘do these include end-product burn?’ Forrest answered that 80% of emissions come at combustion and these are not included. Indeed ‘there will be a lot less demand for oil and gas if the aggressive net zero goals are met’. Current emissions reduction efforts, adding a few cents at the pump and planting trees or extracting CO2 from the atmosphere mean ‘a net zero that would work 100 years from now!’. Industry needs to think more and harder about this. Individual behaviors need to change. Greta Thunberg points her finger at consumption. Almost half emissions are within our control, put the lights out, stop traveling for holidays, get a smaller car. Personal sacrifices are needed. Unfortunately, e-car sales are now slowing!
Finally, a suggestion that SPE could transmute into a ‘Society of energy professionals’ met with a mixed reception, with some OK for a name change.
This special session addressed ‘what SPE members need to know about sustainability’. Nigel Jenvey (Gaffney Cline) offered some fundamentals that are leading to ‘staggering changes’ in energy. Since the Kyoto agreement, some $3 trillion public money has been invested in renewables. Meanwhile, oil and gas has incurred a $1 trillion debt across its supply chain. Climate is now a key part of ESG efforts, but it needs more engagement from industry regarding carbon emissions. For a stable, low carbon transition, tax is critical. The industry is confronted with a ‘real dilemma’.
Shell’s Josh Etkind pointed out that the world population is increasing rapidly, and the world needs more energy with less emissions. Shell’s activity is now framed by the 17 UN sustainable development goals. The plan is to improve the bottom line and lower the carbon footprint. Finance ‘is really playing a role here’. On the plus side (for the industry), Etkind cited Boston Consulting Group’s Jamie Webster who blogged that better oil and gas economics have increased oil and gas reserves by a factor of 2.5.
The debate turned to the ‘innovative business models’ that might enable sustainability. Here Etkind cited the Environmental Partnership’s push to eliminate pneumatic valves and HARC’s work on shared water infrastructure in the Permian basin. This was in part driven by the fact that currently, 60% of reinjection wells are souring due to poor biocide use. Etkind cited Data Gumbo’s blockchain as having produced ‘a $3.7 billion saving’ in water trading. The OGCI oil and gas climate initiative, a $1bn fund targeting CCUS and emissions reduction, also got a mention.
Jenvey added that the SPE has created a CO2 Storage Resources Management System (SMRS) subcommittee of the SPE CCUS technical section. More from the ‘groundbreaking’ document. Etkind concluded that industry needs to reach out to explain its role in the energy transition, ‘We are seen as dinosaurs that are unable to change. But we do have scale and the ability to move quickly.’ The ensuing debate discussed the role of natural gas as a ‘bridge fuel’ and part of a long-term solution. But this is challenged by the anti-fracking movement. The debate is more polarized. Kamel Bennaceur, (Nomadia Energy Consulting) maintained that natural gas remains a ‘destination fuel’ with over 100 years of gas ahead of us. Etkind agreed that the issue is politicized and has reached fever pitch, as both sides think the others are stupid. What makes sense for you? The temperature record, reduced arctic ice, and poor air quality are real, whether you believe or not. We need to come together with respect. Data is less compelling in this emotional debate. In the summing up it was agreed that there needs to be more investment in low GHG technology and cleaner fuels. ‘Growing demand does not mean we can’t cut emissions, witness the US where natural gas has largely replaced coal in power generation.
Ashwani Dev (Halliburton) observed that others, like Tesla have created a ‘value shift’ So what is the value shift for oil and gas? Current processes are too complex, make them simpler with ‘optimization across the board’. Make machine learning systematic for engineers. Halliburton has been collaborating with MIT on analytics to turn petroleum engineers into data scientists. Halliburton’s ‘smart oilfield on an open platform’, Open Earth Community has backing from Shell, RedHat, Equinor, Total and others. Halliburton has ‘completely adopted open source’. ML proofs of concept began four years ago and now represent a major practice. To date successes have come from automated fault extraction, NPT analysis on daily reports and flow prediction.
Pablo Tejera Cuesta (Shell) bemoaned the ‘fact’ that ‘only 5% of seismic data is actually used’ (‘this is shocking’). And only 1% of daily data is ‘used’. Shell is reducing the carbon intensity of its production using digital technology to monitor and improve. One flagship is the Quest carbon capture and sequestration (CCS) project. Digital, analytics and ML are bringing the fourth industrial revolution thanks to sensors, AI, compute power and wireless connectivity. Data in the cloud is key to faster analytics and decisions. Shell’s in-house ‘X-Digi’ subsurface data universe SDU got a mention, even though ‘it’s hard to share some stories because of confidentiality’*. Other projects include seismic image analysis, WLO (well location optimization) and predictive maintenance with a digital twin.
* Although chunks of SDU are now being ‘shared’ as per OSDU.
Robert Heller (BP) reported on the application of data analytics to sand production with an AI-based software tool called the ‘Sandman cognitive sand advisor’. 50% of BP’s production comes from sand-prone reservoirs that represent a safety, environmental and economic risk. Sandman applies expert knowledge and data science to set an optimal production rate that mitigates sanding. Heller contrasted data-driven, numerical AI with ‘cognitive’ knowledge-driven, symbolic AI. While results from the former are harder to explain, creating a knowledge base for the latter require handcrafting in a ‘quite painful’ process. Once it’s done, knowledge-driven AI offers explainability and a low error count. In fact, Sandman uses both approaches. Regarding the ‘good bad and ugly’ aspects of AI, BP sees tremendous potential in the approach. But cognitive computing is not right for every problem. It is best applied to a high value problem that can count on help from willing experts. Expert engagement needs to be well-planned, ‘use more than one expert, but not too many!’ Project management is challenging. As developing software is interdisciplinary. IT underpins BP digital initiatives. ‘Conversations’ with the IT department for connections, permissions are necessary. Sandman was a big software engineering effort. Finally, a good GUI is also important.
Philippe Herve (SparkCognition) compared the cost of descriptive, diagnostic, predictive and prescriptive approaches to maintenance as increasing in difficulty and value. Rule/physics-based models perform poorly in edge cases and fail if one variable is changed. Data scientists are hugely in demand. Herve proposes automated model building (AMB) aka AutoML where ‘AI designs AI’ and replicates the mind of a data scientist. AMB uses genetic algorithms and deep learning to converge a generalized solution. AkerBP is a user. One PoC (not AkerBP) used unsupervised ML model to identify behaviors and correlate anomalies with specific downtime events. The model predicted 75% of production-affecting events several days in advance. The (unnamed) client is now moving into full deployment. Interestingly, 75% of the failures are ‘system failures’ (as opposed to equipment breakdown) when operating parameters have changed, and instability results. SparkCognition has also run another PoC at the Texmark refinery.
Erdal Ozkan (Colorado School of Mines) portrayed himself as the ‘average Joe’, a ‘skeptical believer’. Why skeptical? Because there are few clear examples of ML success beyond pattern recognition. So, what is the promise of analytics, AI and ML for reservoir science and engineering? Ozkan referred to work done at the Center for Petroleum Geosystems and Engineering. Current trials of AI/ML include optimizing economic performance, estimating PVT properties without samples and 'forecasting without physics’. These proxy models are to be presented ‘with a large pinch of salt’. They ‘leave the PEs doing the knitting’. There are lots of gotchas in AI, from model bias to the fact that data-driven models are backward-looking and may miss more current stuff. True physics-constrained/data-driven models have yet to be delivered. So, it’s ‘engineer vs. fortune teller!’ Prediction is particularly hard in shale with changing delivery regimes. If you do get a pattern, should you build a physical model to incorporate the result? Machine learnability has yet to achieve the status and early promises of AI, i.e. beyond pattern recognition. Ozkan concluded with some philosophical reflections on Cantor’s uncountable infinities, on Hilbert’s ‘10 open questions’, on Gödel’s proof of incompleteness theorem, and Ben-David’s 2019 demonstration that some problems cannot be solved with AI.
A paper by Peidong Zhao (U Texas at Austin) applied machine learning to a big data set (4,000 wells) across the Eagle Ford formation. The project set out to compute EUR (estimated ultimate recovery) across a 50x400 mile area. EUR is considered a proxy for economic success. A rather intricate workflow applied multi variate linear regression across 20 or so parameters that showed ‘multicollinearity’. Data was massaged with ‘backward elimination’ and manual removal of redundant information. Moran’s I test showed that EUR is spatially autocorrelated which meant for more data ‘cleanup’. A Random Forest model proved robust in creating prediction model with spatial data. The models ‘explain’ around 50-70% of observed variation. The ‘most significant’ variables that predict EUR are TOC, vitrinite reflectance, Poisson, upper Eagle Ford thickness, well depth and lateral length. That’s just about everything, no?
Chevron’s Andrei Popa showed how AI has been used to optimize horizontal well and perforation placement in the venerable Kern River field. Kern River (California) has some 21,000 wells with 10,000 producers and 52 active rigs. The program started in 2006 and now around 1,100 h-wells have been drilled to reach the thin oil leg, all that is left after decades of production. H-well candidate selection was a ‘laborious manual process’ and eventually Chevron ran out of candidates. The AI program began in 2012 using fuzzy logic on resistivity, oil, gas saturation, reservoir thickness and temperature. ML was used to understand what is driving performance and the impact of stratigraphic connectivity and heterogeneity. A novel approach used a ‘dynamic reservoir quality index’ (dRQI), computed for all 155 million grid cells. A workflow involved, ‘fuzzification’, ‘if-then’ rule evaluation and ‘defuzzification’. A plot of the Lorenz coefficient showed baffles and barriers to flow. The unsurprising conclusion is that h-well performance is best with a good reservoir and no barriers to flow.
This special session on well construction automation was jointly organized by the SPE DSATS (drilling automation), DUPTS (drilling risk), OGDQ (data quality) and WBPTS (positioning) committees.
For David Reid (NOV) Drilling automation is hardly new, ‘the technology is here, what’s stopping us? Culture!’ When DSATS started there was no business case for drilling automation, ‘just a belief that there ought to be something in it!’ Today there is a business case and ‘we are going to go fast’. Jeff Moss (ExxonMobil retired) added that DSATS started with some irrational exuberance followed by a long honeymoon period. With today’s unconventionals the business case for automation is good, but it remains difficult to share the spoils of automation between operators and contractors.
In his keynote, Precision Drilling CEO Kevin Neveu described the drilling business in Calgary as ‘on its heels’. Which might mean a ‘tipping point’ for drilling automation. AutoDriller software has been around for decades. What is new is full process multi-machine control. The next step is to leverage the high volumes of data with AI and complete the transformation. But this will take much longer, and each line of code is potential failure point. Such development is costly and unfortunately ‘finance has abandoned the industry’. Drillers are working against procurement groups here and service companies ‘need to define the value for themselves’. Capital constraints are making industry very risk averse. But risk will be commonplace when field testing AI and data science. Regarding the human impact, some will benefit, others not. The driller’s lot will be greatly enhanced, but the company man and directional driller will be made redundant! Exceptional leadership will be required to push the changes through.
Matthew Isbell observed that Hess has been in the Bakken since 1951. Unconventional wells are ‘moderate to low’ in complexity but need fast paced, supply chain logistics. The key ‘days to TD’ KPI has consistently improved over the years. Where will be the value in automation? A Hess study broke well delivery into 17 phases to find that two, vertical and lateral drilling, dominate. Drilling automation can improve here. Hess began in tests in 2015 with wired pipe and identified problem as ‘people taking bad decisions!’ New workflows have reduced variation in decision making. Hess also uses a central real time operations center that has enabled fewer drillers to see and learn from more wells. Learning rate improvement is the key. In the future, the plan is for standard operating procedures, set points, and drilling in a process control loop, optimizing one section at a time. Today Hess has one well doing this in the Bakken. Drilling automation is not about the technology, it is about people, minimizing variation and leading by the business and ‘lean’ methods à la SQDCP.
Lars Olesen (Pason) described himself as a ‘humble peddler of electronic drilling recorders’. Pason’s automatic driller software controls the rig’s draw-works and top drive giving faster ROP and decreased NPT. Olesen noted that, in product development, the algorithm is a small part of whole. Add in control system integration, workflows/UX, support and ‘you have 10x the effort’. ‘Achieving the overall vision for us and for this panel will take some time’. ‘You can’t tell the company man what to do’! And it can be hard to prove what is contributing to a successful KPI. We need help from the operators here. A meaningful conversation with the contractor is required to deploy an automation product. And ‘most operators are a lot more chaotic than Hess’.
Duane Cuku (Precision Drilling Rig) observed that we are ‘already down from 40 to 23 days per well’ and will soon be ‘down to 13 days thanks to the digital transformation’. But the downturn meant that companies ‘had to offer the moon to survive’, which is unsustainable. A reluctance to invest in technology is understandable, operators are skeptical in the face of ‘extreme cost control’. A fragmented supply chain with multiple providers of the same service doesn’t help. Operators’ personnel are resistant to change. In all events, the rig is the central platform for well construction and drillers are at the heart of the transformation. PDR has developed an ROP optimizer that has now moved from an advisory mode to control the auto driller. Key to this development was a ‘satisfactory sharing of the spoils’ (IP and cash). The tool is charged per-day as separate line item in the drilling contract.
Morten Norderud-Poulsen (Maersk Drilling) referred to automation learnings from offshore drilling on Norway’s Martin Linge field*. Wired drill pipe has proved successful in torsional vibration control, where the auto driller compares modeled and measured weight-on-bit (WOB). But the value creation for the contractor is limited and must be set against the cost of wired pipe. What is now needed are discussions on cost sharing, compensation and risk/liability. While the technology is fine, these considerations make it hard to deploy. The current set up does not drive value creation. But there is the potential for 20-30% savings (on a 400k$/day rig) with a relatively small investment. A new value creation concept is needed, an ‘alliance model’ that focuses on outcomes between operator, drilling contractor and service company.
* As presented in 178863-MS IADC/SPE, Wired drill pipe in a complex environment.
Jim Claunch (Bain & Co.) offered a maturity model for digital technology in oil and gas. The industry is ‘really good at creating silos’ so level 1) involved scaling digital inside the silo say with pump jacks, gas lift automation. Level 2) extends optimization along the value chain, e.g. using midstream data to optimize the field which is ‘tough to do’. Level 3) sees a ‘workforce of the future’ and new, agile ways of working. This ‘makes total sense and gets great results’ but involves hiring different people, ‘digital natives’ and data scientists. Finally, Level 4) sees digital in the DNA of the organization, the real ‘digital transformation’. Claunch touched on diversity, gender and inclusion and bemoaned the passing of the 'great leaders of the past’ that knew their stuff. Today we live in a digital society. Too many people are ‘lost in middle management’.
Michelle Pflueger is yet again involved in a digital transformation initiative in Chevron. Previous efforts (digital oilfield) have not changed people’s work experience significantly. So what is new toady? Pflueger believes that there is now the potential for major change, as the cellphone has changed our lives. A top-down initiative from Chevron’s management has seen Pflueger create a small team of ‘accelerators’. The business units map out their own transformations and the team helps them go faster. The Permian unit is ‘all-in’ with a world-class data set and analytics. This could not have been done with a company-wide ‘standard’ methodology. Gorgon is also different with around a million sensors deployed, growing at 100k per year. Advanced process control at Gorgon has brought a $240 million ‘value’. Chevron is ‘moving everything to Azure’. Chevron is working to evolve its digital culture with Shark Tank-style initiatives to encourage entrepreneurship and a Digital DoJo worldwide championship. The shift is also seen in recruitment, ‘five years from now every engineer will have to know digital like I used to know Excel’. While there are ‘a lot of questions facing oil and gas’, students should realize ‘it is an important time to be in the industry’, ‘we enable human progress’. There is a great opportunity for folks with deep digital expertise ... we are just starting out!
Hani Elshahawi (Shell) sees digital disruptive and ‘a potential threat to many incumbents’. Oil and gas is facing a perfect storm as investors leave in a push for decarbonization. The vanishing workforce (crew change/brain drain) means that in five years there will be 75% millennials. How does digital help? By uncovering invisible insights in an ‘avalanche of data’.
In the session on ‘developing and implementing digitalization business values’, Lucas Gonzalez presented YPF’s ‘Dagma’ (data governance management analytics) program, a push to create a data-driven culture in Vaca Muerta shale operations. Dagma involves data preparation for data science initiatives to optimize frac hit and screen-out detection. Dagma supports business intelligence in the cloud with easy access and visualization of data. To date, 200 people have been trained on self-service BI. YPF is now planning a data lake and data warehouse for upstream data with a ‘balance scorecard connector’ to these databases. Tools of the trade include Azure and PowerBI. Y-Tech, YPF’s technology arm helped build the data pipeline.
David Joy (HP Enterprise) presented on the use of ‘edge computing’ and the IIoT to modernize a Texmark petrochemical plant. The comprehensive system provides video surveillance, ‘man down’ alerting and a connected worker (with an electronic hardhat). Connectivity is provided via a central Wifi canopy/umbrella. The system provides condition monitoring (e.g. clogged filters, augmented reality etc). All served from an ‘Edge Center’ which for all the world looks like a medium-sized server! Here data is kept ‘at the edge’ i.e. on-site, an interesting redefinition of what ‘edge computing’ is about. The communications infrastructure used an elevated dual antenna. Atex issues with the kit were solved. Your mileage, especially in the upstream, may be different as the HPE/Texmark solution relies on solid, non-intermittent comms. The ‘core industry architecture’ is now being extended to other petrochemical plants and refineries.
First, a confession. We attended the Nanotechnology for the Oilfield session in the vague expectation that we would hear of wonderful nano-robotic devices doing smart, perhaps ‘digital’ things in the reservoir. No such luck. Despite earlier claims and hype, notably Saudi Aramco’s hype, over a decade ago, of ‘autonomous micro machines’, today’s nanotech is a bit more prosaic. As Steve Bryant (U Calgary) put it, nanotech is the synergy of nanoparticles and chemistry. Today, ‘anything you can draw, a smart chemist can make’. Nanoparticles plus surfactants can be powerful. The synergy makes for a larger, more flexible design space, allowing for the tuning of electrostatic affinity of foam to enhance propagation.
Hugh Daigle (U TX at Austin) reported on the use of nanotec in ‘sustainable’ resource development where they can replace hazardous chemicals, reduce water use, treat produced water. As an example, particles with a magnetite core and SiO2 shell can be manipulated in magnetic field. Dirt is attracted to the particles and can be pulled out with a magnet. Another application is the use of superparamagnetic nano paint to coat pipelines. These can then be heated up with an external magnetic field to melt paraffin accumulations. The technique was presented in an SPE paper that included a ComSol model of the magnetic field inside a pipeline pig.
In a booth presentation, Jack Nassab, demoed Schlumberger’s Stewardship Tool, first released in 2018. The proprietary wellsite modeling software helps communicate the pros and cons of unconventional wells to the public. Public perception is very powerful, ‘we are judged on our few mistakes rather than the many successes’. The Sustainability Tool computes model KPIs for fracking including emissions on location, during operations and flaring, how consumables get there and waste disposal. The web-based tool covers SOx, NOx, CO2, engines, noise, VOCs, worker safety, traffic etc. An augmented reality function encourages stakeholder engagement. At the time of the ATCE, the stimulation module was operational. Early 2020, V3.0 will roll-out with a production module. Schlumberger plans to release a single well version as an ‘open source tool for all’.
INT presented IVAAP as a subsurface workspace customizable with GeoToolkit. Widgets fit into the IVAAP canvas. Data selected in a map search flows into a Jupyter notebook for AI. INT is working with CGG on a geophysical Jupyter notebook. Weatherford Central is now on IVAAP along with a Slack-style messaging for well ops.
Touring the exhibition floor, we chatted with some knowledgeable members of the upstream IT vendor community. We learned that some majors are requiring a microservices architecture running on Pivotal’s RabbitMQ bus for integration with company development environments or embedded in control systems for machine to machine interaction.
We also learned that OSDU V1 (as it was handed over from Shell) was AWS based and used technology from 47 Lining, now a Hitachi Vantara unit. See the 47L oil and gas case history (which is probably Shell’s SDU). This code base has now been replaced with OpenDES, the low-level layer of Schlumberger’s Delfi. Some expressed an opinion that this represented a de-facto take-over of OSDU by Schlumberger whose ‘professional code’ will include Petrel log formats. While OSDU was originally on Amazon, objections from Microsoft saw the code re-jigged to be multi-cloud. Although ‘multi-cloud’ is considered by some to be more or less impossible!
Silverwell Energy is working on low end edition of its DIAL, digital intelligent artificial lift device for unconventionals.
CMG is using AWS S3 storage, Dynamo DB, Lambda and SNS to bundle data plus application (IMEX, GEM, STARS) into a single Docker container. This is said to be more secure and faster. MPI is used to distribute big jobs across multiple machines.
ColdBore’s SmartPad IoT ‘digital wellhead’ data combines with other data feeds from multiple service providers adding authoritative time stamps for CT/wireline operations and completions. Events are recorded chronologically and ‘impossible to deny’. Previously each contractor had its own time stamp. An operations database is delivered to the operator which can also population WellView. NPT alerts allow for on-the-spot reconciliation.
StoneRidge Technology’s Echelon runs on GPUs and has displaced Schlumberger’s Intersect et ENI. Marathon provided the original Echelon code and funded early development. The tool has been tested on a billion cell model.
Peloton’s mobile app monitors emissions and pushes notifications to personnel and C-Suite corporate reporting dashboard with company-specific KPIs.
Oslo, Norway-headquartered geochemical and biostratigraphy specialist Applied Petroleum Technology has appointed Helge Nyrønning as its CEO. Nyrønning was previously MD of First Geo (now AGR).
AqualisBraemar has ramped-up its international adjusting business with the appointment of Allan Kelly (head of Europe and West Africa), Charles Honner (Americas), Dirk Jol (Middle East and India).
Global commodity price reporting and news service Argus has appointed Neil Fleming as Senior VP, Editorial.
Tom Bradicich has been appointed to the Aspen Technology board of directors. Bradicich is currently VP, Fellow and global head of edge and IoT labs at Hewlett Packard Enterprise.
Basic Energy Services has appointed Keith Schilling as president, CEO and director. Schilling hails from Baker Hughes.
BCCK Holding has appointed Greg Herman as director of business development. He was previously with Waukesha Gas Engines.
Neil Glass is now a director of the board of Borr Drilling.
BP CFO Brian Gilvary is to retire. He is succeeded by Murray Auchincloss, currently CFO of BP’s Upstream segment.
CAM Integrated Solutions has promoted Jason Newton to VP engineering operations and Israel Martinez to SVP business operations.
Polly Courtice is stepping down as director of the University of Cambridge Institute for Sustainability Leadership. The Institute is looking for a replacement.
Kirk Hanes has joined Detechtion Technologies as senior VP global sales. He was previously with Palantir Solutions.
Jack Leahy has joined the Enterprise Ethereum Alliance as membership director.
Jean Cahuzac has been elected president of French industry association Evolen, succeeding Dominique Bouvier. Cahuzac is a retiree from Subsea 7.
Christian Segersven is now head of financial services with TietoEvry. He replaces Wiljar Nesse who is to pursue new opportunities outside the company. Segersven is also head of industry software.
Amy Holmes has joined Express Energy Services as VP and general counsel. Holmes joins Express from Scientific Drilling International.
John Gibson has joined Flotek as chairman and CEO. Current chair David Nierenberg remains as independent director and CEO John Chisholm is to step down from the board. Gibson was most recently with Tudor, Pickering, Holt & Co. Earlier in his 35-year career he was President and CEO of Paradigm Geophysical and Landmark Graphics, as well as president of Halliburton’s Energy Services Group.
Flowserve has appointed Amy Schwetz as SVP and CFO. The company also appointed Sujeet Chand as independent director. Chand is also SVP and CTO at Rockwell Automation.
Foster Marketing has named Bradley Smith as business development coordinator. Smith is a 2019 graduate of the University of North Texas.
Following a decision to reduce Fugro’s board of management, Brice Bouffard, whose term finishes on 30 April 2020, will not be nominated for reelection.
Hexagon has made the following organizational changes: Juergen Dold, manager of Geosystems, geospatial, and safety & infrastructure; Thomas Harring, president of Geosystems; Norbert Hanke, COO; Paolo Guglielmini, president of manufacturing intelligence. Burkhard Boeckem succeeds retiree Claudio Simao as CTO of the Hexagon unit.
IFPen, the French Petroleum Institute has named Olga Vizika-Kavvadias scientific director, Benjamin Herzhaft geoscience director and Sylviane Buttafoco director of digital.
Steve Bate, EVP and CFO at ION Geophysical is to retire and step down as CFO. Bate is succeeded by Mike Morrison as EVP and interim CFO.
Noble Corporation has named Stephen Butz as EVP and CFO.
Oceaneering International has named Earl Childress to succeed retiring SVP business development Steve Barrett.
Opportune LLP, a global provider of business advisory services to the energy industry, has promoted Kent Landrum to Managing Director.
OspreyData has opened a new office in Vancouver, to serve the oil and gas community in Calgary. The company has also hired Jon Snyder as customer engagement manager for North America.
Andrew Gould, former chairman and CEO of Schlumberger, has been elected to the Occidental board of directors.
Recon Technology has elected Ralph Hornblower to its board of directors.
Koby Kendrick is assistant director at the Texas Railroad Commission’s new oil and gas field office in Lubbock, Texas.
Pamela Sabo is the new business development and sales manager with Ryder Scott. Sandeep Khurana has joined as head advisor, upstream and midstream integrated services. Khurana was previously with KBR’s Granherne unit.
Bill McDermott has decided not to renew his contract with SAP and is stepping down from his CEO position. Jennifer Morgan and Christian Klein have been appointed co-CEOs.
Horst Kayser is now chairman of the Siemens Portfolio Companies. He succeeds Jochen Eickholt, who will be a member of the future executive board of Siemens Energy, with responsibility for the power generation and oil & gas units. Michael Sen is designated CEO of Siemens Energy.
Stephane Biguet is to succeed Simon Ayat as EVP and CFO of Schlumberger who has stepped down. Ayat remains as senior strategic advisor for a two-year period.
Stratagraph has appointed William Hagan as president and CEO.
T.D. Williamson has named Jeff Wilson as VE Eastern Hemisphere and Rich Kehl, VP global engineering solutions.
ThoughtTrace has appointed energy executive Arthur Medina as VP of digital transformation.
Daniel Vertachnik is now CEO of Twenty20 Solutions. He succeeds interim CEO Peter Shaper. Vertachnik was previously COO at IQMS before its acquisition by Dassault Systèmes.
David Shaw is now chair of the XBRL Standards Board.
Martin te Lintelo succeeds Paul van Exel as chair of USPI-NL. van Exel continue with his responsibilities for ISO matters and administration. Accenture has joined USPI and the CFIHOS project as review member. The Accenture rep is Andrea Dentone.
Weatherford International announced that William Macaulay, chairman of the company’s board of directors has died.
The PPDM Association had announced the death of Mark Priest a partner with Wipro in Houston. Data management specialist Priest held various roles with Triton Energy, Burlington Resources, Schlumberger and RasGas.
Jeff Pferd, formerly with Petris and Halliburton has died. His family invites in-remembrance donations to the ALS/Lou Gehrig association.
Apergy is to combine with Ecolab’s upstream energy business (aka ChampionX) to create a ‘global leader’ in production optimization solutions, including artificial lift equipment, chemical solutions, and digital technologies. The combined companies are said to have an ‘enterprise value’ of approximately $7.4 billion. The ‘reverse Morris trust’ transaction sees the ChampionX spun-off as a wholly-owned Apergy unit.
BP has invested $3.6 million in R&B’s latest funding round. R&B is a Chinese artificial intelligence building energy management specialist. R&B’s flagship AI platform, BeOP ‘navigates principles and relationships between data and data, things and things and data and things’.
CGG has exited the seabed data acquisition business and terminated the Seabed Geosolutions’ joint venture, transferring its 40% share to Fugro. CGG is also concluding ‘remaining matters’ related to Seabed Geosolutions’ financing with a $35 million payment to Fugro. Fugro is now the sole shareholder of Seabed Geosolutions and is looking to dispose of the ‘non-core’ asset.
Meanwhile, Fugro reports the sale by HC2 Holdings of Global Marine Group, in which Fugro has a 23.6% share. The sale will bring Fugro around $40 million.
FLIR Systems has made a strategic investment in software developer Providence Photonics. Providence’s computer vision technology and infrared imagers are used to monitor emissions and flare combustion efficiency. The deal sees FLIR gain exclusive access to Providence’s IP and expand the FLIR footprint in oil and gas.
Forbes Energy Services is to combine with the US service rig, coiled tubing, wireline and other business lines of Superior Energy Services.
Hexagon is to acquire Cowi’s aerial mapping business, strengthening the HxGN content program with Cowi’s mapping ‘content-as-a-service’. The deal adds to the earlier Hexagon acquisitions of North West Geomatics, SigmaSpace, Melown Technologie, and Thermopylae Sciences and Technologies.
Honeywell has acquired Rebellion Photonics, a Houston-based provider of visual gas monitoring solutions for safety, operational performance, emissions mitigation and compliance in the oil and gas, petrochemical and power industries.
Mistras Group has acquired pipeline data management boutique New Century Software. New Century’s pipeline integrity management software suite complements Mistras’ ‘PCMS’ plant condition management solution. The acquisition was funded with cash on hand. Mistras also recently acquired Onstream Pipeline Inspection, a provider of inline inspection analytics and pipeline integrity solutions.
Tank level monitoring specialist Otodata Wireless Network has received some CAD$7.5 million growth capital financing from CIBC Innovation Banking. The monies will be used to support the company’s product diversification and growth across North America.
Petrofac reports the small ‘bolt-on’ acquisition of W&W Energy Services, bringing an entry-level position in the US onshore operations and maintenance market and an additional platform for growth in the Permian basin.
VMware has acquired Pivotal software and will combining Pivotal’s offerings with VMware’s cloud native applications offerings into a new modern applications platform unit. Pivotal’s software will integrate VMware Tanzu, a products and services portfolio for Kubernetes users.
Following a one-to-five reverse stock split of its ordinary shares, Recon Technology has regained compliance with the $1.00 per share minimum bid price requirement of the Nasdaq. Recon received the Nasdaq non-compliance warning in January 2019.
Industrial internet of things and analytics software house Seeq Corporation has raised $24 million in a series B funding round led by Saudi Aramco Energy Ventures. The round saw renewed participation from Altira Group, Chevron Technology Ventures, Second Avenue Partners and other existing investors.
Stress Engineering Services has acquired Laserstream, a technical service company specializing in laser mapping of tubular assets. The acquisition will provide high-resolution pipe ID scanning hardware and imaging software, enhancing SES’s competencies across upstream, midstream, downstream and other verticals.
Weatherford International has emerged from chapter 11 protection, having reduced approximately $6.2 billion of outstanding funded debt and secured $2.6 billion in ‘exit financing facilities’.
Rick Gearheart presented Unico’s autonomous expert solution to oilfield production. One trial, dubbed the ‘data of Damocles’ saw a 3 terabyte dataset handed-over to a confident AI team. An additional offer to provide domain knowledge assistance was refused, ‘just watch the magic!’. Two months the AI experts concluded that the ‘fault reset event’ was the best ‘predictor’ of failure. What went wrong? The AI team was given too much scope. AI/IOT is a toolset not a project and requires expert input. Unico’s first foray into IoT in 2011 with GMC System pulling data Unico VFDs and displaying on a mobile/web endpoint. The first GUI was too complicated and the system was redesigned as PumpExpert. Now real time data is processes to provide simple message like ‘normal operation’, ‘pumped off’, ‘gas locked’, ‘sticking’, highlighting abnormal situations so they cannot be overlooked. Pump Expert also constantly adjusts pump params autonomously, responding to changing well conditions on a stroke by stoke basis.
Blake Burnette described IOT-eq’s use of the IoT to drive machine learning at the well site. Frack truck IoT systems have evolved from local, on site sensors and processing in 1980s, through satellite connectivity (2000). Around 2010, big data Map/Reduce algorithms were producing up to 2,000 alarms per day! Burnette observed that some findings were more useful than others. An instruction to ‘wash radiators’ could generate million dollar savings. The IoT needs attention to detail, ‘talk to people’, you need ‘boots on the ground’ to find and fix sensors and comms issues. IoT-eq has a technology partnership with Weir on a pump health monitor. The Weir/IQ Data Intelligence dashboard runs either in the cloud or at the edge depending on communications costs. Burnette referenced another IoT gateway, IOT-Glue, as offering a quick data path from data to app. IOT-Glue offers drag and drop configuration and has been used to auto fill RRC reporting e-forms and for tank monitoring and alerting for theft. The IoT can be used to monitor and understand machine learning results and avoid situation such as when ML-based control tuning is ‘learning bad habits’. Finally a warning to the DIY brigade, ‘avoid the pitfall of using consumer products in harsh environments’. More from IOT-eq.
Jim McKenney (NCC Group) stated that ‘despite the promise of IoT and predictive maintenance, these systems have failed to meet their objectives so far because of system complexity and cascading failures’. For predictive maintenance efforts to succeed, field devices, sensors, data models and the supply chain need to work at 99%+. This may not be realizable with regards to cost. IT risk is also an issue. McKenny recommends using tools like Microsoft Threat Modelling, the STRIDE/DREAD approach and ‘many more’ standards and approaches to address IT risk and engineering reliability. IoT tech can reduce wellsite opcosts through predictive maintenance, but a software lifespan of decades exposes systems to future bugs that are beyond the awareness of many IoT providers. Tracking and tracing faults across a complex landscape of software and engineering hardware is difficult. Reliability, availability and maintainability engineering principles need to be applied to IoT devices and pushed upstream. Software should be thought of as a ‘material’ that can be fixed with upgrades and devices ‘front loaded’ with unnecessary features may expose future risk.
Paul Brager (Baker Hughes) also spoke on the growing threat landscape of IT/OT convergence. Future analytics requirements will grow the attack surface even more. Legacy infrastructure is especially at risk. How much convergence is a hotly debated topic. There is also the question of which, IT or OT, is more trusted! The answer is ‘smart’ convergence that assesses supply chain attacks and risks. Operators need a cyber assessment of sourcing to avoid cascading disruption. A holistic view required, as is an intelligent separation of IT/OT, along with a response plan for the eventuality of an attack.
Bob Skiebe presented a range of automation options from Siemens Oil & Gas from field to enterprise IT/OT integration offering IoT based next gen well automation with artificial lift optimization to well AI at the edge to detect anomalies in ESP operations. At the high end, Siemens can supply an ‘integrated digital twin from subsurface to facilities’. For brownfield applications, simple video surveillance of existing gauges from a stand-alone device at the well site with a once-a-year battery change offers a ‘low cost digital transformation’.
Murat Ocalan (Rheidiant) spoke in praise of LPWAN - LoRa and LoRaWAN with an AES-secured payload along with the deployment of deep learning-based algorithms (PyTorch, Keras, TensorFlow) on edge devices (ARM MBED, RTOS, C++). Security is provide with ‘crypto-signed’ firmware.
Paul Solano presented MAC Engineering’s state of the art well site automation, based on IoT and edge computing. MQTT is the base protocol along with emerging integration standards like Sparkplug. Data from existing scada systems is translated from Modbus, OPC and other ‘legacy’ protocols. See for instance the Modbus/MQTT comparison from Novus Automation. This enables a ‘transitional’ architecture in a shift to Scada 2.0 with the incremental addition of a big data lake, enterprise asset management and more. The Scada 2.0 reference architecture includes microservices, containerization, ‘open’ APIs and native big data integration.
* Note SCADA 2.0 – is a rather loose term used by many (Siemens, InFusion, ...). See for instance the 2010 EON/OSIsoft presentation.
Louis Lambert (Redline Communications) recommends using your installed scada that is already ‘built for safety and compliance’. ‘Do not disconnect it!’ Meanwhile operators need to become familiar with private LTE cellular comms. When ready, this can provide data access from cell phones and tablets and enable video and analytics. Likewise, old two-way radios can be modernized with private push-to-talk. LTE provides a ‘path to 5G’ with a self-organizing network and dynamic autonomous configuration of new kit with the 3GPP mobile broadband standard.
Claude Baudoin from the Industrial internet consortium stated, ‘as everyone knows’ that one should ‘just follow the standards’. But there are too many in the digital oilfield, around ten at the data protocol level alone. The Industrial Internet Consortium’s high level IIoT roadmap (possibly the Industrial IoT Analytics Framework) lets users develop business cases and architectures. But the IIoT needs more IT/OT convergence and needs to ‘get out of the proof of concept jail’. The IIC was founded in 2014 by AT&T, IBM, Intel, GE, Cisco. In 2019 the IIC absorbed the Open FOG consortium. The IIC advocates standards based on the OMG’s DDS and OPC-UA gateway. Some eight vendors implement the protocol which has been used in projects by Ensign Energy, NOV, Canrig Robotics and TechnipFMC. Checkout the IIC’s Smart Factory/ML presentation.
At a more practical level Noel McKim presented Meyer’s Gen3 ‘Spyder’ multi-station greasing manifold that connects multiple wells to a central grease controller and lubrication flow meter. The system keeps personnel out of the ‘red zone ‘with secure access lockout and is said to reduce costs and improve safety.
The IQ-Hub Canadian Well Site Automation 2020 conference will be held on May 11-12, 2020.
Josep Daví and Vicente Rios (both with Emerson) presented a ‘digital twin*’ developed for Total’s Culzean** ultra-high-pressure high temperature gas condensate development in the UK sector of the Central North Sea. But first, a definition of the digital twin. Previously, computer simulators were used to train operators prior to commissioning and startup of new plants, hence the operator training simulator. As the simulators improved, their range of application has extended and today, they can support activities across the project lifecycle.
Total’s Culzean project illustrates the use of a high-fidelity process model of the plant based on equipment data sheets and connected to an Integrated Control & Safety Systems (ICSS) and adding graphics with the same look and feel as the real thing. To our ears, this sounds pretty much like the OTS but the difference is that the twin ‘connects with relevant time series data to ensure the model mirrors reality’. More prosaically, Total’s Culzean twin embeds Emerson’s DeltaV, Simulate, the ICSS a/a, with some AspenTech stuff alongside. For Emerson, the ensemble is referred to as the 'Multi-Purpose Dynamic Simulator (MPDS). In general, the MPDS comprises an ICSS database communicating through OPC with a simulator. The simulator consists of a set of integrated high-fidelity models for different parts of the process, developed using Aspen HYSYS and emulation of complex control packages (anti-surge compressor control, combustion heater control) or emulated external safety systems package.
The first twin was delivered a year before first gas for operator training, including advanced training in abnormal operations and start-up/shutdown procedures. The twin was also used to identify and fix multiple minor and not so minor control system issues before first gas. Some 130 PID controllers were tuned virtually on the twin and the parameters exported and passed on to the project. The twin served as the basis of discussion with the design authorities for clarification and in some cases modification. Issues solved included engineering units inconsistencies, missing references and safety systems logic errors that could not be found without a dynamic simulator. Other errors found included wrong tuning parameters and, in some circumstances, divide by zero calculations.
* If there was any doubt that the ‘digital twin’ is not exactly new, both authors claim ‘over ten years of experience' in digital twin technology, suggesting that the twin is rooted in the ‘prior art’ of the simulator. As we intimated several years ago when following our visit to BP’s Saltend UK plant.
** Culzean was designed and built by Maersk which was acquired by Total in 2018.
Petronas’ Sharizan Ramli presented on the use of Emerson’s Plantweb Optics (PWO) asset performance monitor. Early detection of excess vibration in rotating machinery has led to a ‘90% reduction in mean time to repair’. AMS 9420 wireless vibration transmitters have been installed on two production platforms to improve pump uptime and availability. The solution is now to extend to other rotating equipment such as fans in cooling towers. PWO will also integrate Petronas’ Protean OSIsoft PI System-based rotating equipment analytics platform.
Rahul Raveendran (University of Alberta oil sands engineering research) and Warren Mitchell (Spartan Controls) presented an advanced analytic solution for ESP monitoring in a steam-assisted gravity drainage process. Alberta has a 1.3 million bbl/day SAGD production capacity and sees some 3.6 million bbl/day of steam injected. Electric submersible pumps (ESP) used in production have a run life from ‘2 months to 3 years’. Workover and replacement in the event of failure costs ‘half a million to a million dollars’. The authors have developed performance monitoring algorithms leveraging characteristic curves derived from data-driven models. Real-time operating data is compared with the characteristic curves and pattern recognition is used to spot anomalous conditions. Another approach is to reduce high-dimensional data sets to one or two process health indicators, compared with normal operating conditions.
The AI/ML approach uses Knowledge-Net technology developed by Tunisia-based Integration Objects (acquired by Emerson in April, 2019). K-Net includes tools for data connectivity, preprocessing and data analytics, automated root cause analysis and alarm analytics. Custom-built algorithms can be added for monitoring and prediction applications. K-Net is to be rebranded as the Emerson Analytics Platform and will integrate the Plantweb digital ecosystem.
A joint industry project to study data from multiple ESP installs at multiple producer sites has been initiated in collaboration with COSIA, the Canadian oil sands innovation alliance.
Jamal al Balushi presented a major computer virtualization initiative at Petroleum Development Oman. PDO operates remote facilities that previously required on-site visits to fix and/upgrade hardware and software. Today, the IT has been and upgraded with Dell VRTX virtualized servers in the control center and DELL WYSE 7010 thin clients at remote location. The project included upgrades from Windows XP/Server 2003 to Windows 10/Sever 2016. Virtualization has eliminated operating system dependency on hardware. Emerson DeltaV Virtual Studio Environment was used in the migration, providing template/GUI-based creation and management of virtual machines and software.
Mohammad Taufiq and Abdulmohsen Al Ahmed (both with Saudi Aramco) also presented a major virtualization project which has seen Aramco’s DeltaV system upgraded to V13.3.1 and conventional workstations/servers replaces with virtual machines on Dell VRTX hardware. The change also saw the installation of McAfee Antivirus and e-Policy orchestrator augmented with Aramco’s own cybersecurity hardening. The authors concluded that ‘virtualization of industrial control system applications is highly beneficial in terms of cost saving, scalability and ease of system administration’. The reported technology challenges to virtualization can be avoided with good planning. The technology is ‘highly recommended’ for large scale DCS systems.
For more on DeltaV virtualization read the Emerson DeltaV Virtualization white paper.
The 2020 Americas Emerson User Exchange will be held October 5 - 9, in Phoenix, AZ.
BGS, the British Geological Survey is to convert its open data to the GeoPackage format and release it ‘with styling included’. GeoPackage allows geospatial data to be delivered in an open, non-proprietary and platform-independent format.
Energistics has opened its Energistics Transfer Protocol 1.2 for public review and comment. ETP is a two-way web socket-based connection solution tailored to oil and gas data challenges. ETP 1.2 reduces data latency from a 10-15 second baseline typical of SOAP protocols to about 1 second, while its use of binary encoding requires 10x less bandwidth compared to XML. The comments period ends March 31, 2020.
The Industrial Internet Consortium (IIC) has opened a community forum for industry experts to exchange ideas and discuss IIoT problems. The forum was established by the IIC Digital transformation and Ecosystem work groups which ‘worked tirelessly’ to evaluate options and launch the forum. Check out the IIC Community Forum.
IOGP Report 592, ‘Subsea capping response time model toolkit user guide’ provides a how-to guide to users of the IOGP’s response time model toolkit for subsea capping released as described in IOGP Report 594, Source control (aka ‘blowout’) emergency response planning guide for subsea wells.
IOGP has also upgraded the EPSG Dataset data model, following the publication of the revised ISO 19111 data model. Changes include identification of dynamic (plate motion) datums and CRSs and a new ‘datum ensemble’ construct, grouping together successive realizations of a datum. The latest model V10.0 is available from the EPSG website. More on the upgrade from IOGP.
The Nov/Dec 2019 Issue of ISO Focus described work in progress on the ISO/IEC JTC 1/SC 42 artificial intelligence standard. Created under the auspices of ISO/IEC JTC 1, the information technology arm of ISO and the International Electrotechnical Commission (IEC), subcommittee SC 42, ISO/IEC claims to be, ‘the only standards body looking at AI holistically’.
The Open Geospatial Consortium (OGC) has announced the OGC API Records standards working group which is to develop a new version of the OGC catalogue service for the world wide web. The new version will remedy shortcomings in the current catalogue and align better with the W3C’s spatial data on the web best practices. The shift is also to leverage modern tools such as OpenAPI. More from OGC. The OGC has also approved the OSGEO Foundation’s PyGeoAPI with a compliance certification and reference implementation status. PyGeoAPI is a Python server implementation of the OGC API above, allowing deployment of a RESTful OGC API endpoint using OpenAPI, GeoJSON, and HTML. PyGeoAPI is open source and released under an MIT license. More from OSGEO.
PPDM, the Professional petroleum data managers association is sounding out the industry on a new venture into training in oil and gas facilities. The facilities training needs survey (which has just closed) is designed to help PPDM understand how it can best serve the industry and to evaluate a possible expansion of its professional development portfolio to facilities data. We sense a future PPDM ‘What is a facility’ publication.
PPDM also reports that the Indonesian Ministry of Energy and Mineral Resources has endorsed its eponymous data model as an industry standard and is to ‘formally reference’ the PPDM Data Model 3.9 as an ‘industry open standard’. PPDM 3.9 will underpin a metadata catalog of Indonesian data and will also become the foundation of a national data repository, built and administered using PPDM standards as the foundation. The NDR is to be built by Pusdatin, the Pusat Data and Informasi/Center for Data and Information Technology.
BGS has published an explainer on ongoing changes to the earth’s magnetic field that have created a need for a new global geomagnetic model. According to satellite data, the magnetic north pole is moving across the Arctic region at its fastest rate in 400 years. BGS and the US NOAA rejig the world magnetic model every five years. The WMM is used in many global navigation systems. The latest model shows magnetic north racing across the Northern Hemisphere at around 50 km per year, as it moves from the Canadian Arctic towards Siberia, the fastest shift since the mid 16th Century.
Arvizio has partnered with Magic Leap to shrink large CAD/BIM models and LiDAR scans and make them usable on Magic Leap’s spatial VR/AR headset, Magic Leap 1.
Basic Energy Services and Wellbore Integrity Solutions have both joined the OFS Portal supplier organization.
BP is closing its two EU ‘mega’ data centers and migrating all data and 900 hosted applications to Amazon Web Services. Hosted apps include (downstream), AVEVA unified supply chain (previously Spiral Suite) and ‘approximately half’ of BP’s 65 SAP production environments. BP is also creating a data lake on Amazon simple storage service (Amazon S3), for use across its businesses and plans to leverage Amazon’s Kinesis streaming data service and SageMaker machine learning.
ChaiOne, a provider of ‘behavioral science-led digital solutions’ has joined PIDX, the petroleum industry data exchange body. Another new PIDX member is Payload, a mobile/web-based supply chain software boutique. And yet another is Conexiom, a provider of automated order and invoice processing software. In its annual activity review, PIDX reports that Microsoft ‘agreed to join’ the oil and gas e-commerce standards body in Q4 2019.
CMG is to allocate more resources to CoFlow, its ‘next generation’ reservoir flow modeler, developed under an exclusive agreement with Shell. Shell’s financial contribution is to rise, and CMG has agreed to ‘specific development targets and deployments across a broader range of Shell’s assets. CMG also reports revenue increase from clients moving to its cloud-based offerings, introduced in fiscal 2019.
Dell has designed and built a 52 petaflop supercomputer for ENI, claimed to be the ‘world’s most powerful industrial system’ The system, named HPC5 is located in Milan, Italy and comprises 1820 Dell EMC servers, each with four NVIDIA V100 Tensor Core GPUs, interconnected via Mellanox’s InfiniBand data fabric. The system will be used to accelerate seismic workloads, to run Stone Ridge’s Echelon GPU-based reservoir simulator and to develop new AI-based approaches to energy discovery and processing. HPC5 ‘would’ come in at number 5 in the TOP500 in the world, if it had entered. But there again, other oil and gas HPC installations might be competing for the top slots if they were moved to enter the race.
DNO has awarded DNV GL a framework agreement for the provision of engineering and management support and verification services in its Norwegian operations. One project already underway is a study of hydrogen-induced stress cracking.
Brunei Shell Petroleum is to deploy GEP’s ‘Smart’ procurement software to manage source-to-pay across its subsidiary operations. Smart supports spend analysis, savings tracking, sourcing, contract and supplier management, purchasing and invoice handling.
Sinopec’s Sripe drilling R&D unit is to deploy eDrilling’s WellAhead drilling decision support system which will integrate Sinopec’s drilling automation solutions.
Emerson E&P Software (Paradigm) is to bring to market Repsol’s ‘cutting- edge’ Kaleidoscope seismic processing, imaging and interpretation technologies.
EnergyIQ has teamed with WhiteStar Corp., making WhiteStar’s grid and wells data sets available from EnergyIQ’s competitive intelligence solution.
ExxonMobil and digital finance specialist Fiserv have teamed to let customers pay for gasoline using Amazon Alexa. Alexa-enabled vehicles, Echo Auto and other Alexa-enabled mobility devices will allow a user to say, ‘Alexa, pay for gas’. Availability is announced at some 11,500 Exxon and Mobil stations in the US later this year. Watch the video.
ExxonMobil has signed with Chinese internet company Tencent and spares platform Tuhu to establish an integrated ‘supply to business to consumers’ car care network. The joint venture will grow the Mobil-branded car care network in China and leverage the strength of all partners to provide a ‘digitally-enabled’ car maintenance experience.
Schlachter Oil has signed with Field Squared to ‘streamline and digitally transform’ its well site operations. Field Squared’s field service automation platform that streamlines field service operations. Schlachter will use the platform in its production and injection scheduling and reporting, work order management and to manage its asset portfolio.
Exploration data and software provider Getech/Exprodat has joined OSDU, the Open Group’s Open subsurface data universe forum. More from Getech.
Halliburton has announced a multi-year agreement with Repsol to provide a cloud-based master data management solution for E&P. The software as a service enables users to load, ingest, manage and access log, well and other data across different location.
Midland, Texas-based Endeavor Energy Resources is to use IFS Applications, running in the Microsoft Azure cloud, as its new ERP system of record at its oilfield services division. IFS Applications spans field ticketing and work order generation from a mobile device and on into the general ledger.
Nigerian E&P Lekoil has selected Infor SunSystems and CloudSuite EAM to underpin its future growth plans. The solutions will be delivered and supported by business solutions consultant Progressive TSL. More from Infor and TSL.
Katalyst Data Management has added over four million kilometers of IHS Markit’s AccuMap seismic data library to its SeismicZone portal.
Schneider Electric reports continued digitalization of its supply chain with ongoing deployment of Kinaxis RapidResponse. Kinaxis provides ‘concurrent planning’ and ‘end-to-end supply chain visibility and execution’ in the face of ‘ever-changing market volatility and constraints’.
Precision-tool and component manufacturer Knust-Godwin is to target the oil and gas vertical with the acquisition of a Velo3D Sapphire 3D printer and flow and quality management solution. The kit joins KG’s fleet of seven metal additive manufacturing machines in Katy, Texas. Sapphire’s low-print-angle capability makes it possible to recreate a spare part ‘as-is’, offering cost-savings and improved turnaround time.
In a new ‘digital subsurface program’, Neptune Energy is working with several suppliers to speed exploration, reduce costs and bringing discoveries into production faster. The effort includes the development of new tools to scan and interpret seismic data. Neptune is also testing workstations provided by Cegal with data hosted in the Azure cloud, leveraging artificial intelligence technology from Bluware. More from Neptune.
Nuverra Environmental Solutions has completed a pilot of Ondiflo, a ‘blockchain platform’ for the oil and gas industry. Ondiflo was used to automate field ticketing of Nuverra’s produced water and frac water hauling and other service jobs. More from Ondiflo.
Terminal automation system provider Blendtech has ‘cut the cost of compliance’ with the API 2350 overfill protection standard using an automation solution from Opto22. More from the case study.
Petrofac has selected the Microsoft IOT toolkit its digital ‘Connected Construction’ platform. The Petrofac platform leverages Accenture’s Industry X.0 methodology and is undergoing trials at a Petrofac EPC project in the Middle East.
PGS has chosen the Google cloud platform to host cloud-based multi-client data. PGS is also working on seismic imaging in the cloud and on the use of machine learnings and artificial intelligence for subsurface data analysis. In another digitalization project, PGS is working with Cognite to improve fleet performance by optimizing fuel consumption reducing maintenance costs through equipment monitoring.
Recon Technology’s 43%-owned Future Gas Station has provided Xinglin Gas Station (a gas station in Jiangsu Province) with an internet marketing service using its DT Refuel mobile app. The service includes a customer relationship management system and ‘data-driven’ decision support. FGS’ marketing service analyzes consumer behavior to reduce operating costs and facilitate marketing cooperation with third parties to increase gas station revenues. FGS charges 0.5% of the transaction amount as operation and technical service fees. In a separate announcement, Recon’s Nanjing Recon Technology Co. has won a bid to build the automation system for PetroChina’s Jidong Oilfield. The winning bid was RMB 9.5 million per year for a three-year period.
Eurotech is collaborating with Italian Retelit on a new ‘multi cloud’ ecosystem of IoT platforms and infrastructures for industry and government. The solution combines Retelit Multicloud with Eurotech’s Everyware ‘edge to cloud’ IoT solution. More from Eurotech.
Nanalysis Scientific and Sartec are to jointly develop process analytical solutions for the oil and gas industry. The solutions leverage Nanalysis’ compact NMR products and Sartec’s patented machine learning and artificial intelligence software. More from Nanalysis.
Australian Searcher Seismic has built a cloud-based seismic data service using technology from Cloudera. Searcher’s 20 petabyte data library is now available through ‘Saismic’, a data on-demand service with support for deep learning and advanced analytics. Saismic object detection and image segmentation from a ‘distributed, scalable, big data store’. Along with Cloudera, Searcher uses Apache Spark and Hbase for hybrid cloud environments. More from Cloudera.
Schlumberger and ExxonMobil are jointly working on the deployment of digital drilling solutions around planning, execution, and continuous improvement through learning. The deal sees ExxonMobil implement Schlumberger’s DrillPlan well construction package across its unconventional operations.
Storage Heaven has partnered with infrastructure and data management services provider ThinkOn to provide for magnetic tape migration and duplication services. StorageHeaven manufactures ‘TapeMaster’ a stand-alone migrator and duplicator. ThinkOn adds a Clone2Cloud service that streamlines the process, ‘moving hundreds of thousands of older tapes directly to the cloud’. Storage Heaven claims extensive experience in oil and gas tape migration, emulation and cloning and has provided OEM solutions to several global oil and gas solution providers. More from Storage Heaven.
Semantic web and graph technology specialist Cambridge Semantics is making a push into the oil and gas vertical as reported in a recent podcast, ‘Accelerating data integration and insights in oil and gas*. Dave Lafferty (president, scientific technical services) presented Cambridge Semantics ‘Anzo’ graph database as able to ‘take complex data relationships and express them simply’. Heterogeneous data, as encountered in oil and gas, can be ingested and re-assembled in different contexts, re-organizing or pivoting data according user requirements. Anzo uses the Sparql query language and the W3C standard RDF/OWL semantic technology. Support for Neo4J’s Cypher query language is under development.
Sam Chance explained that Anzo acts as an overlay to existing data sources, providing a data fabric for ad-hoc access and applications and capable of executing new queries on the fly. Anzo is said to be based on open standards. In the Q&A, Anzo was compared to another integration technique, data virtualization. There are similarities, the ‘one overall source’ is a common metaphor. But Anzo’s graph model overlay and semantics is said to allow a business-oriented view and to enable ‘machine reasoning’.
In a separate announcement, Cambridge Semantics has released AnzoGraph DB V.2. AnzoGraphDB (AGDB) is a separate product from the Anzo database engine core. AGDB lets third parties build their own graph database applications, scalable ‘beyond the capabilities of the many transactional and single node graph databases on the market’. V2.0 of AGDB adds labelled property graphs in RDF and an SDK for MPP-capable analytical functions that will run in a cluster. Cambridge Scientific is also taking an idea from Neo4J’s marketing playbook with the introduction of a free edition to ‘allow the graph community to start using our analytical graph in commercial projects at no cost’.
* The podcast was hosted by the Oil and Gas Journal. A replay is available on the Cambridge Semantics website.
The Alberta Energy regulator is rolling out OneStop, an automated review system for companies applying to the AER for permits. The tool currently supports applications for reclamation certificates, new pipelines, Water Act-related major projects and closure activities. OneStop is a component of the AER’s integrated decision approach and will speed application processing. OneStop streamlines the AER’s baseline review of applications for lower-risk activities. Applications for higher-risk activities undergo a more detailed assessment by AER staffers. By 2022, all application processes will have moved to the new tool.
As of August 2019, the Alberta Energy Regulator has been joined in Calgary by the newly established Canadian Energy Regulator. The CER is to fulfill the government’s commitment to ‘build a modern energy regulator’ and oversee ‘a strong, safe and sustainable Canadian energy sector’ in its ‘transition to a low-carbon economy’. The CER is to ‘enable modern effective governance, more inclusive engagement, greater indigenous participation, stronger safety and environmental protection and timelier project decisions’. The CER is a convenient 12-minute walk away from the AER.
In its 2019 Year in review, the Texas Railroad Commission reports a ‘turning point’ in its progress towards modernization with its mainframe transition project, the largest IT infrastructure project in its history. The RRC’s ITS Division has begun the transition of data and programs from an ‘antiquated’ mainframe system to a modern, cloud-based platform. The RRC has also opened ‘digital avenues’ for customer interfaces with agency processes, enabling ‘streamlined and efficient administration of regulatory requirements’. Public access has been improved with the launch of the RRC Online inspection lookup (RRC OIL), a web-based application that enables the public to search the inspection and compliance history of Texas wells. The RRC has also released several public data sets for download.
The UK’s Oil & Gas Authority has embarked on a Digital Excellence Initiative as presented by OGA CIO Simon James at last year’s AAPG meet in Edinburgh. James observed that the energy debate is ‘more polarized than ever’ and that ‘quality dialogue and evidence-based action’ is required. The digital revolution at OGA was mandated in the 2014 Wood Report and has seen an open data initiative in 2016 and the construction of the ‘world’s largest oil and gas economic model’ using software from Palantir Solutions (now Aucerna). The OGA’s digital offering today consists of the UK National Data Repository, a 170+ terabyte dataset of 12,000 UK wellbores, the ongoing release of ‘carefully targeted, value-adding data’ and the publication of benchmarking reports. The latter include reports on production efficiency, field recovery factors, operating costs and seismic compliance. More from OGA.
Streamline Innovations operates natural gas treating units in South and West Texas that convert hydrogen sulfide gas into fertilizer-grade sulfur. Inductive Automation has automated Streamline’s processing units with remote bi-directional control and historian data collection. Streamline’s ‘Valkyrie’ process leverages a ‘robust but complex’ control and automation system which includes the Ignition software platform. Streamline’s ‘micro’ units required low-cost, basic automation with remote access to data and local access to an HMI and the cost of traditional PLC-based automation is prohibitive. A hub-and-spoke configuration leverages Ignition Edge runs locally on a Moxa device, talking to the cloud-based Ignition server via MQTT. Python scripts perform fast Fourier transforms on pump vibration data and blend weather forecasts from the National Weather Service to determine optimal operating temperature. Alerts are broadcast via Twilio to operators when issues arise.
Streamline has also developed ‘HMI on a Stick,’ using an Amazon Fire Stick or Linux Stick Computer which turns any television into a unidirectional HMI. This has been given to clients, operators and management who can now monitor operations when at home or in a hotel room. More from Inductive Automation.
Opto22 blogger Terry Orchard recently extolled the merits of the new V1.0 release of Node-RED. Node-RED is a programming tool for wiring together hardware devices, APIs and online services in ‘new and interesting ways’. Node-RED has its roots in manipulating MQTT data streams but is now a more general tool. The new release offers multiple enhancements to the browser-based editor. The MQTT nodes now support MQTT v3.1.1, with v5 support on the way. Under the hood, fully asynchronous message routing makes for more predictable stable data flows. ‘The 1.0 release is a more mature, refined version of the software we’ve grown to love, and it continues to get better as more updates are rolled out’.
Canadian netDNA has automated water management for fracking and water transfer for New Wave Energy Services, a Permian Basin service provider. New Wave’s trailer-mounted water transfer units with four 500-800 HP pumps and 12” diameter intakes have been automated by netDNA using Opto22’s ‘Groov’ EPIC (edge programmable industrial controller). EPIC pulls data from the genset controller, including RPMs and associated telemetry, and publishes it to a central broker/server using its built-in open-source tool Node-RED and MQTT. More from Opto22.
An article in issue 259 of the excellent Linux Format magazine titled, ‘Node-RED: Build a smart thermal monitor’ shows how easy it is to connect the Raspberry PI or an Android smartphone to IoT sensors using Node-RED and MQTT. Node-RED enables drag-and-drop configuration of hardware nodes and logic and makes remote sensor data visible in the public cloud using the HiveMQ broker. The Android access point uses the free myMQTT app.
CO-LaN, the organization behind the CAPE-OPEN standard for computer-aided process engineering, is seeking a rapprochement with other process standards bodies including CFIHOS and DEXPI. Discussions are at an early stage, but standardization on data engineering is overlapping with some of what CAPE-OPEN is addressing. Oil IT Journal asked CO-LaN CTO Michel Pons for details.
Regarding engineering data and process model exchange.
Standardization of engineering data within process simulation could help in the transfer of information between process simulation results, PFDs and PIDs. Since 2019 the CAPE-OPEN standard includes a flowsheet monitoring interface specification that potentially gives access to many pieces of information available within a process model: stream data, unit operation parameters are among the pieces of information easily accessed by a flowsheet monitoring component. The first business case of flowsheet monitoring revolves around such applications as the WAR algorithm* from US EPA, but retrieval of process data for transfer to other applications appears another valid business case.
CFIHOS/DEXPI aim at data exchange between applications, while CAPE-OPEN aims at interoperability between applications. But all are involved in the description of a digital twin of a process plant. CO-LaN is evaluating its participation to initiatives launched by other standard development organizations in this domain.
Michel Pons Technologie SARL
* The EPA’s Waste Reduction Algorithm is used to evaluate the environmental impacts of chemical process designs to mitigate excess production and waste. WAR is a free download from the EPA website.
Researchers at MIT have discovered* a novel way of fingerprinting underwater by sequencing the DNA of microbial communities in samples of reservoir fluids. The approach is used to identify fluid pathways through the subsurface. Lara Streiff, writing in Stanford Earth Matters explained that ‘traditional methods for identifying well connections and natural fractures, well logging and borehole imaging, have significant limitations [.. as .. ] they do not reach the large spaces between wells. Seismic data can describe a larger area, but with limited resolution’. Analyzing the chemistry of these fluids is also useful, but the DNA of the microbial community is said to produce more specific results.
The Stanford study targeted geothermal wells where water is circulated through the subsurface to recover thermal energy. As water travels through the subsurface it picks-up a unique collection of microbes that create a revealing DNA ‘barcode’. Mapping actual trajectories between injectors and producers has been facilitated by the approach which also has application in oil extraction and carbon sequestration. The technique has application in mapping the spread of contamination, assessing artificial fracturing effects or determining the leakage potential of a carbon sequestration site. The method was tested at the Stanford underground research facility in Lead, South Dakota, once the deepest gold mine in North America. The study, authored by Yuran Zhang was published in the November 2019 issue of Water Resources Research. The research was supported by the Stanford TomKat Center for sustainable energy.
* There is already quite a body of knowledge on microbial DNA in the oilfield. A search on OnePetro produces 33 references to ‘microbial DNA’.
The MIT Energy Initiative (MITEI) has awarded a grant to Rafael Gómez-Bombarelli, assistant professor at the Department of materials science and engineering, from its million dollar ‘seed fund’ program that supports early-stage innovative energy research at MIT through an annual competitive process.
The monies will be used to apply artificial intelligence to solve the ‘zeolite conundrum’ whereby, despite millions of possible molecular configurations of the nano frameworks, to date only 248 have been discovered. Zeolites are widely used in refining as molecular sieves and catalysts.
The ‘Automatic design of structure‐directing agents for novel realizable zeolites’ program will use machine learning and simulation to accelerate the discovery cycle of zeolites and ‘expedite progress toward a variety of innovative energy solutions’. Hitherto, discovery of these new frameworks has relied mostly on trial-and-error in the lab, a slow and labor-intensive approach. Gómez- Bombarelli will be using theory to speed up that process. ‘Using machine learning and first-principles simulations, we’ll design small molecules to dock on specific pores and direct the formation of targeted structures. The computational approach will drive new synthetic outcomes in zeolites faster’.
For more on zeolite, read Thomas Degnan’s 2000 paper Applications of zeolites in petroleum refining.