A recent podcast on the ‘Launchpad’, BP’s new energy community, heard Launchpad CTO Tom Grey grill Karen Scarborough*, Senior Technology Associate at BP, who stated that the earlier uses of blockchain were ‘not so practically applicable’. The use of blockchain in the supply chain has been an ‘interesting journey’ that ‘has not panned-out as we thought it would’.
BP was involved in a number of tracking and tracing applications to
investigate the use of blockchain. A lot of thought was put into how
consortia could be formed, with a private blockchain for energy
providers, another private blockchain for say, banking. However, as
Scarborough relates, ‘our original expectations did not pan-out as we
thought. Only one new member has joined the consortium since it
kicked-off. Thus far, the consortium model isn’t working. That’s not to say that some of that won’t
change in the future.’ So what has made it that way? Is it the use case
of the technology? ‘People thought that blockchain was going to be
great for track and trace and supply chain management. It turns out
that that is not really where blockchain shines. Blockchain is not
meant to store a lot of data so it’s not a great tool for tracking and
tracing. In fact, there are lots of legacy technologies that do this
So what is left for blockchain? Scarborough, who is a leading light in the Ethereum blockchain community, thinks that although the private blockchains have failed, there is still ‘excitement’ in the public blockchain space, ‘where there are lots of apps with people actually using them, rather than the speculation we have seen in previous years’.
Even today BP’s procurement is less than perfect. Katy resident Angelica Garcia Dunn recently admitted to defrauding some $2.2 million from BP. Dunn was working as a contract escrow agent to BP, making vendor payments to its railcar lessors and repair vendors. A portion of the lump sum payments destined to these third parties ended up in her own business accounts.
Comment : Back in 2017, Oil IT Journal editor Neil McNaughton described blockchain’s use in commodity trading as ‘another misunderstanding’.
* Scarborough is a co-founder of Molecu, a BP-incubated carbon offset advisory boutique.
The UK Government’s Oil & Gas Authority flipped the switch on the UK National Data Repository on the 1st of July 2021, marking the end of an era for Common Data Access, the industry-backed provider of seismic and well data services. CDA was founded some 25 years ago by UK operators and later resulted in the creation of UKOilandGasData, a collaborative online data sharing platform.
The Wood Review of 2013 suggested that oil and gas data would be better managed and distributed by the government and set the ball rolling on a National Data Repository, run by OGA. CDA’s data provided the foundation of the NDR, which launched in 2019. In 2020, OGA contracted Osokey, a UK-based provider of cloud-based subsurface data management to upgrade the NDR with an Esri ArcGIS Online frontend.
Invited to ‘explore the new improved NDR’ we were met with a ‘Firefox web browser has been detected. A Chromium-based web browser is required to access the full NDR functionality’. Whatever. Some 50 years’ worth of crucial North Sea data is already available from the NDR with ‘up to 4,000% more’ data coming over the next five years to ‘maximize opportunities for the energy sector, inform investment decisions and assist the drive towards net zero’. Data from the NDR is free of charge up to a 3TB/month limit over which a ‘minimal charge’ is made.
As the new NDR goes live, CDA and UKOilandGasData have reached the end of the line. Daniel Brown, CDA’s executive director said, ‘CDA’s efforts to support data sharing between operators laid the foundations of a massively successful industry project that not only saved well over £250 million during its lifetime and was key to the creation of the UK NDR. CDA wishes the OGA and Osokey well as they launch this next generation of UK NDR, which will be a key resource for oil and gas, for the carbon capture and storage industry, and for geoscience researchers worldwide for many years to come.’
~ ~ ~
Meanwhile, over in Houston it was announced that Energistics, the upstream data standards body formerly known as the Petroleum Open Software Consortium will likewise be closing its doors at year-end 2021. The move derived largely from the overlap of Energistics’ membership with the shiny new Open Subsurface Data Universe, OSDU. The corporations that call the shots deemed that one membership subscription was better than two and jointly opted to stay with The Open Group, home to OSDU. Energistics is putting a brave face on the changes and is to become an ‘affiliate’ of TOG, retaining a web presence. But the staff will be laid off and ongoing maintenance of the Energistics standards will be up to the goodwill of the community. More from the Energistics in Transition web page.
It is 25 years since I started writing what was at the time Petroleum Data Manager. The newsletter became Oil IT Journal circa Y2K. That means over 200 editorials on this and that, expressing, I’m sure, sometimes contradictory viewpoints. I’d like to highlight a couple of themes that we have uncovered in our reporting over recent years. One concerns the yin and yang of GIS* vs. geotechnical software as the de facto integration tool for the upstream. Another, the portability of the oil and gas skill set to new energies. And finally, how BP has finally nailed my own bête noire, blockchain.
Back in 2014, Total’s seminal PUG presentation demonstrated GIS’ capacity to integrate disparate information in support of a new venture. Geoscience software still struggles in the integration space as witnessed by the ongoing effort in OSDU. Another theme du jour is what the geo workforce will be doing as companies transition to new energy sources. In so far as most new energy sources (wind, photovoltaic) are located on the earth’s surface, this leaves the subsurface geo/PE brigade at something of a disadvantage. There has been a lot written about the potential for re-cycling the subsurface knowledge base and workforce in geothermal and underground CO2 storage. As yet, neither of these appear to be hiring. Anyhow, it’s unlikely that either geothermal or CCS would ever pay the kind of salaries that the geo brigade became accustomed to during the fat years of E&P. Facilities engineering should offer opportunities especially in offshore windfarms, although the expertise here is shared between oils and their contractors who are already doing good business with their utility customers.
If oil country subsurface know-how may not be hugely applicable to the world of new energy, what about the surface? As the folks from Esri like to say, ‘everything has location’. Which brings me to the highlight of our virtual report in this issue from the 2021 Esri Petroleum User Group (PUG). Anne Luise Procida headed-up Ørsted’s GIS effort, back in the day when it was an oil company. Now Procida has demonstrated that GIS not only transfers well to new energy, it is, at least for Ørsted, at the very heart of the new business, the ‘spider at the middle of the web’.
~ ~ ~
I think that our lead this month, where we learn from BP that blockchain is ‘not a great tool for tracking and tracing’, and that ‘there are lots of legacy technologies that do this better’, is one of the highlights of our reporting. A major problem of B2B communications is that the gatekeepers, the press officers and also, it has to be said, many of the company folk who are allowed-out to give presentations, are hard-wired to present only ‘success’, often before it has been achieved. Failure is air-brushed out of the picture. If pharma did its business this way, we would all be taking shots of disinfectant! When you google ‘blockchain in oil and gas’ you find hundreds of articles all saying the same thing, all reporting on the successful formation of one or other consortium that is going to fix a problem that nobody ever really thought they had. The puffery is not restricted to the marketing department or the larger B2B publications. The academics are in on the act, with rehashes of the same tired nonsense masquerading as ‘science’ in publications such as OnePetro, ACM, IEEE, Springer … this stuff is everywhere! Except that … ‘Legacy technologies do this better!’ … indeed.
A round of applause to BP’s Launchpad for telling it like it is, even sotto voce. In fact the Launchpad interview is well worth listening to* in its entirety as Scarborough provides an update on where the Ethereum blockchain is heading. In fact it should be compulsory listening for all who are thinking of using this in an industrial context. The public blockchain infrastructure, that Scarborough now favors, is built on an indescribable morass of shifting technology and gobbledygook that seems to be forever firefighting its latest gotcha!
* Geospatial information system.
* As is her talk on the State of Ethereum given at the 2021 Ethereum Anniversary Special event.
A new 40 page report from OGUK, The impact of digitalization on data professionals, authored by a team from Aberdeen’s Robert Gordon University, highlights the ‘ever growing range of skills required to work in data in the oil and gas industry, and the challenge of finding all these skills in one individual’. The report is described as a ‘first step towards a digital skills agenda for data professionals’.
RGU makes the unsurprising observation that ‘data and digital skills are critical to the oil and gas industry’. It has a little more trouble with a definition of ‘digitalization’ itself although this, even though it, and the ‘fourth industrial revolution’ are ‘redefining the data profession’. Digitalization comes in two flavors, as a) ‘the revision of existing workflows to utilize emerging tools that enable faster, more comprehensive and higher quality outputs, often using automation’, and/or b) the ‘complete transformation of old workflows where the emerging tools offer such a different way of reaching goals that the new workflows are fundamentally different’. The latter overlaps with the ‘cyber-physical systems of Industry 4.0’ and is where most ‘high-profile digital business efforts are located’.
The study embodies the findings of a survey of 76 individuals and nine interviewees. The ‘self-reported’ results tend to show that the data managers mark their own homework highly. 50% or more claim ‘good to excellent’ competency in the understanding of data-related issues and in understanding the data lifecycle. So what is wrong with the status quo?
Seemingly ‘increasing data fluency and data dependency within organizations are challenging established organization and leadership models’ and data professionals are unsure that their data and digital skills will be adequate to address these new challenges. ‘Half of those surveyed did not anticipate support from their employer to keep their skills up to date’. The report suggests that ‘digitalization [ should be ] a fundamental element of training and skills development for all staff, rather than something that is delivered by a project team, or encountered on a piecemeal basis’.
The study concludes with a succinct graphical digital skills roadmap, illustrated with a neat ternary diagram with ‘data scientists’, ‘data managers’ and ‘data users’ at its apexes. While the roadmap claims to ‘signpost the way forward’, finding ones way around the triangle could prove hard.
Comment: This conclusion, based on a rather nebulous definition of ‘digitalization’ stems from perhaps a too literal reading of the output of the IT consultants. Oil country data managers do need to be jacks of all trades. Education in subjects on the periphery of data management is important, but this needs to include the business of geoscience and engineering as much as, if not more than, the latest IT trends. RGU’s researchers cite Deloitte, the World Economic Forum, the International Energy Agency, OPITO and many SPE authors but seem oblivious to the body of knowledge that has emanated over the past decades from the PESGB’s data management SIG, and from conferences on oil and gas data management organized by PNEC, ECIM, SMi and others. All the stuff in fact that Oil IT Journal (formerly Petroleum Data Manager) has reported on for the last 25 years. And, no, Oil IT Journal did not get any citations either!
Back in 2010, when Anne Luise Procida joined Ørsted, oil and gas was the biggest part of the company. All has gone now as Ørsted, a poster child for the energy transition, now operates some 7.5 gigawatts of wind energy with another 2.3 GW under construction. Ørsted’s GIS team provides cradle to grave support of offshore wind farms, from planning and bidding to development and operations. Procida proudly states that ‘we are the spider in the middle of the web’. She works with other teams to gather data and define geospatial standards that will be used throughout the project lifetime. The usual geo-data issues are omnipresent and necessitate data quality and geodetics to assure accurate ‘as-built’ documentation at handover. During operations, GIS informs asset integrity activity (cables exposed, seabed integrity issues) and MRO*, with Esri tools, notably ArcGIS FieldMaps. Ørsted has been working in the USA since 2015 with a string of acquisitions. The US lacks industry standards for offshore wind. These would help in collaboration with contractors and governments. Many US consultants use CAD or Google Maps making it hard to exchange data. There are also many overlapping US data sources. Procida reflected on the transition from the oil and gas world to wind to conclude that the power of GIS makes it very transferable. Offshore cables work the same way as pipelines, wind leverages the same data types, and geophysical and geotechnical data is managed with a modified SSDM* model. The same types of collaboration and integration teams are involved across the windfarm lifecycle.
* Maintenance, repair and operations.
* The IOGP Seabed survey data model.
Jamie Lambert (ExxonMobil) described an integrated workflow for decommissioning offshore oil and gas facilities. The offshore Australia Gippsland basin has seen 50 years of production. Exxon operates 24 facilities, 1000 km of pipeline and 400 wells. The Offshore Oil and Gas Decommissioning Liability (Australia) report (Advisian 2020) found $50 bn of decom is work needed. A subsea material data register has been created for planning and execution of decom and removal of debris and unused equipment. Data has been pulled from many years’ worth of inspection reports into a single, basin-wide dataset with standard attributes. The system provides simple workflows for non-GIS users. An ArcGIS Online application allows collaboration and offline work. Development started with a spreadsheet, designed by all stakeholders. When finalized, the data was migrated into an ArcGIS Hub host. WebApp Builder and Survey 123 and StoryMaps are also used. A Hub dashboard shows current decom status. Recovery campaigns are based on vessel availability, capacity, etc.. Survey 123 Connect was used to author and report on surveys in real time. A big win was the fact that users are now ‘self-sufficient’.
In an interview with host Sumant Mallavaram, Tony Battle, CIO, Shell Energy Retail traced his own career path from upstream mega projects to the customer-centric downstream. As the industry evolves towards renewables, there is good potential for geospatial professionals to add value. ‘We are here because of ESRI, but the future of data is agnostic’. There is a huge gap in data science in regard of transformations, coordinates systems, and geospatial accuracy. ‘In a world where location is key, there is a great opportunity for geospatial pros to fill this gap’. Today Shell’s geomatics community is focused on geomatics in the upstream. But there are many opportunities in the downstream. Folks need to avoid being ‘trapped in geomatics’ and start to raise awareness and help folks realize something else.
Matt Horn of RPS Group discussed how onshore pipeline high consequence area (HCA) analysis is applied offshore. RPS has been working for a Gulf of Mexico operator to address what happens when there is an incident. This involves marine spill modeling to address trajectory, fate, and effects in an approach that mirrors the PHMSA ‘who, what and when’ approach to onshore spills. RPS’ OilMapLand, developed over 20 years, tracks spills flowing across relief and into waterways to determine HCA/AOI impacts. Onshore trajectory mapping is relatively simple as the relief does not change and there is limited variability throughout the year. But in the offshore environment, currents, tides, waves add significant spatio–temporal variability. Traditional HCA cannot hack it. SMAP, RPS’ spill impact model application provides a stochastic approach, mapping hundreds of scenarios of trajectories, random start dates and wind direction get the most probable trajectory and minimum time to land. SIMAP also computes worst case discharge volumes and affected resources. The outcome determines whether a proposed pipeline route is, or is not, a ‘could affect*’ segment in PHMSA terminology. The tool delivers reports to PHMSA or into a PODS database. Operators use the results in emergency response/clean up training exercises.
* an environmentally sensitive area
In a short LinkedIn exchange after the event, Horn added the following. “The HCA ‘could affect’ designation is typically made at the design phase to set up the regularity of inspections that are used during operations. For on-land pipelines, there are design changes (e.g., re-route the pipeline, add HDD, or increase wall thickness) to reduce the likelihood of an incident (accidental discharge). PHMSA regulates pipelines in the on-land environment (with spill response from EPA, or USCG in navigable waterways). But there’s a transition to BSEE in the offshore (with spill response from USCG) and lots of regulations to work around. This type of analysis is useful for operators to get an understanding of the types of receptors that could be affected, how likely (probability) assuming there is a release, and an understanding of how long it would take (minimum time) to those effects.”
Speaking at the 2021 Esri plenary event, Steven Bjerring traced BP’s geospatial journey from the 2015 global roll-out of OneMap V2.1 (with BP’s ‘Chili’ geospatial workflows) to OneMap 6, released in 2020. Latterly OneMap has seen a massive cloud migration with all OneMap systems moved, and an expansion of mobile apps and use cases. These include change detection of refinery build progress using machine learning, live price tracking at BP and competitor retail sites, global pandemic response solutions and more. OneMap leverages the ‘Citizen Developer’ concept, defined as ‘a person who creates new business applications for consumption by themselves and others, using platforms that have been licensed and approved by BP’. CD is ‘driving innovation, delivery and value from core technology platforms’. One Map was designed from the ground up with the CD model in mind. Configurable apps support web/mobile with an operations dashboard, built with ArcGIS WebAppBuilder. ArcGIS Pro tools and Safe Software’s FME automate, schedule and run data processing, integration and analytical workflows. BP’s citizens still need to follow the rules of the road, watching out for data confidentiality, and must always have a manual work around for business critical functions. CDs should not mess with systems of record or duplicate data. It’s also recommended not to use data from other citizen developed apps! BP’s geospatial technology landscape encompasses plethoric data sources and formats that pass into a constellation of ingestion/data wrangling tools, and on for consumption in OneMap/Esri as wells as other tools like Autodesk, AWS, PowerBI, SalesForce and more. All in all, there are 200+ tools and 2,000+ citizen web apps to choose from. BP is now working toward the ‘intelligent nervous system’ that will embed 5G, new satellite data, a live globe of data, all feeding predictive AI/ML and ultimately quantum computing and ‘human-less AI-driven choices’.
Darron Pustam (Esri) showed how ArcGIS Mission is used to managing field personnel during an incident. Mission is now a single system that streams and logs data including on and off network resources. The ‘all inclusive’ command and control product provides a ‘single pane of glass’ view to create and manage a mission, along with an overview for leadership. The system provides peer to peer communications for responders. Scott Noulis demoed the product in use on a spill at a tank battery. The map provides live locations of workers who can chat and send geomessages of spill locations. A bi-directional mission responder mobile app means that all look at the same data. The system can spin-off spill flow simulations, assign tasks and capture events for regulatory reporting.
Malcolm Ross presented Eavor Technologies’s closed loop geothermal system. Classic geothermal doublet production has not always been successful. ‘Up to 50%’ of such wells are ‘dry’ – i.e. not producing economic heat. Induced seismicity is a problem as is the ‘dangerous high temperature and pressure environment. Eavor’s solution leverages horizontal well technology to drill a family of multilaterals that act as a heat exchanger with the surrounding rock. The system is said to operate at temperatures of ‘from 30°C up to hundreds of degrees’. A ‘siphon effect’ means that the system can run ‘without pumps’. Wells are drilled open hole with a polymer seal. A target bottom hole temperature of 150°C range means that the potential is ‘much more widespread than conventional’. 70% of the world has geothermal potential. GIS risk segment analysis is used to locate optimal sites with respect to geothermal gradient, reservoir thickness and population density.
For the IT-minded GIS folks, Esri has announced ArcGIS Enterprise on Kubernetes as explained in a Trevor Seaton’s recent blog.
The Kubernetes option is part of the 10.9 ArcGIS Enterprise release and
can be viewed as a third operating system alongside Esri’s supported
Windows and Linux options. The Kubernetes microservices-based offering
is said to suit high availability/high throughput scenarios and offers
a streamlined, single script deployment without the need for multiple
setup files or complex configuration of ArcGIS Server, ArcGIS Data
Store, or Portal for ArcGIS. Other Esri resources include Determine whether Kubernetes is right for you and the 2021Dev Summit presentation on ‘Introducing ArcGIS Enterprise on Kubernetes’.
Read the official proceedings from the 2021 PUG here and access the individual presentations here.
Applied Petroleum Technology has launched APT Allomon, a methodology that analyzes produced fluid composition in shale oil wells to identify drainage patterns and maximize production.
Beicip-Franlab has released OpenFlow Suite 2021 with a new basin modeling simulator, resulting from years of R&D at IFPen, and modules for biogenic gas and geothermal studies.
Exprodat has rolled out Unconventionals Analyst 2.1 for ArcGIS Pro. The new release includes an ‘analyze well relationships’ functionality to identify well parent-child relationships, said to be key to predicting interference. A new ‘calculate lateral spacing’ tool computes well spacing statistics across an area of interest to assess the efficacity of a well pattern, drilled or planned.
Kappa Engineering has rolled
out Kappa Server 5.40. Watch the video.
Petrosys PRO 2021.1.1 includes a new ArcGIS Pro connectivity enabling the direct display of subsurface data directly from E&P software.
The 2021.06.0 release of Ceetron
Solutions’ ResInsight is now ready for download from Github. Read the release notes here.
Schlumberger and Equinor have jointly developed a cloud-enabled 3D workflow using data from the GeoSphere HD reservoir mapping-while-drilling service. The solution is used to optimize well placement in real time.
Aspen Technology has rolled-out the Aspen Industrial AI Workbench, a module in the V12.1 release of the Aspen AIoT Hub. The integrated AI environment allows data scientists to accelerate the transformation of data into productized AI/ML algorithms, ‘working hand-in-hand with subject matter experts’.
Brüel & Kjær Vibro has released a new version of its Vibro Conditioning Monitoring (VCM-3) system and SETPOINT CMS software, providing ‘out of the box’ early diagnostics for a range of rotating assets.
Emerson’s Professional Service Team is offering a Digital Maturity Model Quick Index service to help clients compare operations with their peers and determine where improvements will yield the largest ROI and ‘discover what digital transformation projects will tie to your business KPIs’.
Following its recent acquisition of Zedi’s
software and automation businesses, Emerson
has developed an autonomous rod pump management solution with a machine
learning capability. More from Emerson/Zedi.
Gexcon has announced FLACS-CFD 21, the latest version of its fire and explosion modeling software. Simulations can be run in the cloud from the workstation GUI with result files downloaded to a local system. The Flacs GUI is now also available in Chinese.
Honeywell has added the ‘Operator Advisor’ to its Experion HALO (highly augmented lookahead operations) suite. OA uses ML-powered analytics to provide oil and gas, chemical, refining and petrochemical organizations with a ‘consolidated scorecard’ of automation use, along with recommended steps to address performance-related gaps.
Ben Orchard, blogging on the Opto22 website explains how to port your DIY automation skunkworks from the Raspberry Pi to run on its industrial strength Groov EPIC/RIO hardware controllers. Opto22 also provides resources for developers using Python, ‘the most popular language used on Groov hardware’.
Onboard Dynamics has launched the GoVAC Flex, a system that transfers recovered methane to either an adjacent pipeline or tube trailer. The ‘simple, mobile, fully integrated solution’ provides operators of natural gas pipelines with a tool to minimize greenhouse gas emissions during routine maintenance.
Radix Engineering and Software is offering a technology due diligence service to companies interested in mergers and acquisitions. The service consists of analyzing aspects such as architecture, infrastructure, availability, scalability and performance, costs with technology, quality, information security and documentation, assigning risk factors. More from Radix.
Reveal Energy Services’ 2021.2 release of its Orchid completions evaluation platform now enables operators to integrate DAS data with offset pressure, microseismic, tracers, logs, and geologic information for a ‘360° interpretation’ of the physics driving well, pad, and unit development performance.
Seeq has released R52 of its eponymous machine learning software with new capabilities including add-on tools, display panels and user-defined functions. End users can now schedule Seeq Data Lab notebooks to run in the background, ‘fulfilling a top customer request’. Seeq has also spotted an opportunity as OSIsoft’s ProcessBook is being discontinued after 25 years. Customers are looking for a replacement and Seeq is stepping in with its eponymous complement to OSIsoft PI Vision.
A recent lesson shared by the Well Control Incident Subcommittee of the IOGP (WCI Lesson Sharing 21-6) underlined the risk of taking a meter reading at face value when in reality, data ‘validation’ technology embedded in software led to a dangerous false reading. The issue occurred during the drilling of a deepwater exploration well using managed pressure drilling (MPD). During drilling, a rapid increase in flow out was observed, in reality at a rate of 3,000 gallons per minute. Unfortunately, software controlling a Coriolis meter that should have shown the abrupt change included some rather arbitrary determination as to what constituted ‘good’ data. A ‘data validation’ step in the meter flow out considered values less than zero and over 3000 gpm to be ‘bad data’.
Consequently rig personnel believed that flow out and attributed the increasing surface back pressure to a plugged choke. The ‘zero’ flow out value meant that kick detection logic failed to spot the kick since the surge occurred between the 20 second trend detection interval. Fortunately, other indications of the kick led to the well being safely shut in. But, as the IOGP points out, ‘If just one or two actions had been different or perhaps, just slightly slower, the outcome may have been much more severe than just time spent recovering from an influx’.
The IOGP makes several recommendations resulting from this near
miss. These include ‘understanding the logic used in the MPD software
and ensuring that conventional flow detection is available for MPD
operations’. More generally, IOGP links this incident with other well
control events where ‘root causes were associated with lack of
understanding of the technology and associated procedures, leading
individuals to forget basic rules such as shutting-in a well at the
right moment in time.’ As a result, IOGP has brought collected its
analysis of such incidents and advice on a new publication, Managing the introduction of new technology in well operations. More too from the IOGP Safety Zone minisite.
Comment: Suggesting that those involved in well design or rig personnel be cognizant of the code used in the control system is a big ask! It would be better if IOGP communicated details of the meter software (manufacturer, version number) so that other operators could check to see if they are also at risk. We put this to IOGP which replied … ‘When gathering and reporting information on safety incidents, both actual and high potential, as part of IOGP’s safety data collection program, the Well Control Incident reporting program, or other, we do not specify the names of products or services, whether it’s software, a piece of equipment, or a service provider. This information may or may not have been specified in the original report. If it were, it would not be shared with the wider Membership’.
Maria Mutti, geologist, carbon specialist and current professor at Potsdam University, is president-elect of the American Association of Petroleum Geologists for the Europe Region.
Applied Petroleum Technology (APT) has opened an office in Houston to support US shale operators and companies working in the Gulf of Mexico.
Mark Kostryckyj is now a graduate technical consultant at Asset Guardian Solutions.
Jody Denis has been appointed as Senior Drilling and Completions Manager at Avanti Energy to oversee the Company’s drilling programs in both Alberta and Montana.
Charles Beauduin and An Steegen are to lead Barco as ‘co-CEOs’ replacing Jan De Witte who has stepped down. Frank Donck is now Chairman of the Board of Directors.
Aaron Engen and Jonathan Hackett are to co-lead BMO Capital Markets’ new Energy Transition Group.
Michael Halloran is stepping back from his CAPE-OPEN related activities. He will remain as an Associate Member of CO-LaN and will still contribute to CO-LaN.
Johan Krebbers has left Shell and is now working for Cognite. He remains involved in OSDU.
Eitan Fogel is the new CEO at CybeReady. He hails from GeoEdge.
EQT has named Bahare Haghshenas as Global Head of Sustainable Transformation and Sophie Walker as Head of Sustainability for the EQT Private Capital business line.
Andrew Swiger is to retire as SVP of Exxon Mobil after more than 43 years of service. Kathryn Mikells is now SVP and CFO. She hails from Diageo.
Tiffany Harris is now CEO and president at Foster Marketing. She replaces George Foster who will stay on as chairman of the board and minority stockholder.
Emily Reichert (Greentown Labs) is now a Forum Energy Technologies board member.
Jarad Daniels has been appointed Global CCS Institute’s new CEO. He hails from the US DoE.
Roger Martella is now GE’s Chief Sustainability Officer.
John Hall is now President and CEO at HARC, replacing Lisa Gonzalez who will join the National Audubon Society as VP, and Executive Director of Audubon Texas.
Marathon Oil retiree Mitch Little is now a Helix board member.
Jonathan Hough has joined Marathon Capital as MD Energy Transition Advisory. He hails from BMO Capital Markets.
Martyn Millwood Hargrave, chairman emeritus and founder of Ikon Science, is now a visiting professor in the Oxford University Earth Sciences department.
Jason Coposky has officially resigned from his post as iRODS Consortium Executive Director. Terrell Russell has been named Interim Executive Director.
Tim Ping is now CEO of K-Solv Group’s newly-acquired Energy Completion Services.
Matthew Sutton has been appointed Matrix Solutions President and CEO.
Samik Mukherjee is now McDermott’s EVP and COO.
Jeff Brashear, Gary Weiss, Don Bortniak, and Paul Silvis are now mIQroTech board members.
Navigator CO2 Ventures has named Elizabeth Burns-Thompson as VP of Government and Public Affairs, and Jordan Jones as Director of Business Development.
Einar Hass is the new head of the Norwegian Petroleum Directorate’s office in Harstad. He hails from Aibel.
Paul Addison has been appointed as an independent director of Orbital Energy Group.
Grant Dewbre is now COO and Saleh Sagr is SVP, MENA at Perma-Pipe International Holdings.
David Rincon (Schlumberger) is the new PIDX International’s Business Development Committee’s Ambassador for Latin America.
Global energy technology firm RotoJar has rebranded to become HydroVolve, appointing Jamie Airnes as CEO and launching a range of technology solutions around its hydraulic-powered downhole engine.
Alexander MacKay is now a project engineer at Ryder Scott, Houston office. He was previously with KBR.
SEG and AAPG have appointed Vsevolod (Seva) Egorov as the next editor-in-chief of Interpretation and Bradley Wallet as deputy editor.
Gayle Burleson is now a Select Energy Services board member.
Claire Bramley is now CFO at Teradata, succeeding Mark Culhane who is stepping down. Andrey Alekseenko has been appointed as VP Teradata Nordics, Russia, Poland and Czech Republic.
Jeff Spath, Texas A&M Chair in Petroleum Engineering, has joined Tachyus as an Executive Advisor.
Acting CEO Meg O’Neill is now Woodside’s CEO and Managing Director.
Sergio Fernández Mena has been appointed VP Digital Technologies at YPF.
IOGP is recruiting for a Brussels-based Junior Institutional Relations Officer and a Membership Manager. More from IOGP.
All the folks at Energistics following its takeover by The Open Group. More from Energistics
The CO-LaN management board
has started developing the first prototype of the new CAPE-OPEN Testing
Suite. Scope of the prototype Includes testing of thermodynamic process
modeling components, AmsterCHEM, Céondo GmbH and Marcus Bruno
Fernandes Silva have been contracted to perform the development with MR
Woodman Consulting Ltd supervising the work. More from CO-LaN.
The Industrial Internet Consortium has published a Guide to global industry standards for industrial IoT (spoiler alert - there aren’t any!). The Guide outlines a vision and strategy to enable interoperability and system compatibility across the IIoT ecosystem. Erin Bournival, Dell technologies and co-chair, of the IIC standards task group said, ‘Integration and interoperability are critical in IIoT environments. That’s not easy to achieve in complex IIoT environments, so standards play a critical role.’ The Guide enumerates various categories of standards and the organizations that produce them and ‘provides business cases for adopting standards as well as strategies for participating in standards development’.
The IOGP has just published an updated ‘Standards and guidelines for well integrity and control’ (2021). IOGP Report 485 includes standards, specifications, and other material, produced both by IOGP and other organizations, that address well construction and well operations. This document was initially published in 2012 and has been updated periodically since then.
The IOGP’s Joint Industry Project (JIP) 33 is to pilot a global equipment hub (GEH) , a repository for standard vendor documents and data underpinned by JIP33 specifications and the IOGP’s own CFIHOS international standards. The GEH is to pilot a cloud-based repository of standard vendor information on industrial equipment, such as pumps, motors, and instruments. Vendors can upload their information one time and make it available to all of their customers, including package suppliers, EPC contractors, and owner/operators.
The Open Geospatial Consortium (OGC) has approved V 1.0 of the executable test suite (ETS) for CDB, its common database
for simulation-oriented geospatial data used in a synthetic
environment. A synthetic environment is a computer simulation that
represents activities at a high level of realism, from simulation of
theaters of war to factories and manufacturing processes. Such
environments may be created on a single computer or across a
distributed network connected, ‘augmented by realistic special effects
and accurate behavioral models’. Products that implement the OGC
CDB 1.0 standard and pass the tests in the ETS can now be certified as
OGC compliant. The ETS is downloadable on Github.
The OGC is also seeking
public comment on the draft Zarr storage specification 2.0 OGC
community standard. Zarr can represent large, multi-dimensional arrays
of data in a simple, scalable way, and is compatible with cloud object
storage, ‘making it ideal for analysis-ready geospatial data’. Zarr was
originally developed by geneticist Alistair Miles of Oxford University
as a library optimized for massively parallel array analytics. It has
since grown into a community project with a range of developers and
users from fields such as genomics, imaging, astronomy, physics,
quantitative finance, oceanography, atmospheric science and geospatial
imaging. Zarr is used in climate science, in the CMIP6 Google cloud
public dataset, and in oceanography with the ECCOv4r3 ocean state
estimator. Download Zarr here.
The PIDX international emissions transparency data exchange (ETDX) initiative is to extend to Scope 3 emissions reporting. PIDX has collaborated with the Open Group’s Open Footprint Forum and other emissions reporting initiatives to determine that ‘PIDX is most valuable to enable emissions reporting through commercial transactions’. The ETDX team has modeled use cases with real-world data to track a buyer’s scope 3 emissions, at the line-item level, throughout the supply chain. The ETDX project team is seeking participation from industry, ‘from standards organizations, operators, suppliers, and network providers alike’.
PIDX also recently updated its procedures for standards development. Amongst the many housekeeping changes, PIDX has updated its moniker, eliminating ‘Petroleum’ (the P in PIDX) and henceforth referring to ‘Energy’. The change was made in the light of ‘the move toward renewable energy and the digital transformation’. Details of the changes here. BTW, PIDX standards are ‘technologically agnostic and are free to use’. Access the PIDX standards here.
Peter Maier, SAP Global Head of Industries, observed that the energy industry is in good company regarding transition, ‘all industries are changing, notably pharma’. The outcome of the energy transition is unclear. But it is likely that many routine jobs will be eliminated as they are handled by machines. Investment in business intelligence is a smart move in the face of ‘big landslides only a few years out’. Climate change is indeed a threat to the long term viability of oil and gas. Here it is hard act as oil possesses a ‘super high’ energy density. Oil and gas also play a role in chemicals and infrastructure. Shutting all this down in a world addicted to hydrocarbons would cost ‘zillions of dollars’. Finger pointing at oil companies is ‘stupid’. Energy companies are paving the way to the transition, as indeed is the software business. ‘There will never be the new normal, continuous change is the new normal.’
SAP Global VP Oil, Gas, and Energy Benjamin Beberness cited a McKinsey study that found that although renewable energy will grow in the next few decades, oil and gas will still supply about 44% of world energy in 2050 (down from 53% today). SAP has a 40 year track record of supporting energy companies and is well placed to help address the ‘dual challenge’ of the energy transition; more energy and less carbon. Alongside renewables, this will be achieved with a shift to natural gas and hydrogen. Across the bard, ‘disciplined capital planning and productivity will be a differentiator. Berberness stated that ‘95% of the most successful oil and gas companies in the world run SAP solutions’, and 87% of the Forbes Global 2000 oil and gas companies are customers. SAP’s upstream oil and gas effort is informed by the SAP Oil and Gas Consortium, described as the custodians of the industry solution (IS) scope, reviewers of innovation priorities and early adopters of the new cloud services. The SAP S/4HANA Cloud for upstream oil and gas is augmented with quarterly delivery of scope items. In 2021 these include new functionality in JVA/Finance, US hydrocarbon accounting, field logistics and EPPM. Berberness also highlighted the flagship Digital Service Station initiative, more of which below. ETM.next got a plug, Bearing Point’s geolocation tools and equipment tracker as did DNV GL’s Corrosion Under Insulation manager and Flexinergy, an energy management solution from Evolution Energie. Finally the curiously-titled Rise with SAP got a shout-out as SAP’s ‘concierge service to the intelligent enterprise’.
Stephane Lauzon outlined SAP’s plans for upstream oil and gas. These include using machine learning to predict inventory shipment dates to allow users to take action and manage delivery delays. An ‘Overdue Materials – Stock in Transit’ module tracks materials and open stock transport orders which have exceeded the estimated time in transit (probably a lot of these today!). Lauzon addressed the ‘industrialization’ of the SAP S/4HANA cloud. This is an intended shift from today’s ‘complex, siloed, customized IT solutions and processes’ to a future state (Lauzon puts this a surprisingly long way off, on the order of ten years!) of ‘simplified, market standard IT solutions and business processes’. As we have reported previously, SAP carves-up the buy vs. build conundrum with solutions being 80% ‘market standard’ (i.e. unmodified SAP deliverable) and 20% custom. The latter is where clients need to ‘focus the effort and drive innovation’. The latest S/4HANA Cloud asset management scope items are described here. Lauzon concluded that while there are many paths to S4 deployment success, some rules should be observed. Use pre-configured solutions with predefined processes, use modern integration technologies, preferably white-listed APIs, and ‘document your customizations’.
Jean-Marc Delbos presented SAP’s Digital Service Station, now backed up with some very futuristic graphics. DSS will let fuel retailers transform conventional vehicle-centric service stations into ‘preferred multipurpose destinations’ offering a ‘personalized consumer experience’. Behind the scenes SAP automates sales and purchase order generation and optimizes fleet usage with automatic reconciliation, replenishment and settlement processes. The DSS embeds SAP Secondary Distribution Management for Oil & Gas and SAP Retail Fuel Network Operations. E-Mobility adds intelligent electric vehicle charging and circular battery management. E-Mobility, a.k.a charge point operations, is integrated with SAP Analytics Cloud, SAP Concur and non-SAP components. The Open Charge Point Protocol OCPP 1.6 is supported as is Gireve, the pan-EU e-mobility roaming platform.
Udayveer Singh outlined Cairn India/Vedanta’s move to S/4 HANA, in preparation for Cairn Oil and Gas’ ‘next phase of digital’. Cairn’s S/4HANA deployment was performed with help from SAP, IBM and Wipro. Cairn wanted to migrate from its old ERP platform which suffered from lengthy closing and reporting with delays in intercompany consolidation. Joint venture accounting (JVA) was also slow and overall the system was costly, due to customization and an inability to adopt the latest SAP software innovations. S/4HANA benefits include increased productivity from a ‘transformed’ user interface, mobile-enabled transactions and a unified web portal. Self-service functionality leverages the SAP Fiori UX. ‘Next generation’ finance and joint venture accounting (JVA) integration enable real time reconciliation and the integration of finance data, reducing monthly closing cycles time and facilitating value management in multiple currencies. Singh concluded with the observation that ‘S4 HANA Conversion is not only for ERP it impacts the entire SAP Landscape’. A ‘deep dive’ assessment report is essential, with attention to systems that use Java Webdynpro. There are some gotchas, the American Petroleum Institute QCI conversion modules are not compatible with S4. Singh recommends a homogeneous SAP landscape with a single database. S4 includes a major change in database replication techniques so it is key to check your business continuity systems. Migration is an opportunity to look at all stored data, with help from SAP’s DVM service. Cairn’s use cases include rig schedule automation and workflow, eliminating earlier ‘significant manual efforts to update the schedule whenever a new workover is added’ and extra work for the petroleum engineering team. Now an optimization algorithm combines multiple data sources to automatically generate rig schedules.
Madalina Ioana Trifan (SAP) presented a case history from a US oil producer client demonstrating the use of machine learning in operations. Using the SAP HANA automated predictive library (APL), Trifan built ML models for two use cases, scale/paraffin build up and well loading. The model was trained to detect and flag anomalies in measurement data and classify the root cause for downtime. The models proved good at predicting downtime up to three days ahead of time. Key inputs determined from the quantitative approach were verified by known physical relations such as water cut’s impact on well loading and Coriolis gauge temperature on paraffin build-up. SAP Predictive Analytics was used to visualize models’ ROC*.
* Receiver operating characteristic curve, a graph showing the performance of a model at different classification thresholds.
Brian Williams presented on SAP’s use of ISO Management System Standards (MSS) as embedded in SAP’s Asset Strategy & Performance Management. There are over 80 ISO MSS standards. All follow the ‘Plan, do, check, act’ (PDCA) cycle which is said ease the incorporation of multiple standards into core business processes in a consistent way. Management systems are not information technology systems, although information technology is an essential enabler for all MSS. Of particular note is ISO 55000 from the ISO Technical Committee (TC) 251 which covers asset management. SAP IAM is said to act as an enabling platform for ISO 55000. Another standard of note is ISO 14224, described as the foundation of oil and gas asset data governance and data quality. ISO 14224 addresses standard terms and definitions which are key when merging and analyzing data from different sources. Fields of application include failure recording, maintenance and the accurate use of location data assigned to a tag number. Williams also referred to the Oreda Offshore and onshore reliability database that leverages the ISO 14224 taxonomy. Looking to the future, Williams speculated on the use of SAP AIN to share ISO 14224 failure data. Other ISO standards mentioned en passant include ISO 15663 ‘Petroleum, petrochemical and natural gas industries — Life cycle costing’ and ISO 19008 ‘Standard cost coding system for oil and gas production and processing facilities’
Abdulmunim Balushi and his colleague from PDO, Abhinav, presented on the use of SAP Fieldglass Contingent Workforce (CW) module to add visibility into PDO’s burgeoning external workforce. Fieldglass CW now provides insights into ‘hidden workers’ in service contracts, enhances contracting and project management and standardized onboarding/off boarding while enforcing compliance. PDO is currently exploring the Fieldglass Service Procurement Module with an expected go-live at year end 2021.
Farid Akbari (OMV New Zealand) presented on DigitUP, OMV’s upstream digital transformation. DigitUP envisages a network of intelligent digital twins as the ‘digital backbone’ of the company. Field workers are (will be?) equipped with AR glasses to connect them with the digital twin/system of record. While Akbari did not specify whose technology was deployed, OMV has elsewhere been reported as trialing the AR headset from Clark Vision. SAP Asset Manager combines the feed with a visual overlay from the digital twin along with an IAM checklist.
The SAP in Oil and Gas Conference was organized by TAC Events. More from the conference home page.
We dipped into the Aveva World online event, hoping to hear some clear statement as to the whys, wherefores and potential synergies to accrue from its 2020 acquisition of OSIsoft. Either we missed it, or Aveva is not making a big deal out of OSIsoft. CEO Peter Herweck made a brief mention of a combined use of PI and Aveva by EDF at its nuclear plants and also indicated the PI acquisition has meant that ‘teams can do more, bringing IoT data into the cloud and the digital twin’. Herweck gave a preview of the Aveva Connect cloud, with a private cloud available mid-year 2021. We also noticed that ‘oil and gas’ has been airbrushed out of Aveva’s discourse, it is now just ‘energy’.
We also tuned-in to OSIsoft’ Russel Herbert’s podcast for more on the merger. Again we were somewhat disappointed but we did learn that despite all the new technology on offer, ‘it is false to say that that oil and gas is an advanced industry. Many of our customers work with spreadsheets’. You might assume that this backed up with digital technology. Sometimes this is true but often companies are not mature in getting value from their data. A lot of data has been collected over the years, the question now is ‘what are we going to do with all this data and how?’
In the midst of it’s a slightly sycophantic analysis of the merger, an ArcWeb blog observed that ‘Aveva’s Wonderware, Historian, Enterprise Data Management, and Plant Scada offers all have historian applications that might compete with … the more dominant PI System’. Maybe some awkward rationalization is in store for Aveva!
Aker Solutions and AF Gruppen have signed a Letter of Intent to merge the two companies’ offshore decommissioning operations into a 50/50 owned company.
Amalto Technologies has been acquired by Sidetrade. Amalto’s e-Business Cloud solutions will become part of the Sidetrade Platform and will be rebranded as Augmented Invoice and Augmented Order.
Bentley Systems’ Seequent business unit has acquired Aarhus GeoSoftware, a developer of geophysical software. AGS Workbench is a comprehensive software package for processing, inversion, and visualization of geophysical and geological data. AGS software is a spinoff from Aarhus University in Denmark.
BP has acquired UK-based digital energy business Open Energi to develop digitally driven integrated energy systems and deliver innovative, efficient, and flexible energy solutions for customers.
BMO Capital Markets has established a dedicated Energy Transition Group to support clients explore potential energy transition alternatives.
CGG has sold its GeoSoftware business to Topicus and Vela Software. The deal sees Topicus holding 60% and Vela 40%. Vela, an operating group of Toronto-based Constellation Software already owns Coreworx, Petrosys and Tecplot. Topicus is an EU software VC that ‘acquires, builds and manages software companies’.
Cognite has closed a $150 million Series B investment round, valuing the company at $1.6 billion, with TCV, a technology-focused VC.
Enverus has acquired Integrity Title and will integrate its technology into a new platform, Integrity Title Plants, an online platform for title evidence research.
Meanwhile, Hellman & Friedman has completed its $4.25 billion acquisition of Enverus. Genstar Capital, Enverus’ majority owner since 2018, continue to hold a significant minority stake in the company.
Symphony Technology Group has purchased FireEye’s Products business. The two organizations will continue to operate as a single entity until the end of 2021. Post-closing, the Company plans to rebrand as Mandiant.
Immersal Oy, a developer of spatial mapping and visual positioning software has been acquired by Hexagon. Hexagon is also to acquire Infor’s global EAM business for approximately $ 2.750 billion. The agreement heralds a ‘deeper relationship with Infor and Koch’.
K-Solv has acquired Energy Completion Services (ECS) and Chaparral Rental Services (CRS), onshore and offshore rental suppliers for the oil & gas industry. Both entities will operate under the ECS name.
mIQrotech has secured a $6million Series A financing round for the further development of its pipeline optimization and leak prevention tools.
Sanchez Energy has launched OneNexus Environmental, a fintech energy company created to help oil and gas companies ‘systematically and responsibly’ manage their asset retirement obligations and decommissioning activities.
Recon Technology has entered into a securities purchase agreement with certain accredited investors to purchase $55.0 million worth of its Class A ordinary shares.
SeekOps has secured a Series B funding led by Schlumberger, and its new investor CVCI with the support of existing investors Equinor and OGCI, to support both traditional and renewable energy sectors in their decarbonization efforts, with methane leak detection and quantification to enhance ESG reporting and verifying ‘responsibly sourced’ gas certification.
Shell has acquired Inspire, a US technology-enabled clean energy company, to accelerate its goal to become a major provider of renewable and low-carbon energy.
Spectris has acquired Concurrent Real-Time for $166.7 million in cash. CRT provides high-performance real-time computer systems, solutions and software. CRT will integrate Spectris’ HBK platform.
Stratagraph has purchased Technical Drilling Services, a mud logging and consulting services company based in Oklahoma City.
TechnipFMC has acquired the remaining shares of TIOS AS, a joint venture between TechnipFMC and Island Offshore formed in 2018. TechnipFMC is also to market 16 million Technip Energies shares, around 9% of Technip Energies’ share capital.
Laredo Petroleum has migrated to AWS and built a ’serverless data lake’, increasing resiliency and scalability while optimizing costs and operation time.
TotalEnergies and Amazon
have signed an agreement whereby Total will help Amazon power its
operations with ‘100% renewable energy’ while Amazon ‘accelerates
Total’s digital transformation’. More from TotalEnergies.
Saudi Aramco Technologies has selected AccessESP for the commercialization of its JumpStart flowback and well cleanup solution.
Ambyint has partnered with AWS
to provide oil & gas E&P companies with solutions to optimize
production at scale. Ambyint artificial lift optimization products are
now available from the AWS Marketplace.
Oil services group Applied Petroleum Technology (APT) and SGS are teaming to combine and enhance their service capabilities to oil and gas operators. APT has also been awarded a master service agreement by Tangram Energy to provide basin modelling, petroleum systems evaluations and geochemical analyses for the operator’s E&P activities on the UK continental shelf on a non-exclusive basis.
Dietsmann has partnered with Arundo to enrich its services with Arundo’s ‘Marathon’ machine learning and data analytics package.
Baker Hughes has deployed
its remote operations digital technology across Aramco’s drilling
operations, the largest deployment of its kind in Baker Hughes’
history. More from Baker Hughes.
KBC has adopted the BHC3.ai Suite and Enterprise AI applications to enhance KBC’s existing software portfolio for oil and gas process simulation, supply chain optimization, and energy management.
CGI and Shell
have extended their long-term partnership, signing a new five-year
contract valued in excess of CAD$200 million to modernize and expand
its Fleet Solutions business. More from CGI.
Core Lab is to provide Carnarvon Petroleum its Advanced Rock Typing technology with analog petrophysical and engineering parameters on drill cuttings samples from the WA-521-P exploration well on the Northwest Shelf of Australia. More from Core.
Corrosion Resistant Alloys and PipeSearch have jointly launched the PipeSearch platform to connect oil country tubular goods supply and demand.
Dynamic Graphics has signed an enterprise software agreement with BP for data visualization and analysis including CoViz 4D and Sim2Seis licenses.
Deloitte and Teradata
are teaming to help mutual customers migrate their on-premise data
management and analytics environments to the Teradata Vantage
multi-cloud data platform. More from Deloitte.
Empire Petroleum has selected PCG Advisory to for investor relations, digital strategies, and strategic communications.
UK-based Eserv is to supply its offshore plant 3D visualization technology to Neptune Energy in support of the digitization of five of its North Sea operated platforms, enabling some 90 site inspections/year to be carried out from onshore.
Fluid Automation Station has partnered with Alliance OGP to offer dual-fuel solutions, providing oil and gas E&P operators and completions companies access to cleaner and more affordable energy.
Halliburton is expanding its digital collaboration with Aker BP with the implementation of the Digital Well Program, a DecisionSpace 365 cloud application, powered by iEnergy to automate work processes and accelerate decision-making. Petrofac has also signed a three-year contract for the Digital Well Program.
Kuwait Oil Company has selected Halliburton to accelerate its data-to-decisions cycle by implementing automated work processes and digital twins across KOC’s major assets.
IDS has received a ‘DDR Plus'
certification from the International Association of Drilling
Contractors. DDR Plus is a standardized way of importing data into the
IADC daily drilling report (DDR). More from IDS.
IBM and SAP are teaming to deliver LNG, hydrogen production, and carbon emissions reporting solutions.
Archrock has selected Infosys to integrate digital technologies and mobile tools for its field service technicians.
INPEX has signed a sales and purchase agreement with Adnoc for a clean ammonia demonstration supply chain linking the UAE and Japan.
Equinor has awarded ISS a NOK 5.5 billion contract over ten years for facility management at its Norwegian office locations.
Kongsberg Digital and ExxonMobil are partnering to explore the use of Kognitwin Energy hosted digital twin. Norske Shell is also using Kognitwin on its Ormen Lange field, ‘the first ever fully integrated reservoir-to-market digital twin’.
mCloud Technologies and Prosaris Solutions are collaborating to offer the oil and gas sector intrinsically safe ultrasonic gas detection technology to take direct action on harmful emissions.
Moxa Europe and Robotron Datenbank-Software are collaborating in the configuration and provision of IIot platforms in process industries including oil & gas and energy technology.
Petrofac and James Fisher Asset Information Services have teamed to eliminate the need for offshore surveys ahead of modification scopes, reducing time and cost.
Recon Technology’s subsidiary Future Gas Station has secured a three-year cooperation agreement from G7 IOT Hui Tong Dalian to establish an electronic integrated service for enterprise fuel consumption management.
Red Hat and Nutanix
have signed a strategic partnership to enable a powerful solution for
building, scaling and managing cloud-native applications on-premises
and in hybrid clouds. More from Nutanix.
Aramco Europe has deployed SAP
S/4HANA Cloud and additional solutions from SAP to help improve
operations, tighten portfolio oversight and governance, and inform
agile decision-making. More from SAP.
Exida has partnered with Sensia to expand Sensia’s current process safety lifecycle services in terms of scope of services offered and increased global coverage.
Seeq and AWS have partnered to accelerate customer migration to the cloud through the Seeq SaaS Workshop on AWS. More from Seeq.
SGS’s oil, gas and chemical laboratory in Apapa, Nigeria has achieved ISO/IEC 17025:2017 accreditation.
Schlumberger has deployed its Delfi cognitive E&P environment, ‘integrated with the OSDU Data Platform’ chez Petronas. More from Schlumberger.
The Subsea Integration Alliance has been awarded a contract by Equinor on its Bacalhau project offshore Brazil covering the EPCI of the subsea production systems and pipelines.
Schlumberger and IBM
have launched the ‘industry’s first’ commercial hybrid cloud Enterprise
Data Management Solution for the OSDU Data Platform. More from IBM.
Surge Energy has partnered with US Well Services on a field trial of USWS’ all-electric Clean Fleet, Surge’s first all-electric hydraulic frack.
Symbio Infrastructure has selected Siemens Energy as a technology provider for the world’s first carbon neutral LNG liquefaction facility and natural gas transmission line project between Ontario and Quebec.
TechnipFMC and Halliburton have received an OTC Spotlight on New Technology Award (SONT) for their Odassea Subsea Fiber Optic Solution, an advanced downhole fiber optic sensing system. ExxonMobil selected the solution for its Payara development project in Guyana.
Dresser Utility Solutions’ Texsteam and WellAware are teaming up to reduce chemical injection operations expenses. More from Dresser.
OMV S&T has hired Voyager to manage its marine supply chain. More from Voyager.
Speaking at the ‘cracking the data challenge’ session of the virtual GLOBUC GO Digital conference, Michel Lutz, Chief Data Officer, TotalEnergies, sporting the de rigueur hoodie of the true data geek, described his excitement, on joining Total back in 2016, in finding so much data and such varied data types. Along with dozens of petabytes of seismics, millions of sensor data points, Total’s new energy business (Total recently rebranded as TotalEnergies) is ‘streaming data from wind turbines’. Lutz cited the HPC computing capacity of Total’s in-house Pangea 3 and the fact that ‘more and more data is moving to the cloud’.
Total has drunk deeply from the big data Kool-Aid, performing company-level actions re Total’s data culture. Total already has a community of data scientists across its businesses, with Al training and upskilling for people who want to learn and or/ become experts. Total has a data science boot camp, a digital academy and offers ‘AI for leaders’ to train top managers.
A digital transformation roadmap is set to make IT systems more data-efficient with, notably, a partnership with Microsoft on Azure ‘low code’ development. Lutz has signed-off on Total’s stated intent of deriving a ‘€1.5 billion value per year’ as of 2025 from its Digital Factory. This is to house some 300 developers at the heart of the Paris ‘digital district’. The factory is ‘already delivering 40-50 applications per year across all TE businesses’.
One upstream example is real time prediction of production and well
behavior analysis to improve reservoir performance. This replaces
today’s infrequent well tests with a real time, data driven virtual
test. This has proved challenging as ‘hundreds of ML models’ are
required. Other output from the factory includes power and gas
consumption analytics to control a facility’s energy footprint and
analysis of power in Total’s SAF battery plant to optimize production.
More from the Factory home page.
Peter van Den Heuvel revealed that Shell’s OSIsoft PI-based real time data infrastructure serves some 15,000 users around the world. 12 million events are captured every second from some 7 million connected instruments. Shell’s historical data stretches back over 25 years. Shell’s digitization effort builds on many PI tools (Super Collective, Asset Framework, Event Frames, Vision, Analytics) with extra help from Seeq. The PI AF SDK exposes data to other applications. These include Seeq, PowrBI, Petex, C3.ai and many others including SAP. Data from PI Collectives strategically located around the world flows into an extensive data platform running on Microsoft Azure. Some data comes back on site for consumption in C3.ai, Seeq, Power BI and an unidentified Digital Twin application (possibly from Askelos).
The system is set up to accommodate ad-how data access from other users and applications. But van Den Heuvel insists that while users can connect in many different ways, there are rules, making it clear what is allowed and what is not. Most connections pass through the OSIsoft API, with MuleSoft used for external connections. Originally Shell planned for one humungous global PI Super Collective. This proved ‘mission impossible’ because of technology and legal issues with data residency. Today, each regional server has its own PI Vision-based DIY application that lets users make their own trends and perform root cause analysis.
An encounter with (former) Shell visionary Johan Krebbers (of OSDU fame) convinced van Den Heuvel of the importance of the cloud. This has kicked off a long learning curve in the digital team in Shell as data moves into Azure. This was originally dubbed ‘PI in the Sky’ before Shell settled on SSIP, the Shell senor intelligence platform and ‘the next step in our digital transformation’. The plan is to be able to add-in imagery, drone data, etc. combine all of the above to ‘get more value from our data’.
Following a somewhat commercial presentation by Paula Doyle of Cognite’s ‘semantic’ data ops platform, Philipp Tippel (OMV) and Sofie Svartdal Berge (Cognite) teamed to present OMV E&P’s digital transformation, a.k.a. DigitUP. Tippel observed that earlier digital oilfield programs in mature fields seldom managed to go beyond the concept/pilot stage and see a global rollout. OMV, with help from Cognite has set out to change this. Enter the DigitUP program which is to run from 2020 to 2025, with an expected ‘tipping point’ circa 2024 (c.f. Total’s 2025 ‘jam tomorrow’). At which point OMV will ‘enter the world of autonomous operations’, with software driving first operations, then optimization, handling ‘all possible dynamic conditions’ with engineering support ‘on request’ (from the computer?).
Before then, OMV plans to operate on a ‘best day’ basis by 2022
… increasing production levels by repeating optimum performance
(‘best day’) every day. Best Day 1.0 involves ‘front running’ AI that
screens the entire production system to identify performance
deviations. After BD 1.0, more data sources will be added to the mix
with automated deviation detection and performance comparison. This
will send alarms to engineers and suggest action. Deferments are
classified and written back to the system of record. More from Cognite, and watch the video.
IBM host Dariusz Piotrowski kicked-off the Q&A asking ‘why has all this taken so long?’ Lutz responded that Total has been active in data and software for a long time although there was not much talk of data science when he joined, ‘no fancy terminology’. But there was expertise in a numerical approach to geoscience, notably with Total/Elf’s in-house developed SisMage seismic interpretation system. Today’s journey is more about a vision for a company-wide transformation, bringing talents together. Total didn’t wait until 2020 to be active in data. van Den Heuvel pointed out some of the problems that made digital hard to realize. OPC UA is still not embedded in many production systems. SSIP was conceived in a few weeks, but it took much longer to make fit for purpose and get the system endorsed. Data contextualization is all very well as a concept but stakeholders all have a different view of what things mean. It is hard to get all to speak the same language. Paula Doyle observed that while there is in general ‘lots of innovation’, companies are not so good at using data. This is in part due to a complex supplier network, contractors are not incentivized and may even be protective of acquired data. Engineers often have a ‘not invented here’ complex. On the other hand, lots of technologies are on their way to being ‘commodity’, companies don’t build their own ERP system. This should allow for a focus on where value will be created. The unfortunate reality today is that people take decisions based on a best guess. ‘A guy on a rig is making a decision based on a gut feel, that’s today’s industrial reality’. Are people the problem? In OMV the average age is 45-50*. ‘They don’t all talk IT’. ‘Young people have to explain it to them’.
* The first time we heard this kind of talk was 15-20 years ago, when today’s 40-50 year-olds were the smart young kids of the day. The ageist trope will be with us for as long as IT keeps up with its ‘fancy new terminology’.
A special session discussed OSDU and its potential for open collaboration, again moderated by IBM’s Dariusz Piotrowski, who asked what monetary benefits will accrue from the open subsurface data universe. Shell’s Johan Krebbers, father of OSDU, replied that it is too early to say but that once data is moved into single platform, ‘it will be there for many, many years’ (music to the ears of the cloud providers?). Schlumberger’s Jamie Cruise sees OSDU as an opportunity to consolidate data sources which should increase productivity and enable new workflows. Energistics’ Phil Neri commented that the move to the cloud is a ‘huge undertaking’. OSDU provides a ‘2-3 day data migration’ roadmap.
Piotrowski insisted, ‘how much time will it take and at what cost?’. Again, the response was less than direct! Cruise observed that OSDU should be seen as a part of the digital transformation, feeding AI/ML applications. Companies should ask ‘what do you want to curate in the platform’. All this will take time. ‘Be agile, talk to vendors about their data products’. Neri stressed the importance of metadata in OSDU. Today’s silos are ‘economical’ with metadata. Moving to a platform means this needs to change, making data ‘fully described’. This will be a ‘step change of huge value’.
The debate turned to the meaning of OSDU for smaller companies. Cruise anticipates ‘lots more innovation from vendors’. Krebbers added that OSDU gives small companies ‘equal access to data’. Cruise added from his own experience (with Target) that smaller providers needed to form multiple relationships with larger software houses. With OSDU, ‘you just write to the API’ adding ‘One API to rule them all*’. Neri agrees, OSDU is ‘opening the floodgates for small developers with innovative solutions’.
* There is many a true word spoken in jest. The OSDU API is supplied by Schlumberger!
And what of OSDU in the energy transition? Krebbers is confident
that new energy sources will fit into the new, and rebranded ‘open
energy data platform’. The session closed with a call out to the
community to get involved, sign up with OSDU.
Speaking in the midstream/downstream section, Asger Klindt introduced Maana’s
‘Fanar’ application that uses AI to optimize maritime fleet schedules,
matching cargoes with vessel availability. Fanar was proven at Aramco
Trading and in 2020 supported an average of 5 million barrels/day with
130 tankers and over 2000 voyages. Fanar is described as a low code
platform that runs in the Microsoft Azure cloud. The tool is claimed to
leverage information spanning vessels, terminals, bunkering locations,
chartering costs, weather, piracy and war risk, canal fees and more!
Fanar provides a single view of the business, automatically optimizing
schedules across the fleet and offering ‘what if’ scenarios for better
insights. ‘Cargo operations is a highly dynamic environment, planning
never stops, vessel statuses constantly change and veer from the
original plan which demands constant optimization, not just at the
beginning a voyage.’ More from the release.
Petronas’ Sharul Rashid presented on process safety and instrument analytics. Rashid enumerated some of the major process safety incidents of the past decades, from Longford (Australia, 1998), Pascagoula (Mississippi, 2002), to Buncefield (UK 2005) and Deepwater Horizon, (USA, 2010). History is repeating itself as process safety incidents continue to occur. Rashid offers five process safety questions that companies should ask themselves. 1. Do you understand what could go wrong? 2. Do you know what systems prevent this from happening? 3. Are they working? 4. What is your role? 5. How can we leverage digital technology? Rashid discussed the challenges of aligning old/legacy plant with the safety standard IEC61511. A Petronas refinery in Malaysia was commissioned in 1983 and had an ‘under-engineered’ safety instrumented function (SIF), including a single non-smart sensor that was safeguarding the furnace’s fuel gas burner throat against liquid carry-over from fuel gas header. This has now been upgraded with multiple smart sensors and a voting protocol. Rashid observed that while voting may ‘close the SIF gap’, it can increase the chances of a spurious trip.
In the upstream session, Marco Ferraz presented Lisbon, Portugal headquartered Galp’s
GeoScienceAdvisor, developed by Galp and IBM with financial support
from ANP. GSA provides AI/ML-based assistance to seismic interpreters,
performing seismic facies analysis and assessing geological risk and
probability of success. The screenshot looked a little like Eliis’
PaleoScan. More from GALP.
The sixth Global Business Club GO Digital Energy event will be held in Amsterdam on 7-8 June, 2022. More from GLOBUC.
Speaking at the 2021 virtual OPC Day International event, Equinor’s João Pinheiro earned enthusiastic praise from OPC Foundation CEO Stephan Hoppe for his ‘fantastic success story’ of a real world deployment of OPC UA on the Johan Sverdrup offshore north sea development*.
Equinor is running six digital programs centered on its ‘Omnia’ unified data platform. Pinheiro’s focus is data-driven operations and the field of the future. Here Equinor is moving from data/application silos to a single, common platform to enable ‘data-driven’ operations. Currently, one difficulty is that applications have their own data and data models. Readers will spot the similar problem statement to OSDU!
Data context is also a key requirement and this is what an OPC UA connectivity framework can provide. In fact, for Pinheiro, ‘the most powerful game changer in OPC UA is the ability to turn data to information by providing data in context leveraging the OPC UA information modelling framework’. This has allowed Equinor to standardize information across process control vendors’ products, leveraging OPC UA companion specifications and realizing the goal of interoperability! All that is required is for equipment vendors to implement the protocol.
Equinor’s IT/OT landscape ensures that similar information is described in a similar way, with agreed common open formats and connectivity standards and well documented interfaces. An OPC UA server exposes a context address space described with the UA object-oriented information model. A tree view allows engineers to browse between objects to see how they connect. The system needs to be set up with meaningful tag identifiers for real world use.
OPC UA has been implemented and proven in use at scale on Johan Sverdrup and has been operating since 2019, with 19 OPC UA servers on the platform aggregated into a central OT/IT gateway, using an OPC UA aggregation architecture. Other servers provide an equipment template library and an engineering database. These can be merged with as-built plant-specific information and exposed to the OT network. Equinor’s OPC UA library consists of 50 objects, 90 classes and some 920 attributes, these are also used in Equinor’s renewables business. Johan Sverdrup’s tag server currently exposes 190,000 OPC UA data points, soon to be upgraded to 1 million.
Equinor likes the deliverables from the OPC Foundation and believes that giving back to the community is important. ‘We benefit from the community and being open is key to adoption’. Visit the Equinor GitHub OPC UA home page here.
Pinheiro observed that realizing the ‘huge potential’ means getting more vendors on board. Standard OPC UA information models need to expand, adding cloud connectivity, and AutomationML ‘to close the gap between operations and engineering’. Most current process industry standards (NAMUR NOA, MTP, OPA, Industrie 4.0) all of which point to OPC UA as a key building block. More on The Open Group’s O-PAS in our next issue.
* A release from partner TotalEnergies reported that Johan Sverdrup came on stream in 2019 ‘two months ahead of schedule and 30% below the initial budget’.
The Open Group Open Footprint Forum’s first virtual event heard from organizations including Accenture, AWS, Deloitte, ERM, IBM, Infosys and Shell. The OFF started life as a technology spin-out from Shell’s Open Subsurface Data Universe and will leverage a similar open source, cloud hosted approach to deployment. The underlying Open Footprint Data Platform (OFP) was presented by The Open Group’s Heidi Karlsson, and Johan Krebbers, formerly with Shell, now chez Cognite.
When the OFP luminaries were through presenting their wares, others came on the stage to show their own work in the same space and the potential for collaboration with OFP. Anna Stanley from the World Business Council for Sustainable Development (WBCSD), presented the Carbon Transparency Partnership. Sonia van Ballaert (IBM) presented OREN, a new platform from IBM and Shell for greenhouse gas reporting in the mining industry that will ‘underpin transparent GHG reporting’. In answer to a question from Oil IT Journal OFP was said to be ‘a key building block of the Oren platform’.
Liz Dennett (formerly with Amazon Web Services, now chez WoodMac) described sustainability as ‘really just a data problem at its core’ adding that the path to success was to ‘get sustainability and IT in the same conversation’. Last year AWS announced the Climate Pledge Fund a corporate venture capital fund that invests in companies that can accelerate Amazon’s path to meeting The Climate Pledge. AWS provides publicly-available data on its sustainability initiative. IoT Greengrass and SageMaker AI/ML services got a shout-out as technology enablers.
Deloitte’s Jacques Buith issued a call to action for OFP to survey industry CIOs on their emissions stance, prepare an Open Footprint report to be launched at the upcoming COP 26 meeting in Glasgow, Scotland.
View the recordings of the presentations on the TOG YouTube channel and download the slides here.