The most popular sessions at the 2022 EU Community for Information Management (ECIM) conference in Haugesund, Norway were those featuring the latest developments on the OSDU front, more of which in our report elsewhere in this issue. But for us, the highlight of the 25th edition of ECIM was Henning Lillejord’s presentation of ConocoPhillips’ subsurface warehouse, developed to support the ongoing development of the North Sea Ekofisk giant oil field.
Ekofisk was discovered in 1969. 50 plus years on, its remaining reserves still qualify it for ‘giant’ status. The Norwegian regulator recently extended its production license out to 2048. This has justified the development of a major subsurface data warehouse to capture historical information from the field’s 500 well bores, production and injection activity, horizontal wells and satellite field developments. ConocoPhillips is still trying to figure why Ekofisk is such a great field. Its best well started producing in 1990 and produced for 30 years without a workover. Why? Where is the waterfront? Answering such questions (and many others) is key to making predictions about future performance and informing development.
Lillejord acts as a ‘data ambassador’ for users, centralizing well master data, business logic, and QC rules. Previously, common workflows involving production and geological logs would be done with Excel. ‘And re-done every time they were needed!’ Now, E&P data has been gathered into a data warehouse that also supports operations and maintenance. But the neat aspect of Lillejord’s approach is that no new tools were needed to build the data warehouse. The main enabler was SAS Institute’s software (with help from Tibco Spotfire) which has been used in-house for years in the finance department. Now log and production data is ‘fresh and integrated’. The end user goes straight to the warehouse. The warehouse ingests data from Petrel Studio, WellView, Energy Components, OFM, PI and more, presenting information in a ‘well on a page’ format. A similar ‘field on a page’ function exposes well tops, tests, NPD data and 4D seismics. SAS has shifted focus to the opinion of the end user as opposed to that of the IT department. Domain understanding is the key.
In a presentation given at Altair’s Future Industry event podcast, Ingo Mierswa, founder of RapidMiner, now part of Altair, gave a ‘requiem for a data scientist’. Here are some excerpts.
It’s 2022. We should have autonomous robot workers and flying cars, the reality is that you have invested in AI & ML and all you’ve got is a lame pie chart with moldy data. Organizations have hired a bunch of nerds who can out-math Archimedes, but they get sweaty palms when it’s time to solve real business problems. Others have advanced data scientists who can do amazing things with deep fakes but end up creating million-dollar solutions to ten cent problems. The AI honeymoon is over—and it SHOULD BE. You may be a data scientist, chief data officer or whatever, you need to be strong right now because it’s time to say goodbye to the profession as we know it. We were told, hire more data scientists, train them, empower citizen data scientists. Now while some use ML/AI, many are waiting to reap the rewards. Mierswa finds this ‘troubling’. After our last ML model has been deployed, how should we be remembered? As people who had true impact on the organization or as a bunch of eggheads toying with the latest algorithms for our own amusement.
86% of C-level respondents to a PwC survey thought AI mainstream. So the problem is not lack of investment, everyone is investing in AI. But 80% of these initiatives are not in production and have had no impact. The usual riposte to this is bring out more data scientists. ‘I do not believe this for a second!’ The problem is that there are two types of data scientist, overwhelmed and overwhelming. The former, fresh from college have never talked to a stakeholder and get passed around from department to department. They have no deep knowledge of the problems they are supposed to solve. The other type is a bunch of truly brilliant people who won’t do the simple stuff like analyzing the problem. So you can chose between a 10 cent solution for a million dollar problem or a million dollar solution for a 10 cent problem, like AlphaGo, an ‘overwhelming’ data science solution to a problem that nobody cares about.
The root case of data science’s failure is not a skills gap but rather the huge wall between business stakeholders and data scientists. Such a separation also exists in coding and is solved by scrum teams that collaborate. This is how very complicated things get delivered but it is not how most data science is organized. We need to embed data scientists, citizens or others, in the business. Upskilling is better than outsourcing. And of course, you need the right tools which is ‘why I am pleased that we are now part of Altair’.
More on RapidMiner in oil and gas.
Comment: It seems to us that Mierswa does a better job of explaining the failure of data science than he does recommending a different approach. Most large companies we have looked at to date already deploy embedded/collaborative teams. Perhaps the AI problem is more fundamental than Mierswa suggests.
In my opinion, ECIM continues to be the best upstream data management conference*, a position it has consolidated with the demise of the other ‘best’ show, PNEC. It was gratifying that the organizers gave me some decent exposure with an appearance on the panel session and some amusing, if rather embarrassing probingduring the gala reception, by ECIM court jester Graeme Blakey, of my failing memory .
One thing about being on a panel (at least for me) is that one is so focused on ones own discourse, trying not to say anything too daft, that it is hard to listen to what the other panelists are saying. To remedy this I invited the other panelists to provide a short summary of their key points. So far nothing, but the invitation is still open. So that leaves me with the opportunity of summarizing and, why not, embellishing my own contribution. Which, in other words, is pretty much like writing an editorial. So here goes.
One thing I have noted from the many previous panels I have listed to is that panelists often do not seem to be addressing the questions from the master of ceremonies. This might be frustrating for the MC and those in the audience who are expecting some serious answers to good questions. Well I’m not sure which question I was trying to answer, but I first found myself extemporizing on the topic of standards, divulging a key argument form my as-yet-to-be-written book which may or may not have a title along the lines of ‘The IT that Goes Wrong’.
So standards! I have been attending conferences, user group meetings and what have you for many years and have observed two things. One: People entering the profession confronted by a problem of data exchange or interoperability all come to the same conclusion that what is needed is a (new) standard. Two: Following possibly many years of effort, the same folks discover that the new standard is not working and move on to other things, to be replaced with new blood, folks with renewed enthusiasm and the process starts over.
In years of fiddling with my web development, working across platforms (PC, Linux, Mac) I made a major discovery in the standards arena. As you know, the king of all standards is Ascii, the American Standard Code for Information Interchange. This sets out how text and numbers are represented by bits and bytes. But when you are working cross platform with Ascii you will very quickly come unstuck with the manner in which end of lines are handled across the different platforms. Given that Microsoft and Apple are competitors (as is Unix/Linux) it is easy to see that the intent is to hamper interoperability. After all, the last thing that Microsoft wants (and vice versa) is to make it as easy as possible for its user base to up-sticks and transfer all its workload to a Mac environment. The EOL gotcha is real, but it is also a metaphor. Similar gotchas exist between cloud platforms, printer cartridges and just about everywhere in IT. The problem is for those advocating standards is that there are far too many people pulling in the opposite direction, and these are the people with the clout!
All this was by way of softening up the audience for my bombshell. This was in answer to the question, ‘what has been the greatest data management disaster?’ I found myself saying ‘the semantic web’! Which should have caused a riot in Norway, home to ISO 15926 and a host of academics beavering away to leverage Tim Berners-Lee’s (a.k.a ‘God’) resource description format to make a better world of interoperable linked data. Well that hasn’t happened. In fact it hasn’t happened so much that my bombshell went down like a lead balloon. I doubt that many in the audience were aware of the years of semantic effort**. It is true that semweb has been more popular (but essentially unsuccessful) in the engineering community than in the upstream.
Sitting next to a SLB rep, I thought I would give a smart-ass answer to the question, ‘what has been the greatest success in data management?’ Schlumberger’s Finder I said. Which was ‘smart’ in that Finder has long disappeared from the SLB catalogue, replaced with a host of expensive apps that are now to be replaced with OSDU (maybe). It was also smart because for a while, Finder really was something else. Selling like hot cakes in the late 20th Century. It was eventually decommissioned when the underlying software stack reached end of life (more IT that goes wrong!).
I also found myself giving another smart-ass answer to the ‘greatest success in data management?’ question. I suggested the greatest success was GIS. Smart, because for all the efforts of the bottom-up data managers to manage E&P’s plethoric data types and re-combine them into a prospect or play, it seems like the top-down GIS approach often provides an easier way of bringing it all together. Perhaps I should have said, that GIS is better at stealing the scene from the data managers who still need to do the grunt work getting stuff in place.
The final two questions were ‘What is the biggest unsolved problem in data management?’ and ‘What are you most hopeful about for the future?’ Tricky ones these. On the one hand, the biggest ‘unsolved problem’ and their hoped-for solutions have remained remarkably static over 20 plus year of data management conferences. I think that on top of these, a new problem is the siloization of data management and the difficulty of communicating between the data managers, IT, data science and ‘the business’. So my hope was to do this better. But since IT is constantly mutating, rediscovering the wheel and inventing new names for stuff, this is not getting any easer. On the subject of naming stuff, another divulgation from my future oeuvre. The great French mathematician Henri Poincaré defined mathematics as the ‘art of calling different things by the same name’. I suggest that IT can be considered the ‘art’ of calling the same thing by lots of different names.
* Of course the other (the real?) reason I like ECIM so much is the opportunity to run in the beautiful Djupadalen park. You can see my 2022 ECIM run on Strava.
** The semantic dream lives on though. In this very issue we report on how Australian academics are still flogging the semantic dead horse in an MRO context.
In his wide ranging keynote address to the 2020 ECIM Data Management Conference in Haugesund, Norway, Equinor’s Harald Wesenberg stated that ‘Apps age like fish, but data ages like wine’. This explains why so many digital oilfield projects fail (not only in Equinor!). Development methodologies like scrum and agile are easy and cheap to deploy. But software development is not like civil engineering. There are no first principles to build on and little relationship between cost and delivery. Agile teams may involve many players but not, perhaps, so many users. Successful software comes from successful user conversations. Development teams need to be immersed in the environment so that they build what users need and understand the data.
This is where things get complicated. Sensors are inherently unreliable, information needs context. ‘Do you know what you are measuring?’ A term like ‘cargo’ can mean different things to different stakeholders. Terminology alignment between domain specialists and data scientists is tricky. Again, context is key to capturing ‘silent deviations’, i.e. unknown local practices and adjustments. Applying manual data checks up-front may lead to stress and overwork. There are frequently differences between how outsiders think a process works and what is really going on. Developers have strong opinions on how software should work. The project manager in the middle translates requirements, but may have his/her own biases (groupthink). Apophenia (the tendency to perceive meaningful connections between unrelated things) is another gotcha.
Wesenberg warns of the ‘3 Cs’, career, competence and corruption
which have led to falsified nuclear safety data, questionable Alzheimer
R&D and more. Inspiration may be found in the work of Steven Shorrock. To understand and manage our complex systems, Wesenberg suggests the Cynefin
framework for managing complexity. In any event, ‘What got me here will
not get you here – you need to make your own journey’. Wesenberg
wound-up with a quote from Jeff Bezos ‘When anecdotes and data disagree
the anecdotes are usually right!’. In other words, ‘you may have been
measuring wrong’.
Hilde Nordbo traced the 50 year history of the Norwegian Petroleum Directorate (NPD). With the 1969 discovery of Ekofisk
it was clear that the oil business was going to be massive for Norway.
The NPD has maximized the value from oil and gas for Norwegian society
and now Norway has ‘the best oil industry in world’. A key component
here has been the proper management of resources and of data. The
establishment of a National geoscience resource was prescient. Geobank
houses physical samples, logs and tapes with some 160 km of cores and
13 petabytes of data in Diskos. Diskos has also been a success factor.
A recent huge find is attributed to re working old data in Diskos. In
the context of the energy transition, is oil still necessary? ‘Yes, in
combination with emerging sources’. All are part of NPD’s effort,
preparing for the future, with data repurposed in new uses.
Mathias Hartung (Wintershall DEA)
addressed the issue of data governance in a ‘mature yet transforming
operator’, faced with the challenges of post-merger integration, energy
security, the pandemic and ‘net zero’. The focus today is on
operational efficiencies, quality decisions and compliance/reporting.
This all mandates data quality and fixing the ‘garbage in garbage out’
problem which ‘even the best data science cannot fix’. Hartung
leverages a SIPOC approach
(supplier – input – process – output – customer) which is applied
‘right to left’ (q.v. ‘right to left thinking’ elsewhere in this
issue). ‘SIPOC fixes GIGO’ in a process that goes from Scada systems
into Wintershall’s data warehouse (an implementation of Steinhaus
Information Systems’ TeBis historian), a hub that feeds data out to stakeholders in the GPOE unit (well integrity – see also Cegal on ‘New ways of working), reservoir management (OFM),
ESG and finance. Wintershall’s dedicated data and information
management organization has a direct line to corporate management. Like
the data, the unit crosses all departments – subsurface, process,
information management data science. Documents are managed is a
combination of OneDrive, SharePoint and (for enterprise content
required for compliance and reporting) in a dedicated document
management system with links to the physical archive of paper
documents. Hartung observed that Wintershall’s climate change KPIs are
‘exploding’ and are migrating from manual estimates to automated
reporting and management. There will be some 50k climate KPIs to report
on in 2023, a ‘huge data management problem’. Data governance is
essential for digitalization and can be done by professionals focusing
on business processes and data readiness. Governance delivers
efficiencies, boosts morale and assures compliance. ‘Data and
information flow is what we do’. Hartung is open to collaborate on data
governance best practices.
Henk Tijhof presented Shell’s
OSDU* experience to date and its take on OSDU strategy and vision.
Shell initiated the OSDU program, contributing components of its
in-house developed SDU, the subsurface data universe. Technology is
changing relative to legacy systems and Shell is ‘trying to figure our
where we are going’, with a focus on workflows. Some users are keen,
but many are hard to convince. The majority say ‘ok if you want to
change go ahead I have a day job!’ The expectation is of company-wide
access to all global data for AI/ML workflows. Such business processes
can be digitally orchestrated in the cloud. But Shell is struggling
with data in cloud. Shell’s subsurface and wells (SSW) digital
ecosystem is more than OSDU. With the move to an OSDU future cloud
architecture, Shell ‘wants to partner, not to develop our own
solutions’. With the new ways of orchestrating business workflows, it
remains important to keep the benefits of legacy apps. OSDU needs more
market solutions. Data migration is an issue, OSDU is currently more of
a data transfer platform. In any event, Shell’s OSDU program is large
and running at full capacity. Shell is committed to support OSDU, but
this has never been done at scale. Generating and consuming data in the
cloud via OSDU in support of end-to-end seismic workflows has value,
but is technically difficult. The OSDU Forum needs to stay strong with
sustainable and equitable contributions from members. We need a
commitment to go all-in with OSDU as a platform for data storage. Here
there are a lot of things to solve like data immutability and working
with apps. A workspace concept needs to be designed and implemented
‘asap’ to avoid going back to the original project databases. In a
slight dig at the scrum approach, Tijhof confessed to being an ‘old
school waterfall man’. It’s key to get the design principles right.
More thought is needed on data management, catalogues etc. We need more
team work to test, break and fix.
* Originally the Open subsurface software universe. Now just ‘OSDU’, to allow infinite scope creep!
David Holmes (Dell) provided a
more positive spin on the OSDU state of play, speaking on behalf of The
Open Group. There are today some 200 companies paying TOG fees and
providing resources, an ‘enormous community’. One major release
involved a ‘million person minutes’ of Webex calls. The OSDU mission is
to ‘reduce data silos’ and provide an exit ramp for existing
technology*. An environment where we can deploy new stuff without three
years of integration effort thanks to a ‘cloud and open standards-based
ecosystem’. OSDU needs to be ‘tech-agnostic’ i.e. poly-cloud, on prem
or public cloud. OSDU started in the subsurface but is extending across
energy and broadens its scope. Notably with the Open Footprint forum
(OFP). This means ‘using modern IT to industrialize data management’
with cloud capabilities displacing legacy (although ‘we are not there
yet!’). Open Source is ‘at the heart of OSDU**’. But ‘not everything is
‘free’, open source means ‘differently expensive’. Service providers
are building commercial OSDUs. The consultants, notably SLB, are in on
the act. While you could roll your own, a vendor edition means less
spent on platform capabilities. You need to re-evaluate your IP and
just keep what is core, ‘just write the cool stuff’. Support the
platform, grow domain coverage and think how to take this forward. OSDU
is one of the few things around which industry is coalescing.
* An interesting remark, potentially with major consequences for the vendors of ‘legacy technology’.
** Actually it is not really clear if OSDU is open source.
Andreas Sandvik Jakobsen (Sopra-Steria) and Bruce Chalmers (Var Energie)
offered a DataOps 101. Data Ops (DO) will ‘liberate data from its
sources and deliver it to a person, app or system’. DO is ‘governed by
agile’. Var Energie uses DO to deliver data products to its
geoscientists and engineers, leveraging data science, domain expertise
and data management. Cleaning exploration data for data science use is
not easy. Quality challenges abound: trends can mislead, deviated wells
are awkward to handle and there are many different data sources. DO
helps with data cleansing, adding context, dashboard building and
integration with Petrel. Wells are consolidated in a Python/Pandas DataFrame.DataFrame.html
along with NPD attributes and viewed in a Python GUI. Was it worth it?
Merging multiple data sets adds ‘debugging knowledge’. Most value add
came from data consolidation and prep with DO. DO can dispel or verify
data myths, by comparing different data sets/sources and identifying
bias. The resulting data product can be used over time as a, ‘always
on’ tool for interpreters. The trick is to combine data science, domain
expertise and data management into one ‘cog’ that drives the
interpretation machine. DO means an automated data pipeline, that is
accessible and communicable and that supports enterprise data
governance. DataOps engineer is ‘the sexiest job in analytics’. In the
Q&A, Oil IT Journal asked what is DataOps, is it anything more than
just using Python? The answer was that ‘DO is technology agnostic, it
could be Python, PowerBI’. OK but what is it?
There was a full house for Einar Landre’s report from the trenches on Equinor’s
experience of OSDU adoption. Since its 2018 launch by Shell and Johan
Krebbers, OSDU has grown to 220 member companies and 2,600 contributors
on the Slack channel. OSDU is now ‘near to production ready’. Hitherto
data has suffered from broken lineage and lack of provenance. In the
well delivery process knowledge and intent is lost in an operational
‘fog of war’ . OSDU can help here to maintain lineage in delivery with
artefacts that point to their predecessors in the workflow. Data
provenance can be fixed with metadata captured at source. But where is
OSDU in Equinor’s journey to the cloud? Equinor’s ‘Omnia’ cloud is a
bespoke data warehouse running in Microsoft Azure and preceded OSDU.
One Omnia component, the subsurface data lake (SSDL) is being
refactored to run OSDU code inside Omnia. This is slotted for ‘soft
production’ by year-end 2022. OSDU currently runs inside Equinor custom
schemas, in the future, Equinor data will leverage OSDU well known
schema and ultimately to pure OSDU, with attributes on demand from
apps/users. So where is the system of record? Seemingly ‘there is no
golden record only an endless journey of insight’. The golden
record/single source of truth is a fallacy. A data platform needs to
support the endless data lifecycle, the continuous distillation of raw
and immature datasets into fit-for-purpose datasets that are ‘bearers
of knowledge’. These can be archived either along the datatype/context
axis or along the artefact/project axis, OSDU supports both. A legal
risk assessment of OSDU is underway in Equinor. The first data to store
will be non confidential, ‘low hanging fruit’. Legal data tags are work
in progress as is data management, governance and compliance. Are we
there yet? No!
Maria Juul (NPD)
presented Diskos 2.0 the fifth manifestation of Norway’s oil and gas
data repository. Diskos is ‘the world’s largest NDR’ with 13 petabytes
of data, over 300 users and 33 member organizations. Diskos is managed
by three partner organizations the NPD (regulator and administrator),
its members (contributors, users) and the operator (the technology
provider). Landmark and Kadme return as operators of Diskos 2.0
which will go live in January 2023. Enhancements in 2.0 include a new
API for access to well seismic trade and production data, automated
reporting, dropsite data ingestion, improved QC and tagging of
reporting requirements. Third parties can add data for completeness.
More data will be available from the public portal.
Diskos now runs in the public cloud, on both AWS and Azure. A virtual
data room, served from AWS Stockholm, is available as an additional
service. While DISKOS is said to be ‘OSDU ready’, there are no plans to
‘merge’ with OSDU. OSDU is not considered mature and ‘we need to
understand more to see if this is where we want to go’.
Bee Smith and Zahir Ibrahim described how the UK North Sea Transition Authority
is ‘driving data quality improvement through Section 34 infrastructure
reporting. NSTA took over management of the UK National Data Repository
in March 2022. The initial focus is for data quality improvement under
impetus from the 2016 Energy Act. Ready access to climate data is a
prerequisite to the energy transition. The Energy Pathfinder
launched, extending NSTA’s brief to span oil and gas, CCS,
electrification, Hydrogen and offshore wind. One emerging use is
decommissioning, where new data attributes and a system of record are
needed. Section 34 of the UK
Offshore Energy Act empowers the NSTA to ‘to require information and
samples’. To which end, an ‘agile’ approach has been deployed to
improve data reporting. This involves some 45 types of infrastructure
and here, ‘only shapefiles or geodatabases are accepted, no more
spreadsheets!’ Data is run through an FME Workbench which applies some
36 checks on duplicates, attributes and spatial integrity. Future plans
include tighter definitions, more attributes, more checks and more
automation. NSTA is driving innovation through better quality data. In
the Q&A, NSTA was asked why there was no facility for non GIS data
(such as PDFs or reports of corrosion) and why the IOGP Spatial Data
Model was not considered. This was acknowledged to be a possible future
option.
Sakthi Norton (Offshore Energies UK)
addressed the evolving value of data in ‘a North Sea in transition’.
ESG factors are gaining in importance and are now ‘as important as
profits’. This is changing data management as ESG data is now required
to cover Carbon, biodiversity risk, water management, social, safety,
governance and anti bribery. There is a lot of data to collect and
maintain and also lots of overlap with different traditional functions
in HR, operations, and finance. Much of this data is already captured
but it is ‘siloed, messy, and not necessarily purposed for ESG’. OEUK’s
current focus is on large operators and the supply chain. OEUK is there
to support industry-wide ESG reporting via the SEQual registry of suppliers and buyers. ‘We need to be ready for ESG data’.
Chandra Yeleshwarapu (Halliburton)
bravely stated that ‘Data management as a function is going to
disappear!’ It will be replaced by ‘DataOps’. You need to think of data
as a product, you need a data pipeline and thus you need an enterprise
architecture. Today, companies are funding data science rather than
data management. This means developers are in the wrong place. Better
to bring them together in one scrum team, interlocking the cogs of
MLOps, DataOps, DevSecOps that ‘constitute the enterprise architecture’.
Having already heard two talks on DataOps without understanding much, so we listened-up to SLB’s
Jamie Cruise talk on ‘data without borders’ and digital transformation
via ‘DataOps’. For Cruise, ‘data chaos is the enemy of the industry’
and current E&P environments impede digital transformation
especially at the data layer. What is needed is a platform that
connects producers to consumers. This is hard to achieve because
companies have spent years developing ‘something that looks like a
platform’. Now we have OSDU, the data platform and a ‘pure open source
community*’ So now what? Cruise invokes ideas from outside the industry
notably ‘data as a product’. He cited McKinsey and the Harvard Business
Review’s ‘A better way to put your data to work’ HBR June 2022.
Here we have a system of record feeding a data platform feeding
applications. ‘Alongside of data as a product we have DataOps’ and
‘data starts speaking to humans in a trusted voice’ (one might think that Cruise has his tongue in his cheek at this point).
Product thinking will change data boundaries - beyond corporate, vendor
and national data silos. ‘The sheer weight of buzzwords will mean we
start thinking of data products!’ (OK he definitely is kidding!)
Standard platforms productize the collaboration model between producers
and consumers. However, data is a ‘business’, enabled by data loading,
curation and peer-to-peer data marketplaces. All of which could make up
a digital energy environment and all of which are fee paying. OSDU is
‘just a piece of the puzzle’.
* Again – this is moot.
More from ECIM, the EU Community for Information Managers.
Blogging on the OSDU Forum home page, Dean Samara-Rubio (Intel) announces that the OSDU ‘Edge Lab’ is now open. As OSDU scope creeps beyond its initial subsurface focus, the Forum is ‘looking to’ other phases of the oil and gas operations business including drilling and production. Here the OSDU approach will enable data interoperability with APIs for ingestion of production data to the cloud. But Samara-Rubio’s concern is for the ‘millions of widely distributed and remote sites’ with edge computing and communications devices. Enter the OSDU Edge working group which is to ‘investigate edge computing for the energy sector’. Current members of the new work group include Chevron, ExxonMobil, Red Hat and Intel who are funding development of APIs and architecture for the edge through the OSDU Edge Lab, with initial member contributions from Aveva, Dianomic, Naonworks, Petrabytes, and Softdel.
The OSDU Forum also recently announced the OSDU Innovation Marketplace. Although OSDU already exposes a catalog where vendors can showcase their wares, this lacks functionality for consumers to request solutions and products from vendors. The Innovation Marketplace closes this gap by allowing ‘upfront demand specification’ such that the developer ecosystem can build solutions for known consumer needs. The Marketplace is a pass-through for requests, the OSDU Forum is not becoming a software development house and ‘will not, and must not’ engage in commercial activity or conversations. Requests are reviewed by a marketplace triage team to ensure that they are appropriate and comply with anti-trust guardrails. Any changes needed to the OSDU data platform to facilitate innovations will be ‘donated or communicated to the shared backlog’. The OSDU marketplace was built with Aha’s idea management software. Despite the aim for a ‘vibrant marketplace’, five months since inception, it does not appear to have garnered much interest from OSDU innovators!
More on OSDU elsewhere in this issue (ECIM) and from The Open Group.
A recent publication from IBM titled ‘The Quantum Decade, A playbook for achieving awareness, readiness and advantage’ is a 140 page analysis of the ‘weird but wonderful realm of quantum mechanics’ and how harnessing the power of qubits should allow future quantum computers to perform more powerful computations than traditional computers. IBM expects practical applications that exhibit the ‘quantum advantage’ in this decade with widespread adoption expected by 2030. It’s all about the number of qubits. In 2020, the state of the art in quantum computing was an IBM system with 65 qubits. A 1,000 qubit machine is forecast for 2023.
BP’s VP Digital Technology Richard Debney believes that, ‘Moore’s Law is coming to an end and classical computing is reaching its limits just as our demand [for compute power] is starting to surge.’ To harness the power of quantum computers, programming needs to adapt. As Debney explained, ‘Quantum computing is not just an expansion of classical computing. We can’t just port problems to quantum computers. We need to break them down and build communities that can effectively apply this technology to the right problems.’
Last year IBM identified a potential use for quantum computing in deep learning, using ‘quantum kernels’ to solve ML problems that are hard for classical methods. Woodside Energy is working with IBM’s quantum researchers to investigate practical applications of quantum kernels in machine learning workflows. The ongoing research, a ‘pathfinder project’ for Woodside, addresses, inter alia, petrophysical analysis of well log data.
Another focus for quantum research is materials discovery, in pharmaceuticals and petroleum refinery catalysis. Here Doug Kushnerick, formerly with ExxonMobil Research says, ‘[Today] the materials discovery process is unbearably slow. Companies don’t have time to experiment endlessly. Quantum computing can give us an exponential leap in discovery.’ ExxonMobil is also investigating the use of quantum computing to optimize journeys of its tanker fleet, a problem which, at-scale is said to be ‘intractable’ for classical computers. This led ExxonMobil to join the IBM Quantum Network to get access to advanced quantum computing systems and tools including the open source Qiskit quantum optimization module to test quantum algorithms. The outcome? ‘Depending on the aspects of the problem, some heuristic quantum algorithms performed slightly better than others, and variational quantum eigensolver-based optimization performed better depending on the choice of the ansatz.’ So there you have it!
Heightened expectations of quantum computing are evidenced in a paper on the EOS website where researchers Annarita Giani (General Electric Research) and Zachary Goff-Eldredge (US Department of Energy) explain how quantum computing can tackle climate and energy challenges. Climate modeling, in particular, ‘simulating the nuanced effects of ever-shifting clouds on climate’, is apparently ‘proving intractable to classical computing’ . Quantum computing may be able to solve the nonlinear differential equations that are key for working on fluid dynamics problems. We have asked the authors if this means that they are calling into question the current climate model from the IPCC. So far no reply!
The 2022 Oil & Gas Session at the Huawei Connect Dubai event heard from Li Yangming who announced two ‘scenario-based’ oil and gas solutions: an integrated oil and gas field network and a smart gas station. The network blends edge computing, AI, ‘native hard pipe’ optical comms and IPv6+ in a centrally-managed, secure network architecture that supports multiple technologies including industrial PON, Wi-Fi 6, and 5G. Hard pipe technologies further leverag Flex-E and NHP for enhanced security and network isolation. On the smart gas station front, Huawei has integrated its full-stack technical capabilities and worked with partners to build the smart gas station solution (see below). An integrated and converged edge platform based on FusionCube adds an AI capability to existing service station infrastructure.
Yangming also lifted the veil (a little) on the Pangu Supermodel that Huawei has co-developed for CNPC. Pangu, named after the Chinese creator of the universe, proposes to add AI functionality to a cross domain upstream data model. The supermodel ‘has OSDU inside’. AI smarts are delivered from Huawei MindSpore, an open Al framework (tuned to its Ascend processors) and ModelArts, a one-stop AI platform for citizen data scientists.
Souvatra Mukherjee from partner 3W Networks (a unit of Egypt-headquartered Eleswedy Electric) presented work for Adnoc to deliver a ‘high bandwidth, data centric, trustworthy network’ for diverse applications. Robert Kwong presented Singapore-based WeCar Technology’s approach to the digital transformation of gas stations. For Kwong, technology transition for retail is already a done deal (otherwise, ‘you are no longer in business’). The next frontier is the customer, ‘how are we going to delight our customers?*’. The answer is in knowing your customer and tuning the offer to his or her preferences, while respecting data privacy (of course!).
* Delighting the gas station customer through digital is not all that new. Back in 2001 we reported on Hess’ use of VoiceXML technology to satisfy hungry motorists with a pre-ordered ‘Blimpie’ sandwich.
Visit the Huawei oil and gas home page here and watch the recording of the Huawei Connect oil and gas track here. But a word of warning and a suggestion for an improvement. Some of Huawei’s Chinese speakers’ English is quite hard to follow. Maybe some of that Pangu AI could be applied to providing some close captioned subtitles. Or even better a transcript of the proceedings!
Badleys TrapTester 7.2
introduces ‘Strata-Cubes’ and user-defined macros for seismic
processing. Strata-Cubes is a 3D container for gridded seismic
attributes which will support new functionality in future T7 releases.
More from Badleys.
Eliis’ Paleoscan 2022.1.1 improves data exchange with OpenWorks, improves well log management and optimizes fault import.
The 2022.10 release of Resoptima’s ResX package includes a validation feature to check configuration prior to running a whole ensemble-based simulation. The process has also been upgraded to enable one-shot QC checks before run submission. The ‘create initial ensemble’ process supports drag-and-drop of facies data into the model. Resoptima also announced a partnership with Aker BP, Sval Energi and NORCE on the reduction of CO2 emissions from oil and gas production.
The HDF5 community has announced NeXpy a GUI toolbox for analyzing HDF5 data. HDF5 is used inter alia in the Energistics standards for E&P data.
Pegasus Vertex’ new (4.3) release of CEMLab introduces new function for fluid search, multiple water densities, and a ‘show on report’ option for fluid tests.
Release 4.9 of HiveMQ’s Enterprise MQTT platform offers distributed tracing, underpinned by OpenTelemetry and IoT data ingestion from MQTT clients into the public cloud, leveraging the enterprise extension for Google PubSub.
Avalon, from Sensia, is a new cloud platform for oil and gas operators. The ‘open, vendor-neutral’ platform can be deployed over existing infrastructure. Applications can be migrated to the platform or customized for specific visualization, alarming and data-reporting needs. Sensia is a Rockwell Automation/Schlumberger oil and gas automation joint venture.
The 14.0 release of AspenTech’s AspenOne includes over 100 sample models to help customers manage and report Scope 1 and 2 emissions. More in the release.
CSA Ocean Sciences has launched ‘Slick Kit’, a portable oil sampling system for slick sample characterization by offshore professionals. The kit includes standard operating procedures and supplies needed to collect slick and sheen samples.
DNV is leading three parallel joint industry projects to develop its KFX gas dispersion, fire and explosion simulator. The JVs target CO2 dispersion, hydrogen and ammonia safety.
Seeed Studio has announced new compact, high-performance Nvidia Jetson-based ‘mini AI’ computers designed for ‘edge AI’ in the Industrial Internet of Things. The high end T906 sports the Jetson AGX Orin module and an eight-core Arm CPU plus 1,792-core Ampere GPU to provide 200 trillion operations per second for ‘sparse INT8 workloads’. More on the T506S, T906 boxes from Seed.
Emesent’s Hovermap ST-X augments the reach of autonomous Lidar with a 300 meter sensing range of 300 meters and over a million points per second.
The MFE Inspection Solutions handheld optical gas imaging camera allows operators to visualize gas leaks in real-time. The 640 x 512 high operating temperature mid-wave infrared camera is powered by its 70wh rechargeable battery handle.
Schneider Electric has rolled -out the EcoStruxure Micro Data Center R-Series, expanding its ruggedized micro data center offering for IT applications in remote/harsh industrial environments. The 905 kg package houses a NEMA 12/IP54 rated 42U cabinet and APC Smart-UPS backup power.
SAP has announced SAP Build (previously AppGyver), a ‘low-code’ environment for citizen developers. Build exposes SAP’s Business Technology Platform and Signavio solutions to business users.
The latest 5.6.3 release of L3Harris Geospatial’s Envi includes a machine learning toolbox that offers random forest models, anomaly detection algorithms and K-means unsupervised classification.
Matlab and Simulink have unveiled a new Simscape battery simulator in the 2022b release. Simscape Battery provides design tools and parameterized models to let designers create digital twins, run virtual tests of battery pack architectures, design battery management systems, and evaluate battery system behavior across normal and fault conditions.
The 2023 release of OriginLab’s Origin graphing software adds over 100 new features and extended graphing and analysis capabilities. A list of oil and gas users of Origin can be found here.
Autonomous Underwater Vehicles specialist Terradepth’s ‘Absolute Ocean’ solution allows survey companies, marine organizations, and ocean data consumers to interact with all their data sets in one geospatially-referenced place. Alongside Terradepth’s survey data, Absolute Ocean houses public data sets of side-scan and synthetic aperture sonar, magnetometer grids and satellite imagery. The solution also serves as a storefront where customers can purchase commercial products, such as satellite-derived bathymetry from TCarta of Denver. More from Terradepth.
Speaking at the PTN Events Oil and Gas Digital Transformation virtual/online conference, Rob Kennedy (Wood) advocated the application of ‘right to left thinking’ to ‘transform lifecycle economics’. Right to left means starting with what is needed at the end of the facility lifecycle and working back to design, allocating capex appropriately. For Wood, the process revolves around a full asset lifecycle methodology that includes building a digital twin ‘from the ground up’. Wood’s flagship digital twin was developed for Turkish Petroleum’s Sakarya Gas Field to accelerate project delivery, maximize operating performance and reduce costs. The ‘twin’ comprises an asset twin, GIS and process twin, built with component software from Hexagon, ESRI and Wood’s own Virtuoso asset performance management tool that is said to embed data standards based on CFIHOS and PODS.
Daniel Fleck (RedEye) described engineering information management (EIM) as an often-overlooked area of asset management. EIM involves multiple documents and data types, from 2D drawings, CAD files to 3D models and BIM* data, through images, documents and controlled records. Working with this disparate information remains hard. A survey revealed that engineering information mis-management is ‘significant and costly’. Redeye proposes a generic template for the development of a digital engineering standard that is comprehensive and future-proof for digital twin and BIM usage. EIM needs to be agnostic, supporting all platforms and tools natively. Enter Redeye’s Common Data Environment, a hosted, multi-tenant solution for workflow and engineering information management.
* Building information model/management.
Maria Coelho presented digital engineering work performed at DICE, the Digital Innovation Center of Excellence (an unit of INL, the Idaho National Laboratory) spanning model-based design, digital threads/twins, artificial intelligence and extended reality. INL, a national center for nuclear research is managed by the Battelle Energy Alliance. Model-based systems engineering is described as the application of modeling and data-driven engineering in support of systems-of-systems design, analysis and life cycle operations. MBSE transforms typical systems artifact documents to data objects, creating a ‘single source of truth’. MBSE informs the digital twin, a ‘living virtual model of the physical asset’ that is used to predict future behavior. Real-time bi-directional communication between the twin and the asset compares simulated and measured information. It is this integration of real-time data and dynamic model update that distinguished the digital twin from the ‘traditional’ simulator. Moreover, the digital twin blends sensors and instrumentation, artificial intelligence, and online monitoring into a ‘single cohesive unit*’. INL’s digital twins are built around DeepLynx (and on Git) a central data warehouse of live ‘ontological’ and time series data. AI is used to automate expensive and manual human activities and to predict unobserved and difficult to measure events.
* One might think that this is easier said than done!
Sarunas Strasevicius (StackFlows) claims that ‘60% of skilled worker time is wasted on coordination’ activities, responding to emails, chats, follow-up meetings and tracking down missing input. ‘Employees feel that two-thirds of scheduled meetings are unnecessary*’. ‘Over 10 percent of an employee’s day is spent on tasks that have already been completed, either by themselves or by a colleague.’ Strasevicius enumerated some misconceptions of digital transformation. In-house IT will never have the resources to realize the transformation. On the other hand, contrary to the legacy IT approach, modern tools can make a transformation ‘remarkably affordable’. The key here is across-the-board business process automation, led by the operations team who ‘know how processes should run’, and who are the best-equipped to achieve the transformation. Strasevicius recommends the use of the Object Management Group’s BPMN process modelling standard. The ISO standard modeling language is ‘easy and intuitive’ and can be learned in a month, along with the companion OMG Decision Model and Notation spec which adds rules and automation to BPMN. StackFlows integrational platform leverages all of the above with bi-directional integration of third party (ERP, CRM … ) systems.
* A statement that recalls John Wannamaker’s observation that ‘Half the money I spend on advertising is wasted; the trouble is I don't know which half.’
Matthew Moore is the global subject matter expert for condition monitoring at contractor Petrofac. He observed that predictive maintenance and condition monitoring in oil and gas is ‘very mature’ due to the exceptional return on investment associated with avoiding downtime and reducing maintenance costs. Petrofac has reaped the benefits of digitalization through its CBMnet reporting tool and advisory for critical rotating machinery. Petrofac is now working to progress condition monitoring with automated data collection and analysis, moving towards real-time data and the use of wearable technology (headset) and ‘facilitate the use of AI and machine learning’. Real time vibration data collection is getting attention today with the advent of IIoT wireless vibration sensors and cloud-based AI and ML. Condition monitoring has been ‘glamorized as an IIoT digital initiative and introduced to more industrial sectors, particularly manufacturing. The market is now ‘hugely competitive’ due to the availability of low cost MEMS technology. Moore walked through some rather intricate failure cases to show that while the AI/ML approach has a lot to offer, the new technology brings new challenges and increased costs. ‘Condition Monitoring is set to become significantly more expensive, but will value scale proportionally?’ ‘AI and ML for prescriptive maintenance is still some way off but remains the ultimate prize’.
Kai Eberspaecher works for Bengal Energy, a Canadian operator exploring in the Australian outback. He presented a ‘day in the life’ of a field operator, an activity that is digitally-enabled with a range of off-the-shelf applications. The operator’s 200 mile drive is tracked with a journey management system from Brisbane-based JMS. Driver fatigue and heat exposure is monitored with Canaria. Work orders are managed with Redeye (see above) and captured in Rockwell Automation’s Factory Talk Fiix computerized maintenance management (CMMS) package. Data from the Ignition Scada/historian system is analyzed in Energysys’ production accounting. Bengal’s digital transformation design principles are ‘off-the-shelf, cloud-based and value for money’.
Johnpaul Portelli presented Canada Natural’s clean resource innovation network (CRIN). In 2020, the Alberta government instituted FEMP, its fugitive emission monitoring program, obliging operators to visit sites and scan for emissions with OGI cameras and fix methane leaks. As initially proposed, this was an expensive procedure and an Alt-FEMP approach has been developed that combines OGI cameras in specific areas, aerial detection with LIDAR, and ground-based, truck mounted equipment. Modeling techniques showed that a combination of different frequencies and resolutions gave better outcomes at lower costs. Canada Natural now uses the University of Calgary’s PoMELO technology, a truck-mounted multi sensor package that is used by operators to scan for emissions during regular site visits. Aerial Lidar from Bridger Photonics also ran. Canada Natural is also exploring the use of satellite technology to add another layer of high frequency/low resolution detection at ‘minimal cost’.
Next year’s PTN Events Oil and Gas Digital Transformation Conference will be held virtually on 12 - 13 September, 2023.
Donald MacMillan heads-up ABL Group’s new Saudi Arabian subsidiary, ABL Marine Services LLC.
Singapore-based LNG developer Atlantic, Gulf & Pacific Co. has promoted Sandeep Mahawar to chief commercial officer at its LNG Terminals & Logistics unit.
In an internal move, Altair has appointed Ravi Kunju to chief product and strategy officer.
Asset Guardian Solutions has appointed Stephanie Calder as chief commercial officer. She hails from semiconductor manufacturer Winji.
Cheniere Energy has named Brian Edwards to its board as an independent director. Edwards is SVP at Caterpillar.
Zied Ghazouani heads-up Clariant Oil Services’ new EAME technical center located in the Dubai, UAE Science Park.
Following Al Monaco’s retirement, Greg Ebel has been named president and CEO of Enbridge. Pamela Carter takes over from Ebel as chair of the board.
Al Cook, Equinor’s EVP E&P is quitting to take up a CEO role in a company ‘outside the energy industry’. His position will be filled by Philippe Francois Mathieu, currently director of strategy.
David Mills is to be appointed new CFO at Hexagon. He is currently CFO of manufacturing intelligence. EVP Jürgen Dold has resigned to ‘spend more time with my wife and family’.
Zurinah Yen Puasa (Shell) is now chair of the IOGP geomatics committee.
Lisa Edvardsen Haugan is the new president of Kongsberg Maritime, taking over from Egil Haugsdal who is to take on a new role in the group ‘to be announced shortly’.
Michael Baker International has named Rich Driggs as EVP and COO. Driggs hails from WSP Parsons Brinckerhoff.
Scott Robertson is to leave the UK North Sea Transition Authority at the end of 2022 ‘to pursue new opportunities’. His roles will be taken on by Tom Wheeler who is in turn replaced by Jane de Lozey, ‘on a temporary basis’.
The PPDM Association welcomes back Ali Sangster (S&P Global), David Hood (geoLOGIC systems), Daniel Perna (EPAM Systems), and Jamie Cruise (SLB), all re-elected to the board. Melinda East (Focus Forward) and Jesus Rodriguez (Marathon) are new appointees to the PPDM board.
Natalie-Nguyen La has joined Ryder Scott’s Houston office as senior petroleum engineer. She hails from Shell Oil Co. Ekene Ohaegbu has joined as senior PE. She was previously with EP Valuation.
Schneider Electric has appointed Manish Kumar as EVP of its digital energy business.
Seeq has appointed George Skaryak as its first chief revenue officer. He was previously with Cyara.
Pam Presswood is the new chief information officer at Valor, a mineral management company. She hails from Luther King Capital Management.
Jessie Lockhart is now ‘chief people officer’ at Velo3D. She comes from Lam Research.
Weatherford International has appointed Chuck Davison as EVP operational excellence. He joins from Strike, LLC.
Linda Nolting Kristensen heads-up Welltec’s new test flow loop facility in Esbjerg, Denmark. The facility is to provide corrosion testing services to carbon capture and storage projects, notably Wintershall’s Project Greensand.
STEM Returners has partnered with Petrofac on a new jobs scheme to help engineers overcome ‘career break bias’. The scheme sets out to help engineers get back into science, technology, engineering and mathematics (STEM) related work after a career break. The scheme is currently offering four fully-paid placements in Petrofac’s teams in Aberdeen.
Dina Alnahdy has joined the mCloud board of Directors. She is also CEO of Entec Environmental Technology and holds a Guinness world record for the ‘world’s largest handprint painting’, a 10,000 square-meter map of Saudi Arabia made with 1.2 million green handprints!
Linux Foundation Energy (LF Energy) has announced three new open source software projects as follows. The carbon data specification consortium
(CDS) standard to define the raw data required to track the carbon
intensity of power systems, GridLAB-D, a next generation power system
simulator and the OCPP cloud connector, a cloud based implementation of
the open charge point protocol. Shell recently joined LF Energy as a
‘strategic member’. More from LF Energy.
Chris Frost (Fujitsu), blogging on The Open Group website opined that the standards body’s output to date address some aspect of developing and running digital businesses. However, the TOG standards are ‘largely stand-alone’. What is needed is an overview of the whole portfolio of standards. Frost proposes a straw man portfolio (sans OSDU) and invites others to join him in the TOG digital practitioners working group.
Report 653 (recommended practices for electrification of oil and gas
facilities) and Report 647 (recommended practices for flare gas
recovery systems) have just been release by the IOGP’s energy transition directorate. The reports are a free download from IOGP.
The International Sustainability Standards Board
(ISSB) of the IFRS Foundation has issued a drafts of two
sustainability-related disclosure standards, IFRS S1, ‘general
requirements for disclosure of sustainability-related financial
information’ and IFRS S2 ‘climate-related disclosures’. The documents
cover company disclosures on Scope 1, Scope 2 and Scope 3 greenhouse
gas emissions. More from IFRS.
The Norwegian Petroleum Directorate is investigating the use of the Society of Exploration Geophysicists SEG-Y Rev 2.0 format for seismic data. Rev 2.0 promises greater automation of seismic reporting. The evaluation contract has been awarded to Troika International is contacting NCS to evaluate the new format and its deployment. If deemed appropriate, the format will be the basis for Norwegian seismic data reporting regulations.
A position paper from Namur, the German automation standards body presents ‘next generation’ Ethernet-APL for safety systems. Ethernet-APL provides a state-of-the-art physical layer to support Industry 4.0 applications and reduce the owner operator effort. The new protocol is said to be ‘the most important revolution in process safety since the introduction of safety systems, around three decades ago’. Ethernet-APL provides up to 10 Mbit/s to the field and is tailored for the process industry. It supplies field devices with electrical power and can be used in explosion hazardous areas (Ex) Zone 1/0.
A recent article, ‘How to make models more useful’ in the Proceedings of the (US) National Academy of Science (PNAS) posits that while ‘computational modeling has become a valuable tool for science and policy […] community standards to share model details have not kept pace’. For research to be replicated, evaluated and improved, the computer code in the model should be comprehensible and published alongside the articles that describe the results. This is not yet the case for most modeling science. Models in PNAS’ sights include earth tectonics, global temperature change, sea-level rise, loss of biodiversity and more.
The problem is that published articles do not usually contain enough information to reproduce the models and their source code, if available, may not be understandable and runnable by others, a particularly important consideration when computational models are the basis for high-impact policy decisions regarding such things as climate change and disease spread.
While a lot has been written about open sharing of data and software, notably on the ‘FAIR’ (findability, accessibility, interoperability and reusability) principles, a recent survey of some 8,000 articles on model-based research found that a majority do not make the code available. For the most recent articles, over 80% do not provide access to the model code. While peer review follows widely understood and accepted scientific norms, there are no equivalent standards for code. There are currently no guidelines on applying the FAIR principles to model code.
The 2009 ‘climategate’ affair, when climate researchers emails were hacked, was particularly damaging to public confidence in climate science, because of the lack of scientific transparency and restricted access to climate models and data sets. More than a decade later, ‘little has changed’.
Confusingly, the PNAS authors consider the attacks on climate models as ‘somewhat ironic’ since they have ‘some of the most rigorously tested and reliable scientific code’. While the Community Earth System Model (CESM) from the US National Center for Atmospheric Research stands out as ‘one of the few climate models used by the IPCC to make its code and data openly accessible and documented, this is not enough’ [ … ], ‘the CESM is complicated to download and difficult to install and run’.
These issues, articulated a decade ago, still persist. Why are models not more accessible? Developers may be concerned they will not receive recognition or rewards for the extra effort involved. Plagiarism is also a concern. To meet these challenges, representatives of leading organizations that support computational modeling met in December 2021 to establish the Open Modeling Foundation (OMF) to proselytize the use of FAIR principles in computer modeling. A central mission of the OMF is to adopt existing standards or develop new ones, if needed, to help modeling researchers, research and academic organizations, journals, funders, and other stakeholders to define what it means for a model to be FAIR. It will also offer guidance to help researchers meet these standards. More from the OMF’s development site on Git.
Chris Welsh presented OFS Portal’s activity to date and its likely evolution towards an Energy Supply Chain Network. OFS Portal was established in 2000 by a group of some 20 oilfield services companies*. The organization provides a standard legal framework for interoperability and digital integration to over 500 oil companies with connections to some 45 e-commerce networks. OFS Portal provides a standard catalog process (leveraging J-Catalog from OpusCapita), transaction management (with PIDX standards) and government e-invoicing with Edicom. On the subject of emissions reporting, OFS Portal is working with the PIDX ETDX workgroup to syndicate content as a catalog service. There are two approaches to emissions monitoring, ‘top-down’, rough-and-ready estimates using industry averages and ‘bottom-up’ using measurements from operations. [For more on the subject, see our report from the PIDX 2022 European Conference: Journey to Net-Zero]. The plan is to move from today’s predominantly top-down reporting to more detailed bottom-up line-item reporting by 2030. Meanwhile, a lot is going on in the e-commerce space outside of oil and gas. Some four e-commerce networks have consolidated in different geographies as follows Peppol, EESPA, EIN (all in the EU) and BPC (US). A move is now afoot to ensure interoperability with a Global Interoperability Framework (GIF). GIF is currently a talking shop but the ultimate aim is to leverage common building blocks to connect the networks. To integrate with the GIF movement, Welsh proposes a new Energy Supply Chain Network (ESCN), leveraging PIDX as an entry point to the non oil and gas e-business networks. The ESCN will provide a legal framework for oil and gas content publishing, catalog management, e-invoicing, tax and emissions reporting and a point of connection between oil and gas and the GIF community.
* OFS Portal is now supported by its members Baker Hughes, SLB, Halliburton, Weatherford, Select Energy Solutions and Well Integrity Solutions.
Todd Albers (Federal Reserve Bank of Minneapolis) provided an update from the Business Payments Coalition (BPC) and set out a roadmap to a future ‘B2B Digital Highway’. Currently, in the US, 75% of invoice delivery and processing is manual. This is costly, and hampers e-payment adoption. Some 40% of B2B payments are by check. The Business Payments Coalition started in 2011 as an industry forum that brings all stakeholders together to drive end-to-end B2B payment processing efficiency. This has led to the Exchange Framework Initiative, an operational B2B exchange framework for the US to kick-off in 2023. Longer term, the BPC/EFI will support the delivery of electronic payment and supply chain documents.
Andrew Mercer presented Baringa’s solution* for oil and gas emissions management (EM). Climate change represents an existential risk to oil and gas. Industry is under pressure from society and the regulator. Investors need to model long term climate risk. Baringa scores fossil fuel producers based on their current implied warming temperature and the changes to their business model required to align with a 1.5° target. Thus a coal producer might have a 4.5° current score that would need a 95% cut in output for 1.5°. Baringa’s scorecard for the major oils shows ENI, Repsol, TotalEnergies and BP as leaders, thanks to their announced oil production cuts. Devon, Pioneer, Suncor and ExxonMobil come last, with no reduction. But all claim net zero by 2050! As Mercer puts it, ‘there is a high degree of variance in emissions reporting across the sector, pointing to underlying issues and inconsistencies in the boundaries and methodologies applied’. The US Inflation reduction act has increases the urgency to overhaul emissions measurement, with a $900/ton (rising to $1,500) penalty on excess methane. Methane emissions measurement and reporting systems may need to be rebuilt to accommodate the new requirements. On the plus side, there is a ‘hefty carrot’ for carbon capture and biofuels.
* Baringa’s Aladdin Climate was acquired by Blackrock in 2021. Aladdin assesses climate risk, informs capital allocation, loan approvals, and portfolio monitoring and reporting in response to regulatory and investor pressures.
Charles Bryant provided more information on the Global Interoperability Framework. The GIF includes recommended practices, policies, standards and guidelines to connect existing e-business networks from EESPA, Peppol, Connect Once and BPC. The GIF recommended practices document version 1.0 was published in 2020 and is a free download. Peppol, the GIF ‘pioneer’ offers mature solutions across numerous markets and recently delivered streamlined agreements and a continuous transaction control solution for fiscal reporting. As Chris Welsh mentioned above, Connect ONCE/OFS Portal is evaluating a GIF-compliant Energy Supply Chain Network.
Ahti Allikas (OpusCapita) presented on GIF implementation in Peppol, the business network that is managed by the OpenPeppol not-for-profit, member-led international association. Peppol was launched in 2008 and OpenPeppol established in 2012. Peppol provides standard messaging, a discovery mechanism and delivery in AS4 format. More from Peppol and watch the SPDM ‘All about Peppol’ presentation by Tormod Tønnesen (Offshore Norge) and André Hoddevik, (EU OpenPeppol).
Arne Johan Larsen presented Equinor’s strategy for B2B interaction. Equinor was an early adopter of e-business-to-business transactions. Over 20 years ago, Equinor and other operators started the Trade-Ranger industry initiative. [Trade Ranger was acquired by Hubwoo (now Proactis) in the early noughties]. A 2015 initiative to reduce administration costs in Europe saw Equinor leverage the Peppol format and eDelivery infrastructure as ‘the fastest and easiest to implement for both Equinor and suppliers’. A solution was in production in under three months. Equinor is now looking to realize ‘touchless’ digital interaction with its suppliers covering electronic purchase orders, order confirmations, advance shipping and more. For an oil and gas operator, a substantial amount of procurement is not related to oil and gas, like general consumables, IT equipment, consultancy hire, etc.. Hence the interest in the emerging Global Interoperability Framework Model and the Oasis Universal Business Language (UBL) and a ‘Four corner*’ eDelivery model. Equinor has already implemented several ordering and invoicing functions including a blockchain/smart contracts application for automated service approval. This is to replacing field tickets and manual verification. Equinor is porting Peppol functionality to the US Exchange Framework and is inviting suppliers and partners to join it in the ongoing BPC’s E-invoice Exchange Market Pilot
* More on the ‘Four corner model’ here.
The OFS Portal event also heard from service providers DocStudio (Scope 3 Emissions Reporting), Sovos Saphety (multi-exchange B2B) and Halliburton, which used third part supplier EDICOM to validate and format PIDX XML invoices for submission to tax authorities.
More from OFS Portal.
Following a 2016 loss of containment incident resulting from a landslide, Husky Midstream has implemented a state-of-the-art instrumentation monitoring program from HiFi Engineering and Stantec.
The solution leverages real-time geotechnical instrumentation,
distributed fiber optic sensing, repeat ILI* and weather data
monitoring to locate areas of ground and pipeline instability. The
early-warning system includes alarm thresholds that proactively shut-in
the pipeline when required. More from HiFi.
* inline inspection.
Working with ExxonMobil, Nabors has announced a ‘major industry milestone’ with the retrofit of a ‘fully automated’ robotics module on a land rig. The Canrig Red Zone Robotics system automates the rig floor, increasing safety and performance. Nabors’ RZR-equipped rig has already drilled multiple horizontal wells in the Permian Basin.
Blackline Safety reports multi-year contracts with large, international customers across the Middle East and Europe valued at $1.8 million. The largest, a three-year deal worth some $500,000 with OQ Oman covers Blackline’s G7c cloud-connected wearable gas detector with two-way emergency voice calling and push-to-talk service.
Cenovus Energy is using Bridger Photonics’ gas mapping lidar to detect methane emissions in a large-scale monitoring program across its Canadian assets. The program is run under the auspices of the Alberta Energy Regulator’s alternative fugitive emissions management program (Alt-FEMP).
Brunei Shell Petroleum has awarded CGG a multi-year contract extension for the operation of a dedicated seismic imaging center at its head office in Seria, Brunei.
Thailand’s PTT Oil and Retail has chosen Cloudera
to create an integrated retail and fueling customer experience through
the enhancement of its data analytical capabilities. The Cloudera data
platform will provide PTT with a clearer view of its customers and
enhance engagement across PTT’s 1,900 gas stations and 3,000 retail
branches. Full implementation is slotted for 2024. More from Cloudera.
SLB (formerly known as Schlumberger) has awarded a ‘moonshot pioneer’ award for its use of Dataiku’s analytics technology to perform predictive maintenance of its drilling rigs. More from Dataiku. The work involved a multi-terabyte collection of time series data, contextualized with meta information about drilling context and maintenance actions. One case study addressed failure mechanisms of rotary steerable systems.
Equinor is the first operator to scan an offshore structure with a rolling ultrasound scanner from Elop Technology. The Elop Insight was used to perform non-destructive testing and monitoring by ‘seeing’ deeper into concrete material with a view to extending the lifetime of offshore structures.
Equinor has awarded Emerson a five-year framework agreement for the provision of operational support services on its Norwegian North Sea Martin Linge platform. These cover maintenance and upgrades of the control technology, software and instrumentation to ‘accelerate carbon-efficient production’ and enable remote operation from onshore. The award follows Emerson’s implementation of a complete automation solution leveraging its DeltaV distributed control system.
Siemens Energy is using Gecko Robotics’
wall-climbing robots, to offer ultrasonic robotic inspections services
across the EU. The solution collects ‘rich multi-modal’ data for
condition-based maintenance of rotating equipment. Gecko’s robots are
remote controlled and equipped with ultrasonic transducers,
localization sensors, lasers and HD cameras. They climb vertically and
horizontally, adhering magnetically to an extensive range of equipment
types to scan for changes in thickness, cracks, corrosion, blistering
and other forms of degradation. More from Gecko Robotics.
Petronas has implemented Halliburton’s Digital Well Program and Digital Well Operation DecisionSpace 365 cloud solutions as the foundation of its ‘enterprise digital well integrated operation’.
Equinor has signed a collaboration agreement with Hitachi Energy covering electrification, renewable power generation and low-carbon initiatives worldwide. Hitachi has previously provided Equinor with microgrid solutions on several projects, such as the Dogger Bank offshore wind farm and Troll A, the world’s first HVDC* power-from-shore connection.
* high voltage direct current
Klarian is teaming with InfinyOn, a ‘real-time event streaming company’ to run its Digipipe, pipeline monitoring solution and dashboard atop the InfinyOn cloud. Data from Klarian sensors stream into the platform ‘with single digit millisecond latency’. Klarian evaluated ‘traditional’ event streaming solutions as Kafka, AirFlow and Broadway before settling on InfinyOn.
Petrobras has awarded a three year, R$49 million (approx. $9.5 million) contract to Radix to develop and maintain scientific software focused on well and subsea engineering. Radix will develop software for two lots of subsea and one lot of wells, marking its return to large contracts with Petrobras. The partnership will last for three years, extendable for an additional two years. The deal will generate 50 new job positions at Radix’ Rio de Janeiro HQ.
TotalEnergies is to ‘explore’ the use of Regent’s electric seaglider for travel to offshore platforms. The seaglider is a new category of marine electric vehicle with a range of 180 miles and a top speed of 180 mph with existing battery technology. Total’s earlier dalliance with unconventional vehicles envisaged the use of an airship from FlyingWhales in seismic survey.
Schneider, Aveva and Shell have formed a strategic alliance to support their transition to net-zero. The partners will explore opportunities to co-develop integrated end-to-end energy solutions designed to decarbonize hard-to-abate industries such as cement. Aveva and Schneider are to bring digital engineering, operational process and energy optimization technologies to the table. Shell adds its sustainable energy supply solutions, project engineering capabilities and its renewable energy portfolio.
Shell and GE Additive have 3D printed a complex (but non-operational) oxygen hydrogen mixer. The demonstration part was printed in nickel alloy 718 on a GE Additive Concept Laser M Line system, installed at Shell’s 3D printing center of excellence at Shell’s energy transition campus in Amsterdam. The mixer’s design was ‘inspired by natural geometries and symmetry including the Fibonnaci sequence as replicated in flowers and petals’.
Aker BP has selected Tietoevry as digital services partner in a long-term partnership to deliver on Aker BP’s ambitions on digitalized operations. In a separate announcement, Tietoevry has formed a strategic partnership with Google Cloud to ‘scale data innovation and digital transformation in the Nordics’.
Weatherford is adding drilling analytics from Kwantis to its Centro well construction platform. Kwantis’ ID3 (integrated drilling data discovery) software will enable real time well planning and post-well analysis, predicting downhole drilling issues and identifying opportunities to reduce cost and emissions.
Evolution Markets, a provider of energy and environmental financial and transactional services is to deploy cQuant’s price simulation, renewable energy, and battery storage analytic offerings to assist clients in procuring or selling renewable energy and managing energy price risk.
mCloud has partnered with Google Cloud to launch new sustainability applications notably its AssetCare oilfield emissions management solution. mCloud also announced that it was exiting its low-margin Technical Project Services business and has ‘retired’ low-value legacy connected assets and workers.
Ipieca* has just released the 2022 edition of its sustainability reporting survey, subtitled ‘advancing sustainability reporting across the energy industry’. 31 oil and gas majors and service companies responded to the survey, conducted since 2012, to assess member companies’ reporting practices and expectations for the future. The result is a dense collection of some 50 graphics which are hard to summarize. Cherry picking some findings more or less at random we have the following. Climate change reporting beats all other sustainability concerns. Most (77%) do not have an external sustainability council/advisory, although external ‘audit/verification’ is often reported. Some 12 different assurance standards are used and these are increasingly verified by third parties. Other issues addressed in the survey cover corporate motivation for sustainability reporting, materiality of such, stakeholder engagement and more. There is a huge amount of information in this document and it is a recommended read, not least because it underscores what a confusing landscape sustainability is today. The survey results will also inform further development of the Ipieca, API and IOGP Sustainability reporting guidance for the oil and gas industry (2020 edition).
* Originally the ‘International petroleum industry environmental conservation association’, but now … ‘our name is now just IPIECA’.
In an interesting piece of upscale marketing, CGI showcased its ‘metaverse innovation’ and sustainable technology solutions to fight climate change during COP27. Seemingly, the metaverse can reduce the environmental impacts of travel. CGI is also to ‘share’ the investments it is making to address environment impacts, notably its Sustainability Exploration Environmental Data Science (SEEDS) UN-backed R&D program.
In its annual review of corporate reporting, the UK Financial Reporting Council found that companies’ net zero and carbon neutrality reporting is too often ‘aspirational and high level’ and ‘fails to provide users with sufficient information’. Investors are calling for better information in financial statements, including connecting net zero targets to relevant disclosures. The FRC’s report is designed to be of use to reporting teams as they prepare disclosures on net zero and other GHG emission reduction commitments.
The New York headquartered International Federation of Accountants has released the recordings of its recent Climate Week event. IFAC is working with ‘Accounting for Sustainability’ (A4S) and others to ensure that the accounting profession can advise businesses and play a key part in decarbonization. In a panel session, VP and Controller Shamsul Bahar set out Petronas’ ambitious goals for net zero by 2050 and its target of 30-40 gigawatts of renewable energy by 2030. Petronas is ‘moving away from a manual data collection processes, using the skills of the finance team to enhance data quality’.
ISO has released its ‘Net Zero Guidelines, a 48 page document that provides guiding principles and recommendations to enable a common, global approach to achieving net zero greenhouse gas emissions through alignment of voluntary initiatives and adoption of standards, policies and national and international regulation. ISO stresses the voluntary nature of the standards. The document has 161 references to what ‘should’ be done. But only one to what ‘shall’ be done. The latter refers to the fact that ‘ISO shall not be held responsible … etc.’ The Net Zero Guidelines is free, but to access the accompanying ISO 14091 standard you will have to pay 158 Swiss Francs. We intimated to ISO that making the standard free would hasten take-up. They were having none of it.
Accenture reports on a ‘big step forward’ in methane mitigation with the recently passed US Inflation Reduction Act (IRA), which is seen as a way to address climate change. The legislation levies charges against the largest emitters, ‘placing accountability for methane action (and inaction) at the top of c-suites’ agendas’. Total cost to industry would exceed $2.5 billion by 2024 without action on methane reduction. Corporate focus will shift 180°, from lost sales, to limiting the new tax penalties. On the plus side, the IRA has set aside a ‘sorely needed’ $1 billion for improved mitigation and reporting of methane.
Expro has received some funding for two carbon-reduction projects from the Aberdeen-based Net Zero Technology Centre. The NZTC has awarded a total of £8 million to fund net zero technologies as part of its 2022 Open Innovation Program. The award will fund Expro R&D into real-time flare emissions measurement and control and a novel approach to well testing without flaring.
MiQ and TruMarx Data Partners have announced the CG Hub, a platform for trading ‘certified’ natural gas that intends to incentivize reduction in methane emissions. The CG Hub, runs on TDP’s ‘Comet’ cloud-based energy trading platform. Certification is provided by ‘non-profit’ MiQ, ‘the fastest growing and most trusted methane emissions certification standard’. Certification is assessed against a ‘credible and transparent’ standard, leveraging ‘data-led’ emissions measurement. MiQ was established by RMI (formerly the Rocky Mountain Institute), and global sustainability consultancy SystemIQ to facilitate a ‘rapid reduction in methane emissions from the oil and gas sector’.
KeyLogic unit OnLocation is upgrading the National Energy Modeling System
it originally developed for the Department of Energy. NEMS informed the
2021 UD Dept. of State’s report on Pathways to Net-Zero Greenhouse Gas
Emissions by 2050. NEMS is also used routinely by the Energy
Information Administration for its Annual Energy Outlook. Now
OnLocation is enhancing NEMS to include Direct Air Capture, combined
bioenergy and coal retrofit technology with carbon capture other
enhancements. A new Hydrogen Market Module will provide granular
projections of hydrogen production technologies, costs, volumes,
transportation and storage and end use. These enhancements, being
developed for DOE, are important because of the expectation that
technologies and fuels such as DAC and hydrogen will play a central
role in decarbonization. More from OnLocation.
The intelligent Pipeline Integrity Program has re-contracted with Orbital Sidekick to provide its GHOSt global hyperspectral observation satellite constellation known, set to launch in 2023. The hyperspectral imaging (HSI) constellation consists of six satellites to be launched on SpaceX’s Transporter program. The iPIPE consortium targets the prevention and detection of pipeline leaks and is housed at the University of North Dakota Energy & Environmental Research Center.
Validere reports that its Carbon Hub emissions measurement and reconciliation software has been deployed by both TRP Energy, a private E&P operator in the Midland Basin and PureWest, an independent natural gas company based in Wyoming’s Green River Basin.
Climate tech firm Scepter is to launch methane-detection balloons in Texas’ Permian basin. The balloons’ payload includes hyperspectral sensors operating in short-wave infrared that can detect ‘moderate sized’ leaks of down to 50 kg methane per hour. The first balloon will go up in January 2023 to target ExxonMobil-operated areas of interest. A second balloon will launch in April to survey oil fields operated by Pioneer, Chevron, Occidental and others. Data will be analyzed by Atmospheric and Environmental Research, a Verisk unit. The stratospheric balloons are a prelude to Scepter’s ‘ultimate goal’ of methane monitoring from low-Earth orbit satellites. The balloons are provided by Tucson-based World View.
CF Industries, ExxonMobil and EnLink Midstream are collaborating on a CCS project that sets out to capture some 2 million tonnes of CO2 from a CFI dehydration and compression unit in Donaldsonville, Louisiana. This will be transported though the EnLink pipeline network for sequestration at an ExxonMobil facility in Vermilion Parish, LA.
The Global Status of CCS in 2022 from the Global CCS Institute, found that while the number of CCS plants in operation is virtually unchanged since 2021, there has been considerable progress in the ‘advanced development’ category. The total capacity of CCS projects in development was 244 million tonnes per annum (Mtpa) of carbon dioxide (CO2) – an increase of 44% over the past 12 months. The 70 page document lists current and planned CCS projects which appear to be reaching a new high following a 2016-18 low point. Worldwide CO2 emissions are currently put at around 36 billion tonnes/year. So the current ‘capacity’ is under 1% of global emissions. But it’s a start!
The UK-based International Oil & Gas Producers
association has just published IOGP Report 652 on recommended practices
for measurement, monitoring, and verification plans associated with
geologic storage of carbon dioxide. The Report provides a guide for
developing a measurement, monitoring, and verification plan for
geologic storage of carbon dioxide and compares CCS technologies to
monitor storage performance and assess site-specific risks. The report
is a free download from IOGP.
The Greater Houston Partnership has announced HETI, the Houston Energy Transition Initiative, dedicated to strengthening Houston’s leadership as the energy capital of the world. A report, co-authored with McKinsey explains how Houston can be the funding leader for the energy transition, leveraging the new US Inflation Reduction Act.
Oslo-based Resoptima has
partnered with Aker BP, Sval Energi, and the Norwegian NORCE R&D
establishment to reduce CO2 emissions from oil and gas production. The
joint project has the support of the Research Council of Norway. The
project will deploy Resoptima’s reservoir modelling and reservoir
management technologies to enhance oil production while minimizing CO2
emissions. Optimization will span well placement, reservoir drainage
and water injection. When validated by all parties, the software will
be added to the commercial portfolio of Resoptima for licensing on a
global basis. More from Resoptima.
The sixth edition of the Future Investment Initiative (FII) in Dhahran, Saudi Arabia saw the ‘largest-ever’ carbon credit sale and a conversation between FII Institute CEO Richard Attias and Trian Fund Management’s Nelson Peltz on ‘ensuring long-term success throughout the ongoing global changes’. The mega carbon sale was announced by PIF, the Saudi Public Investment Fund and Tadawul Group in the context of the ‘transition to a carbon neutral future’, said to be ‘integral to Saudi Arabia’s efforts to achieve net-zero goals by 2060’.
Vysus Group, in partnership with Siccar, a blockchain-based enterprise data sharing platform, have announced the Energy Transition Databox. The ETD is a ‘complete emissions management solution’ to assist oil and gas operators during the energy transition.
NASA is cancelling its GeoCarb mission and is to explore other options to measure and observe greenhouse gases. The Geostationary Carbon Observatory was to study the global carbon cycle by mapping concentrations of key carbon gases from a satellite in geostationary orbit. GeoCarb was canned because of technical concerns and escalating cost, up from a budgeted $171 million to a current estimate of over $ 600 million.
A letter to the Financial Times penned by Justin Steinberg (Steinberg Asset Management) nuances some recently reported factoids on Walmart’s supply chain emissions. Steinberg refers to the ‘increasingly treacherous waters’ of ESG investing, observing that Walmart’s celebrated sustainability goal, ‘Project Gigaton’ does not in fact mean that Walmart is to reduce its supply chain emissions by a gigatonne by 2030. Gigaton’s goal is ‘aspirational’ and includes ‘avoided’ emissions. This is a ‘complicated concept with no accepted definition’. A company could in theory continue to increase emissions while counting theoretical business-as-usual emissions as a reduction. ‘Whether or not this is occurring is impossible to know because the underlying data are impenetrable’. Moreover, Walmart acknowledged that calculating supply chain emissions … is an unreliable exercise that ‘involves estimations on top of assumptions that are repeatedly layered to arrive at a falsely precise number’.
ABL Group has acquired the operations of well control equipment specialists Hose International.
Cathedral Energy Services has acquired the operating assets and personnel of Ensign’s Canadian directional drilling business in a CAN$5.0 million deal financed with a share issuance by Cathedral to Ensign.
Shareholders of North American MWD specialist Gordon Technologies have each sold 25% of their equity to Alpha Dhabi Holding in an all-cash deal.
Hexagon has acquired AI-powered BIM (building information modelling) specialist Avvir whose design-vs-reality analysis platform is to integrate Hexagon’s BIM and virtual design and construction solutions.
Shell and ExxonMobil chemicals joint venture Infineum is acquiring Entegris’ pipeline and industrial materials business
Mesquite Technologies has acquired artificial lift and production optimization software company OspreyData. The software will integrate Mesquite’s ‘Taproot’ line of AI-powered control equipment.
ProFrac has acquired US Well Services in a stock-for-stock merger transaction valuing the deal at approximately $270 million.
In a recent virtual investor call, troubled Chinese oil service company Recon Technology informed shareholders that most all of its revenue comes from its Chinese operations. Its Future Gas Station app currently generates revenues of $1-1.5 million per year. Recon’s business has been negatively affected by China’s national policies ‘that discourage the development of platform companies’. The company also announced that it has regained compliance with Nasdaq’s continued listing requirements and that it has a $47 million cash position.
Oil and gas automation specialist Sensia, a Rockwell Automation and Schlumberger joint venture, has acquired Swinton Technology, a provider of oil and gas measurement and metering supervisory systems.
Copenhagen-based ZeroNorth has acquired Prosmar Bunkering AS, an online platform supporting the bunker fuel market. Prosmar’s Bunker Dashboard solution and Bunker Pricer module are included in the deal. Prosmar Risk and Prosmar Price Matrix are not. Earlier this year ZeroNorth raised $50 million in capital.
mCloud has received a written notification from the Nasdaq indicating that its market value over the past 30 days was below the required minimum of $35 million for continued listing. The company has 180 days te rectify the situation.
There were a reported 120 participants in the first post-covid face-to-face (F2F) meeting, hosted by Equinor in Stavanger, of Cfihos, the capital facilities information handover specification. Cfihos is currently managed by the UK-based IOGP as Joint Industry Project 36. At its inception, in 2012, the early hopes for Cfihos were for a ‘lightweight mapping [of engineering information] with a realistic amount of work’. Earlier attempts at standardizing plant information (ISO 15926 and Fiatech/Jord) were not widely implemented. Ten years on, is Cfihos succeeding where these initiatives failed?
Participation in the 2022 F2F suggests renewed interest in Cfihos. This appears to be coalescing on the deliverables to date which comprise a suite of interlocking Excel spreadsheet definitions. A parallel deliverable is the Cfihos relational model (currently at V1.5), described in an 84 page PowerPoint presentation. This currently represents a ‘logical’ model, physical implementation to a database is being outsourced. Based on the little information released from the F2F, it would appear that a ‘lightweight’ model remains elusive.
Cfihos was also highlighted in a joint presentation at the Aveva World conference by Erin Jones (ExxonMobil) and Peter Townson, (IOGP JIP36/Cfihos). The presentation, ‘a Practical implementation of Cfihos to meet data surveillance needs in the energy industry’ addressed ‘inconsistent data across the suite of design tools applied by your engineering- procurement-construction (EPC) firm.’ Cfihos was reported as creating consistency and interoperability in ExxonMobil’s equipment data. ExxonMobil is playing catch up here, ‘our competitors have been leveraging a data centric approach for years’. Shell initiated the Cfihos initiative back in 2012, donating its in-house work to USPI-NL. For ExxonMobil, Cfihos’s main contribution stems from its common language that ‘elevates’ existing information EPC systems from traditional PDF documents to ‘true data’, captured in ExxonMobil’s ‘RED’ (repository for engineering data) application, an implementation of Aveva AIM & ISM*. RED ‘enables data centric engineering surveillance, improve data handover to operations, and build an engineering data foundation to allow for global asset analytics’.
In a short email exchange, Jones added, ‘The relationship between
Cfihos and RED has allowed us to solve the business problem we were
experiencing. We were getting data from our contractors but it was not
standardized in a meaningful way so it was difficult for us to perform
any kind of surveillance other than with lots of human intervention. By
implementing an industry standard which specified the data that we
wanted, we were able to implement a more automated approach to
surveillance through AIM & ISM. I suppose we could have established
a bespoke ExxonMobil standard to accomplish similar consistency but
we’re trying hard to leverage pre-existing standards and minimize
creating our own answers where there is industry consensus on things
like a data standard’.
* Asset Information Management and Aveva Information Standards Manager that uses ‘industry-standard templates such as CFIHOS’.
Cfihos is planning a version 2.0 for Q3 2023 More (but not much more) from the JIP 36 home page.
Melinda Hodkiewicz (BHP Fellow, Engineering for Remote Operations ) and her colleagues at the University of Western Australia’s Natural & Technical Language Processing Group have just published a paper* on the use of knowledge graph technology to build and explore large maintenance data sets.
The authors have been studying how short text fields on maintenance
work orders (MWO) can be extracted and combined with structured data
into a knowledge base. This can be used to find information in historic
asset data for failure modes and effects analysis, maintenance strategy
validation and process improvement. A failure mode taxonomy was derived
from the ISO14224 standard.
The researchers have developed two open source tools to facilitate the work: MWO2KG uses deep learning supported by annotated training data to automatically build a knowledge graph, and Echidna an intuitive query-enabling interface that visualizes the historic asset data in the graph database. A demonstration of Echidna is available online and the source code for both tools is on GitHub.
The research is said to leverage the Semantic Web’s resource description framework (RDF) which has been used to build a triple store of maintenance data. Triple stores are particularly amenable to visualization in a graph, an approach which is said to be ‘increasingly used’ in the engineering sector (the ‘Bosch Materials Science Knowledge Base** was cited).
A supervised deep learning model driven by (expert) annotated data was used to extract named entities. Here the authors acknowledged that access to maintenance work order data is ‘problematic’ as being considered ‘commercial-in-confidence’. Also while it is desirable to have multiple annotators, ‘getting them to agree is a challenge’. Named entity recognition was performed with the open source Flair library. At the end of the pipeline, a Neo4j graph database was used for further data exploration. The researchers are now working on functionality that will enable data owners to upload their own datasets for analysis.
* Constructing and exploring knowledge graphs from maintenance data in Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability.
** The Bosh Materials Science KB appears to be something of an
Arlésienne. We found nothing on the Bosch website but there are
many references in the semantic web community to research in progress
on the BMSKB such as this one.
Back in 2008, the then Petroleum Data Model Association (PPDM) changed its name to become the Professional Petroleum Data Management Association, reflecting its expanding scope from data modeling to include best practices, certification and training. PPDM is changing its name again to become simply the ‘PPDM Association’ with a tagline of ‘the global energy data professionals’. The latest change reflects a shift from just ‘petroleum’ to embrace ‘new energies and more environmental responsibility’. ‘Energy’ and ‘data professional’ ‘more fully reflect the value and service we provide.’
In a similar vein, the oil and gas service behemoth that used to be called Schlumberger is changing its name to just ‘SLB’. We have two problems with the new moniker. First, how will it be pronounced. ‘Ess, ell, bee’ as the SLB marketing folks would no doubt like, or ‘Slub’ as it might trip off one’s tongue. Like PPDM, SLB’s name change reflects a desire to downplay its oil and gas heritage in favor of its ‘vision for a decarbonized energy future’. The other downside for SLB is that its founders, the Schlumberger brothers will be turning in their graves.