I’m not usually short of ideas for editorials, this is my 258th or thereabouts. I had a few of my usual rants about this and that in the pipeline before I realized that the best editorial fodder I could dream up comes from this action-packed edition of Oil IT Journal. There is just so much going on in oil and gas IT. Where to begin?
OSDU, the open subsurface data universe, has just released ‘Mercury’, its first ‘operational’ edition. That is great but what is spectacular is the amount of attention OSDU has gotten with plethoric announcements from software houses teaming on various OSDU-based platforms from the cloud providers. We have 78 mentions of OSDU in this issue.
A birds-eye view of what has happened is rather perplexing. Not long ago, a suggestion that the whole industry would ‘collaborate’ around Schlumberger’s Delfi data infrastructure would have met with some skepticism. One might have imagined that Halliburton for instance would push back. But no, Halliburton is offering DecisionSpace 365 ‘on OSDU’ and a ‘publicly available’ OSDU reference implementation in its Open Earth Community. Most all the vendors we have come across have backed OSDU with unbridled enthusiasm. A feeding frenzy and a software supercycle? With regards to Schlumberger’s generous donation, it is more ‘don’t look a gift horse in the mouth’ than ‘beware the Greeks … ’!
One facet of OSDU which may or may not have come to the attention of the holders of the purse strings is the requirement for a massive data migration effort into the new cloud environment. Data under consideration includes … ‘ProSource, R5000/OpenWorks, OSIsoftPI, IHS and more’. This is conceivable for software in the geoscience area. But a mass migration of data in OSIsoft (now Aveva) and/or IHS Markit is a bit harder to imagine. The size of the data migration task is also difficult to imagine in the context of a cash-strapped industry.
OSDU, like NASA’s Martian helicopter, has kicked up some dust on planet subsurface. Notably, chez the Society of Petroleum Engineers whose planned workshop on open source software is, at the time of writing, an OSDU-free zone. Shell’s own recently-announced ‘OpenAI’ collaboration with Baker Hughes/C3 is likewise OSDU-free. Another significant open source movement in geosciences, the Software Underground is currently holding its annual Transform event. We understand that there has been a lot of discussion of OSDU but the outcome is, as far as we can tell, that the SU will proceed sans OSDU. The two bodies have a very different approach to ‘open’ with SU’s freewheeling hackathons and OSDU’s members-first approach.
We also report from the other The Open Group/Shell-backed open initiative the Open Footprint Forum. This now has teamed with the venerable PIDX oil country e-business standards body. In the TOG-hosted event, we learned that the OFF APIs will open up emissions data, but that companies can deploy their own implementations and can ‘hide their own data from competitors’. This may be problematic in the context of transparent open data.
It’s interesting to reflect on the two memes of ‘open data’ and ‘open standards’. Objectively, IT standards are orthogonal to actual reporting, although the two are often conflated. What’s key in reporting is a genuine need (due to regulations) or desire (due to self-motivated transparency) to report! It does not matter a jot whether this is XBRL, XML, JSON or what. A corporate logic that proceeds from a perceived obligation (as opposed to a wish) to report, through a few committees and on to an ‘IT standard’, is really just kicking the can down the road.
But the Great Conflation, as admirably exposed in last year’s EAGE, is that of just about any IT development with ‘green’. At a recent OSDU event, Reuters’ John Nixon’s doggedly questioned presenters with ‘and how is OSDU going to help with the energy transition?’ Answer was there none. Elsewhere, quantum computing is conflated with GHG mitigation. Heck, if the world is to wait for QC to mature enough to ‘solve’ global warming we are really in trouble. But the conflation is enough to allow QC afficionados to qualify for taxpayers’ ‘green’ money.
No doubt financial considerations are behind Total’s decision to dispose of its Alternative Subsurface Data facility, a test center in Pau, France for rock physics and logging tool calibration. Meanwhile, Total continues with its considerable investment in data science. Perhaps the connection is just a figment of my imagination. But it reminds me of the pride (?) with which the seismic contractors took in going ‘asset light’ a couple of years ago. Good for the balance sheet. But ‘data science’ needs real data and, as it is now emerging, a helping hand from ‘real science’.
As folks in the OSDU constellation puzzle over what Schlumberger has to gain from opening up its data infrastructure, I draw your attention to a statement made by Schlumberger’s Steve Freeman at the 2020 EAGE who said, ‘If you need a head of IT then the service companies have failed you’. This does not need too much unpicking. Schlumberger wants to cut out the corporate IT middlemen. Software-as-a-service will be served to end geoscience users, armed with their Python notebooks for some added data futzing. SaaS all sounds very 1999/dot-com boomish, which got me rereading some back editions of Oil IT Journal, always a worthwhile exercise, though I say it myself. I found this evidence of ‘prior art’ from Schlumberger in our February 1997 issue …
As they say over here, ‘plus ça change, plus c’est la même chose*’.* The more things change, the more they stay the same.
© Oil IT Journal - all rights reserved.