At any point in time the world, and what we perceive of as ‘knowledge’ or ‘science’ contains a mixture of stuff that later turns out to be right and stuff that turns out to be bunkum*. Back in the day we had phlogiston, the luminiferous ether and Mars’ canals. All bunkum! Today we hear news reports of many medical remedies and beliefs that turn out to be bunkum.
I have just got back from the excellent PNEC data integration conference (report on page 6) and it might seem churlish to argue that much that is said on the subject of data management tends towards the bunkum side of the equation. But that is what I propose to do.
For me, PNEC had two enlightening moments. During the panel session on future trends and best practices, a speaker from the floor who returned to the geodata world after a stint in accounts said he was ‘appalled at the lack of automation in geodata.’ In finance, the expectation is that data comes in to the system and flows through untouched by human hand. This interesting observation got short shrift from the panel along the lines of, ‘oil and gas data is different’ and the debate went on to discuss what a well was and other ‘best practices.’
The other enlightening moment came in Karl Fleischmann’s answer to my own query (sometimes enlightenment needs a bit of priming) as to whether Shell’s analytical approach to data management might pave the way to automating parts of the workflow. Fleischmann, who was presenting on what technical data management has to learn from modern manufacturing, came back with something along the lines of ‘yes indeed.’
The problem today is that far too much of technical data management is considered to be a ‘workflow’ i.e. a ‘people thing,’ when it should be a process i.e. automated. Data management is at the stage that financial services were many years ago when a multitude of individuals were involved in a transaction, filling out forms, stamping and signing and passing over to the next person. Today’s forms may be digital and the messages may be files or emails, but the result is the same.
Why are things this way? The computer business has a lot to answer for here in the way it has to reinvent itself all the time. Again people are center stage and we are encouraged to ‘collaborate,’ BYOD** or network socially. Such a free for all makes a straightforward, process approach hard to realize. Putting the individual at the heart of the workflow is good for the IT business but it means that much of what we have come to know and like is an obstacle to the automation of data flows.
Matthias Hartung’s presentation on Shell’s attempt to provide ‘trusted data at your fingertips’ was a good summary of the state of the art. Hartung called for the harmonization of applications and architectures, of more and better standards from the SEG, PPDM and Energistics and saw hope from the new Standards Leadership Council.
But wait a minute. Calls for standards, application harmonization that reminds me of something. Well it reminds me of the whole 17 years of PNEC and Oil IT Journal. If you don’t believe me, read my very first 1996 editorial from what was then Petroleum Data Manager—a call for more and better standards and interoperability. As they say, ‘plus ça change, plus c’est la même chose.’
Hartung advocated the professionalization of data management, turning it from ‘Cinderella to enduring beauty.’ The call for professionalization was the big thing to emerge from this year’s PNEC. PPDM, the UK’s Common data access and the emerging Saudi Aramco-backed DMBoard initiative are pushing for professionalization. For PPDM and CDA the plan is for a data management ‘society’ along the lines of the SPE or the SEG. There are a few problems with this.
First of all, petroleum engineering and geophysics have a hundred years or so of academic and practical experience to build on, data management has twenty years of experience and little to show for it. Academic bunkum!
Secondly, the ‘call for standards’ has been heard so long that it is getting a bit rusty. What exactly does it mean when a multibillion major or national oil company makes a cry for help to an underfunded standards body? I submit that it means that the problem at hand is in reality less than mission critical. Standards bunkum!
Thirdly, consider the bigger picture, that of the business. It is a truth universally acknowledged, if rarely acted on, that business is hampered by the lack of communication between its traditional silos, geosciences, drilling engineering, production and so on. The only major change to this picture in the 16 years of this publication is the emergence of yet another silo, the IT department, another disconnect in the way we do business. Silo bunkum!
Finally we have the mantra that a data project is all about people and not about technology. This effectively precludes any serious attempt to treat the problem as one of automation—which is all about technology. People bunkum!
The mistake of sanctifying data management as a profession is that it perpetuates the data problem with yet another silo. Instead of one hard to negotiate gap between IT and the business, there will now be three hard to negotiate gaps, between the business, IT and data management. And inside this new silo what do we find? Useful knowledge of data flows and applications—just what is needed to support an automation effort.
But all this lovely know-how is in the wrong place! It would be put to much better use if the business and IT could get together to eliminate vast swaths of data management completely just as finance has managed to do.
* Nonsense—see its amusing
etymology.
** Bring your own device—i.e. an iPhone.
© Oil IT Journal - all rights reserved.