Macondo and the HMI

Chevron’s David Payne argues that the human machine data interface is failing drillers as real time data disappears down a ‘data exhaust.’ New workflows and open data systems are needed.

In a well received keynote to the Society of Petroleum Engineers’ Digital Energy Conference this month (full report next month), David Payne, VP of Drilling at Completions at Chevron observed that a year had past since the Deepwater Horizon/Macondo blowout. Since then there have been ‘conversations’ around well design, blow out preventer design and so on. But there has been little discussion of human factors.

Payne argues that a major factor in Macondo was that the human interface to data failed the people on the rig. The human brain can only handle 6-8 data points at a time and has a limited capacity to manage data. On Macondo, the data was there but operators were not capable of handling and interpreting it.

A modern deepwater rig produces masses of data, ‘We used to have a little Geolograph and there was a lot of manual data entry, now we have measurement and logging while drilling (MWD/LWD) data coming in in near real time.’

A deepwater Gulf of Mexico rig has thousands of sensors. One rig today can collect more data in a year than Chevron D&C has collected in the whole of its history. How is this handled? It isn’t! There is a big ‘data exhaust’ on the back of the rig, and the data is gone! It scares folks how much it would cost to keep and manage.

What are we doing about all this? We have proliferating drilling decision support centers (DSC). Chevron’s DSCs provide a lot of help in geosteering, but they don’t really change the way we do business.

They do however offer huge opportunity for analytics, doing stuff we have not thought of yet. We would like to be able to compare real time data with a global database—the laws of physics are the same worldwide! We need to connect earth science with drilling data, so that we know, for instance, when to change drilling parameters and when to pull out of hole. We also need to use IT to find out why connection times are too long using data mining, linking multiple data streams and databases.

There is also a demand for open systems. The very idea of proprietary data is unacceptable. We need to set expectations around industry standards. Software standards have limited success to date. WITSML is good but is just the very beginning. We need to change D&C workflows and to move beyond traditional process.

Initially it was suggested that information from the DSC should go through the usual ‘chain of command.’ This is ridiculous! We need to change the paradigm; leveraging video links to the rig and ‘face to face’ contact from DSC to rig. We will see dramatic IT-driven change. IT represents the single biggest opportunity for improvement in D&C.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.