Oil IT Journal Interview—Lars Olav Grøvik, StatoilHydro

StatoilHydro’s information architect tells of a constantly raising bar in data management, of creative chaos in the workplace and ‘sustainable’ data management on the digital oilfield.

This interview was sparked off by the May 2008 ‘Tale of two tradeshows’ editorial*.

Yes. I feel that something is missing from data management conferences. We discuss a lot of practical day to day problems, maybe even look ahead a little, but we never really seem to lift the horizon to a few years out, to the level of implied by ‘digital oilfield’ deployments. I am interested in having a cross industry dialogue looking at the medium term of data and information management. We need a roadmap to get from how data is managed today to how it will be managed in the ‘razzmatazz’ world of the digital oilfield as is on show at the Intelligent/Digital Energy events.

What’s wrong with the status quo?

Today, the reality is that management is screaming, ‘why isn’t IM working despite the millions invested?’ Now the short answer to this is that management has ‘raised the bar.’ We are now dealing with more complex data, in larger data sets and shorter cycle times than before, all in the face of staff reductions in the library and elsewhere—and we are still in business! As we are constantly raising the bar we will never get rid of quick fixes like spreadsheets and PowerPoints. We will always have clever engineers and geologists creating value. If we deprive users of the tools they like we damage creativity. The real question is how to avoid spreadsheets creating chaos—how to capture good new ideas to the corporate workflow—moving the chaos and creativity to a new level. The problem is that most all data management solutions are ‘static.’ What we need is an approach that accommodates this constant raising of the bar. What happens for instance when a new data sets arrives and needs to be ‘managed.’ How do you handle the legacy data?

What’s the answer?

That’s a good question!

How does the Intelligent/Digital Energy paradigm actually work then?

Today’s Digital Energy implementations work because they leverage a highly customized infrastructure. But before you even get to this situation, you may need to spend hours cleansing the data for a single asset. You should ask the IE/DE protagonists how long they spent getting data from the asset to the corporate database. This usually involved blending data from both clean and unclean sources, so you either exclude the latter, clean the data, or reduce the scope!

Are these fixes done for the lifetime of the asset or for the demo?

They can be both! Some really are examples of how StatoilHydro and other operators really work—examples of ‘sustainable’ data management. But for some of these, you need to check to see how many support people are involved. For some flagship NOC digital oilfield deployments, the support requirements are pretty amazing—more that would be possible for a company like StatoilHydro.

But if that’s what it takes to ‘digitize’ a major asset why not bring in a significant IT staff?

One reason is that a lot of bulk data is stored near to the end user, so you can’t really outsource to cheap IT providers in India or elsewhere. Although there are some interesting attempts to develop a ‘global data store concept.’ There is no doubt that we need to go beyond the ‘pain barrier,’ but it’s hard to achieve this while still doing business. It’s like trying to change the wheels on a moving vehicle!

What about production data—that’s gaining traction at ECIM?

This is a very active field that is in the same situation as geoscience was a decade ago. But the direct connection between production systems and reporting makes data quality a sensitive issue. Discrepancies between reports and field data can be brushed under the table.

It’s interesting though that business systems like SAP scored highly in quality**?

This could be a case of garbage in, garbage out I’m afraid!

Semantics have got a lot of traction in Norway—what’s your take?

There is a lot of new technology coming up and there is value here. Semantics, like other new technologies is following a fairly classic path, with a few ‘believers’ actively promoting the technology at first, the technology appears to provide all the answers and explain everything (like sequence stratigraphy!). Then the pendulum swings. The semantic pendulum is still swinging! Some think they have seen the light and are very positive, some remain skeptical. Semantics will likely find its place—and this may come sooner than we think. It depends a lot on data providers. If service companies and authorities take-up the technology, it may come faster.

But semantics is not a done deal in the context of integrated operations as it sometimes seems?

No. We are looking at technology choices.

What is happening in real time data and automation?

I should be more involved in real time data than I actually am. We have been too busy with the mergers. Some semi-automated processes have been developed using Perl scripts to move real time data around. These tools may or may not be maintained. This has led to a kind of lose automation.

And master data—a shared subset of corporate data? Such ideas were posed as a panacea a couple of years ago.

We have had lots of discussions around these issues. If you have two databases with similar kinds of data, you can opt to standardize them and merge the two. Or you can keep the two databases running, especially if you need extra parameters on different fields or other use cases. You could then blend information from the two by sharing top level information. StatoilHydro’s approach is for an ‘official’ dataset (the first solution). But I can see cases where the second approach is valid. But a well master data footprint does have to be relatively large. And sensor data may or may not be standardized and can blend high and low frequency information. It is a complex issue. But as I said, Statoil-Hydro’s approach is for a single master data set. One thing is for sure, without a minimum set of parameters ‘masterized,’ you can’t have true integrated operations. Having said that, today’s reality is one of manual or semi-automated data links that are very hard to sustain.

And PPDM/Energistics. What’s their role here?

We are involved in Energistics’ projects—but here again the bar is being constantly raised in the workplace. Standards body’s work will never be on a par with a major development from Schlumberger or Landmark, nor will they measure up to a super database from a major oil company.

I am curious to know how other industries manage—pharma, internet, mobile telephony all must have similar issues. All seem to manage to constantly raise the bar without interfering with their ongoing business. How other industries achieve this would be an interesting study topic for a future-looking meeting.

* Oil IT Journal May 2008.

** ECIM presentation.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.