Oil IT interview, DNV GL and the Veracity data platform

Oil IT Journal talks to Cathrine Torp, Kenneth Vareide and Jorgen Kadal about DNV GL’s new data platform. Today ‘absolutely nobody is handling data as an asset.’ An enhanced data value chain is needed to fix the familiar problems of data quality and provenance and of data lakes full of garbage. DNV GL’s position as a ‘trusted third party’ has the potential to bridge the proprietary silos.

What is Veracity’s background?

KV A few years back we set up a digital accelerator group for data. We hired experts in data, software and IT security and made several acquisitions including QLabs (IT risks), Echelon (IT security) and Tireno (IT infrastructure). Our previous work in standardizing marine software development is also highly relevant to oil and gas and its big data opportunities.

CT We also employ technology domain experts. We have already developed a big data solution for drilling.

JK Our core offering and differentiator is our independent third party role and our ability to unlock siloed data. Our role of data curator and broker goes back to 1980.

That’s interesting in the context of the new big data silos from GE, Rockwell, Yokogawa and others. How do you get inside these walled gardens?

JK We are in a dialog with GE re Predix, also with Kongsberg and IBM. The issue of multiple big data silos is unresolved. There is a need for an enhanced data value chain to assure veracity. Can you trust the algorithms that have been applied to data, the data context and the outcomes? This is where the ‘platforms’ struggle. However, two platforms can co-exist. There are two ways of doing this. One option is to chain them so that, say, engine anomaly data from one platform is passed to the second platform for independent QC. But you do need to access source data. You can’t clean your way out of these data quality issues. So a better option is to use the DNV platform as a source of curated data that can be pushed back to, say, Predix for monitoring. Such problems exist in the maritime sector where different OEM data silos cause frustration and data ownership issues.

Is DNV GL a standards setting body like the American Petroleum Institute?

KV I’m not sure about the comparison but for instance, 60% of the world’s pipelines are fabricated with DNV specifications. We are worldwide standards-setters. Our JIPs last from one to three years and the outcomes are published as a recommended practice (RP). We then work with industry to keep the standard fresh. It is a bit like ISO. In fact we build on ISO standards.

CT But we are a lot faster than ISO!

Is Veracity a full services data storage deal or just on the quality/verification side?

KV It could be anything between these extremes. Today we are doing more cleaning and verification but with the present announcement it is clear that we are moving in the direction of full service.

JK We set standards, particularly with our RP for data quality*. We also offer data maturity assessment services to clients. Despite grand claims to the contrary, absolutely nobody is handling data as an asset. A a lot of data we see is complete rubbish, with meaningless tag numbers and so on!

Isn’t this the problem with the data lake concept whereby GIGO**?

JK This has been one of the key findings of our research to date. Even a slight mismatch, a time shift in data from rotating equipment, can make the data useless. This is where our automated QC checks come in. We develop these in joint industry projects which deliver open source fixes that operators can deploy as services.

What partners are involved?

JK We are in partnership with Hortonworks for the big data stuff. We have Hortonworks on premises. Cloud providers are becoming more interoperable thanks to open source software. Even Microsoft is getting into the open source thing.

What about other upstream standards from say Energistics like Witsml, Prodml?

KV We don’t have a big role in IT standards, our focus is on qualifying software. We have a role in maritime standards for functional description of ships. We also have a domain model of risk. We build on the CMMI approach, tailored to suit our industries. The IoT brings new challenges especially when ‘solutions’ come as closed, black boxes.

So what is new in Veracity is the data QC?

KV Yes. Previously we were looking at quality outside of the process that was originally used. The previous RP on IT/OT data quality is now 10 years old. The new (2017) release adds coverage for big data, data ownership and security. You should visit Oreda.com, the Offshore reliability data portal. The Oreda dataset holds years of curated asset reliability data supplied by asset owners. It is used by insurance companies and others for costing and forecasting.

On the subject of data ownership. OOs see a small amount of kit from many OEMs. OEMs see lots of data but only from their own kit. Nobody sees everything!

JK You summarize the situation well. GE wants to compile data from other OEMs. But there is push back from a conservative industry. Equipment comes from many suppliers and OOs want to have a clear picture of asset reliability. This can’t be done without standards and curation. It may be easier for us to do this than an individual OEM.

We have written a lot about data handover to OOs on commissioning. In fact we will be reporting from the CFIHOS initiative in a future issue.

JK Yes, this is another open field. You need a holistic view of the supply chain and between operators. Some do share. We have one very mature operator two months ago gave the OK to share turbine data and analytics with OEMs.

They do say that they don’t like to have to pay to get their own data!


JK Things have come a long way. We are working on pilots and a proof of concept that we will be showing at the OTC.

* To be reviewed next month.

** Garbage in, garbage out!

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.