Oil IT Journal interview, Andreas Jagtøyen, Kongsberg

Following the release of KognifAI, Kongsberg’s new artificial intelligence cross-industry software platform, Andreas Jagtøyen who heads up Kongsberg’s Marine/Energy division explains how the offering is consolidating Kongsberg’s prior IT art. He also opines on the role of the data lake, the internet of things and on the risk to industry of multiple vendor-specific ‘platforms.'

Why a new IT platform?

I head-up Kongsberg’s energy division that covers oil and gas production assurance, drilling, wells, along with wind power and systems for ship owner-operators. Each has some kind of digital platform (SiteCom for oil and gas) and there are significant commonalities across them all. So our intent now is to bring all these together in Kognifai and provide a development platform and interfaces for clients to develop their own apps alongside ours. Our ambition is to cover the whole value chain across oil and gas, wind and shipping in a single end to end energy ecosystem.

In oil and gas, you mentioned SiteCom, is the Witsml server still relevant?

Yes it is very important to a lot of our clients. The SiteCom servers are in regular use and this will continue as our Discovery and WellAdvisor applications are ported to the Kognifai platform, along with Witsml.

We recently reported on GE’s Predix platform that seems to play a similar role. But it can be a long haul between legacy software and true platform-based apps. A huge software re-write is needed.

That is a challenge… and also the license model needs to evolve. We currently sell software licenses and maintenance but with Kognifai we are moving to a pay-by-use model.

This is all in the cloud?

Yes, that will happen going forward. Our LedaFlow, WellAdvisor both currently work as separate processes. They can, and are, already combined into ‘digital twin’ type functions, demonstrating the potential of connected apps. This has changed the way we and our customers work. The Kognifai digital platform will further streamline the flow of data across the value chain as all data is made available to the digital twin. The digital twin can then be used to predict outcomes for changing processes. If real time measures differ from the twin then something is going on – an alarm can be triggered. This is really good for catching sensor failures. In extreme cases it may help detect upsets that require production to be closed down.

But to go back to re-engineering, this will entail a huge effort. What’s in it for you folks? It is still an enormous job.

Yes, but much of the work has already been done with our customers. This has let us deliver solutions that are not generally available. The idea is to make these more generally available on the Kognifai platform.

Is this a data lake based platform?

Well … our customers own their own data which we may use with their approval. Data ownership is a sensitive issue. So we don’t collect customer data into our own data lake! LedaFlow for example can push data into the cloud where it can be connected to and reused by the cased pipe tool through our internet of things gateway. Reusing data from our own systems is relatively easy as we have good knowledge of the data structures. But data from third party systems can be problematic. Siemens tag formats for instance may need to be processes through the IoT gateway and translated into a common notation.

Like OPC-UA?

We do have OPC-UA in the gateway, but no, there is more to it than that. ABB, Siemens and other third parties all name tags differently. It is differences like these that are managed by the gateway.

Will you be offering the internal tag description protocol up as a standard?

This is an open platform. We want to give customers a generic resource to help the kick off their projects. One that provides all of the above plus data security.

Previously the issue was with multiple control system protocols. But now we are facing multiple ‘platforms’ like Predix from GE, Veracity from DNV GL and others. Is this progress?

It is true that the lack of a standard notation in industry is a problem. Other verticals do better. We need a more standard tag notation. Statoil for instance requires a standard here. For new builds this is fine but brownfields can be very hard to sort out.

Does Statoil use the Kongsberg tag notation?

Some new projects are on the standard. This is an interesting and important topic and it is good to see the press interested.

What role does the proposed ExxonMobil process control standard play in all this?

I’m not sure about this, it appears to be a similar initiative to Statoil’s. Really this should be an ISO/IEC standard. But there again, it would not help with brownfields!

Is your platform a product or a standard?

There is a lot of uncertainty around this issue. Majors are debating internally whether to go with a single platform from, eg, Predix, SAP or ourselves. We are constantly being asked about these issues which are unresolved. These platforms need to live beside each other in an integrated oil company. It is a challenge for us to transport data between different platforms. Let’s say that today this is a good topic for discussion! It is another reason why there will not be a single data lake, that is unrealistic. More likely there will be many data lakes and storage systems that share data between multiple platforms in the future. There is a need for industry to work together. If some vendors keep a their platforms proprietary this is not going to happen.

More from Kongsberg.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.