Integrated Operations in the High North final report

Norwegian IOHN initiative set out to design a ‘robust’ semantic software architecture for an Arctic setting. Report finds semantic web technology immature and hard for domain specialists to use.

On the ‘better late than never’ principle, we report that the much ballyhooed Norwegian ‘Integrated operations in the high north’ semantic R&D project produced its final report mid 2012. IOHN set out in 2008 to ‘design, implement and demonstrate a reliable and robust software architecture to be used in an Arctic setting.’ This of course begged the question, ‘what is special about software in an Arctic setting?’ The answer, IOHN’s rationale, was that operational models for such environments depend on ‘an extended support network that requires collaboration across disciplinary, geographical and organizational boundaries.’ Enter IOHN’s ‘open standards’ for interoperability, integration and data transfer.

These involved the development of a semantic web-based integration platform for sensor data and semantic models for the upstream. Along with the W3C’s semantic web technology, IOHN was to achieve its ‘flawless’ information exchange using POSC/Caesars’ ISO 15926 protocols. The integration platform was developed using Cambridge SemanticsAnzo Enterprise. This provided ‘virtualized’ access to information in source data stores, including Microsoft Excel. Anzo leverages the Open services gateway initiative, a Java framework used in the open source Eclipse IDE.

Anzo was successful at combining data from Excel and other sources but the researchers determined that the underlying semantic web technology is ‘better suited for meta-data rather than sensor data due to the high overhead of RDF.’ However the team managed to implement and query instrument data in Sparql and the returned RDF/XML data could be parsed by applications.

Modeling on the Snorre field found that semantic software tools need improvement. ‘There are hundreds of competing solutions for mapping between relational/tabular sources and the RDF graphs.’ Moreover, modeling tools are ‘not mature and not usable by oil and gas engineering domain experts whose knowledge is required to build a semantic model.’

A sub-project involved developing a ‘Software-related technology qualification process’ for qualifying software systems and components for dependability. This included the qualification of ‘architectures, systems and components (ASC) used in complex software-intensive (CSI) and software-intensive (SI) components. One unqualified IOHN success seems to have been acronym development!

The team used the free software tool ProtoSeq to map the safety requirements of the IEC61508 standard into a collection of patterns using the goal structuring notation. Other software quality testing methods were trialed such as failure mode and effects analysis and a ‘cloned buggy code detector.’

One NTNU student developed ontology tools for Scada security and performed a literature survey to classify security attacks and incidents and to help developers protect the industry from cyber threats.

Another sub-program investigated ‘digital innovation dynamics’ i.e. the role played by digital technologies in ‘creating, not simply representing, the materiality of physical phenomena.’ Such insight is said to explain the ‘ongoing transformations of the offshore petroleum industry’ with the advent of the ‘fully digital oil field.’ Here a ‘conflation’ of the material and digital worlds is transforming the nature of work, technology and organization of the offshore petroleum industry.

Each of IOHN’s sub projects (and there are many more we have not covered) is summed up with ‘successes’ and ‘lessons learnt.’ The latter, while not always equating with failure, are in general more informative than the successes. If you view IOHN as a large scale R&D funding exercise this is OK. Unfortunately, over the years, IOHN has been hyped as a panacea for collaboration and integrated operations. In this context it is hard to identify a concrete outcome.

Whether it was wise to direct so much of Norway’s oil and gas R&D funding towards what remains unproven technology is another question. But Norway is in good company here. The EU has just launched ‘Optique’ and is to plough some €14 million into research on ‘semantic, scalable end-user access to big data.’ See above.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.