Oil IT Journal: Volume 22 Number 4


Shale gains ‘overstated'

MIT finds ‘flawed logic’ in US EIA shale forecasts that ’significantly overestimate’ the impact of technology over sweet-spot based drilling. Current forecasts reflect ‘unrealistic’ expectations.

Recently published research* from the MIT Energy Initiative has demonstrated significant bias in current unconventional oil and gas well modelling. Francis O’Sullivan, Mitei’s director of research, and researcher Justin Montgomery, have developed a statistical model that is claimed to ‘reliably distinguish’ between the impact of location, and completion technology, on well productivity.

Reporting on the research, MIT’s Kelly Travers writes that ‘continuing low prices have led to substantial uncertainty about future production levels of unconventional oil and gas and the long-term economic viability of these resources.’ Technological advances in drilling techniques and hydraulic fracturing are key to developing tight oil and gas. But operators are simultaneously raising average productivity by ‘sweet-spotting,’ i.e. cherry picking the best drilling locations.

Sweet spotting implicitly downgrades the remaining acreage and makes forecasting problematical. MIT applied a spatial error model and regression-kriging to public data from the Bakken formation in North Dakota to better understand the impact of sweet spotting and improve forecasting.

The statistical approach is popular in the earth science community but, according to the researchers ‘has never before been used in well productivity modeling.’ Regression kriging accounts for spatial variability at a high resolution and is said to improve on models that are currently used in the industry, notably by the US Energy Information Administration in its Annual Energy Outlook forecasts.

Five different models were trialed on 42 months of data from 4,000 Bakken wells. Current modeling techniques were found to be incapable of capturing the rapid spatial geological variability that underpins the sweet spots.

The EIA’s approach in particular was not flexible enough to account for short-distance variations. Half of the the gains in Bakken productivity were due to changes in where companies were drilling wells rather than how they were drilled.

MIT concluded that ‘current forecasts for future production and cost of US tight oil and shale gas may be overoptimistic due to unrealistic expectations of future technology-driven productivity gains.’

The researchers concluded that developing shale fields economically at current prices is ‘very challenging.’ The research ‘should help both policymakers and commercial entities better understand what can be expected from these important resources going forward.’

* Elsevier Applied Energy June 2017.


KognifAI

Kongsberg’s artificial intelligence platform to optimize data access and re-use across drilling and production.

Kongsberg has announced ‘KognifAI’ an ‘open and collaborative’ platform spans information technology and operational technology by optimizing data access and analysis for its maritime and energy industry customers.

In a presentation to investors, Jeppe Sverdrup VP Digital Energy at Kongsberg stated that two thirds of the world’s largest oil and gas companies use Kongsberg’s software, IoT services and advanced analytics. Some 10,000 wells are already monitored with its technology and live data feeds from 100 rigs amass terabytes of data daily.

Kongsberg has been working to rationalize digital technology across its different business units since 2014. The result is KognifAI, a ‘single digital platform for all data produced across the technology spectrum.’ Kongsberg’s latest digital foray is a joint venture with Norwegian eSmart Systems, an advanced analytics boutique that leverages the Microsoft Hadoop distribution HDInsight and Azure machine learning. Visit the KognifAI AppStore and read our exclusive interview with Andreas Jagtøyen Kongsberg, head of Kongsberg’s energy division, on page three of this issue.


Oil and gas a digital laggard? Big data disruption? ... maybe

Neil McNaughton reports on recent developments in the big-data-internet-of-things-advanced-analytics space. Is oil and gas a technology laggard? Is the digital grass greener on the other side of the fence? Will disruptive technology like the WebIsADatabase replace the knowledge worker? He also gets political with some AI-related news hot-in from the French election.

An apology to our upstream subscriber base. But as focus shifts from exploration to production, and as what exploration that is being done (onshore US) seems to be making do with less G&G, there is less information coming our way ‘E’ end of the E&P spectrum.

Elsewhere the information vacuum is filled with contributions from what can loosely be described as the big-data-internet-of-things-advanced-analytics brigade. Some of which is interesting, but all of which is getting a bit tired. Too much razzmatazz, not enough meat. No proof of the pudding. No killer app to replace the knowledge workers that we all are which I suppose is just as well!

We do of course continue to report dutifully on the outpourings of the big data etc. movement in the hope and expectation that eventually, something of interest will turn up. But so far, we, like others, are pushed into the uncomfortable position of ‘reporting’ on stuff that has not yet happened. All those jobs that will be lost to AI at some time in the future? All that ‘digitization’ stuff you thought you’d already done. Well seems like you haven’t even started!

On which topic, we hear from the consulting community that oil and gas is a technology laggard. A report from EY recently popped into my inbox where I read that ‘The digital revolution disrupting so many industries has been slow to make its presence felt in the oil and gas sector. [ … ] The energy industry has traditionally lagged behind other sectors when it comes to adopting technology for above ground uses. [ … ] But the dramatic changes digital is bringing to the modern enterprise can’t be ignored forever, and oil and gas executives are beginning to recognize the promise and challenge of adopting a digital strategy.’

Strong stuff indeed. But what exactly is all this stuff going on in all those other ‘disrupted’ industries? It so happens that I also recently chanced on an opinion piece in Nature by one Andrew Kusiak who, as professor of industrial engineering at the University of Iowa, ought to know. Kusiak has it that ‘… big data is a long way from transforming manufacturing. Leading industries - computing, energy and aircraft and semiconductor manufacturing - face data gaps. Most companies do not know what to do with the data they have, let alone how to interpret them to improve their processes and products. Businesses compete and usually operate in isolation. They lack software and modelling systems to analyze data.’ So much for the grass being greener on the other side of the fence!

Another popular argument doing the rounds has it that oil and gas has piles of data just hanging around doing nothing. The implication being that, if you not doing stuff with all your data all the time, then you should be replaced by a robot that can! I’m not sure exactly what is amiss with this notion, but I suggest that it is a bit like asking why you are not doing something right now about all the air that surrounds you un-breathed. Not perhaps a perfect analogy but you get my drift.

I may have misunderstood the import of another announcement, the first release of the WebIsADatabase (Wiadb) a linked, open database from the Data and web science group at the University of Mannheim. The dataset contains nearly 12 million ‘hypernym’ relations collected from the web. Hypernyms, aka ‘is a’ relationships, tell you stuff like ‘red is a color’ or ‘iPhone 4 is a smartphone.’ The researchers provide their scraped data along with confidence scores, ‘rich’ provenance information and interlinks to other ‘LOD’ (linked open data) sources like DBpedia and Yago. The whole dataset of over 470 million RDF triples is provided as linked data, Sparql endpoints or as downloadable dumps.

Is it possible to consider the whole of the web as a reliable information source? If you believe in the ‘wisdom of crowds’ then maybe. Personally I am skeptical. I wrote about the usefulness of the web as a source of information in my March 2008 editorial ‘Heat pumps, phlogiston and the world wide web.’ I concluded then that ‘on the web you are more likely to find bull from enthusiastic amateurs than gospel from the experts. There is a distinct weighting of the knowledge scales in favor of the unqualified hordes.’

In a way, the bigness of the dataset means that ‘knowledge’ eventually becomes a matter of opinion. The web is more of a massive voting machine than a repository of knowledge and is easily biased by misconceptions or gamed by the unscrupulous.

Speaking of voting, as you probably know I live in France where there have been some interesting happenings on the political front of late. In the run-in to the elections there was some angst among the chattering classes in that a hitherto unknown (over here at least) Canadian polling institute (I will refrain from mentioning its name on the ‘do not feed the trolls’ principle). These folks seemingly used big social media data to correctly predict the result of the US presidential elections last year.

In the run in to the French presidential elections, the same polling concern, using the same techniques, predicted a win for the National Front. This of course flew in the face of all the old-fashioned local pollsters. But what would they know with their ‘legacy’ technology of small samples of ‘representative’ individuals. The Canadian outfit was out ‘disrupting,’ scraping masses of information from the buzzing social media. Big data, advanced analytics, how could they go wrong?

@neilmcn


Oil IT Journal interview, Andreas Jagtøyen, Kongsberg

Following the release of KognifAI, Kongsberg’s new artificial intelligence cross-industry software platform, Andreas Jagtøyen who heads up Kongsberg’s Marine/Energy division explains how the offering is consolidating Kongsberg’s prior IT art. He also opines on the role of the data lake, the internet of things and on the risk to industry of multiple vendor-specific ‘platforms.'

Why a new IT platform?

I head-up Kongsberg’s energy division that covers oil and gas production assurance, drilling, wells, along with wind power and systems for ship owner-operators. Each has some kind of digital platform (SiteCom for oil and gas) and there are significant commonalities across them all. So our intent now is to bring all these together in Kognifai and provide a development platform and interfaces for clients to develop their own apps alongside ours. Our ambition is to cover the whole value chain across oil and gas, wind and shipping in a single end to end energy ecosystem.

In oil and gas, you mentioned SiteCom, is the Witsml server still relevant?

Yes it is very important to a lot of our clients. The SiteCom servers are in regular use and this will continue as our Discovery and WellAdvisor applications are ported to the Kognifai platform, along with Witsml.

We recently reported on GE’s Predix platform that seems to play a similar role. But it can be a long haul between legacy software and true platform-based apps. A huge software re-write is needed.

That is a challenge… and also the license model needs to evolve. We currently sell software licenses and maintenance but with Kognifai we are moving to a pay-by-use model.

This is all in the cloud?

Yes, that will happen going forward. Our LedaFlow, WellAdvisor both currently work as separate processes. They can, and are, already combined into ‘digital twin’ type functions, demonstrating the potential of connected apps. This has changed the way we and our customers work. The Kognifai digital platform will further streamline the flow of data across the value chain as all data is made available to the digital twin. The digital twin can then be used to predict outcomes for changing processes. If real time measures differ from the twin then something is going on – an alarm can be triggered. This is really good for catching sensor failures. In extreme cases it may help detect upsets that require production to be closed down.

But to go back to re-engineering, this will entail a huge effort. What’s in it for you folks? It is still an enormous job.

Yes, but much of the work has already been done with our customers. This has let us deliver solutions that are not generally available. The idea is to make these more generally available on the Kognifai platform.

Is this a data lake based platform?

Well … our customers own their own data which we may use with their approval. Data ownership is a sensitive issue. So we don’t collect customer data into our own data lake! LedaFlow for example can push data into the cloud where it can be connected to and reused by the cased pipe tool through our internet of things gateway. Reusing data from our own systems is relatively easy as we have good knowledge of the data structures. But data from third party systems can be problematic. Siemens tag formats for instance may need to be processes through the IoT gateway and translated into a common notation.

Like OPC-UA?

We do have OPC-UA in the gateway, but no, there is more to it than that. ABB, Siemens and other third parties all name tags differently. It is differences like these that are managed by the gateway.

Will you be offering the internal tag description protocol up as a standard?

This is an open platform. We want to give customers a generic resource to help the kick off their projects. One that provides all of the above plus data security.

Previously the issue was with multiple control system protocols. But now we are facing multiple ‘platforms’ like Predix from GE, Veracity from DNV GL and others. Is this progress?

It is true that the lack of a standard notation in industry is a problem. Other verticals do better. We need a more standard tag notation. Statoil for instance requires a standard here. For new builds this is fine but brownfields can be very hard to sort out.

Does Statoil use the Kongsberg tag notation?

Some new projects are on the standard. This is an interesting and important topic and it is good to see the press interested.

What role does the proposed ExxonMobil process control standard play in all this?

I’m not sure about this, it appears to be a similar initiative to Statoil’s. Really this should be an ISO/IEC standard. But there again, it would not help with brownfields!

Is your platform a product or a standard?

There is a lot of uncertainty around this issue. Majors are debating internally whether to go with a single platform from, eg, Predix, SAP or ourselves. We are constantly being asked about these issues which are unresolved. These platforms need to live beside each other in an integrated oil company. It is a challenge for us to transport data between different platforms. Let’s say that today this is a good topic for discussion! It is another reason why there will not be a single data lake, that is unrealistic. More likely there will be many data lakes and storage systems that share data between multiple platforms in the future. There is a need for industry to work together. If some vendors keep a their platforms proprietary this is not going to happen.

More from Kongsberg.


SeismicZone ‘microsites’ to host sellers’ seismics

Katalyst’s seismic data marketplace now offers branded showcases for third party data.

Katalyst has announced a new microsite functionality, an extension of its SeismicZone online seismic data marketplace. The microsite lets seismic data owners market their proprietary data from a private space hosted by Katalyst. Data owners can thus leverage the full functionality of SeismicZone to showcase their data libraries on interactive maps specific to their proprietary data.

SeismicZone stores subsurface data at its private Tier 2 data centers. iGlass, a customized PPDM 3.8 serves metadata to a web-based ESRI map portal. While is independent of Katalyst’s data management solution, data management clients can view data available for license in the iGlass map.

Katalyst CEO Steve Darnell said, ‘In the current environment, many companies are having to explore every avenue for saving time and money. We have been working to address the needs of data owners and purchasers alike to create the best experience for both sides of the resale market.’


Palantir and PetroVR/Caesar Systems combine

PetroVR and PlanningSpace planning support platforms to merge.

Palantir Solutions and PetroVR (formerly Caesar Systems) have entered into a definitive merger agreement under which Palantir and PetroVR are to combine in a ‘stock-for-stock’ transaction. The new company is to be called Palantir Solutions. The companies’ software platforms, PetroVR and Palantir’s PlanningSpace will together form an ‘industry-leading’ oil and gas planning software application with a global delivery and support network from nine offices in eight countries. The deal also sees PetroVR joining Palantir in the Halliburton-Landmark-Palantir alliance that delivers petroleum investment lifecycle management solutions.

Palantir MD and CEO Jason Ambrose said, ‘Our combined teams are dedicated to serving customers and providing software that meets the planning needs of multi-functional asset teams, corporate planning groups and executive decision makers. PetroVR employees will join the Palantir team and the combined company is positioned for future growth and continued value creation for our shareholders and our clients.’

Advisors on the deal were (for Palantir) One-to-One, Paladin, Allan Tiller Law, Outside GC CA and Quayle Munro. Integrity Advisors, Baker Williams Matthiesen and PKF Texas advised PetroVR. More from Palantir.


Exprodat’s free online GIS benchmark

Industry at large scores meagre 2.6 out of 5 on geographic best practices self-test.

Getech unit Exprodat has released an online benchmark app to help operators ‘take their geographic information system (GIS) operations to the next level.’ The free web app provides immediate feedback and an overview of an organization’s geospatial strengths and areas where attention is required. Exprodat’s app also provides a comparison with oil industry peers and competitors (with anonymized data). The benchmark assesses current GIS best practices, helps make a business case for future investments and allows users to track progress over time.

Some 50 organizations have already used the benchmark methodology and some have opted for a follow-up onsite audit. Exprodat expects that the new online app will encourage others to test themselves. To date companies have scored an average of 2.6 on a scale of 1 to 5 in the Exprodat GIS maturity matrix.

E&P companies are ‘slowly beginning to deploy GIS in a true enterprise manner’ as indicated by scores of 3 and above. While a small number of operators score very highly, the vast majority of organizations ‘still struggle with data management, integration, skills and governance.’ Moreover, progress over time is poor. Take the test and check out the GIS benchmark story map on ArcGIS Online.


Fred Simkin comments on AI in Oil and Gas

SmartFix president thinks editor Neil McNaughton missed a trick in 2016 editorial.

I enjoyed your 2016/3 editorial ‘From linear programming in 1958 to winning at Go.’ But I find it hard to believe that you could discuss AI in oil and gas without mentioning important applications like Schlumberger’s DipmeterAdvisor and the use of constraint-based reasoning for automating the configuration of equipment like BOPs, equipment maintenance advisers and tools for well safety, inspection and regulatory compliance. These deserve attention because while there have been some ‘hitches,’ like DipmeterAdvisor in fact, they have, in many cases, been commercially successful.

I understand that those coming from the data side of the IT house may be more comfortable with the analytical, ‘computational intelligence’ approach with its roots in linear programming. But us AI ‘symbolists’ are still here and we are still making a difference, particularly as the industry hemorrhages domain expertise due to economics and the age of the folks with the heuristic knowledge.

It is especially important today to implement, test and deploy applications that capture and share knowledge across the enterprise. Analytical applications don’t do this, even though they can add value. It isn’t an either/or proposition, fuzzy logic/neural net hybrids are extremely powerful. You mentioned Google’s GO app which was a hybrid as is IBM’s Watson.

Yours, Fred Simkin, President/Sr. Knowledge Engineer SmartFix LLC.


Software, hardware short takes ...

Akai on Vaizr. Battelle PipeAssess. Blue Marble seismic calculator. PDS AVA Clastics. IHS EWB 2.0. Industrial Skyworks’ Blue VU. INT Viewer 5.2. Kappa Workstation 5.12. Roxar Tempest 8.0. Schlumberger Lift IQ. Seeq R17 for Devon. Terrasciences’ TLAS LAS reader for Android.

Assai has ported its engineering document management solution from Oracle Forms to freeware Vaizr running on Tomcat, eliminating the need for an Oracle application server.

Battelle has announced PipeAssess PI (pipeline integrity, not ‘PI!’). PA-PI embeds physics-based modeling and prediction for axial crack growth.

The 2017 release of Blue Marble’s Geographic Calculator includes a new quality control tool for seismic survey data to assess pre and post stack seismic data in UKOOA and SEG-Y files.

Petrotechnical Data Systems (PDS) has announced the Ava Clastics analogue database and clastic sedimentology package. This includes ‘GeoCypher,’ co-developed with the University of Leeds.

DNV GL has launched the PKI methane number calculator for pipeline gas to ensure that the fuel is fit for purpose. The algorithm quantifies the effect of pipeline gas quality on engine knock to ensure safe and efficient engine operations.

IHS Markit’s Engineering Workbench 2.0 retrieves relevant information from the ‘universe’ of technical knowledge that includes standards, codes, specifications, patents and technical references.

Industrial Skyworks has announced Blue VU, software for drone-based building, and oil and gas infrastructure inspections. Blue VU offers intelligent organizing of UAV images, synchronized use of UAV imagery, 3D point clouds and vectorized inspection results.

INTViewer 5.2 brings improved performance, a new GUI and Python automation.

Kappa Workstation 5.12 includes advanced models released by the Kappa unconventional resources consortium.

Emerson’s Roxar Tempest 8.0 provides risk mitigation, increased field productivity and expanded integration from geosciences to production. V8 features additional history-matching capabilities and an integrated ‘big loop’ workflow.

Schlumberger has announced a new artificial lift management service, Lift IQ offering soup-to-nuts production life cycle management operated from remote surveillance centers around the world.

Seeq R17 adds visual analytics on ‘past, present and future data’ and new multivariate regression and reference signals for ‘golden batch’ analytics. The new version was unveiled at the 2017 OSIsoft user conference with Devon Energy’s work on oil field tank haul-off improvement as poster child.

Terrasciences has released ‘TLAS,’ an Android app for viewing log ascii standard (LAS) well log files. TLAS comes as both ad-supported freeware and a Pro version with more functionality including LWD data display.


Rice Oil & Gas High Performance Computing 2017

ExSeisPiol - parallel I/O for extreme workloads. Devito - SciPy for seismics ‘because not everyone is a polymath.’ Nvidia’s Occa cross platform API. BP on ‘disruptive’ OpenSFS.’ Argos frac simulator.

Introducing the 10th anniversary edition of the Rice oil and gas high performance computing conference, chairman Jan Odegard outlined an expanding role for HPC that includes, along with its traditional seismic imaging home ground, big data applications in operations and management, deep learning techniques and the convergence of existing systems into next-generation architectures.

Cathal Ó Broin from the Irish centre for high-end computing introduced ‘ExSeisPiol’ aka extreme-scale parallel I/O for seismic workflows. ESP was developed with support from Tullow Oil to handle real-world issues of multiple SEG-Y versions and data stored in legacy formats. The ESP C/C++ API and stack builds on Data Direct Networks’ Infinite Memory Engine’s burst-buffer technology. This uses high performance flash storage to enhance noncontiguous write performance. ESP will be released as open source software later this year.

Navjot Kukreja (Imperial College London) demonstrated ‘Devito’ a SciPy-based domain specific language for seismic imaging. HPC in seismics requires an intimate knowledge of physics, programming and hardware optimization. The problem is, ‘not everyone is a polymath.’ Devito exposes a symbolic math library, useable by seismologists, that hides the algorithmic/hardware complexities to ‘bring the latest in performance optimization closer to real science.’ The Devito code generator has been tested and verified on the full SEG/SEAM dataset. Devito has backing from Intel and BG Group (now Shell).

Thomas Cullison (Nvidia) introduced Occa, an open-source library used to program multi-core architectures. Devices such as CPUs, GPUs, and FPGAs are abstracted using an ‘offload-model’ for application development. C and Fortran kernel languages allow developers to use run-time compilation for device kernels. The Occa API provides comparable performance across Cuda and OpenCL with code running on Intel, Nvidia and AMD hardware.

Shawn Hall (BP and OpenSFS) presented on the ‘disruptive technology’ that is OpenSFS, a vendor neutral, non-profit whose mission is to ‘keep the Lustre file system open.’ OpenSFS offers a low, flat membership fee and gives a ‘unified voice’ to the vendor community.

Dylan Copeland presented Geonumerical Solution’s ‘Argos,’ a 3D HPC simulation of hydraulic fracturing. Argos performs 3D, coupled multiphysics simulation of fractured rock mechanics, fluid dynamics, and proppant transport. Argos supports unstructured grids, fracture networks, and arbitrary configurations of multiple wellbores and completions. To date, studies have demonstrated the significance of anisotropic stress fields, near-wellbore friction, proppant bridging, and slurry rheology.

Many more presentations from Rice O&G HPC 2017 here and videos.


Capital Facilities Information Handover Standard

USPI-backed standards group face-to-face meet chez BP. Shell on handover and mounting tag counts. Chevron on ‘over ambitious’ prior art. Total’s Quantum digital twin. AmecFW’s information hub. Yokogawa on IOGP standards. USPI on ISO accreditation. Shell on broadening CFIHOS’ scope.

The handover of data and information on the ‘as-built’ state of facilities such as offshore platforms, FPSOs or refineries represents something of a holy grail for the construction industry. While parts of the constellation of technologies that is ISO 15926 have been tried by operators and software vendors, a ‘practical’ implementation that can be used by engineering prime contractors (EPC) to hand-over to owner-operators (OO) has proved elusive. The developers of Cfihos (pronounced ‘see-foss’), the capital facilities information handover standard, have set out to simplify the process, abandoning ISO 15926’s semantic web technology in favor of the exchange of a limited amount of ‘must-have’ information via standardized Microsoft Excel templates. Cfihos is owned by the USPI standards body and has backing from Chevron, Shell and BP which hosted this face-to-face meeting earlier this year.

Paul van Exel, USPI chair, presented Cfihos as a handover specification (not just a guide) of what OOs want in terms of the documents and information required to populate their information systems. Cfihos will deliver a process industry standard document with rules and principles for information handover. The specification will point to the ISO 15926 reference to data library and act as a common language for all stakeholders. Cfihos has been available to members since 2015.

Jason Roberts (Shell) provided the background to the initiative. Labor productivity is flat to downhill in oil and gas (it is rising in other verticals). Time spent on engineering design increases steadily while the oil price crash has impacted project viability. On safety, Roberts cited the Marsh study of the 100 largest losses in the hydrocarbon industry (1972-2009) which initially led Shell to think it was doing a good job. A refinery fire with fatalities was a wake-up call when it emerged that unclear ownership meant that some lines were not in anyone’s corrosion inspection systems. Elsewhere untagged valves had corroded – they too were not in the maintenance system. In answer, Shell developed EIS, an internal engineering information specification that was a forerunner of Cfihos. Feeding EIS meant addressing the problem of information handover. Supplier data comes in different formats and tools don’t talk to each other. Despite some contracts that specify ‘handover information in ISO 15926,’ nobody knows what that actually means! Meanwhile data volumes grow. Kashagan has around a million tags and as many engineering documents, even though the original design was for around 15,000!Cfihos means that OOs can tell EPCs what is required to be handed-over. Operators need to be able to tell their procurement people ‘don’t go for the cheapest contractor because their IM is a mess!’ Shell’s push for a standard was inspired by the 2010 IOGP position paper on the development and use of international standards. Roberts observed that there are big gaps in ISO 15026. But one solution is not to ‘put everything in a standard.’ It’s better to check if there are existing standards (eg ISO 13706 IS for heat exchangers) that can be used. Cfihos goes beyond data centricity, adding rules for data context, values and ranges. There is also a standard taxonomy for documents and metadata, the ‘initial discipline document type.’

Josh Vincent (Chevron) described the Cfihos philosophy as follows, ‘we build the same types of assets and buy the same types of equipment, so there is no reason we should have different ways of describing them.’ Of course, this has been tried before, notably with the Posc Caesar Association’s ISO 15926 standard, a reference data library of ‘all of the engineering terms and objects required to design, build and operate an engineering facility.’ But Part 4’s scope has maybe proved a bit ambitious. Cfihos’ scope is to be limited to around 600 items that will be needed for handover. Such scope reduction is possible since it is only necessary to describe what is in place at handover as opposed to what might be required in the design phase. The spec will also make for a more consistent way for owner operators to ask suppliers for information.

Paul Charles presented Total’s ‘Quantum’ Virtual Plant initiative, a new digital twin approach that shares the same objectives as Cfihos. Charles believes that Cfihos could need some rationalization and an improved data model. Total is prepared to share its RDL work with Cfihos to maximize alignment between the two RDLs. Combining Total’s RDL (and Exxon’s) will likely need to wait on Cfihos 1.3.

Charles Samkin presented Amec/Foster Wheeler’s (AFW, now a Wood Group company) asset information hub (AIH), another ‘digital twin’ of the physical asset. The AIH is hosted by Amec using the Aveva Net platform said to be ‘based on’ Cfihos and integrated with Amec’s software portfolio. Aveva Net serves as a single source of the truth for other compliance and integrity applications deployed at Amec. All data, documents and models share the same platform. The solution was co-developed with BP and is now used by Amec world-wide and shared with customers. Today’s handover specs are many and varied. Cfihos is expected to ‘fix the historical mess.’

Andy Davidson took over to explain how AFW has used Aveva’s Information Standards Manager (ISM) to evaluate Cfihos V1.2. ISM is a ‘powerful tool’ for managing different standards and for communicating with clients. A grid of ‘permissible attributes’ locks down what information can be exchanged, using an ‘extended version’ of Cfihos. ISM is a gateway to 2D/3D schematics, piping and instrumentation diagrams (P&ID) and other Aveva tools. ISM does data validation for attributes including units of measure. There is also a gateway to Bentley’s ProjectWise construction information management system. The hub has an interface for inputting data from the Cfihos Excel spreadsheets but, as Davidson opined, ‘Whenever I see a stack of Excel spreadsheets there is always the chance of getting something wrong.’ Such reticence over the data format that has been selected by Cfihos was shared with others in the software community.

Elbert van der Bijl presented Yokogawa’s approach to information exchange on automation projects. Yokogawa’s measurement systems for field instrumentation and production control must communicate with third party systems such as OSIsoft PI and SAP. Information exchange is usually Excel-based or on PDFs which are ‘still unfortunately the state of play.’ There is often a lot of missing information and a tendency to ‘focus on capex over opex.’ The instrumentation business also comes with its own standards, ‘Ecl@ss and BMECat. Control systems vendors deliver in their own software tools. Yokogawa has its Centum VP design suite, ADS master database and FieldMate validator. The objective being smooth commissioning, field acceptance testing and startup. Recently OOs have asked Yokogawa to ‘take care of the information lifecycle.’ This is a ‘free competitive area where all are looking to help the OO.’ For Yokogawa, standardization is driven via the IOGP’s standards subcommittee, by ISO/IEC and, in Norway, by EPIM with its STI data sheet standard.

Elsewhere, Namur’s NE150 standard for e-data exchange has been used to exchange data with Siemens’ toolset. Most of this is happening in the chemicals business where a different set of standards are being developed to do much the same things as Cfihos, making life harder for companies that work in both verticals! van der Bijl expressed some polite frustration with the fact that contractors build and model ‘logically’ and then ‘someone says hand over in Excel! What do you do?’ Then there is the issue of modern instrumentation that produces big data – maybe 500 parameters for a single flow meter (maybe too much for an OO!) There is also a lot going on in the IIoT space, notably the joint OpenGroup/Exxon Mobil standard for control systems. A ‘very aggressive approach designed to put us out of business!’

Paul van Exel (USPI) described the complex relationships between the ISO TC184/SC4 committee, Posc/Caesar and ISO 15926. The units of measure update took five years and is still not published. There are issues with different conveners, time and a lack of dedicated resources. Then there is the thorny question of who owns the standard. van Exel recommended that members make representations to ISO TC184 about their desire to see a properly maintained Cfihos standard. Progress on Cfihos is by and large on track. There is now a need to delivered quick wins and documentation, perhaps with a ‘Cfihos for Dummies’ style publication. The activity has grown to the extent that a part time project manager is required. Release 1.4 is slotted for delivery in October.

Anders Thostrup (Shell Global Solutions) led a discussion on widening Cfihos and on making it easier to understand and deploy. Scope could extend to plant and process, spare parts, document content as data (hard to do), materials management, full design data, datasheet content, 2D/3D ‘maybe for our children!’ Thostrup recently visited Sakhalin where a successful data handover leveraged an earlier version of Cfihos-type technology. ‘All stand to benefit from quality information.’

Vic Samuel (Chevron and IOGP) sees opportunity in the downturn for a renewed push for standards. Information management standards can ‘reduce exposure to legal, safety and competitive risks.’ Cfihos can be seen a fitting with the UK Government’s mandate for building information management aka ‘BIM.’

USPI is working on a memorandum of understanding for the use of Datum360’s CLS360 engineering information management tool for use in the project. Others in the software vendor community, Aveva, Phusion, Intergraph are also tracking the project closely.

Comment - Previous attempts in this space (PCA/Fiatech) have not been successful. Cfihos’ approach of a reduced information set looks promising. The use of Excel (actually CSV files) is a double edged sword. Easily understood by the engineers, but less than state of the art IT-wise. In one sample data set we spotted a cell containing ‘6 DEG C’ for a temperature value. Clear, but awkward for data ingestion! The geeks see JSON as the way forward. But the geeks have form… remember RDF?


ALRDC Gas Well Deliquification Workshop

Theta Oilfield Services on the ‘pendulum’ of AI in artificial lift. Encline on what the Internet of Things offers the oilfield. Colorado School of Mines creates production test labs.

The Artificial Lift R&D Consortium’s 2017 Gas Well Deliquification workshop took place earlier this year in Denver. Theta Oilfield Services’ Terry Treiberg traced the ‘pendulum’ of artificial lift software beginning with the majors’ in-house developments of the 1980s and 1990s. The early 2000s saw the emergence of platforms from Wonderware, Case Services, Cygnet and XSPOC. Followed by data warehousing historians, notably OSIsoft’s PI System, along with visualization tools.

Treiberg noted the ‘lure and disappointment’ of analytics with a proliferation of machine learning, predictive analytics and IoT offerings with exaggerated claims for ‘deep and predictive insights.’ The problem with these approaches is statistical models alone are ‘average at best, and useless in many cases.’ Domain knowledge is needed and data must be correlated, interpreted and understood in context. Statistical methods can help but they are at their best when combined with proven engineering algorithms. Treiberg illustrated such domain smarts with a variety of downhole pump dynamometer card signatures illustrating different situations. Artificial intelligence has a role to play here in a hybrid approach that combines scada data flows and analytical engines with ‘customer-configured, open, extensible and integrated tools.’

Bill Elmer (Encline Artificial Lift Technologies) asked ‘what does the internet of things mean to the oil and gas industry?’ The IoT’s goal is to turn everyday production equipment into a smart device that allows a lease operator to remotely ‘check-in’ to equipment to monitor indicators of interest from a webpage or app. IoT reality is closer to the E&P industry than you might think. Devices communicate using Modbus TCP and conventional scada programs can pull or push data and communicating using remote procedure calls. This provides the ability to perform engineering calculations at the wellsite and to create equipment KPIs, alarms or shutdowns based on calculated indicators. But what of security? Elmer warns against listening to the sirens of the IT industry. ‘You don’t need a massive cloud-based system.’ Operators just need to maintain private internet networks behind firewalls that are ‘never in contact with the cloud!’ ‘Keep your data private and never share outside of the internal network!’

The ALRDC meet also heard from Rosmer Brito on a proposal to create a production laboratory at the Colorado School of Mines. The School is looking for relatively small donations to create test labs covering downhole and surface facilities. More on the CSM proposal here.

More presentations from the ALRDC.


Folks, facts, orgs ...

ABB, Aqualis, Bahwan CT, Barco, Ceiba, CH2M, BRWWO, Circor, Devon, Entero, Foster, GE, Inpex, KBR, Kleinfelder, Kongsberg Digital, Michael Baker, Maana, MacGregor, Implico, PIDX, Ansys, C-Core, OGC, Oseberg, Precision, QS Energy, SAP, Subsurface Consultants & Associates, Shell, Schlumberger, Spectris, TRC, Ziyen, Alberta TSB, Warren Business Consulting, Assai, USPI.

Chun Yuan Gu is now president of ABB EMEA region. He takes over from Frank Duggan, who succeeds retiree Bernhard Jucker as president, Europe.

Tim Ho heads-up Aqualis’ Taiwan office.

Clay Harter has started a new business line at Bahwan CyberTek focused on selling and customizing Tibco/OpenSpirit-based solutions to the oil and gas vertical. Harter was previously with Tibco.

Mike Benson is the new national director of sales at Barco’s control rooms business.

Richard Lane has stepped down as director of Ceiba.

Tawny Chritton (CH2M) is chair of the Building Responsibility Worker Welfare organization.

Tina Donikowski has been appointed to Circor International’s Board of Directors. She retired from GE in 2015.

Jeff Ritenour has been promoted to EVP and CFO at Devon Energy.

Gary Gonzenback is now senior industry advisor in Entero’s Houston office.

Lindsay Brown has been promoted to event coordinator at Foster Marketing.

GE Oil and Gas has opened a new facility in Takoradi Port, Ghana.

Hiroshi Takuwa has been appointed to a senior role at Inpex’ business planning and strategy division.

Richard Slater is to retire from KBR.

Louis Armstrong has joined Kleinfelder as EVP and west division director.

Michael Link is now VP of advanced analytics and machine learning at Kongsberg Digital. He was previously CIO at Opera software.

Michael Baker International has named Lisa Roger EVP and CIO. She hails from SC3. Leanna Anderson (previously with ServiceLink) is EVP and chief communications officer.

Maana has appointed Roop Lakkaraju as CFO, Steven Gustafson as Chief Scientist, Len Emmick as CSO and Azita Martin as CMO.

Jan Finckenhagen heads-up MacGregor’s (part of Cargotec) new training academy and VR showroom in Arendal, Norway.

Michael Martens (Implico) has been appointed to the PIDX membership committee.

Matthew Zack is VP of business development and corporate marketing at Ansys. He was previously with SAP.

C-Core has named Mark MacLeod as president and CEO. He succeeds retiree Charles Randell.

Luis Bermudez heads-up the OGC’s innovation program (previously the interoperability program).

Rich Herrmann is product director at Oseberg.

Robert Phillips is retiring from his role as chairman of Precision’s board. He is succeeded by Steven Krablin.

QS Energy has appointed Jason Lane as CEO and chairman.

Bjoern Goerke is now CTO at SAP.

Jim Granath, Catalina Luneburg and Jill Almaguer have joined Subsurface Consultants & Associates as instructors.

Shell has opened a new ‘major’ technology hub in Bangalore, India.

Juan Carlos Picott heads-up Schlumberger’s new Houston production technology center of excellence.

Spectris has named Karim Bitar as non executive director.

DeWitt Burdeaux and Lane Miller are now training and market directors at TRC.

Shane Fraser has joined Ziyen’s board of directors to lead the new Oil Intelligence division.

The Alberta Transportation Safety Board has deployed a team of investigators to the site of a pipeline ‘occurrence’ at a storage facility near Edmonton, Alberta.

Warren Business Consulting has published the 2017 edition of its Survey of talent management in oil and gas.

Elbert van der Bijl is to represent Assai on the USPI board.


Done deals

Aker, Atos, Badger, Concentric, Divestco, Emerson, Esia, Westwood, Erza, NOV, ESG, Trican, PDI.

Aker Solutions has bought oil-services provider Reinertsen in a NOK 212.5 million deal that excludes liabilities.

Atos has acquired big data consulting and solutions provider zData.

Badger Explorer is to acquire Dwellop in a NOK 190 million cash and paper transaction.

Concentric Energy Advisor is to purchase the assets of Gannett Fleming Canada’s Calgary depreciation and valuation practice.

Divestco has negotiated a $CAD 6 million secured loan with BC-OSB Holdings at an interest rate of 17% (!) per year. Proceeds will pay down a $3.2 million bridge loan and augment working capital.

Emerson has completed its $3.15 billion of Pentair’s valves and controls business.

Private equity backed ESIA has regrouped its energy research, analysis and consulting units (Hannon Westwood, Novas, Richmond Energy Partners, Douglas Westwood and JSI Services) as Westwood Global Energy Group. The company also recently acquired Houston-based market research firm, Energent Group Software.

Singapore-based oilfield services firm Ezra Holdings has filed for US Chapter 11 bankruptcy, blaming a ‘prolonged slump’ in the energy industry.

Flotek Industries has sold its drilling technologies segment to National Oilwell Varco in an approx. $17 million deal.

Spectris recognized a £19.9 million impairment charge on its ESG microseismic business due to ‘difficult market conditions caused by a low oil price.’

Trican Well Service is buying Canyon Services in an approx. $CAD 637 million deal.

PDI has acquired DataMax, Lomosoft and FireStream WorldWide.


IBM/Stone Ridge ‘bests’ Exxon reservoir simulation record

Performance milestone claimed for IBM Power System with Nvidia Telsa P100 accelerators.

IBM and Stone Ridge Technology claim a ‘performance milestone’ for a reservoir simulation run. The test run was performed using Stone Ridge Technology’s Echelon reservoir simulator to run a billion-cell job in ‘only 92 minutes.’ The hardware was cluster of 30 IBM Power Systems S822LC for HPC servers equipped with 60 Power processors and 120 Nvidia Tesla P100 GPU accelerators. IBM compares this with the previously published ‘record’ run reported by ExxonMobil on the NCSA’s Blue Waters supercomputer that used over 700,000 cores (Oil ITJ Vol 22 N°2). IBM worked with Nvidia to port the code to the GPU architecture and achieve ‘10x the performance and efficiency over legacy CPU codes.’

Stone Ridge president Vincent Natoli said, ‘This demonstrates the computational capability and density of a GPU-based solution. By increasing compute performance and efficiency by over an order of magnitude, we’re democratizing HPC for the reservoir simulation community.’ Sumit Gupta, IBM VP HPC added, ‘By running on IBM Power Systems, users can achieve faster run-times using a fraction of the hardware. The previous record used a supercomputer that occupies nearly half a football field. Stone Ridge did this calculation on a system that could fit in the space of half a ping-pong table.’ IBM and Stone Ridge present their achievement as a victory for ‘fully GPU-based codes’ over ‘legacy CPU codes.’ Your mileage may vary…


Zenotech, Epistemy, RFDyn team on cloud-based flow simulation

'Masters of uncertainty’ leverage ‘elastic private interactive cloud.'

David Standingford of Bristol, UK-headquartered Zenotech reports on trials of a cloud-based combination of reservoir fluid-flow simulation with optimization technology from ‘masters of uncertainty’ Epistemy. Jonathan Carter, former CTO with E.On, initiated the study that involved tweaking multiple input parameters to Rock Flow Dynamics’ tNavigator flow simulator to quantify model uncertainty and optimize well placement. tNavigator was controlled from Epistemy’s Raven Bayesian history matching and optimizing toolset using a 10-dimensional Latin hypercube. The trial involved 31 models and 2,000 runs, more than in-house resources could handle.

Carter turned to Zenotech to use its EPIC front-end to compute resources in the Amazon web services cloud. EPIC (elastic, private, interactive, cloud) provides ‘easy and secure’ access to AWS resources. Standingford emphasized the flexibility of cloud-based HPC in oil and gas and also the ‘huge economic benefits’ to accrue from improved modelling and simulation. The whole optimization task was completed in12 days. The AWS bill amounted to $750.


Emerson DLO embeds PI. New Roxar subsea wet gas meter

Dynamic lift optimizer at Cera Week big data tech ‘agora.' Microwave resonance salinity meter.

A new release of Emerson’s Dynamic lift optimization (DLO) software introduces an Industrial Internet of Things (IIoT) capability for cloud-based deployment. DLO dynamically adjusts lift gas flows or electric submersible pump speed based on the latest well test data. The application adjusts lifting power to maximize production and adjusts to well shut-ins and compressor trips. A typical 10% production improvement is claimed for the technology. DLO now embeds OSIsoft’s PI System data infrastructure and historian.

Emerson presented its integrated operations (iOps)/IIoT offerings at Cera Week earlier this year in an ‘ROI from big data technology agora.’ An iOps command center showcased IIoT-based shale gas operations, ‘expanding digital intelligence to the entire manufacturing enterprise.’

Emerson also recently announced a new microwave resonance technology-based salinity measurement system for its Roxar subsea wet gas meter for real-time salinity measurements for high gas volume fraction/wet gas flows.


Connecting ‘dark assets’ to IBM Watson

Prismtech/Intel team on edge computing solution.

Speaking at IBM InterConnect 2017 in Boston, CTO Toby McClean described how PrismTech’s Vortex Edge connectivity solution is ‘Lighting up dark assets’ connected to IBM’s Watson internet of things platform. Prismtech has teamed with Intel on an edge computing solution that reduces the effort and resources required to acquire data from dark (i.e. unconnected) assets. Data can be either edge processed using Prismtech’s predictive analytics and/or forwarded to IBM Watson for ‘cognitive services.’ Vortex Edge provides a fully integrated hardware, data connectivity and analytics package, turning Industrial IoT data into real-time predictive maintenance insights and actions. More nifty apps on show in the Innovation Factory Playroom.


Sales, deployments, partnerships ...

Opto 22, IBM, Geospatial, Google, TomTom Telematics, T-Systems, Ampelius, Phusion, Aptomar, Saudi Aramco, ADNOC, McDermott, Dassault Systèmes, Elite Control Systems, Exova, FairfieldNodal, Ikon Science, Vepica, Intergraph, Lloyd’s Register, Gosco, Software AG, OSIsoft, PDS, RPSEA, SEAM, SAP, Google, Siemens, Schlumberger, Weatherford, Terrabotix, Geological Remote Sensing Group, Trendsetter Engineering, Add Energy, Yokogawa, USPI.

Opto 22 has partnered with IBM to connect industrial equipment to the Watson Internet of Things ecosystem.

IDS DataNet is to sell its drilling technology through Kongsberg’s KognifAI online app store.

Geospatial has signed a technology partnership agreement with Google to advance its GeoUnderground service of buried asset cartography.

TomTom Telematics and BP are to collaborate on a fuel and driver management solution.

Shell has renewed its global master services agreement with T-Systems for hosting and storage services through 2022.

Ampelius and Phusion are to combine the former’s oil country equipment and parts trading service with Phusion’s ‘colossal’ engineering data resource.

Total has selected Aptomar’s tactical collaboration and management system for environmental monitoring and oil spill detection at its Martin Linge field.

Saudi Aramco has signed a memorandum of understanding with ADNOC to ‘improve operational performance and efficiency across the oil and gas value chain.’

McDermott is to deploy Dassault Systèmes’ 3DExperience design software.

BP has renewed its process control software contract with Elite Control Systems for the Azeri Chirag Guneshli oilfield in the Caspian Sea.

Exova is to provide a range of testing services for McDermott’s offshore fabrication projects in the oil and gas sector in the UAE.

FairfieldNodal and Ikon Science have partnered to offer complete reservoir services to the oil and gas industry.

Vepica is to implement Intergraph’s design software at its Davis refinery project in North Dakota.

Lloyd’s Register and Gosco are to team on the provision of well project management, engineering and other services offshore Ghana.

Software AG is to augment its digital business platform with OSIsoft PI System data.

PDS has expanded its collaboration agreement with the University of Leeds to support joint technology development.

RPSEA and the SEAM have signed a memorandum of understanding to combine resources and jointly pursue research projects.

SAP and Google have teamed to serve SAP Hana from the Google cloud platform (GCP). GCP is to offer an automated provisioning capability of certified Hana instances with enterprise-grade security, high availability, disaster recovery, and scalability. SAP is also now a global reseller of Siemens’ EnergyIP meter data management solution,

Schlumberger and Weatherford have formed OneStim to deliver completions products and services for North American unconventional plays.

Terrabotix is now member of the Geological Remote Sensing Group.

Trendsetter Engineering and Add Energy are to supply engineering expertise and access to Trendsetter’s Rwis (relief well injection spool) to the unnamed operator of a newly-sanctioned field development.

Yokogawa is now a USPI member.


Standards stuff special - the birth of a protocol

Well test software consultant Laurence Ormerod tells Oil IT Journal how PTAML (pressure transient analysis and data transfer standard) originated and ultimately fused with Energistics’ ProdML.

The pressure transient analysis and data transfer standard was initiated by Schlumberger that was looking for a way of importing well test data into Petrel. The well test is a key data set for reservoir diagnostic. A high end deep water well test can cost $50 million and take a month of rig time. Schlumberger picked Kappa’s Saphir as the source for data transfer, although initially Kappa was not involved in the project.

Schlumberger’s Tony Fitzpatrick turned to Energistics for help on the PTAML standard. Which is when I got involved as having a good knowledge of well testing and the Energistics approach to open source software.

We decided to build in the Prodml environment rather than Resqml which is more concerned with the 3D reservoir model, with grids and well trajectories. Well testing is an appraisal stage activity and is primarily concerned with production tests although the pressure transient information is used in both environments. Ptaml, like ExxonMobil’s PVT data standard, also wound-up in Prodml for a variety of administrative reasons to do with Energistics’ workgroup organization. Some Resqml data is included in the standard such that it can transfer for instance, partial grids and fault elements which may be important to the interpreter.

Later on, Kappa was involved, helping to test the data transfers and to fine tune the scope of the data model. Kappa and Schlumberger from then on pooled their resources on the development – still with the objective of donating the software to Energistics. Kappa helped expand the model to cover more facets of the well test process – this is now quite comprehensive.

Initial development centered on the earlier PRODML 1.2 release. We are now working with Energistics to align with PRODML 2.0 and the recently released Common Technical Architecture (CTA). Energistics is also soliciting interest from its member operating companies for real-world testing of the new protocol. Operators will also be able to add some new requirements to the standard. The first PTAML release will likely be in PRODML 2.1 out late 2017/early 2018.

More from Energistics.


WIB, the Dutch process automation users’ association, seminar

LeiKon on context-based online data management in process industries. ISPT on energy optimization with Shell’s GMOS network simulator. More on ExxonMobil’s new process control standard.

In his keynote address to the annual public seminar of the Dutch WIB process automation users’ association held recently in Den Hague, Udo Enste (LeiKon/Namur) observed that, since the introduction of process control systems, industry has been trying to standardize processes in the face of growing demands for asset management solutions, condition monitoring tools, energy efficiency and now something else … operating performance. This has its own KPI, the performance index (PI), defined as the benefit gained, over the effort required to implement a new system. Different PIs can be combined, with weights, to produce an overall performance measure. Such measures can be applied over the long term, ‘big loop,’ for plant redesign, at the medium term, for maintenance and on a daily basis to adjust set points. LeiKon’s bad actor analysis is used to analyze maintenance activity (using word counts on SAP messages) and connect the results to assets whose replacement would be costly. A LeiKon word cloud showed leaks as a major item. LeiKon’s technology is used in the EU-sponsored MORE project (more efficiency for Europe).

Enste introduced the concept of ‘context-based online data management’ (Cbodm), a new application of IT in the process industry. KPIs are easy to calculate using existing DCS, SCADA and other systems. But to do this site-wide or across multiple sites requires a standard information model along with data analysis and data context. Enter an activity that captures a holistic picture of connected resources and dependencies. Cbodm provides process-oriented context to online and historical data sources, adding generic algorithms for missing values (mass/energy balance) and adjusting KPIs when conditions change. Ineos is the poster child for the Cbodm approach. Here some 7,000 process flows and 2,000 equipment objects are connected to PI data. The goal is to model 30 Ineos plants by next year in a ‘drag and drop’ design framework.

Andreas ten Cate (ISPT.eu) observed that many things are changing in the process industry with energy transition and digitization. ISPT is working with Shell on energy optimization using Shell’s AIMMS GMOS/NETSIM network simulator and supply chain optimizer. This includes a portfolio of data-driven plant model templates. These can be populated with local plant information and run to get capex/opex and SRI reports. Agent-based modeling also ran (NorthWestern/CCL’s NetLogo got a plug here). The EU SPIRE 2030/cognitive plants project also ran.

Bert de Wilde presented ExxonMobil’s joint venture with The Open Group to develop a new standards-based, commercial replacement for today’s obsolete refining/chemical DCS fleet. This is said to be a $5 billion project that should reach maturity in 2021.

Read our previous coverage of the Exxon standard here and visit The Open Group. In the Q&A de Wilde explained that the Exxon protocol will go ‘way beyond a simple comms protocol like OPC-UA.’ It will leverage the complete Purdue horizontal architecture and enable ‘more intelligence in the field.’ The project has support ‘from the highest level in the company, the business directed us to do something different.’

More from WIB, the Werkgroup voor Instrument Beoordeling.


ITVizion addresses nightmare of cross-system synchronization

Blogger Alex Ivasu on IT Vizion’s ‘generic, self-sustaining' PI AF meta-model.

A recent blog posting by IT Vizion’s Alex Ivascu asks ‘why is cross-system model synchronization still a nightmare?’ ERP systems, maintenance management systems, data historians and other applications need to embed an information hierarchy from tags, across sub-assemblies, right up to the level of the plant. Such systems likely had an accurate version of the model when first implemented but as the plant and processes evolve, things can easily get out of sync.

One of IT Vizion’s oil and gas clients needed an asset model for its maintenance management system (MMS). One possibility was to build the model in the MMS, but most will not support proper modeling or integration with other apps.

Instead, IT Vizion has created a ‘generic self-sustaining’ model in OSIsoft PI Asset Framework, capturing work performed by the clients engineers to convert process drawings into intelligent P&ID’s. PI AF now supports management of complex process models, based on class templates and tied into the company’s financial systems. An ‘intelligent’ tag naming convention will also tie the model into CAD-defined assets, Microsoft SSIS and Siemens XHQ framework. The expectation is now that ‘the nightmare of cross system model synchronization will be a thing of the past!’


LR deploys Axxim risk profiler on Angolan Olombendo FPSO

Bumi Armada Berhad’s floater gets corrosion baseline inspection for reliability centered maintenance.

Lloyd’s Register has won an inspection contract with Bumi Armada Berhad (BAB) work on its offshore Angolan Armada Olombendo FPSO. The agreement covers LR’s risk-based inspection services and Axxim software which is used to develop quantitative risk profiles and to support BAB management team’s short and long term CAPEX/OPEX decisions.

Axxim provides risk-based inspection, reliability-centered maintenance, root cause analysis and more. The project includes corrosion risk assessment and risk-based inspection expertise for the hull, topside structures, pipelines, turret, mooring system, pipework and pressure safety valves. The initial six month contract will provides a baseline and template for BAB’s subsequent inspection and maintenance programs.


SkyX launches pipeline drone

Visual, infra-red and multispectral VTOL sensor platform operates at 150kph for 70 minutes.

Ontario-based SkyX has announced an unmanned aerial vehicles (UAV/aka drone) to serve the ‘multi-billion’ pipeline monitoring market. SkyOne drones offer a vertical take-off and landing capability, autonomous flight and recharging. SkyOne can travel at 150km per hour for 70 minutes looking for leaks, vandalism, vegetation encroachment and more. SkyOne innovates with on-site recharging at strategically located ‘xStations’ housed in weather-shielded domes.

SkyOne can be equipped with a range of sensors including visual, infra-red and multispectral. The SkyX OS software provides at-a-glance monitoring of a single drone or a whole fleet, flagging anomalies in real time for rapid response or further investigation.

SkyX reports that the current oil and gas pipeline surveillance relies on road vehicles and helicopters to detect damage and threats. SkyOne brings a 24/7 capability for real time collection of a ‘far wider scope’ of information. The solution is provided through a leasing service model. SkyX can remotely control either an individual UAV or an entire fleet, monitoring events with real-time video.

SkyX launched in April and is seeking Round A financing to address the estimated $37 billion per year monitoring market. More from SkyX.


DNV GL studies wind-powered offshore water injection

WinWin project moves from drawing board to lab test of physical demonstrator.

DNV GL reports completion of phase one of its ‘WinWin’ joint industry project. WinWin set out to assess the feasibility of using wind power to provide energy for offshore water injection. WinWin plans to use an ‘off-the-shelf’ commercial floating wind turbine adjacent to an oil or gas field to produce electrical energy to power pumps and treatment units. Phase 1, completed in 2016, was a desktop technical and commercial simulation performed by DNV on behalf of the JIP partners Statoil, ExxonMobil, VNG Norge, ENI, Nexen, Flow Solutions and Catapult Offshore Energy. Phase 1 results indicate ‘significant potential’ for the technology and DNV is now kicking off phase 2 to validate the simulation findings with a physical demonstrator. This will perform lab testing of the electrical systems at the DNV GL power laboratories in Arnhem, Netherlands.

Three of the original partners (DNV GL, ExxonMobil and ENI Norge) are moving forward on phase 2 and are joined by the Norwegian Research Council. DNV ‘encourages industry to jointly assess the risks and validate the concept.’ Project sponsor Johan Sandberg said, ‘WinWin has shown great potential for the oil and gas industry to lower costs and increase efficiency, while also reducing its environmental footprint.’


GE ‘digitalizes’ Noble rig fleet. Meridium, ServiceMax on Predix

Partners announce ‘Digital Rig’ powered by Predix. Asset management meets field service software.

GE and Noble Corporation are teaming on a ‘Digital Rig,’ leveraging GE’s Marine asset performance management solution. The Digital Rig will enable data-driven efficiencies with the target of a 20% reduction in operational expenditure. GE is to deploy its latest marine Asset performance management (APM) system, ‘powered by Predix,’ on four of Noble’s rigs. The partnership will provide visibility of equipment anomalies and drilling process deviations and will enable a shift to predictive maintenance.

Noble president David Williams said, ‘The shift to data-driven decisions will have a significant effect on drilling efficiencies. Our industry must embrace the digital revolution to stay efficient and nimble, and Noble is leading the way.’ APM combines ‘digital twin’ data models with advanced analytics to detect signs of potential failure or performance degradation sometimes weeks before a problem occurs.

GE also recently announced the combination of its APM portfolio with Meridium, the asset management solution that it acquired last year. GE is now working to integrate field service specialist ServiceMax’s technology (acquired last year for $915 million) within the GE Digital/Predix umbrella. For more on Meridium/APM/Predix read our report from the 2017 GE Oil & Gas Annual Meeting.


Intelligent Plant’s Industrial App Store

Astrimar, IDS, Leidos, Siemens and Wood Group tools join IP’s 'Gestalt Alarm Analysis'

Speaking at the UK ITF* showcase in Aberdeen recently, Steve Aitken presented Intelligent Plant’s industrial app store. The IP app store lets operators share technologies and access a market of apps from vendors including Astrimar, IDS, Leidos, Siemens and Wood Group. Vendors can publish apps and ‘get paid as they are used without deployment and maintenance costs.’

Recently announced apps include failure mode and criticality analysis (Femca) preparation and management tools from Astrimar and Wood Group Kenny and augmented reality-based support for field workers from Essert.

IP, an MS Azure and OSIsoft partner, also helps operators make use of the data they already have and claims to ‘understand time-series data.’ IP’s own app is also available on the app store and provides a ‘a consistent set of key performance indicators compliant with EEMUA 191 Rev 3.’ This helps pinpoint ‘bad actor’ equipment types from a consolidated enterprise database of alert and event information from other applications. IP has also developed an interface that allows OSIsoft PI to query the archive and trend or display the results.

* Industry technology facilitator (UK).


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.