Oil IT Journal: Volume 21 Number 4


Chevron’s wall-to-wall PI

100 year old San Joaquin Valley field puts PI through its paces in comprehensive automation effort. Upstream Foundation embeds PI Server, a central PI ‘collective’ and secure OT/IT link.

Speaking at the 2016 OSIsoft users conference in San Francisco this month, Neel Chakraborty showed how Chevron has unified production data across its San Joachim Valley business unit’s assets including the 100+ year old Kern River field*. Chevron’s data challenge was a familiar one, that of multiple process data repositories, from Excel/Access to various vendors’ historians.

Back in May 2008 we reported on the first stirrings of Chevron’s Upstream Foundation which now includes a template for the use of OSIsofts PI System for process data unification. In 2009, Chevron specified PI Server as its ‘common architecture for process control data utilizing an aggregate process historian throughout upstream business units.’

Before the UF was implemented Chevron had multiple connections across the OT/IT firewall, between various scada servers and PLCs in the field and a large number of endpoints (workstations and mobile devices) in the business network. The plan is to replace all of the above with a central scada PI ‘collective’ on the operations side of the firewall with a single secure connection to a mirrored ‘business PI collective.’

Early on, Chevron recognized that equipment tag naming was critical to the project’s success and that the UF was a great opportunity to fix this, with a consistent naming convention. A structured ‘information objective analysis’ process was developed to rationalize end-to-end tag mapping. This included establishing meta data requirements and the development of PI asset framework models of groups of tags, all of which was discussed and approved by stakeholders before deployment. We understand that Industrial Evolution (now a Yokogawa company) designed and implemented the enterprise-wide historian and helped with the tag standardization. This process started in 2011 and it took Chevron a further three years to complete the unification of all its sub systems.

Alongside the data historian role for PI, Chevron uses PI event frames to capture non-time series data such as information from automated well tests and cyclic steam flooding.

The large community involved in the project meant that attention to organization was key. A three tier governance model was developed, involving all stakeholders. The project also saw structured knowledge transfer from the PI specialists in the contractors to Chevron personnel. Chevron is now looking to implement predictive analytics and to extend the infrastructure to embrace ‘future data.’ Read the paper here. More from OSIsoft next month.

* The 1899 Bakersfield/Kern River discovery is said to have inspired the 2007 movie ‘There will be blood.’


More on Macondo

US Chemical Safety Board finds that latent design faults in the BOP, ‘vague’ safety roles and poor risk management contributed to 2010 blowout.

The latest (Volume 4) in a series of reports from the US Chemical Safety Board’s investigation into the Macondo blowout includes lessons for the regulator. Previously, legal considerations limited the flow of information from BP and Transocean.

In 2016, testimony from rig personnel who had previously invoked the 5th amendment has brought new insights. The final phase of BOP testing revealed latent failures in the BOP, even before it was deployed to the wellhead. Other records have shed light on the interplay between BP, Transocean and the regulator. The CSB found ‘vaguely-established safety roles and responsibilities that affected human performance and major accident risk management.’

In its 130 page report, the CSB judges that while the April 2015 BSEE* well control regulations and pilot risk-based inspection program are ‘positive efforts,’ they do not constitute an adequate framework for major accident prevention. An improved approach is needed to avoid another Macondo-like event and to reduce risks to ‘as low as reasonably practicable.’ The CSB found that ‘a culture of minimal regulatory compliance continues to exist in the Gulf of Mexico and risk reduction continues to prove elusive.’

* Bureau of Safety and Environmental Enforcement.


The outlook? Cloudy.

Neil McNaughton quits kicking the GE Predix can down the road and takes a look at the Pivotal/Cloud Foundry-based platform as a service. Back from the EU SAP in oil and gas tradeshow, he reports on another oil and gas-relevant cloud platform. And this month’s actualité brings yet a third from National Oilwell Varco! Will the proliferating platforms bring micro services-based IT nirvana? Or will the obstacle of ‘indirect access’ to data prove insurmountable?

As I have been promising for a couple of months now, I am going to take a closer look at GE’s Predix offering and other similar uses of big data science to optimize equipment maintenance. I have been kicking the Predix can down the road for the past three issues so here is what I have gleaned from GE’s website and our own observations on how the Predix concept has evolved over the last couple of years.

We first came across Predix in 2014 in the context of predicting machine failure. Predix was the platform that supported GE’s ‘Predictivity’ offering. But how did ‘prediction’ become a ‘platform?’ My guess is that the marketing folks may have put the platform cart before the predictive horse but I digress.

Anyhow a visit to the Predix home page certainly clears up what the platform is, a mechanism for connecting distributed machines and sensors to the cloud. How is this done? Predix, at least in its current manifestation, builds of top of Pivotal’s Cloud Foundry. Which, Wikipedia informs me, is an open source, cloud computing platform-as-a-service, originally developed by VMware and now owned by Pivotal Software, which is a joint venture between EMC, VMware and GE itself.

Users of Predix therefore get two things. On the one hand they get the data access and information needed to support engineering and operations. On the other hand, they get some hand holding from GE’s cast of thousands of developers as they venture into the brave new world of the cloud. A quick spin through Predix’s software components suggests that such help may be appreciated. And it is always a good idea to offer something for both the engineers and for IT to get their teeth into.

As an aside, and as we observed before, the cloud, along with containerized services, notionally make it possible to run your software across multiple service providers, all supplying ‘micro services.’ Turning today’s monolithic apps into decoupled micro services is another area where considerable hand holding may be required. The future of Paradigm’s reservoir-driven production optimization may be worth watching in this context. On the other hand, the soup-to-nuts nature of the platform may be scary and oils are unlikely to countenance handing over their whole IT stack to GE. But if it does work and provide access to PI, to SAP, to this and that and if, as we understand, GE has 800 developers working on Predix in oil and gas the option may be attractive.

I just got back from the EU SAP in oil and gas conference (more on that next month) where I learned that SAP is also offering data science-based analytics. ‘Yes,’ I hear you say, but SAP is about finance. ‘No,’ says SAP which has just offered its high end Vora analytics to the (real) scientists operating the Square kilometer (radio telescope) array. The endgame for SAP is similar to GE’s. In fact there is considerable overlap in the MMO* space between the two companies’ offerings. SAP also offers a platform with its Asset intelligence network and repository.

In a more modest and focused example, National Oilwell Varco is offering remote BOP monitoring and predictive analytics to users of its subsea blowout preventers. Behind NOV’s RigSentry BOP monitoring service is ‘Max’ NOV’s very own industrial data platform (see page 12).

So now, after a short spin through three service providers offerings, we have three ‘platforms.’ Those enamored of the cloud and ‘micro services’ will argue that this is not a problem, that all of the above will be consumable as micros services and that all your IT/data integration worries will be over. That supposes that the different platforms will allow for such seamless interoperability. Experience suggests that this may not be quite how things will work out. Cynics will remember a similar false dawn back in the days of business objects.

Another problem looming for platform users is data ownership. The issue of data ownership and data transfer across borders is often raised in the context of foreign countries with awkward regulations. But the issue of who owns data, who gets to keep it, and who pays for added value services is even more pressing with the kind of platform offerings above.

Claude Molly-Mitton, president of SAP’s French user group writing in Le Monde, addresses the issue of ‘indirect access’ to data. This has it that your data, once it has passed through SAP’s software, increases in value. Thus SAP expects you, the data owner, to pay (again) for such indirect use. This is not totally unreasonable as in some circumstances, indirect use can be a way of sneaking in extra licenses/seats to the software.

In the context of a supplier of equipment (compressors, BOPs) the notion of data ownership and re-use is also interesting. Even a large oil company only sees a subset of the data that comes in from a single vendor’s compressors. The vendor is in a good position to collate data across all of its kit, worldwide. On the other hand the operator has a significant amount of information about operating conditions that can inform decisions as to why a particular problem is recurring and what the root causes are. So the vendor can improve its knowledge from the operator and vice-versa.

While the big data idea catches the attention, and while the intricacies of programming access to data in the cloud are fascinating, the big issue here is less the size of your data but its ownership. The cloud magnifies the problem of indirect access to the nth power!

* maintenance management and operations.

@neilmcn

Oil IT Journal Interview, Ken Evans, SAP

The head of SAP’s oil and gas industry business unit comments on various client strategies for the downturn. He compares 'interoperability’ with ‘winner takes all’ IT solutions, the changes brought by the cloud and SAP’s asset intelligence platform and discusses the elusive goal of a ’single source of the truth.'

Last year you said, ‘longer term projects are not what’s happening.’ Are they now?

I see a bifurcation in the industry. Those who are too highly leveraged are going out of business. Others are just cutting everything. For some, this is an opportunity to benefit from the cycle, to consolidate and simplify operations. This is where SAP comes as a safe choice. 2014 was a record year for us in oil and gas and our cloud numbers are up.

These sales are mostly to larger shops?

We have over 300 new customers last year bringing us to 2,200 oil and gas clients. So no, we have a lot of smaller shops. There are 95 companies here in The Hague. We ourselves have 800 people in oil and gas.

What is the situation now regarding interoperability. Are companies still interested? Or is SAP’s strategy that the winner takes all?

Interoperability is something we have been talking about for 30 years or so. For network-focused companies like Google and Facebook it is definitely winner takes all. Today, interoperability issues stem from on-premises, independent, disaggregated systems. But with the cloud you can provide access to any object or service that you like. So yes, we are open, we are a platform company. But the truth is that there is less and less demand for this as apps provide and consume directly from the cloud. For us this means the asset intelligence network.

You still have to program your way into the platform. How does a developer work to integrate the SAP cloud? Do you have to roll up your sleeves and learn some esoteric language (I’m thinking ABAP)?

At the SAP TechEd we had high schoolers develop apps. To fully leverage the stack you may need some modern programming skills like Java. But we are trying to lose end user programming and replace it with self-service analytics etc.

Having listened in to a few presentations, it seems that your clients are still having to cope with multiple information silos…

The original promise of the ERP was and remains a single source of the truth for the enterprise. This is now being consolidated into the cloud.

But that is not the case for ExxonMobil and BP which have multiple SAP instances and seem to consider the single source of the truth as a stretch goal! Do any majors deploy a single global SAP instance?

Most do indeed have several SAP sources. Maybe only ConocoPhillips has a single global instance. Again the cloud should help here.

Which cloud are we talking about in general, Amazon, Azure?

All of the above are options but we do have our own, the Hana enterprise cloud. We also partner with IBM in fact we made an announcement last week.

Last year we noted that one major client was specifying Rest/Odata endpoints for data integration. What’s the thinking on this issue now?

The idea now is that data is not getting moved. In our hookup with OSIsoft, we are not moving bulk PI data into Hana. Hana manages the metadata and the data stays in situ.

Hadoop seems to be downplayed compared with last year...

There was probably more attention to Hadoop in previous years. This is mostly because our data science offering works across any data store.


Rice University high performance computing in oil and gas

PGS brings back the supercomputer for massive Triton survey. Univa trials Docker for seismics and finds its latency wanting. CGG announces GPUWrapper for AMD/Nvidia code portability.

In his keynote address to the Rice oil and gas high performance computing conference held last month in Houston, PGS’ Sverre Brandsberg-Dahl announced the return of the supercomputer in exploration seismology. For the last decade or more, clusters have performed the bulk of seismic processing, often described as ‘embarrassingly parallel.’ For PGS, this changed with the huge (10 million sq.km.) 2014 Triton Gulf of Mexico survey. This five vessel, high-fold, long offset, dual-sensor, broadband survey provides full azimuth data. To process the 660 terabyte data set, PGS built ‘Abel,’ a special purpose, massively parallel supercomputer, actually a 24 cabinet CRAY XC40. Abel is an all CPU machine (for ease of programming) and has a total 600 terabytes of memory.

Univa’s Ian Lumb looked at the potential of Docker’s container technology in oil and gas. Docker has performed well in proof of concept trials but the transition to production use requires containers to be incorporated into existing IT infrastructures and workflows. Docker users outside of oil and gas report easier replication, faster deployment and lower configuration and operating costs. But the technology is unlikely to support ‘latency intolerant’ MPI applications such as seismic imaging. On the other hand, the use of GPUs via Docker containers appears promising.

A team from CGG has been working on GpuWrapper, a cross platform API for heterogeneous GPU environments. Today’s oil and gas market is dominated by Nvidia GPUs and the proprietary Cuda programming language. The compatibility layer allows for device-independent programming across Cuda and OpenCL, and promises ‘maximum performance’ on both Nvidia and AMD GPUs from a single code base. GpuWrapper is said to bring a negligible performance hit and is ‘fully transparent’ for geophysicists. The company has over a petaflop of AMD GPUs successfully deployed in production and reports great performance on wave-equation modeling.

Bags more presentations and videos on the conference website.


CGG GeoSoftware Deploys Objectivity’s ThingSpan

Strata/Hadoop World presentation runs Jason Workbench data model on big data infrastructure.

Speaking at the Strata/Hadoop World big data conference in San Jose, California last month, CGG’s Marco Ippolito presented CGG’s high performance computing capability which includes a 10 petaflop GPU-based machine at one processing center (see page 3). Even so, increasing processing workload has created technical challenges, notably of processing to storage bandwidth. Another issue is the marketing ‘noise’ coming from the ‘big data’ movement.

Salvation has come from Objectivity and its ThingSpan platform for ‘information fusion’ solutions that enrich big data assets with real-time data streams. CGG uses ThingSpan to enable ‘parallel seismic data visualization and analysis.’ ThingSpan deploys a Lambda architecture and runs natively on the Hadoop file system as a Yarn application. Apache Spark is also used for workflow and data transformation. The result is a data store that spans seismic volumes, petrophysical data and reservoir models. CGG’s Jason Workbench common data model also ran.

CGG GeoSoftware SVP Kamal Al-Yahya commented, ‘Over the past ten years, Objectivity has helped CGG leverage various IT advances in our commercial software. ThingSpan now provides us with a low-risk avenue into the Hadoop ecosystem and increases deployment options for our clients.’

Read the presentation here.


University of Michigan’s comprehensive fracking study

Graham Sustainability Institute study underpins State’s fracking green light.

The scope of the 200 page integrated assessment of hydraulic fracturing by University of Michigan researchers and the Graham Sustainability Institute is too great to allow for our usual short summary. The document is the fruit of three years of work that set out to inform decision makers in industry and government, setting out the pros and cons of the many facets of shale development.

Dan Wyant, director of the Michigan Department of Environmental Quality, praised the report and expressed his appreciation that many of the state’s suggestions were incorporated into the final version. The report’s findings have helped shape changes to state fracking regulations particularly with more preparatory work and monitoring of water levels. Operators must now report the pressures and volumes pumped and post chemical additive information to the FracFocus disclosure registry.

The report, which cost $600,000 to produce, is neither ‘pro’ nor ‘anti’ fracking. Instead it provides a detailed analysis of the likely outcomes of various courses of action. It would appear that the report was taken as a green light for fracking for the State. The UK Energy Institute also partnered in the study.


Katalyst adds interpretation data management to iGlass

Repsol shows how the KDM ProjectDataStor has rationalized PPDM-based seismic data.

Katalyst Data Management (KDM) has added interpretation data management to iGlass, its PPDM-based data management solution. The new KDM ProjectDataStor supports interpretation project and metadata archival in an ‘end-to-end’ hosted offering. iGlass comprises a web-based, Esri map interface and a PPDM 3.8 backend.

iGlass was cited in a recent PPDM presentation by Repsol’s Cindy Cummings who observed that hitherto, industry has never agreed on a standard for a unique survey identifier. While the PPDM data model can accommodate many seismic data types, often different data components are not managed as a single group making for complex real world workflows.

KDM has followed the PPDM model as closely as possible when developing the iGlass database and much has been learned by its staff. Katalyst will feature iGlass and the ProjectDataStor module at the PNEC Data Management Conference in Houston next month and at the Vienna EAGE in June.


NETL’s game-changing ‘public’ frac dataset

Cores taken in Wolfcamp test wells suggest current hydraulic fracturing design is ‘inefficient.'

The US Department of Energy’s National Energy Technology Laboratory (NETL), working with the Gas Technology Institute, Laredo Petroleum, and industry partners, has collected what is claimed as the ‘world’s most comprehensive’ hydraulic-fracturing research dataset. At a Permian Basin test site, eleven 10,000 foot long horizontal wells were drilled and stimulated in the upper and middle Wolfcamp formations. Some 600 feet of core was cut through the fractured zones, providing researchers with ‘phenomenal quality’ core samples.

Current fracturing operations are said to be ‘inefficient.’ By improving frac design and execution, it should be possible to reduce the number of wells drilled along with the amount of water and energy needed. The team describes the results as ‘game-changing.’

NETL says that the data will be made publicly available, but don’t hold your breath. The NETL’s Andrew Gumbiner told Oil IT Journal, ‘Data from the project is proprietary for two years. Some project results will be made public during the two year period but there are currently no specific dates or venues for publication.’ Other partners in the test included Core Laboratories, Devon Energy, Encana, Total and Halliburton. Laredo Petroleum provided technology leadership during all operational phases of the research. More from NETL.


Software, hardware short takes

Emerson/Roxar, Eliis, Petrosys, Nvidia, Kongsberg, INT, Ikon Sciences, GeoTeric, EnergyIQ, Dynamic Graphics.

The new 7.2 release of Emerson/Roxar’s Tempest reservoir engineering suite enhances the ‘Enable’ history matching and uncertainty analysis module. A new ‘App Connector’ supports complex, multi-application workflows for stochastic, ensemble-based analysis. Tempest 7.2 is the commercial release of ongoing bespoke development under Statoil’s total uncertainty management program and builds on a long-term collaboration on algorithm development with the University of Durham, UK.

The 2016 edition of EliisPaleoscan adds new modules for sequence stratigraphy, automatic geobody extraction, rock property modeling, colored inversion and seismic to well tie at a sub-seismic resolution. Also new is a mutli-z picking tool, spectral blueing and RGB blending. Parallel computing has also been upgraded with the latest multi core technologies.

The 17.7 release of Petrosys’ eponymous oil and gas mapping package brings an up to 100x performance hike through intelligent data caching. Other enhancements include on-the-fly contouring, new depth conversion options and velocity models. Fault surfaces from Petrel, OpenWorks and Paradigm Epos can now be directly displayed in the 3D Viewer. The release adds more direct connections and data management functionality to popular interpretation suites.

Nvidia’s new Tesla P100 hardware accelerator and Nvlink interconnect promise an up to 50x performance boost for data center applications over the previous K80 edition. Nvidia’s Pascal software drives the new units to 5 teraflops of double precision performance.

The 2.0 release of Kongsberg’s LedaFlow transient multiphase flow simulator adds new functionality for wax and hydrate modelling in addition to core model improvements for low liquid loading and vertical flow. Ledaflow is co-developed with industry partners ConocoPhillips and Total under the LTDA consortium. Experimental verification is provided by Sintef.

INT’s GeoToolkit.NET (V3.7) and HTML5 (V2.1) releases include support for specialized seismic, contour, log and well schematics. The HTML5 edition uses WebGL technology to interact with geophysical data from desktops or mobile devices.

A new RokDoc 6.3.2 release from Ikon Science improves well ties with Bayesian wavelet estimation for broadband seismics. The geomechanical module adds guided workflows for novice users and new mud weight vs. deviation/azimuth and stress polygon plots. Existing modules have been enhanced to include tensile strength as a variable, and pore pressure-stress coupling can be computed for horizontal stress. Ikon, in collaboration with Schlumberger is adding colored inversion and 2D forward modelling workflows to the Petrel quantitative interpretation solution.

GeoTeric 2016 reduces processing time by ‘up to 90%.’ Disk space requirement is also down 80%. The new release adds context sensitive navigation, improvements to the spectral expression tool and enhanced color blending and seismic attribute volume visualization.

IBM has announced the data scientist workbench an all-in-one tool for programmers, data engineers, data journalists, and data scientists who are interested in running their data analysis in the cloud. Python, R, and Scala are currently supported.

EnergyIQ has announced Active-Exchange, an event-based upstream master data management solution that enables companies to exchange and synchronize data between disparate applications. The company has also announced IQinsights, a single point of access to corporate well performance information.

Dynamic Graphics CoViz 4D visualization flagship is now optimized for rapid loading and display of ‘billion point’ Lidar datasets.


Kappa’s ‘generation 5’ software and public KURC report

Kappa unconventional resources consortium report shows diversity of frackers’ numerical models.

Kappa Engineering has rolled out its ‘generation 5’ software with the ‘official’ release of Kappa Workstation 5.1. The new 64 bit-only Microsoft .NET software includes Saphir, Topaze, Rubis and a new formation testing application, Azurite.

Azurite provides an integrated environment for processing raw formation tester data. Functionality includes quick look pretest regime identification and computation, full pressure transient analysis and pressure gradient and fluid contact determination.

The company has also released a short public report on the outcome and future of the Kappa unconventional resources consortium (KURC). Phase 1 of the project saw some 28 companies invest €3.5 million in new software tools for unconventional well test software and reserve estimation. Participation in phase 2, a one year extension, suffered from the downturn, but is scheduled to complete by year end 2016 with a possible one year extension.

The main project deliverable is KURCApp, a suite of Topaze workflows tuned to shale evaluation and ‘technically proper’ reserves booking. These include discrete fracture network models, stimulated reservoir volume and well flowback studies and microseismic data analytics. Integration with geomechanical models for re-fracking also gets a brief treatment. The 28 page public report is more of a teaser and an invitation to join the consortium than a treatise on shale production analysis. But it provides some illumination into the diversity of the frackers’ numerical models and their intricate workflows.


SMi E&P Data Management, London

Chris Bradley on the art of the data possible. BG’s ‘pet hate’ of data ownership. Dong on the intricacies of Norway’s production data. Troika, getting serious with seismics. NDB gets production data out of the closet. Arundo and the perfect data storm. Sintef’s DataGraft. ArcaStream, GE...

First and foremost, hats-off to SMi for organizing this, the 17th edition of it E&P data management conference which was held in London earlier this year in the throes of what is probably the deepest industry downturn in the UK to date. Chris Bradley (Independent consultant) reminisced on the days of ‘big money’ data projects when a major could afford $10-15 million on a ‘big bang’ enterprise development. Even then the outcomes were disappointing. Today, there is renewed interest in ‘big data.’ But the problems of data governance and stewardship remain. Most organizations have not sorted out their ‘little’ data. Bradley advocates a Dama-type approach starting with a high level map of the business to identify key concepts that are shared across exploration, production, downstream and retail. This provides a framework for a conceptual data model, itself the basis for more drill down into detailed objects and complex relationships. You also need to understand data ownership, and ‘prepare to be horrified’ as some data may be held in ‘27 different systems.’ Data management is not the ‘field of dreams,’ but rather the ‘art of the possible.’ Plan big, implement small. Data models and governance are vital.

While we were reflecting on the vitality of data governance and ownership, Neil Storkey (BG Group) cited ‘ownership,’ along with a ‘single version of the truth’ as two of his pet hates! ‘We don’t want ownership in our vocabulary.’ Bradley’s Dama approach has failed while Storkey has had ‘pockets of success.’ But pockets are not enough, a new data order is needed. First try to get IT out of the loop and stop switching to the next new technology panacea. What is required is more attention to strategy, Storkey’s specialization. This means adopting the RACI framework and getting the business to say what outcomes are needed. BG’s data LifeSaver defines responsibilities of data providers and consumers. Next you should empower citizen stewards of data, ‘even though Gartner says they will fail!’

Peter Jackson provided a plug for Bain & Co.’s Rapid toolset for clarifying decision accountability and its application to big data projects. According to a Bain study, only 4% of companies ‘execute well’ on big data strategy and oil and gas particularly lags on adoption. This is due to a lack of understanding of big data’s capability and an unwillingness to change how we work. Companies should create a big data center of excellence and staff it up with data scientists, analysts, developers, data managers, engineers, product managers and business/legal folks (blimey!). Alteryx’ workflow-based cleansing and Tableau both got plugs as key analytical tools used inter alia to identify pump seal failure, fluid contamination and out-of-spec operations. All achieved with data mining, regression analysis and fuzzy logic.

Magnus Svensson (Dong E&P) traced Norway’s rather intricate history of production reporting and the EPIM standards body. There is more to production reporting than meets the eye as oil and gas flows from a subsea module, into a production platform and on through export pipelines to the shore. The process involves complex volume reporting by the operator to partners and by partners to the government. There has been a decade or so of various production standards development and ‘we are still not finished.’ Reporting still involves manual massaging of data, but it is getting harder to tamper with data now it is not just in text format. This is an important consideration as a terminal operator may report for 20 different companies. Despite the current standards’ imperfections, implementation has been a worthwhile, albeit costly exercise. The authorities are keeping a close eye on developments as the industry ‘may have been reporting the wrong numbers for 10 years.’ In the Q&A it emerged (to some laughter) that a parallel UK initiative, PPRS, the production data exchange format has ‘lost focus.’

Jill Lewis (Troika) reported on another standard that was having trouble with adoption. SEGY Rev 1 has been out for 13 years and is still hardly used. SEGY rev 2 will be out ‘real soon now.’ Even with a standard, individuals tend to do things differently so companies need to automate checks and leverage expertise. Data management is a vast subject and you need to prioritize. All field data should be in SEGD. But some companies are still using formats from 1975! ‘You can’t be serious!’ Specifying a required format in a contract is a necessary but not sufficient condition. One company specified SEGD Rev3 in the contract. But the data came back in Rev 1 (an old contract had ben used). Format and QC checks (with Troika’s software) are particularly important before old tapes are destroyed. ‘Make sure that you have control of this stuff even though your outsourcing partners may not relish your efficiency.

Jonathan Jenkins (NDB) and co-author Cindy Wood of the UK Oil & Gas Authority presented on production data management, allocation and reporting. Recent Guidance Notes for Petroleum Measurement heralded an OGA review of production allocation systems. These dovetail with a 2016 Aupec/NDB data maturity survey which found that most companies have no dedicated production data management at all and simply rely on Excel. One result is that engineers don’t trust allocation figures. Volumes can be 30-40% adrift. Deployment at client sites is almost exclusively a choices between different vendor’s software. ‘Standards? They could not care less, so we did not do it.’ Jenkins’ goal is to ‘get production data out of the closet,’ with an objective of saving 50 days per engineer per year by building a central body of truth exposing the same figures to both allocation and production systems.

Tor Jakob Ramsoy (Arundo Analytics) sees a ‘perfect storm’ about to hit oil and gas, driven by the internet of things, the cloud and by data science. At the nexus of the storm is Arundo’s offering. Companies need to ready themselves for the storm by high grading their CIO and adopting a digital strategy. Deep learning is exploding across the whole industry. Data, not software is the real challenge. Ramsoy advocates data transparency across value chain. FMC, Aker and Schlumberger can all provide embedded software and an integration layer to support a cloud-based data mart. Asset owners will subscribe to apps running against the data platform.

Robert Bond (Charles Russell Speechlys) provided a salutary reminder that data, big or small, is subject to a host of regulations covering its protection, cyber security and due diligence requirements in M&A. These requirements differ from country to country. If big data is where the money is, folks will try to steal it! So rule number one is, if you don’t need it, get rid of it! Other issues arise on the transfer of data from one company to another. EU legislation requires that the recipient of such a transfer must be able to demonstrate its right to use the data for specific purposes. In a merger, the US is particularly watchful about divulging personal data to non-US entities. Some legislations have extremely prescriptive laws including the possibility of sending you to jail. Some companies like PeopleSoft and Salesforce used to rely on US safe harbor legislation for data in US the US. But safe harbor ended in October 2015. Roll in social media, home working and BYOD* and you are entering a minefield!

Dumitru Roman presented Sintef’s DataGraft cloud-based service that offers developers ‘simplified and cost-effective solutions for managing their data.’ The system went live late last year and allows users to transform tabular data into semantic-web style RDF**. Roman opined that even ‘open’ data such as that provided by the NPD Fact Pages is hard to query and hard to integrate with other Norwegian datasets such as the Norway business registry or publically available real estate data. Roman’s answer is to ‘scrape’ data from the PDF sources and create a cloud-based data service. DataGraft is a part of the linked open data movement and assumes familiarity with W3C standards like RDF, Sparql and Owl. Data can be hosted on DataGraft’s semantic graph database. The work stems from the EU 7th Framework DaPaaS project.

ArcaStream’s James Pitts enthusiastically announced that he was going to talk about technology. Specifically, about ArcaStream’s software defined storage and leverage commodity IT pricing. Often, vendor data management strategies are over-reliant on human intervention and you end up with a ‘data junkyard.’ Tiering, replication and other data services may only work with the vendor’s solution, leading to lock-in. Other ArcaStream tools automate data ingestion, extracting and indexing metadata from a magnetic the tape. (Some seismic data specialists appeared skeptical of this possibility when confronted with real world ‘warts-and-all’ seismic tapes!)

Pat Schiele (GE) thinks that, in the downturn, the upstream can learn a lot from the downstream especially in using data for asset maintenance. The downstream, along with power gen and aviation have all ‘been here before’ and have moved from time-based to condition-based maintenance. In the upstream though, every well Christmas tree is custom, making for high cost of design, manufacturing and maintenance. Schiele argues that more standard designs should stay within a ‘range of suitable’ as opposed to the current ‘every well is different’ approach. He also observed that the current practice of paying a company like GE for maintenance means that the ‘cost/reward structure is wrong.’ We need more data sharing and collaboration. This happens in utilities and power gen, not so much in the upstream. ‘We can make money at $40 oil, just not very much.’

* bring your own device.

** resource description format.


Prospero 3rd oil and gas cyber and scada security conference

Microsoft InTune, ABB Network Manager, Nixu cyber defense, Stonesoft, Wireshark, Nessus ...

There were interesting papers presented at the Prospero 3rd cyber and scada security for oil and gas event held earlier this year in Amsterdam. Unfortunately, the ‘Chatham House’ rules that govern the event prevent us from reporting who said what or acknowledging anyone but the conference organizers.

One Norwegian service company has deployed Microsoft’s InTune company portal to securely manage its mobile users’ devices. All devices (corporate or BYOD*) must have the InTune software running to communicate with the corporate network.

An EU gas distribution network operator showed how the security of its trans-national scada network has evolved over the last 25 years or so. Initially, scada systems were protected simply by isolating them from external traffic. Over the years the network has been opened up to more and more services and systems have been hardened. In the last few years, a new operational model has deployed that necessitates more calculations and services that are consumed by geographically dispersed users. Data use has extended from operations to invoicing. In the meanwhile, the operator has decommissioned its legacy ABB Spider scada control system. This has been replaced with ABB’s Scada Network Manager. Security in the new environment is assured by homologated ICCP protocols between control centers, secure gateways and by real time monitoring and analysis of network traffic. The operator considers ICCP as a necessary but not sufficient condition for its cyber protection. This has been further bolstered with the use of IPSec communications across the board. The company is keeping a close watch on evolving EU regulations touching on cyber security of critical infrastructure including oil and gas distribution.

Two presenters emphasized the need for network segmentation as a means of reducing the attack perimeter and limiting the impact of a possible hack. Segmentation involves dividing the network into smaller segments and isolating critical infrastructure. While this help prevent unauthorized access and restricts the spread of malware, the challenge is then how to assure access for authorized users and to re-establish connections for essential systems. In which context the Nixu cyber defense center got a brief plug.

A major EU refiner also insisted on the need for network segmentation and a strong separation between the enterprise information system and the industrial control system. Communications between the two environments must pass through a ‘physical firewall’ and dual-attached devices that span both systems are prohibited. Great attention is placed on the physical security of ICS components which are assessed to assure appropriate degrees of protection and equipment redundancy. Wireless connections for control and safety functions is proscribed. While suitably robust wireless devices are authorized in some circumstances, such networks are considered potentially ‘hostile’ and must be isolated from process control at least through a level 1 dedicated firewall. The refiner is about to add further protection to its systems with the deployment of next-generation Intel/Stonesoft firewalls. These include intrusion detection, deep packet inspection, application-level control and a secure VPN. Other tools mentioned included Wireshark, Nessus (vulnerability scanner) and Nmap.

* bring your own device.


Folks, facts, orgs ...

Kongsberg Digital, AccessESP, Acteon, Aquatic, Audubon Field Solutions, Blackhawk, C&J Energy Services, Canary Labs, Chevron, Department of Energy, SSG Limited, Energistics, EQT Corp., Gemini, Honeywell, IDC, Western Digital, iRrods, KBR, OFS Portal, Royal Dutch Shell, IOGP, Actuant, Allegro, NCS Multistage, Petrowest, Precision Drilling, Primoris, Quanta, PODS.

Hege Skryseth is to head-up the new Kongsberg Digital unit and also assumes the position of chief digital officer of the parent company.

John Algeroy has joined AccessESP as Europe and Africa Region Manager.

Bob Terrell is now regional manager at Acteon’s Aquatic Engineering & Construction unit. Andrew Blaquiere has also transferred to Aquatic.

Audubon Field Solutions has hired Kurt Newbrough as director of survey and Scott Shaver as operations manager. Both hail from Energy Surveys, Inc.

Blackhawk has opened a new subsidiary and operating facility in Villahermosa, Mexico to hire and train local employees.

Randy McMullen is now CEO at C&J Energy Services succeeding retiree Josh Comstock. He continues as CFO.

Jeff Knepper is the new executive sales director of Canary Labs.

Mark Nelson is Chevron’s VP of strategic planning, replacing Joe Naylor who is now VP policy, government and public affairs.

Neelesh Nerurkar is now the US Department of Energy’s deputy assistant secretary for oil and natural gas. Doug Hollett is principal deputy assistant secretary in the Office of Fossil Energy, replacing Julio Friedmann who returns to the Lawrence Livermore National Lab.

SSG Limited is now a member of Energistics.

Bob McNally is now senior VP and CFO at EQT Corp, succeeding Philip Conti who remains on the board.

Gemini has named Peter Sametz president and CEO succeeding retiree Doug Lautermilch.

Sean Fuller has joined GSE Systems as senior VP sales. He hails from GE Hitachi Nuclear Energy.

Darius Adamczyk is now president, COO and CEO with Honeywell.

John Villali has joined IDC as research director. He hails from DNV-GL.

Western Digital is now an active contributor to the iRrods Consortium.

Greg Colon has joined KBR as president, engineering & construction, Asia-Pacific. He hails from Worley Parsons.

Dave Wallis is to retire from OFS Portal as director for eastern hemisphere. CEO Chris Welsh will assume his duties.

Royal Dutch Shell’s Monika Hausenblas succeeds John Chaplin as chair of the IOGP. Caterina De Matteis has been promoted to policy officer.

Ken Bockhorst is executive VP, Energy at Actuant. He replaces Brian Kobylinski who is now president and COO of Jason Industries.

Allegro has hired Paul French as its new Chief Marketing Officer. He hails from Armor.

Robert Nipper, founder and CEO of NCS Multistage, has been appointed executive chairman of the board. Marty Stromquist is the new CEO.

Dan Tsubouchi has been appointed to the Petrowest board of directors.

Carey Ford has been appointed as interim CFO of Precision Drilling following Robert McNally’s resignation. The search is on for a full time successor.

Thomas McCormick is COO at Primoris.

Jim O’Neil has stepped down as president, CEO and member of board of directors of Quanta. His place is taken Earl Austin, who retains the COO title.

The PODS board of directors has elected Novara GeoSolutions’ Scott Blumenstock (Treasurer), New Century Software’s Ron Brush, and TRC’s Peter Veenstra for a one year term as service provider. TransCanada’s Kartin Franke, Chevron’s Paul Herrmann, Oneok’s Chad Hultman, BP America’s Michael King, Plains All American Pipeline’s Michael Ortiz and PG&E’s Wen Tu have been elected for a two year term as operators.


Done deals

Wood Group, Ingenious, Wilks Brothers, Trican, WellDog, Cash Resources, Gasmet, Quantitech, OFSCap, Oil and Gas Asset Clearinghouse, CenterGate Capital, Navig8, RKOffshore, Iron Mountain, ION, Honeywell, Movilizer, RSI Video, Canaccord Genuity, FFT, Actuant, FourQuest.

Wood Group has acquired e-learning and training specialist Ingenious.

Wilks Brothers has acquired 16% of Trican Well Service.

WellDog has received A$4.25 million in new debt financing from Cash Resources Australia.

Gasmet Technologies of Helsinki, Finland has acquired UK-based Quantitech following the retirement of the current owner and MD Keith Golding.

OFSCap has acquired The Oil and Gas Asset Clearinghouse from its owner, CenterGate Capital.

Navig8 Group has acquired offshore oil and gas QHSSE service provider RKOffshore, now rebranded as RK8.

Australian, Canadian, UK and US regulators have ordered Iron Mountain to divest records management assets in order to proceed with its $2.6 billion acquisition of Recall Holdings.

ION Geophysical has acquired Global Dynamics, developer of the SailWing marine towing system.

Honeywell has acquired Movilizer, a provider of cloud-based support for field service operations. The company also recently acquired intrusion detection specialist RSI Video Technologies for $123 million. The proposed ‘combination’ with United Technologies has been abandoned due to UT’s ‘unwillingness to engage in negotiations.’

Canaccord Genuity is advising the Formation Fluid Technology board on strategic alternatives including a possible sale.

Actuant has acquired FourQuest Energy’s MENA assets in a deal worth $60 million.


BP shifts to standardized rig ITC

SMi Oil & Gas telecommunications presentation shows how BP is rationalizing its ‘fragmented’ rig IT with a standard infrastructure to support the SiteCom/OpenWells-based BP Well Advisor suite.

Stephen Teale, BP’s global rig IT manager, speaking at the recent SMi Oil & Gas Telecommunications event in London, observed that in the current low cost environment, digital technology promises much needed efficiencies and cost savings and that telecommunications are key to realizing the digital opportunity. BP’s objective of drilling ‘safe, compliant and reliable wells’ is hampered by current rig IT that is ‘disordered and fragmented’ with multiple solutions deployed and with bespoke, regional developments. This leads to increased cost and complexity in application and infrastructure support. Installations and upgrades have led to equipment rooms packed with multiple systems.

Teale’s team has therefore developed a standard rig IT infrastructure that will be deployed on new rigs over the next couple of years and retrofitted to some 30 existing rigs. BP’s ‘well advisor’ (BPWA) is based on Kongsberg’s SiteCom* communications technology and Landmark’s OpenWells operations reporting infrastructure, all with ‘stringent’ digital security.

Real-world deployments on major platforms such as Great White and Ocean Victory illustrated the scope of rig IT. VMware’s ESXi hypervisor serves multiple virtual machines running the BPWA suite of applications. Physical server management is achieved HP’s iLOX. The BPWA suite is comprehensive and multi-vendor. Well planning for instance, couples Schlumberger’s Techlog with Landmark’s DecisionSpace AssetView. Other systems of note include PI, Primavera and Petex. The global wells organization’s back office knowledge management system has been developed in SharePoint.

New subsystems are developed as proof of concept and validated in a test lab environment. Teale’s group then provides templates for different use cases, from a bill of materials, through project plans, system of record to an IT&S requirements document. These guide personnel through commissioning, delivery and testing.

Cramming all this kit onto the rig mandates attention to details such as space, weight, power and cooling requirements and cost. BP is also moving from a diverse legacy communications infrastructure with many contracts and vendors with potentially multiple points of failure. Now a single contract and unified vsat communications have simplified delivery. End-to-end redundant design has lowered the risk of failure, heightened security and improved performance. The standardized system has reduced costs through bulk purchase of satellite bandwidth with outages of less than one 1 hour/month/rig.

* See also Tessella powers BP’s Well Advisor (N° 3 2016).


IRMS rolls-out valve integrity management software

New ‘VIMS’ software and services offering from Independent Risk Management Systems.

Delft, Netherlands-headquartered pipeline repair specialist IRMS* has announced a new software and service offering for pipeline valve integrity assurance. According to IRMS, industry data on valve integrity and maintenance is poorly analyzed. Enter IRMS’ valve integrity management solution, ‘Vims.’

IRMS general manager David Obatolu said, ‘We see many valve problems arise that could have been avoided with a methodical data review of the pipeline system. Vims software and our review process provides operators with a straightforward way of monitoring all aspects of their valves.’

The process starts with an audit of the number and condition of installed valves. Audit data is logged and analyzed before capture into the operator’s existing MRO systems and processes. Data can be benchmarked according to valve type, location and reliability. The standardized approach leverages ‘solid risk and reliability engineering’ and is claimed to support informed decisions on inspection plans and repairs.

* Independent risk management systems BV.


Santos deploys RedEyeWFM

Software manages commissioning and inspection activity across 200 well sites.

RedEye Apps has announced the imminent launch of RedEyeWFM with Santos’ GLNG unit its first customer. RedEyeWFM* is used by Santos and its suppliers to manage commissioning and inspection activity across its 200 CNG producing well sites and Queensland, Australia gathering network.

RedEye helps manage mobile workers’ activity with customizable templates for audits, inspections, work requests and other activities. Templates can be edited and reused and even bought and sold on the Redeye template store. Jobs can be scheduled and staff allocated as required. Mobile workers see jobs pop up on the RedEye app. CEO Wayne Gerard said, ‘RedEye is designed with zero training in mind. It’s easy to use and perfect for the field environment.’ Users can chat and share photos via the app and if necessary raise another work order for review and action.

* presumably work flow manager.


Sales, deployments, partnerships ...

Elsevier, SEG, Geosoft, Ingenu, Koncar, Upland Consulting, Schlumberger, BMT Argoss, Aker Solutions, ASD Global, AspenTech, Aveva, Cortex, OFS Portal, DocStar, Eco-Stim, Exprodat, Frontica, GE, SapuraKencana, GlobalView, Kongsberg, Halliburton Landmark, Oniqua, Veriluma, PetroWEB, Rock Flow Dynamics.

Elsevier has published more than 14,000 maps sourced from the SEG through its online geoscience solution, Geofacets.

Geosoft has launched the Botswana Geoscience Portal offering free access to multi-disciplinary datasets from the Ngamiland district.

Ingenu and Koncar are to deliver a pipeline surveillance and wellhead monitoring solution to Shell Nigeria. The solution is integrated and supported by Upland Consulting.

Schlumberger’s ActiveQ real-time downhole flow measurement service has been deployed on Kuwait Oil Company’s Sabriya field.

Sakhalin Energy has appointed BMT Argoss to deliver weather forecasting services in support of operations in Russia’s Sakhalin Island.

Aker Solutions has signed a renewable five year agreement with BP to deliver engineering services, asset integrity management and operations on BP-operated subsea oil and gas fields. Statoil has awarded Aker a contract for preliminary engineering work on a tie-in of the Utgard gas and condensate field to the Sleipner facilities in the North Sea. The company’s MMO unit has signed a subcontractor agreement with Kværner to provide engineering, procurement and construction services for upgrading Statoil’s Njord A semi-submersible platform.

ASD Global’s design automation suite has been interfaced to AspenTech’s capital cost estimator, Aspen basic engineering and AspenHysys.

BashNIPIneft has deployed Aveva E3D and Laser Modeler for use on upgrade and modernization projects in Russia.

Cortex has renewed its partnership with OFS Portal to implement industry best practices for eInvoicing.

Sun Oil Limited is to implement DocStar Eclipse enterprise content management and ‘smart’ automation software to improve workflow in the accounts department.

Eco-Stim has been selected by a major oil and gas operators in Argentina to provide well simulation services in the province of Neuquén, Mendoza and Rio Negro.

Exprodat is to provide companies participating in the UK Oil and Gas Authority’s exploration license competition with access to its Exploration Analyst software.

Frontica has signed a five-year contract with Aker Solutions to deliver staffing, IT and consultancy services within HR, finance and procurement.

GE and SapuraKencana have signed a memorandum of understanding to provide light well intervention services in Asia-Pacific.

Oman Trading International has deployed GlobalView’s interface to its Allegro 8 solution to manage its oil and refined products trading business.

Kongsberg has signed a 12-year contract with Statoil to maintain, modify and upgrade its existing dynamic process simulators in Norway and abroad.

Eni is to implement Halliburton Landmark’s DecisionSpace suite, ‘Smart Transform’ and training services to E&P units worldwide.

Oniqua and Veriluma are teaming up explore potential applications designed to leverage predictive intelligence technology to enhance MRO inventory.

PetroWEB has launched its Global Seismic Library, a catalog of commercial multi-client seismic and gravity/magnetic data coverage.

Lukoil has acquired licenses to Rock Flow Dynamics’ tNavigator and Model Designer simulation solutions for use on its North American assets.


Standards stuff

PIDX field ticket best practice. OGC and ASTM team. Cloud Standards Customer Council publishes IoT architecture. IOGP releases EPSG developer’s guide. OGC’s common database RFC.

The PIDX business processes work group has approved the field ticket best practice guideline document for upstream operators and suppliers. The next step is a vote in the standards committee and final approval from the executive committee. PIDX is working on a pilot implementation to automate field ticket data capture, to document efficiencies and to leverage internet of things technology to exchange invoices.

ASTM International and the Open Geospatial Consortium are to jointly develop standards, best practices, and other tools to support the geospatial industry, with an initial focus on point cloud data. The collaboration will cover data acquisition and dissemination, location-based services, and unmanned/autonomous navigation.

The Object Management Group-sponsored Cloud Standards Customer Council has published the cloud customer architecture for the internet of things.

The IOGP has published the EPSG registry developer guide a.k.a. IOGP 373-7-3, geomatics guidance note number 7, part 3. The document is intended to help users of the EPSG’s application programming interface to develop applications that retrieve entities and attributes from the EPSG dataset.

The Open Geospatial Consortium is requesting public comment on its common data base (CDB) candidate standard, an ‘open format and encoding for the storage, access and modification of a representation of the natural and built environment for simulation applications.’ The CDB embeds commercial and simulation data formats in widespread use in industry. Data in the CDB is tailored for real-time applications. The CDB storage model supports applications in which inter-connected simulators share a common view of the simulated environment. Comments close May 27, 2016.


ABC Powering onshore oil and gas facilities, Houston

Southwestern on fuel cells. ConocoPhillips on solar. SandRidge’s ACSelerator. Aggreko, Zrho on gas.

The American Business Conferences event, Powering onshore oil and gas facilities, held earlier this year in Houston, was an opportunity to learn about the complex nature of the task and the wide range of options available.

Southwestern Energy’s Don Sevier presented the results of various trials of natural gas powered fuel cells. Sevier observed that using pressurized natural gas to power oilfield instrumentation is wasteful. Also the EPA is tightening regulations regarding venting such instrument gas to the atmosphere. Better options include using the gas to generate electricity through a fuel cell or a thermo-electric generator. These, along with solar packages, have all be trialed and are applicable in different circumstances.

ConocoPhillips’ Rob Dwyer showed how solar power can be combined with thermal electric generators acting as base load for when the sun doesn’t shine. Such systems are used to trickle charge batteries overnight but also to ensure that their charge/discharge cycles are optimized and battery life is maximized. Battery failure and the resulting opex costs are a major drawback for solar but they can be mitigated with these hybrid systems.

Jason Niven (Sandridge Energy) has developed a sophisticated software monitoring and control system to manage its 1150 mile network of power lines and seven substations. The system is particularly useful in managing outage response. Weather generated outages can be widespread and may lead to information overload if alarming is not properly filtered. The in-house developed substation control ACSELerator system includes an interactive GIS map and scada tools for overview, troubleshooting and operations. The system captures data from wellsite power meters and artificial lift run status. Links exist to weather radar including lightning strikes and other systems such as Maximo and Cygnet.

A presentation by Aggreko’s Tony Walluk demonstrated better economics from a large scale 5 MW central gas-fired power plant as opposed to per-site generation. The system also gave a better return than taking the gas to market.

Jon Hesse presented Zrho’s field gas conditioning system, used to bring produced gas up to spec for generation. Zrho’s 1500CMT unit uses catalytic reformation to produce a consistent methane stream. Each unit offsets around 1500 tonnes of carbon annually compared with diesel-based generation. More from ABC.


Patent potpourri

Arria patents brevity. ConocoPhillips sued over shareholder voting. Auto-Dril vs. Pason, case dismissed! Rapid Completions sues Baker Hughes.

Arria NLG has been granted a US patent for a ‘method and apparatus for referring expression generation.’ The breakthrough means that ‘the second or subsequent mention, in a report, of a machine, device or process can be expressed in a word or two, rather than by its full name.’ The company claims inter alia that its natural language technology will allow devices deployed in the internet of things to ‘talk naturally.’ Natch!

Marshall Feature Recognition is suing ConocoPhillips for the infringement of its US patent No. 6,886,750 covering a ‘method and apparatus for accessing electronic data via a familiar printed medium.’ Marshall alleges that ConocoPhillips’ shareholders ‘use a smartphone with a QR Code scanner to scan shareholder voting documents.’

The case brought against Pason Systems by Auto-Dril in respect of an alleged infringement of Auto-Dril’s US 6,994,172 B2 patent on a well drilling control system has been dismissed with prejudice by the Court.

Acacia unit Rapid Completions LLC is suing Baker Hughes for infringement of US 9,303,501 B2, a method and apparatus for wellbore fluid treatment. The patent was originally assigned to Packers Plus Energy Services which holds several patents covering multi-zonal completion of horizontal wells.


Emerson educates the regulator on tank monitoring

NIOSH hazard alert sparks call for new tank gauging standard and VeriCase radar accuracy checks.

Emerson bloggers Jim Cahill and Christoffer Widahl have drawn attention to the risks associated with manual gauging of production tanks. The fact that this is a potentially dangerous operation was highlighted by a recent worker fatality and a hazardous alert (HA3843) from Niosh, the US National institute for occupational safety and health. There were nine fatalities caused by manual gauging or sampling of production tanks between 2010 and 2014 with workers exposed to toxic hydrocarbons when opening hatches for manual gauging. Emerson therefore recommends automated tank gauging (ATG). But while the API18.1 safety standard covers large (>100bbl) tanks, there is no standard that supports cost effective ATG on smaller tanks where high end custody transfer kit would be uneconomical.

A new standard in prep, API18.2, addresses automated tank gauging in small production tanks which often already have level monitoring radars for overfill prevention. Such tanks should immediately start to adopt the new standard and use radar level measurements for custody transfer. Emerson’s Rosemount unit recommends using its VeriCase tool to ensure that tank radars comply with the accuracy verification requirements of the new standard.


Forecasts and fantasies

Latest reports from Research & Markets, Technavio, MHI, RepRisk, IDC, Navigant.

A report from Research & Markets predicts that the global oil and gas drones market will be worth $4 billion by 2020. Up from $609 million in 2014. Drones are used to monitor pipelines, roads, storage tanks and for flare stack and rig inspections.

Technavio puts the value of the global actuator market at $6.15 billion in 2015 and projects it to reach $7.47 billion in 2020, driven by the ‘trend of automation.’ Another Technavio study puts the market for distributed acoustic sensing at over $418 million by 2020.

MHI’s third annual report on ‘potentially disruptive’ digital, ‘always-on’ supply chains sees challenges from a lack of trained personnel. MHI found ‘at least one’ of eight disruptive technologies could be a source of competitive advantage to the supply chain over the next 10 years.’ The ‘lack of a clear business case’ is the major barrier to investment.

RepRisk has released the third edition of its ‘most controversial projects’ report of environmental, social and governance risk. The study covers worldwide risks in oil and gas projects inter alia.

A new study from IDC Energy Insights finds that oil and gas CIOs ‘have their hands full’ in the current climate as they try to be more productive while reducing costs at the same time. Companies are tuning existing applications and processes and updating applications that help automate and optimize processes and workflows. Mobile is growing rapidly but cloud growth is only ‘moderate.’

Navigant Research puts the upstream communications market value at over $1.5 billion in 2016. Satellite is the leading technology with fixed fiber and wireless cited as ‘cost-effective alternatives.’ Semaphore is dead!


National Oilwell Varco announces IoT for BOPs

RigSentry monitors blowout preventer integrity over ‘Max' industrial internet platform.

National Oilwell Varco has announced RigSentry, a remote condition monitoring service providing real time analysis of subsea blowout preventer health. The new predictive capability is expected to give customers insights into probable BOP performance issues with a prediction horizon of approximately 14 days.

NOV analyzed some 14 years of historical sensor data and maintenance logs, adding expert knowledge gleaned from the company’s 60 year history of designing, testing and manufacturing BOPs. Rig-Sentry is claimed to identify the specific point of failure and alert customers earlier than was previously possible.

The system runs on to op NOV’s ‘Max’ industrial data collection platform. NOV president and CEO Clay Williams said, ‘Big data’s potential in condition monitoring and predictive analytics will change the way we support, maintain and design our equipment.’


Watson, Hana to marry

IBM’s cognitive computing, SAP’s business suite and database available from Austin, TX base.

IT behemoths IBM and SAP are teaming on a ‘co-innovation’ initiative to combine IBM’s Watson-based ‘cognitive computing’ with SAP’s S/4 Hana business suite and database. IBM SVP Bridget van Kralingen opined that ‘business strategy and value will derive from the foundational elements of cognitive computing, the cloud and from consumer-quality experiences in industry.’

The collaboration extends a strategic partnership covering Hana enterprise cloud services. IBM and SAP are now to develop industry-specific cloud solutions and expand current cloud services to include application maintenance and support services. For companies wary of the cloud, hybrid and on premise options will be available with Hana running on IBM Power systems and supported from IBM’s new SAP Hana center of excellence in Austin, Texas.

IBM will develop cognitive solutions for Hana and line-of-business solutions using its ‘cognitive’ APIs. A joint effort will also target the end user experience, with input from IBM’s iX unit and from the team behind SAP’s Fiori UX. The new team will be ‘co-located’ in Walldorf and Palo Alto. Watch the video.


Linux Foundation backs oil infrastructure project

Civil infrastructure platform to provide ’sustainable' software bricks to industry.

The Linux Foundation (TLF) has announced the civil infrastructure platform (CIP) that is to provide software building blocks to assure reliable operations in various industries including oil and gas. The announcement was made at the Embedded Linux conference in San Diego this month where the nonprofit organization unveiled its plans for ‘mass innovation through open source software.’ The CIP is to provide the software foundation required for essential services for civil infrastructure and economic development on a global scale.

The CIP seeks to mitigate the ‘duplication of effort, loss of development time, fragmentation and interoperability issues’ that bedevil major projects. TLF executive director Jim Zemlin said, ‘Open source software can accelerate innovation, enable interoperability and transform technology and economics for an industry. The CIP will provide a a common framework to support some of society’s most important functions for decades to come.’

The CIP will add a base layer of industrial-grade software to the Linux kernel adding safety, security and reliability to the open source flagship operating system. Software ‘sustainability’ is also a key objective, particularly in the face of infrastructure life cycles of 10-60 years. Early CIP backers include Hitachi, Siemens and Toshiba.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.