Oil IT Journal: Volume 22 Number 2


DNV GL’s data platform

DNV GL finds lack of awareness of digital potential in oil and gas. To help things along, the ‘trusted third party’ rolls-out 'Veracity’ an independent data platform for frictionless connectivity.

DNV GL, the Oslo-headquartered consulting, testing and certification company for the global energy sector has surveyed the oil and gas business and found that ‘digitalization of the oil and gas sector has been increasing steadily in 2016, with significant progress in digital technologies, including artificial intelligence, automation, predictive analytics and machine-to-machine communication.’

39% of respondents to the survey, now in its seventh year, report that ‘digitalization is the area in which companies are most likely to invest in 2017,’ with plans for R&D, trials and full-scale implementations aimed at improving profitability and reducing risk.
On the downside, 30% of respondents report that lack of digital culture and funding are major barriers as is the presence of an ‘old-fashioned organizational culture.’ More generally, a lack of awareness among senior management about digitalization’s potential benefits, a skills shortage and ‘bureaucratic obstacles’ are more likely to impede progress than technical and data-related issues.

To help companies accelerate their digital initiatives DNV GL is leveraging its role as a ‘trusted third party,’ to propose an independent industry data platform, ‘Veracity,’ that is intended to ‘facilitate frictionless connections between different industry players, domain experts and data scientists.’ The company is to leverage its experience of oil and gas big data projects that focus on reduced downtime, improved safety, predictive maintenance, performance forecasting, energy efficiency and real-time risk management. In previous work it has emerged that ‘data quality is a major barrier to overcome.’

DNV GL Oil & Gas CEO Elisabeth Tørstad said, ‘Our new industry data platform combines domain expertise and data science to put quality-assured data, the veracity of data, at the center of industry-wide innovation. The aim is to not only build trust, but also boost knowledge and encourage collaboration. Industry needs to be successful at this to leverage the benefits of digitalization.’ DNV GL president and CEO Remi Eriksen added, ‘Companies have always turned to us for independent, expert assessments and best practices – to build trust in the safety, efficiency and sustainability of their physical assets and operations. Now, we are exercising this same role in the digital domain with our Veracity data platform. We are not looking to own data, but rather to unlock, qualify, combine and prepare data for analytics and benchmarking.’

More on Veracity from DNV GL. Read also our interview with DNV GL data specialists on page 3.


Predix, Maana converge

GE and Maana’s ‘intertwined vision’ realized in converged, cloud-based industrial internet. Predix meets EKT, the ‘Enterprise knowledge technology.'

Mike Dolbec, who heads-up the digital infrastructure investments at GE Ventures recently explained the rationale between Maana, a company the GEV is backing financially, and GE’s own industrial internet initiative. Maana’s ‘Enterprise knowledge technology’ (EKT) powers critical decision support workflows of some of the world’s largest industrial companies. GE’s Predix is a cloud-based operating system for industrial applications.

EKT is claimed to ‘go beyond data lakes and data-centric technologies,’ allowing experts to ‘understand and exploit the relationships between data, physical properties and processes.’ EKT lets users develop of ‘hundreds of at-scale models for asset optimization.’

In the past three years, Maana has executed on a vision that is ‘uniquely intertwined’ with GE’s. Predix and EKT have now ‘converged into a solution for industrial companies’ digital transformation.’

GE is a pioneer in the field of the digital twin, a digital representation of a physical asset. The twin combines physics-based modeling, data and artificial intelligence to understand the past and predict future outcomes. More on GE’s digital twin in our on-the-spot report from the GE Oil &Gas Florence event in next month’s Oil IT Journal.


Big data and legal risk. World’s future energy needs.

Neil McNaughton wonders what the regulator thinks of big data-derived stochastic reserves numbers. Lone Star reports on similar warnings from the Association for Computing Machinery. Mark Razmandi warns of bias in SCADA analytics. Also... the Church of England blesses fracking (official!). Oft-cited IEA fossil fuel forecasts sneak-in 2 billion world population hike for 2040!

It may be something of a canard but fairly often in data conferences there are vague warnings of potential non-compliance issues that stem from the fact that today’s reserves reporting, ultimate recoverable numbers and so on are obtained in part through various numerical modeling techniques. The warnings come from those in the data management community who are anxious not to be in the firing line if and when the SEC, or whoever is your regulator, comes back with embarrassing questions as to how such numbers were arrived at.

Consider for a moment ExxonMobil’s humongous calculation performed recently at the NCSA (see page 5) which briefly monopolized all the processing power of the Blue Waters supercomputer to produce an optimal development plan. One that might include some reportable reserves numbers. What if, some ways down the road, the regulator gets back and asks for justification of the reported numbers. Operator fires up the behemoth, re-runs the analysis and comes up with a different number! For yes, these calculations are statistically derived, using either old fashioned Monte Carlo techniques or more fancy big data-esque computation.

OK, perhaps this is rather contrived. Exxon was more interested in testing its modeling on a really big computer than in reporting. More seriously, the issue of risk, big data and analytics was highlighted recently in a release from consultants Lone Star Analysis. Lone Star reports that while ‘a consensus is beginning to define acceptable practices and define legal risk for users of big data and analytics,’ such new developments ‘pose significant legal risks for careless uses of big data.’

In the US, the Association for Computing Machinery (ACM) recently issued a statement on ‘Principles for Algorithmic Transparency and Accountability.’ The ACM outlines seven principles for big data users in business and government. These include an awareness of possible bias in a model, the need for access to data, calculations and accountability for AI-based decisions. Big data users need be able to explain procedures and decisions and to assure auditable data provenance and model testing.

While this may be of less application to oil and gas, Lone Star also flags up recent EU legislation on data privacy. Big Data advocates who want to ‘discover’ new relationships and patterns in consumer data will have to be careful how they explore the digital unknown universe, while regulators insist that such data is only used for ‘specified, explicit purposes and only those purposes.’

Lone Star CTO John Volpi concluded, ‘We always advocate the use of redacted and abstracted data and the targeted use of machine learning over brute force big data. As society comes to grips with the risks of big data done badly, we should not be surprised to see more regulations in this space.’ And CEO Steve Roemerman added a warning on the risk of litigation, ‘The ACM principles may help US plaintiffs who feel they have been harmed by big data and AI. If models are not transparent, auditable and explainable, such risk cannot be effectively determined.’

Another contribution to the analytics discussion came from Anadarko’s Mark Razmandi in a presentation given at the LBGC Wellsite automation conference held last month in Houston (report on page 6). Refreshingly, where others like to present oil and gas as a technology laggard, Razmandi has it that oil and gas is the original big data proponent. What has changed in recent years is a shift to ‘comprehensive mass data acquisition paradigms and progressive analytics.’ In a companion presentation, Bias semantics in big data, Razmandi describes ‘objective’ bias, due to the particularities of data acquisition and the more familiar subjective bias of those seeking to make more of the data than they should. And this is no abstract concept, Razmandi’s field is scada systems engineering and oilfield automation.

~

A rather amusing claim appeared in the latest issue of the IOGP Highlights newsletter where I read that the ‘Church [of England] blesses fracking.’ In the same newsletter it appears that, ‘progress [has been made] in standardizing Christmas trees.’ Nothing like having God on your side!

On a more serious note, I have heard and read a lot recently that oil and gas has a good future. Global oil production is forecast to stay around the 100 bopd mark out to 2040, making the noble goals of COP21 appear somewhat illusory, which is of course great news for the industry. However, it is worth looking at how the IEA, the source of various forecasts from BP, ExxonMobil and others, derives its numbers. Embedded in the IEA’s forecast is an increase in the world population of around 25% from today out to 2040.

Some of the growth is forecast to be from poor, low-to-no fossil fuel users, to which the IEA offers sympathy but little else. But in my admittedly rather cursory scan of the IEA’s recent output, the massive growth in world population (from 7 to 9 billion) appears to be a given, and is hardly commented.

Doing something about world population growth is probably about as hard as achieving significant COP21-style decarbonization. On the other hand it is curious that the IEA, and for that matter, pretty well everyone else, does not appear to be interested in trying to do something about both of these interlinked issues. From COP to POP?

@neilmcn


Oil IT interview, DNV GL and the Veracity data platform

Oil IT Journal talks to Cathrine Torp, Kenneth Vareide and Jorgen Kadal about DNV GL’s new data platform. Today ‘absolutely nobody is handling data as an asset.’ An enhanced data value chain is needed to fix the familiar problems of data quality and provenance and of data lakes full of garbage. DNV GL’s position as a ‘trusted third party’ has the potential to bridge the proprietary silos.

What is Veracity’s background?

KV A few years back we set up a digital accelerator group for data. We hired experts in data, software and IT security and made several acquisitions including QLabs (IT risks), Echelon (IT security) and Tireno (IT infrastructure). Our previous work in standardizing marine software development is also highly relevant to oil and gas and its big data opportunities.

CT We also employ technology domain experts. We have already developed a big data solution for drilling.

JK Our core offering and differentiator is our independent third party role and our ability to unlock siloed data. Our role of data curator and broker goes back to 1980.

That’s interesting in the context of the new big data silos from GE, Rockwell, Yokogawa and others. How do you get inside these walled gardens?

JK We are in a dialog with GE re Predix, also with Kongsberg and IBM. The issue of multiple big data silos is unresolved. There is a need for an enhanced data value chain to assure veracity. Can you trust the algorithms that have been applied to data, the data context and the outcomes? This is where the ‘platforms’ struggle. However, two platforms can co-exist. There are two ways of doing this. One option is to chain them so that, say, engine anomaly data from one platform is passed to the second platform for independent QC. But you do need to access source data. You can’t clean your way out of these data quality issues. So a better option is to use the DNV platform as a source of curated data that can be pushed back to, say, Predix for monitoring. Such problems exist in the maritime sector where different OEM data silos cause frustration and data ownership issues.

Is DNV GL a standards setting body like the American Petroleum Institute?

KV I’m not sure about the comparison but for instance, 60% of the world’s pipelines are fabricated with DNV specifications. We are worldwide standards-setters. Our JIPs last from one to three years and the outcomes are published as a recommended practice (RP). We then work with industry to keep the standard fresh. It is a bit like ISO. In fact we build on ISO standards.

CT But we are a lot faster than ISO!

Is Veracity a full services data storage deal or just on the quality/verification side?

KV It could be anything between these extremes. Today we are doing more cleaning and verification but with the present announcement it is clear that we are moving in the direction of full service.

JK We set standards, particularly with our RP for data quality*. We also offer data maturity assessment services to clients. Despite grand claims to the contrary, absolutely nobody is handling data as an asset. A a lot of data we see is complete rubbish, with meaningless tag numbers and so on!

Isn’t this the problem with the data lake concept whereby GIGO**?

JK This has been one of the key findings of our research to date. Even a slight mismatch, a time shift in data from rotating equipment, can make the data useless. This is where our automated QC checks come in. We develop these in joint industry projects which deliver open source fixes that operators can deploy as services.

What partners are involved?

JK We are in partnership with Hortonworks for the big data stuff. We have Hortonworks on premises. Cloud providers are becoming more interoperable thanks to open source software. Even Microsoft is getting into the open source thing.

What about other upstream standards from say Energistics like Witsml, Prodml?

KV We don’t have a big role in IT standards, our focus is on qualifying software. We have a role in maritime standards for functional description of ships. We also have a domain model of risk. We build on the CMMI approach, tailored to suit our industries. The IoT brings new challenges especially when ‘solutions’ come as closed, black boxes.

So what is new in Veracity is the data QC?

KV Yes. Previously we were looking at quality outside of the process that was originally used. The previous RP on IT/OT data quality is now 10 years old. The new (2017) release adds coverage for big data, data ownership and security. You should visit Oreda.com, the Offshore reliability data portal. The Oreda dataset holds years of curated asset reliability data supplied by asset owners. It is used by insurance companies and others for costing and forecasting.

On the subject of data ownership. OOs see a small amount of kit from many OEMs. OEMs see lots of data but only from their own kit. Nobody sees everything!

JK You summarize the situation well. GE wants to compile data from other OEMs. But there is push back from a conservative industry. Equipment comes from many suppliers and OOs want to have a clear picture of asset reliability. This can’t be done without standards and curation. It may be easier for us to do this than an individual OEM.

We have written a lot about data handover to OOs on commissioning. In fact we will be reporting from the CFIHOS initiative in a future issue.

JK Yes, this is another open field. You need a holistic view of the supply chain and between operators. Some do share. We have one very mature operator two months ago gave the OK to share turbine data and analytics with OEMs.

They do say that they don’t like to have to pay to get their own data!

[Laughter]

JK Things have come a long way. We are working on pilots and a proof of concept that we will be showing at the OTC.

* To be reviewed next month.

** Garbage in, garbage out!


Paradigm 2017

75 Page presentation covers new and updated functionality and 'Paradigm on top’ data strategy.

Paradigm CTO Bruno de Ribet has just published a 75 page presentation of new functionality in Paradigm’s 2017 across-the-board software release. The Echos seismic imaging package has been upgraded with new workflows for broadband and 5D seismics. All coordinate data manipulation is now 64 bit.

GeoDepth has been enhanced with improvements to the EarthStudy 360 multi-azimuth pre-stack environment. 3D gather analysis is now available on Windows.

On the interpretation front, a new integrated environment presents a 3D canvas, base map and section windows with one click data loading, synchronized navigation and new guided workflows for pore pressure evaluation including GPU-based multi-volume M2MP work. The quantitative seismic interpretation module has seen refinements to inversion with the inclusion of Skua-GoCad stochastic inversion. Again, the full QSI function is now available on Windows.

Modeling has been enhanced, notably with a link to Dassault Systèmes Abacus for geomechanical studies. Models can be exchanged as Resqml data files.

Petrophysical analysis in Geolog has been improved with new Monte Carlo-based uncertainty analysis of rock type.

Finally, Paradigm presses ahead with its ‘Paradigm on top’ data strategy whereby Epos co-exists with and leverages third-party platforms for data management and sharing. The 2017 release see a new Petrel direct connection, new sync back to Open Works, functionality and direct access to SEP formatted seismic data on disk.


Architectural style a la REST key to interoperable software

EnergySys’ MD advises caution on current microservices fad.

EnergySys MD Peter Black picked up on our interview with Landmark/Dell-EMC (Vol 21 N° 10) to opine that today’s enthusiasm for microservices needs to be put into context.

What makes the web work is not the size of a ‘service’ but its architectural style. Here the key is the fact that no ‘out-of-band’ (OOB) information is required to access and use something on the web. This is largely down to the REST architectural style, as defined by Roy Fielding in his 2000 PhD thesis. REST-style programming underpins the web and has made browser based access possible to a vast range of resources with no need for a manual or OOB information.

Unfortunately, much enterprise software is not written this way and its use, even via a web-based microservices API, will require some OOB information or a peek into the software manual. Every system has a different manual and perhaps even another language to be learned.

Here at EnergySys we use the Open Data Protocol (OData), an OASIS standard for RESTful web services. With OData, there is no OOB information, and all a programmer needs is a starting web address. I can pick up an off-the-shelf product like Excel or Tableau, and point it at EnergySys and immediately start consuming the data. Integration is also easier with REST. Want to build a workflow? Go to Zapier.com, sign up for free, and link together over 750 apps on the web. This is revolutionary!

Today’s enthusiasm for microservices echoes the earlier failed services-oriented architectural (SOA) paradigm whereby monolithic, legacy applications were to be delivered as neat functional components. As usual, industry over-promised and under-delivered and we don’t hear so much about SOA today. Instead, we’ve started talking about microservices. The idea here is very similar, but this time we’re not going to wrap our monolithic applications, we’re going to break them up into microservices.

If the result of a microservices effort is a set of well-written services that support RESTful interfaces that can be combined easily and securely with tools like Zapier, then we can consider that a success. However, simply announcing that we’ll build our software as microservices doesn’t say much, and has the potential to generate the same disappointment and disillusionment that now surrounds SOA.


Review, ARMA Glossary 5th Edition

American Records Management Association on records management and information governance.

The American Records Management Association published the 5th edition of its Glossary of records management and information governance terms* in 2016 and kindly provided Oil IT Journal with a review copy. The Glossary is a 78 Page document that targets records management personnel with succinct (sometimes very succinct) definitions for terms that span, information technology, records and information management (RIM), legal services and business management.

Is there a place for such a published Glossary in the world of Wikipedia? On the one hand Wikipedia provides much more breadth and depth to definitions. On the other hand, breadth can be a nuisance when confronted with multiple usages.

The ARMA glossary will help those working in a RIM environment home-in on a usage that is likely to be relevant to them, especially for ARMA-specific terms like the ‘International information governance maturity model’. But while such terminology is accompanied by useful cross-references as hyperlinks to other points in the glossary, it comes up short with no links to authoritative sources of information elsewhere on the web. This, for the IGMM by the way is here.

* ARMA International TR 22-2016.


Common Data Access’ unstructured data challenge

Open source software’s field day at UK upstream linguistic analysis testbed.

The results of Common Data Access’ (the UK’s joint industry oil and gas data body) challenge were presented late last year. Nine companies took up the challenge of ‘using data and linguistic analytics’ on a heterogeneous multi terabyte North Sea dataset.

Both New Digital Business and Venture IM teamed with Cray on an analytics pipeline that embedded Cray’s Graph Engine running on a Cray Urika-GX supercomputer. In fact Cray went all-in on this effort, providing a comprehensive open source stack that used Apache Tika to parse the plethora of file formats. Tika’s optical character recognition function was used to read scanned images. In all some 3.5 terabytes of data were ingested and indexed. Cray is very keen to promote its graph technology in this context, ‘graph will be a significant part of any similar projects because G & G data is so interconnected,’ and is offering ‘unlimited free access’ to its Urika-GX for further research.

For most, analyzing this multi-discipline data set involved some sort of taxonomic analysis to align the different terminologies used. Flare Solutions, a specialist in the field, showed how its technology is used to build a synthetic well file by classifying documents against the taxonomy, identifying key industry information products. Flare noted that ‘having some structure in unstructured information supports text analytics.’ The CDA data set ‘stress tested’ current classifications and represented a learning opportunity as many additional synonyms were found, highlighting terminology variations across organizations. Flare is also planning a move to a graph model.

Hampton Data Services teamed with Zorroa whose convolutional neural net technology was used to extract information from scanned images and classify report types prior to OCR. Iterative fuzzy search and guided machine learning provided a mechanism for improving classification with use.

Independent Data Services also pitched in with an open source-based solution, using Tesseract OCR and the OpenStack private cloud that embeds ElasticSearch (web-based text search), Log Stash (document processing) and Kibana (data exploration). The stack enables mining of structured and unstructured data.

Agile Data Decisions demoed its iQC tool, observing that it is better to maximize the use of what structured database information is available rather than trying to extract information from unstructured documents. Again, open source software predominated with Python-based machine learning and a Hadoop ecosystem.

Schlumberger used Wipro Holmes and Solr to derive value from unstructured data, although the slide set is rather light on the details!

CDA project manager Dan Brown summarized the outcome of the exercise for Oil IT Journal. We have incorporated an analytics program in our business plan, building on what we’ve learned from the challenge and are considering working with other industry bodies to deliver a second challenge for 2017, in the seismic domain. We continuing to share lessons learned from the challenge with a second ECIM workshop in Stavanger and we will be presenting the results at upcoming conferences. The key data management lesson is that modern analytical techniques depend upon access to large quantities of high quality, well organized data, for training and tool evaluation, as well as problem solving. There is a clear role for national data repositories in facilitating this.’

Read the CDA Challenge presentations here.

~

Although this is unrelated to the CDA challenge, those interested in such matters may like to follow the ‘TextExt’ DBpedia Open Text Extraction Challenge.


NCSA Blue Waters runs reservoir fluid flow models

ExxonMobil’s John Kuzan talks to Oil IT Journal about record-breaking supercomputer trial.

ExxonMobil has used the National center for supercomputing applications’ Blue Waters supercomputer to benchmark a series of multi-million to billion cell reservoir models using its own proprietary code base. The parallel simulation test used all of Blue Waters’ 22,640 Cray XE6 32 core nodes and 4,228 Cray XK7 GPU hybrid nodes for an aggregate 716,800 processors. The system was used to help ExxonMobil ‘make better investment decisions by more efficiently predicting reservoir performance under geological uncertainty to assess a higher volume of alternative development plans in less time.’ Blue Waters’ parallel processing capability was used to speed modeling of multiple realizations of fluid flow models of various reservoirs which are currently ‘hampered by the slow speed of reservoir simulation.’

ExxonMobil’s scientists worked closely with the NCSA to benchmark a series of multi-million to billion cell models on NCSA’s Blue Waters supercomputer. The ExxonMobil/NCSA team tuned all aspects of the reservoir simulator from input/output to improving communications across hundreds of thousands of processors. These efforts have delivered strong scalability on processor counts ranging up to Blue Waters’ full capacity. In an email exchange, Oil IT Journal asked John Kuzan, manager, reservoir function for ExxonMobil Upstream Research for more on the trial.

Was the idea to run many alternative development plans on smaller models or to perform an exhaustive analysis of an extremely large model?

Yes to both. We have techniques for running coarser models that are trend-accurate or simply taking longer with fewer processors on very large models. In short, lots of options to quantify the subsurface uncertainty and have an impact on business decisions.

Was the standard Blue Waters configuration used in the trial?

Yes the standard Blue Waters configuration was used so we could test ability to have the code run on different architectures with minimal need for re-code.

How easy was it to port your proprietary code base to Blue Waters? Any insights on leveraging the GPUs for instance?

It was fairly easy, but I can’t get into the details on GPUs at this time. Give us six months and I’ll be able to comment more readily!

Are there plans to try out Blue Waters on seismic imaging?

Not at this point because it does not represent such a grand challenge. That is not to say seismic is easy, just that we viewed the seismic problem as easier to parallelize at such a massive scale.

The test sound like a success. Is Exxon Mobil going to scale up its in-house IT to match Blue Waters?

The test was a success. It exposed certain areas of the code that were bottlenecks and has allowed us to enhanced the performance when using a few thousand processors and given insights for performance enhancements with smaller models, too. Stressing our code has great value for robustness, stability, and scalability for the smaller models and even for far fewer processor counts. On the last question, I think business demand will drive the direction we go…

~

The $200 million Blue Waters was originally designed by IBM which pulled out to be replaced by Cray. The Petaflop-bandwidth machine was completed in 2012 but was not submitted to the authoritative Top500 list (see here for why).

Other NCSA partners include BP, Caterpillar, Cray, Dassault Systèmes, Dell, GE, and Siemens.


ABC Wellsite Automation, Houston

The third American Business Conferences Wellsite Automation was an information-packed event. Devon, Exco, Windy Cove, Williams, New Horizon, Permian and Discovery shared experiences of multi-vendor scada/PLC deployments. Vendors including ABB, eLynx and Emerson provided insights into automation and communications. EDF presented on measuring and mitigating emissions.

Brandon Davis described Devon Energy’s eclectic, multi-vendor automation environment. scada data is hosted on Cygnet alongside Dover Exploration’s Xspoc smart lift optimizer. A standardized PI historian/Asset Framework structure is deployed across different Cygnet sites. Sites are equipped with high throughput 900MHz Ethernet radios and corporate WiFi connectivity on the pad. Data flows into a central production control room for remote operations, management by exception and role-based analytical decision making.

Automation is baked into Devon’s operations from design through to the field with attention to process hazard analysis safety and environmental reporting.

The Cygnet scada object-oriented model supports detailed metadata on all equipment and provides cross references for multiple source systems at all levels. A standard PLC program allows for template device creation and screen design. The system provides flexible access to data for updating set points automatically and provides high frequency data logging. Comprehensive data access also underpins real-time production optimization, using remote expertise and software-based tools for gas lift, rod pump performance monitoring and decline curve analysis.

In 2017 Devon will be introducing predictive analytics for route optimization, rotating equipment surveillance, leak detection and more. A part of this effort involves ‘new ways of gathering data.’ These will likely include PI System and the Microsoft Azure cloud. Two internet of things-style communications protocols are under evaluation, IBM’s MQTT and OSIsoft’s PI OMF. Rolling out the automation system has seen a 3% production hike across Devon’s assets.

Johnathan Hottell (Exco Resources) advocates deploying scada systems under a rapid application development environment comprising Schneider Electric’s Wonderware and Clearscada tools. Again the MQTT protocol got a plug for real time data and as a pathway to data science, machine learning and ‘virtual operations.’ The new tools are rejuvenating the scada ecosystem which, according to Hottell, is ‘never done!’

For Mark Peavy (Windy Cove Energy) the majors and some independents have had great success in data management and analytics and have achieved a ‘huge competitive advantage.’ But there is much to be done in terms of scada efficiency, security, safety and environmental concerns. On communications, although ‘wired is always better,’ this may not be cost effective. Users need to be familiar with the intricacies of telecoms and communications. Looking ahead, the (ubiquitous?) MQTT protocol was mentioned as a route to a secure automation/internet of things future. Companies need to adapt machine-to-machine intelligent communications and to ‘blend computer technology with oil and gas experience.’ Again, communications and data management are seen as foundational for analytics and for operators ‘creative competitive advantage.’

Renner Vaughn described ABB’s building blocks for the modern field network. These target, in particular, ‘data-driven’ younger workers with an architected end-to-end communications network. ABB advocates a shift from today’s field-wide 900MHz radio network to hybrid, upstream field network, with a local 2.5GHz TropOS broadband mesh connecting pads over a few miles and moving the narrowband to the edge of the field. ABB’s self-organizing TropOS and MicrOS radios provide communications solutions at different points in the field network. A single TropOS node can replace four legacy subsystems for PoE, WiFi, serial and Ethernet. ABB also offers professional consulting services to cost and optimize deployment.

For those getting started on oilfield communications and control, Rob Graham (Williams) offered some advice on selecting an entry-level system, weighing the merits of PLCs and RTU-based scada systems. The RTU beats PLC on communications options, ease of deployment and use. But the PLC is more scalable. In all events, the key to successful deployment and use is to ‘read and apply the manual,’ to use best practices for installation, respect environmental ratings and make sure everything is properly grounded! In reality, modern systems blur the RTI/PLC boundary leading to both confusion and the potential for smart hybrid deployment.

In a follow-up talk, Graham described how Williams assessed the cost benefits of various scada systems for use by smaller operators, in particular the use of scada as a service-style offerings. These have low up-front costs but require an ongoing subscription. Phased deployment allows for improvements and ongoing tweaks, as knowledge of systems increases and use cases are discovered. scada systems can also input data into production accounting systems. In the minefield of communications protocols and hardware standards, Graham recommends using as few protocols as possible. Data concentrators can be deployed to translate esoteric or legacy protocols as required.

Greg Boyles provided a detailed analysis of New Horizon Exploration’s switch from manual to automated monitoring of its remote Trinidadian operation. Here sand production was causing multiple issues that were frequently misdiagnosed by field personnel. Manual adjustments to pump rates based on unreliable data were often counterproductive and were jeopardizing field economics.

An automated controller and progressive cavity pumps now respond to real time sensor data and keep fluid levels where they should be. The automation system includes a ‘scada on steroids’ component that can be accessed through a web interface. The distributed production team can see, plan and react to the same data from any location. The ‘standard’ platform was assembled with an eclectic technology stack from ABB, Allen Bradley, Baldor, Fuji and others. Production has seen a significant hike (17-58%) and op costs are down 50%.

Rob Warren (Permian Resources) showed how ‘cutting-edge’ report-by-exception methods can deliver consistent production while reducing manpower per well. Choosing the right scada host and software system is key. For Permian, this meant choosing between an internal system and a third party hosted solution from eLynx. In a move to decouple applications from plethoric field communications protocols, Warren advocates a central MQTT-based server. MQTT, originally developed in 1999, provides report by exception and is used in Amazon’s internet of things and Facebook messenger. Warren extends the ‘by exception’ concept right up from the M2M protocol to an ‘operations by exception’ management culture.

Landon Schaule compared various field radio topologies and solutions to conclude that for Discovery Natural Resources, a cellular mesh topology, Rajant’s ‘breadcrumb’ network was the way to go. Cellular communications can be flexible but costly large bandwidth uses without the right agreements in place. Discovery has used cellular Raven modems in every phase of operations at one time or another. They can be used as temporary communication devices on plunger controllers or on tank battery scada systems.

In a joint presentation, Robert Vela (SM Energy) and Stephen Jackson (eLynx) showed how the latter’s technology underpins SM’s operations. eLynx’ scada functionality allows SM Energy to leverage its in-house expertise and data and focus on production. eLynx provides a set of tools for organizing and viewing production data in the central control room or on mobile devices.

Jody Overshiner compared traditional well-test based allocation of gas and oil production with that offered by Emerson’s Roxar in-line multi-phase meter (MPM). While the MPM does not replace the separator and periodic well tests, it does provide continuous real time monitoring of gas, oil and water production. Such continuous monitoring can identify transient issues that might be overlooked otherwise.

Alex Barclay (U Tulsa) and Papa Mauricio (elynx) presented on scada cyber security. Several well-documented incidents, the latest, a Russian hack of a Ukrainian electricity substation, have shown that control systems are vulnerable to attack. Modern scada systems use internet TCP/IP protocols for communications which make them vulnerable to a range of attacks. Moreover, they were, in general, designed for functionality over security. Solutions include the use of various standards and best practices from ISA (SP-99), NIST (SP 800-82) and the API (1164). The U Tulsa/NSA National center of academic excellence in cyber operations performs industrial cyber R&D and maintains ‘honeynets,’ cyber traps to catch hackers.

Aileen Nowlan from the Environmental Defense Fund described methane as a powerful greenhouse gas. Its unwanted, fugitive production is also a considerable waste, amounting to $2 billion per year in the US alone. The lion’s share of fugitive emissions come from hard to trace small leaks. What is needed is a low cost, ubiquitous monitoring technique. Statoil set up the methane detector challenge to find a solution. The winners were Acutect and Quanta3. The latter has new been deployed on Statoil’s Eagle Ford operations where smart detection algorithms are being developed to pinpoint leaks and develop optimal mitigation strategies.

More from American Business Conferences.


OSIsoft Internet of Things/Big Data, Houston

NOV Max/RigSentry for BOP. Revenos Limitless Well. Element Analytics and CRISP data mining.

A better-late-than-never summary from OSIsoft Internet of Things, Analytics and Big Data forum held late last year in Houston. CTO Helge Kverneland explained how NOV is embedding more automation into its systems. Drill bits and drill pipe may now be connected with (relatively) high data rates. In production systems, flexible risers have been equipped with sensors for decades to detect when things go wrong. But these also generate a lot of data, much of which went unused until now. Kverneland sees this increasing use of data in the context of the ‘fourth industrial revolution.’ Here, oil and gas has been a bit behind in terms of M2M, cyber, robots and ‘we want to be part of this too!’ The internet of things is where the action is. NOV may be a mechanical company but it has been developing software for a long time and has many programmers working on its control systems. To date, the focus has been on asset performance management (in collaboration with the event host OSIsoft).

The BOP stack is a good illustration of the intersection of machinery and control systems. If a 500 tonnes deepwater BOP assembly fails, take up and repair might mean a couple of weeks of downtime. This has been addresses on the mechanical side with a pod of control systems that can be retrieved with an ROV. NOV is now looking to use big data and analytics to predict failures from real time and historical data. Equipment failures tend to follow a ‘bath tub’ curve where failures are highest early and late in the asset’s life. So there are good reasons not to ‘fix what ain’t bust!’ BOP avail-ability is monitored with NOV’s MAX industrial internet of things/big data platform, with a PI System under the hood. ‘Rigsentry for BOP’ is NOV’s first big data breakthrough. When the ‘quad’ safety system detects a failure on a pod, it switches over to another. NOV is now working on a similar ‘RigSentry for top drive.’ Ultimately, data scientists will warn customers that ‘the top drive bearing is about to fail’ and match this fact up with an upcoming window of opportunity for maintenance. Kverneland wound up saying ‘connected products are the future of our industry and data will get us there.’ NOV straddles manufacturing and the internet of things. But ‘we can do this better because we know our equipment.’

Michael Maguire & Wesley Dyk introduced Revenos’ ‘Limitless Well,’ a ‘multi-state well completion and production analytics discovery tool.’ The proof of concept targets a value-case for standardizing disparate data big data and traditional database structures. PI AF and the PI integrator for business analytics are the building bricks. The game plan is to ‘democratize data and deliver actionable insights to decision makers.’

In a similar vein, Sameer Kalwani introduced Element Analytics’ approach to predictions, machine learning, data lakes and data readiness. EA’s web-based platform transforms raw, real-time operational data into a form that data scientists can use to perform analytics quickly and repeatably. EA’s toolset includes an OSIsoft PI System hosted in the Microsoft Azure cloud. The CRISP-DM standardized data mining process also ran. Referring to the data-driven vs. physics-based modeling debate, Kalwani opined that we need both, both models are appropriate and complement each other. In any events, the PI System is foundational to such initiatives.

Read these and presentations from Devon, MOL and OSIsoft on the conference website.


Folks, facts, orgs ...

Agile Data Decisions, ARMA, BJ Services, C&C Reservoirs, Chevron, Civica Digital Business, Cornerstone Completion Services, Energy Impact Partners, Emerson, Petrolink, Infosys, Energistics, Enservco, ExxonMobil, Flotek, London Geological Society, Honeywell, Hexagon, IOGP, iRODS, Michael Baker International, OFS Portal, P2 Energy Solutions, Spectris, TIBCO, Verve Industrial Protection, Westwood Professional Services.

Phil Neri is marketing consultant at startup Agile Data Decisions. He hails from Ikon Science.

Ryan Zilm is president of ARMA, the American records management association. Susan Goodman is director of the ARMA board.

BJ Services has appointed Andrew Gould and Bill Stewart to its board of directors. Evelyn Angelle is EVP and CFO.

James Faroppa is chief geoscientist and senior VP services at C&C Reservoirs. He hails from Murphy Oil Corp.

Chevron EVP midstream and development, Michael Wirth has been promoted to vice chairman.

Chris Doutney is to lead the newly launched Civica Digital Business. He was previously with Fujitsu.

David Reynolds is the new business development and advisory manager at Cornerstone Completion Services, a recently-launched division of Wellsite Fishing and Rental Services. He hails from Columbia Pipeline Group.

Don Wood has joined Energy Impact Partners as venture partner and west coast advisor.

Emerson has opened a training center in Aberdeen with a ‘live’ liquid flow loop with three configurable metering streams.

Petrolink VP R&D David Johnson and Infosys’ Robin Goswami have been elected to the Energistics board of directors.

Chris Haymons has joined the Enservco board of directors. He succeeds retiree Steve Oppenheim.

Former president and director of the Woods Hole Oceanographic Institution, Susan Avery has joined the ExxonMobil board of directors.

Current worldwide VP for IBM Watson, Michelle Adams has been appointed to Flotek’s board.

Richard Hughes is executive secretary at the London Geological Society.

Honeywell has appointed Stephen Gold as VP and general manager, connected enterprises. He hails from IBM Watson.

Hexagon’s Chief Strategy Officer, Mattias Stenberg is to succeed Gerhard Sallinger as president of Intergraph PP&M. Sallinger moves over to senior VP, strategic alliances at Hexagon.

Michel Contie has stepped down as Total’s representative on the IOGP’s management committee. He is succeeded by Total’s Michael Borrell.

The iRODS consortium has named Jason Coposky executive director and Terrell Russell as chief technologist.

Brian Lutes is president and CEO, and Dale Spaulding is executive VP and COO at Michael Baker International.

Kaisen Energy Corporation, Canbriam Energy Inc., Pennsylvania General Energy Company, and Headington Energy Partners are now OFS Portal community members.

Scott Lockhart has been appointed P2 Energy Solutions’ CEO. He hails from IHS.

Petroplan has named Rory Ferguson CEO. He was previously with Lawrence Harvey.

Ned Deets is now director of the SEI CERT division following founder Richard Pethia’s retirement.

Kjersti Wiklund is now a non-executive director at Spectris.

Steve Hurn is senior VP global sales at Tibco.

Verve Industrial Protection (formerly RKNeal) has named Eric Byres as senior advisor.

Westwood Professional Services has named Michele Guy to lead its Phoenix office.


Done deals

Schneider buys MWPowerlab. ERF Wireless acquires Accordant and seeks new OTC/ALS listing. Greenwell buys Exclusive Energy Services. SCF to fund BCCK expansion.

Schneider Electric has acquired real time 3D virtual reality boutique MWPowerlab. The acquisition adds immersive simulation and training capabilities and advanced 3D visualization to Schneider’s industrial software portfolio and enterprise asset performance management platform. The deal cements a long-term collaboration between the companies on simulation, asset management and HMI supervisory solutions.

ERF Wireless is to acquire privately-held Accordant Communications in a $14 million, all-stock deal. Previously, ERF was advised by Dallas-based turnaround specialist Asset Econometrics. The Accordant deal is the ‘first-fruit’ of AEI’s efforts. The ERF board has now directed the company management to obtain a market listing with the OTC Markets Group under the alternative listing standard. Last year, ERF terminated its SEC registration and is no longer required to file financial reports. The ALS listing will allow for replacement financial reporting to shareholders and others.

Greenwell Energy Solutions has acquired Exclusive Energy Services. EES’ data acquisition system mixing plants assure optimal chemical mixing and delivery for coil tubing, work-over and frac jobs. The plants will be combined with Greenwell’s chemical supply portfolio.

BCCK Holding Company and SCF Partners have entered into a strategic partnership whereby SCF will provide growth capital for BCCK’s expansion opportunities. BCCK provides engineering, procurement, fabrication and field construction services to the midstream oil and natural gas processing sector.


NIST’s ‘Introduction to Cyber Security'

Federal information security organization releases cyber risk management framework.

The US NIST* has just released a draft of Special Publication 800-12, ‘An introduction to information security,’ authored by Michael Nieles, Kelley Dempsey and Victoria Yan Pillitteri. The 97 page document targets ‘those new to the information security principles and tenets needed to protect information and systems in a way that is commensurate with risk.’ SP 800-12 provides ‘tips and techniques described [that can be] applied to any type of information or system in any type of organization.’ The basic principles of information security apply to government, academia and industry and SP 800-12 provides a backgrounder in information security basics as well as a high-level view of the topic.

Central to the approach is the NIST Risk management framework (RMF), that promotes the concept of near real-time risk management and ongoing system authorization through the implementation of ‘robust continuous monitoring processes.’ The RMF also ‘provides senior leaders the necessary information to make cost-effective, risk-based decisions on the organizational systems supporting their core missions and business functions, and integrates information security into the enterprise architecture and system development life cycle.’

The RMF is a six-step program that addresses security categorization, control selection, implementation and assessment, system authorization and ongoing monitoring. A rigorous, if somewhat academic, approach is advocated in the development of both commercial off-the-shelf products and customized systems. The draft advocates the use of ‘trusted system architectures. These are realized through best practice software engineering techniques including security design and development reviews, formal modeling, mathematical proofs, ISO 9000 quality techniques, ISO 1599 15288 systems engineering standards and architecture concepts such as a ‘trusted computing base’ or reference monitor.

Security needs to be assessed to establish a level of confidence that the security meets requirements. Here NIST recommends the use of the Common criteria portal in procurement of IT products with security functionality.

* National institute of standards and technology of the US department of commerce.


SEC pushes in-line XBRL. CFA endorses digital data.

iXBRL signals end to dual filing. CFA Institute reports on 'transformational’ digital technology.

The US Securities and Exchange Commission (SEC) is proposing to ‘green light’ in-line XBRL (iXBRL) for US listed company filings. iXBRL combines human readable HTML and machine readable XBRL in a single document. Structuring financial information can also assist in automating regulatory filings and business information processing. By tagging the numeric and narrative-based disclosure elements of financial statements and risk/return summaries in XBRL, disclosure items are standardized and can be immediately processed by software for analysis. iXBRL signals the end of ‘dual filing’ whereby both an HTML version and an XBRL version were submitted that could lead to inconsistencies between the two filings. The SEC has released a video explaining its new ‘viewer’ that is said to support in-depth analysis of individual filings. The SEC also reports a new xBRL-JSON draft specification which will provide a ‘very simple way to make XBRL data easier to consume.’

~

XBRL received a strong endorsement from the CFA Institute last year with the publication of a 60-page report, ‘Data and technology, transforming the financial information landscape.’ Authors Mohini Singh and Sandra Peters argue that the same technology that is transforming regulatory filing can be further harnessed to ‘reform the financial reporting process end to end.’ The report examines the effects of data and technology on the finance function, on data capture, management, analysis, and use in financial reports and audit.

Companies need to structure data early in the reporting process and to see structured data as a form of communication, not just a delivery mechanism. Today, companies view structured reporting as a compliance exercise and cost center. Data is not structured at source, early in the financial reporting process rather as an additional step needed to fulfill regulatory filing needs. Regulators need to require structured reporting beyond financial statements and allow investors a deeper look into annual reports (see above!).


Yokogawa EPM for UK Shell/BP joint venture

Enterprise Pipeline Management and FAST/Tools scada deal.

Yokogawa is to supply the British Pipeline Agency, a Shell/BP joint venture engineering company, with a management and control system for a major multi-product fuel pipeline system, replacing its legacy scada systems. The three integrated multi-product fuel pipelines total 650 km, extending from Ellesmere port in northwestern England to the country’s southeastern coast and London’s major airports.

Yokogawa’s Enterprise Pipeline Management Solution and Fast /Tools scada software will be deployed to schedule, monitor and control the pipelines and compressors. The EPMS supplements a basic pipeline management system with specific gas and liquid applications that help manage delivery contracts and their associated logistics. EPMS also provides a scada data management function.


Sales, deployments, partnerships ...

Aker Solutions, Blue Marble, Pointerra, Fluor, Geoscience BC, LMKR, Landmark, Paradigm, Beicip-Franlab, PetroDE, OneSubsea, TechnipFMC, Weatherford, Nabors, BuildSourced, EnergyIQ.

BP has awarded Aker Solutions front-end engineering design (FEED) services framework agreements.

Blue Marble Geographics has teamed with Pointerra to provide cloud-based LiDAR data delivery via Global Mapper.

Fluor Corporation is to provide FEED and procurement services to Tengizchevroil’s multi-phase pump project in Kazakhstan.

The BC government is to fund Geoscience BC with $10 million over two years for, inter alia, energy/earth science research.

LMKR has secured a long term exclusive agreement with Landmark for continued maintenance, support and marketing of GeoGraphix.

Paradigm and Beicip-Franlab have signed a multi-year agreement whereby Beicip-Franlab will offer consulting services to Paradigm’s clients.

Samson Energy uses PetroDE’s acreage screening tools in its shale play workflows.

Statoil has awarded OneSubsea an engineering, procurement and construction contract for the subsea production system of its Utgard North Sea development.

TechnipFMC is to deploy its riserless light well intervention services on Inpex’ Ichthys project.

Weatherford and Nabors are to jointly supply drilling solutions in the US lower 48 land market.

Xtreme Drilling is to implement BuildSourced’s asset tracking platform to replace its manual spare parts inventory control system.

EnergyIQ’s IQlifecycle software has been selected by a ‘Dallas-based customer’ to automate back-office operations across the well lifecycle.


Standards stuff

IOGP Seabed Survey guidelines. CAPE-OPEN interop installers. RTI’s DDS-based industrial internet. ISO/IEC 80000 Part 13 IT definitions for quantities and units.

The International association of oil and gas producers, IOGP, has just issued Reports 462-01, Guidelines for the use of, and 462-02, Guidelines for the delivery of, its Seabed Survey Data Model (SSDM). The reports emanate from the IOGP’s Geomatics Committee and concern the SSDM V2.0 release. The new data model is delivered as a UML template in Enterprise Architect for both ArcGIS implementation and GML encoding.

The interoperability special interest group of the downstream Colan standards body has released a developer guide for installers of V2.0 CAPE-OPEN type libraries and primary interop assemblies (PIA). The .NET PIAs target process simulation software developers implementing the CAPE-OPEN interoperability standards. PIAs ensure consistent installation, registration and removal of older software on an end-user’s machine. The document explains the requirements governing what the installers need to do, the technologies used to develop the installers and how they should be used by software vendors delivering CAPE-OPEN compliant software.

A ‘second version’ (a.k.a V1.8!) of IIRA, the Industrial internet consortium’s reference architecture has just been published. The new IIRA includes ‘tweaks, updates and improvements,’ in particular a new ‘layered databus’ architectural pattern. The new pattern was contributed by Real-Time Innovations to accommodate companies that embed RTI’s DDS protocol. For the uninitiated, a databus is a ‘data-centric information-sharing technology that implements a virtual, global data space, where applications read and update data via a publish-subscribe communications mechanism.’ The RTI implementation adds ‘rules and quality of service parameters, such as data rate, reliability and security.’ RTI blogger Brett Murphy commented, ‘You can implement a databus with a lower level protocol like MQTT, but DDS adds higher-level QoS, data handling, and security.’

The ISO/IEC 80000 series of quantities and units standards is updating Part 13: information science and technology covering ‘names, symbols and definitions for quantities and units used in information science and technology.’


Emissions monitoring and reporting

PTAC event hears from Spartan Controls, Process Ecology, CapOpEnergy, Osprey and Envirosoft.

The Petroleum Technology alliance, Canada (PTAC) hosted a recent information session on IT tools for cost reduction and operational improvement, with a particular focus on emissions monitoring. Steve Barker with Emerson local partner Spartan Controls presented the ‘Bliers*’ reporting tool for NOx emission compliance. Spartan also provides solutions for CO2 venting and process optimization and comprehensive data analysis and visualization including on mobile endpoints.

Laura Chutny presented Process Ecology’s MethaneAdvisor, observing that while regulatory focus on emissions was increasing, current measurement techniques are ‘all over the map!’ Methane Advisor on the other hand is modern web-based software for estimating and managing flaring, venting and methane/GHG emissions from oil and gas operations. Operators can save on potential C-tax liability while minimizing losses from leaks.

Cooper Robinson demonstrated Cap-OpEnergy’s DEEPP tools for planning and monetizing carbon management in the upstream. CapOp provides a hosted solution for emissions reduction. In particular from installation to audit. Watch the video on Youtube.

Jeremy Bernard outlined Osprey Informatics’ intelligent visual monitoring solution for industrial operations . Alongside its security-oriented video cam monitoring offering, Osprey offers thermal imaging with automated alerts. In one instance, a visual alarm produced a ‘100% ROI’ from a single methane emission event.

Tim Biggs presented Envirosoft’s EmPower suite for oil and gas environmental, regulatory and compliance. EmPower offers on-time emissions data capture for fuel flare venting, emissions inventory and Dehy** monitoring. The components provide comprehensive data preparation for reporting to Environment and climate change Canada’s single window information manager (Swim) reporting system. More from PTAC.

* 2016 US Base level industrial emissions requirements.

** Glycol dehydration emissions.


eCompliance rolls-out Safety Intelligence

Cloud-based best practices, reporting and analytics platform for EHS professionals.

Toronto-headquartered eCompliance has launched ‘Safety Intelligence,’ a cloud-based analytics and reporting platform. Safety Intelligence provides a cross-company view of real-time risk, enabling health and safety professionals to proactively detect, control and reduce workplace hazards. The eCompliance mobile app connects workers to head office, creating a two-way conversation. Safety leaders can make faster, fact-based decisions while executives gain a comprehensive view of safety risk across the company.

eCompliance CEO Adrian Bartha said, ‘EHS* departments capture a lot of critical data, but have lacked the ability to use it in real-time. Safety Intelligence provides executives with safety performance visibility across the organization. With powerful analytics, EHS professionals can quickly identify risk and prevent incidents from occurring.’

The solution includes a library of safety reports, based on best practices from over 300 EHS professionals. Data is analyzed and viewed in site-specific interactive dashboards to create and track custom KPIs and charts. Stakeholders receive reports on safety performance, helping predict and prevent incidents.

Safety Intelligence targets users in hazardous industries including construction, energy, utilities and infrastructure services to ‘understand and eliminate risk through dynamic data visualizations and interactive safety analytics.’ More from eCompliance.

* Environment, health and safety.


'Leonardo’ IoT announced. Accenture’s hydrocarbon logistics.

SAP to unveil $2 billion internet of things offering. Accenture expands SAP collaboration.

German software behemoth SAP has announced a ‘jump-start enablement program’ for its internet of things (IoT) ‘innovation portfolio.’ The program is intended to help customers connect the emerging world of intelligent devices with people and processes to achieve tangible business outcomes. Rather immodestly, SAP has named its yet-to-be IoT ‘SAP Leonardo’ after the renaissance master who ‘ushered in a groundbreaking era of science and discovery.’

Leonardo represents a €2 billion over five years investment that is to provide ‘adaptive applications’, big data and connectivity in packaged solutions for line-of-business and industry use cases such as connected products, assets and infrastructure.

Leonardo builds on the EU’s Industrie 4.0 R&D initiative. The worldwide jump start program includes consulting services from SAP experts to match IoT innovations with customer strategies in ‘achievable pilots with a clear path to business value.’ Leonardo will launch next July at KAP Europa in Frankfurt.

~

In a separate announcement, SAP and Accenture are to expand their collaboration in the oil and gas industry with two cloud-based initiatives. The Connected hydrocarbons logistics solution and an expanded Upstream production operations solution. Both will leverage the latest SAP technologies, SAP S/4HANA, the cloud and SAP’s role-based ‘Fiori’ GUI.


Coreworx for Omani petrochemical new build

Construction software interface management solution supports $3.6 billion mega project.

Ontario headquartered Coreworx is to provide its interface management software to Oman’s state owned ORPIC’s $3.6 billion Liwa petrochemical new build in the northern industrial city of Sohar. Coreworx’ software addresses the complex dependencies between the diverse software used by different engineering contractors which ‘can create challenges to successful project execution.’ Coreworx provides a pre-configured, web-based solution that supports the ‘direct and formalized communication’ between the four major international EPCs operating working on the project from different geographies.

Coreworx CEO Ray Simonson said, ‘This project required a proven solution that could manage the large volume of project interfaces between EPCs as well as the numerous interfaces with regulatory agencies, local government offices and suppliers. Coreworx Interface Management was designed for capital projects of all sizes and offers stakeholders unparalleled transparency and decision support with its owner oversight tools and dashboards.’

Coreworx clients include Fluor and Chevron, Shell and Saudi Aramco, and claims an aggregate 100,000+ users.


ExxonMobil’s drilling data modeling and analytics for Pason

Drilling advisory system a.k.a. Fast Drill licensed for third party use.

ExxonMobil has licensed its patented Drilling Advisory System (DAS) to Calgary-headquartered Pason Systems. DAS embeds ExxonMobil’s Fast Drill technology that combines full-physics modeling of drilling with structured well planning and design to ‘identify limiters and maximize performance.’ ExxonMobil’s drilling rate has improved 80% since it introduced Fast Drill over a decade ago. DAS automates and enables Fast Drill in real time with proprietary data modeling and data analytics.

Exxon has tested Fast Drill on nine fields to date. The application of the Fast Drill process is (rather curiously) measured in terms of an annual energy saving ‘equivalent to removing 1,200 cars from the road’ through a reduction in fuel use and a decrease in emissions.

Pason president and CEO Marcel Kessler said, ‘Exxon’s DAS extends our portfolio of technologies that improve the efficiency of drilling operations. We worked with ExxonMobil through the development and field demonstration phases of this project that resulted in the grant of this license.’ More from Pason.


OSIsoft adds TIF-backed TrendMiner AI to portfolio

Belgian machine learning specialist adds self service analytics to PI System.

OSIsoft, developer of the PI System, has signed with TrendMiner, a unit of Belgium-headquartered DSquare, adding the latter’s ‘advanced self-service predictive analytics’ technology to the OSIsoft portfolio.

TrendMiner is a long-term OSIsoft technology partner. The new agreement positions TrendMiner as an integral part of the OSIsoft ecosystem, ‘strengthening the existing partnership and increasing alignment in product development.’

TrendMiner is a high-performance analytics engine for time series data. TrendMiner CEO Bert Baeck said, ‘Self-service analytics needs more than just a historian, it needs a powerful infrastructure. This is why we have leveraged OSIsoft’s technology in our solution. In addition, the OSIsoft ecosystem benefits both us and our customers.’

TrendMiner uses pattern recognition and machine learning. Users can query PI System data directly with Google-like searches to identify process trends and locate past similar behavior and spot problems or determine the ‘golden fingerprint’ for optimal processes.

TrendMiner reported ‘250% growth’ for 2016 from rapid adoption of its software. Also in 2016, the company received a $1.1 million investment from The Innovation Fund, an EU investment vehicle backed by BASF, Solvay, Arkema and Total.


Yokogawa+Microsoft+Foghorn+Bayshore+Telit=IIoT

'Practical’ Industrial Internet of Things offering spearheaded from Silicon Valley HQ.

Yokogawa reports that IIoT* technology is now ‘ready for practical use,’ thanks to advances in network technology, the availability of low-cost, large-capacity data communications and the move of corporate information systems to the cloud. Yokogawa’s IIoT has been enabled thanks to a partnership with Microsoft, FogHorn Systems, Bayshore Networks and Telit IoT Platforms. All of whom are to integrate their technologies into an IIoT architecture for the delivery of new services that are set to ‘transform Yokogawa’s business model.’

The initiative is driven from Yokogawa’s architecture development division, to be co-located at Foghorn’s Silicon Valley HQ. Yokogawa’s IIoT architecture will integrate the cloud-based Microsoft Azure IoT Suite, FogHorn’s ‘fog/edge’ computing solution, Bayshore’s OSI layer 7*3 security and Telit’s wireless communication modules, sensor onboarding, and device management.

The plan is for plug-and-play business process applications, ‘sensing clouds’ with automatic provisioning, database and historians in the cloud, all tied together with an application development environment. Yokogawa recently made separate $900,000 investments in both Bayshore and Foghorn. More from Yokogawa.

* Industrial internet of things.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.