Oil IT Journal: Volume 22 Number 8


Common fluid connector

Norwegian Billington Process Technology’s common fluid connector protocol connects best in class, first principle simulation toolset in cloud-based digital twins of wells and plants.

Speaking at a Franco-Norwegian ‘Digital competences’ event in Paris earlier this year, Wim Van Wassenhove introduced Billington Process Technologies’ (BPT) new Common fluid connector protocol (CFCP). The protocol is designed to fill a gap in ‘digital twin’ style high fidelity simulator applications which currently make it hard to deploy best in class, rigorous simulation tools in a real-time environment. A proof of concept involved the exchange of well production data between simulators (Petro-SIM, UniSim, HySys), flow assurance tools (OLGA, LedaFlow) and training simulators (K-Spice, Indiss, Yokogawa). These can be driven from BPT’s Excel add-in, allowing engineers to develop extra functionality in a familiar framework.

The CFCP socket connects fully compositional tools using first principle thermodynamics. The approach will allow oil and gas users to develop or improve in-house solutions as alternatives to ‘packaged proprietary solutions.’ BPT provides the CFCP on top of Prediktor’s APIS Foundation for real-time data management. APIS provides a set of field-proven APIs and ‘best-in-class’ OPC-UA gateway performance. The solution allows operators to ‘make smarter use of plant as well as predicted data, either in the cloud or on-premises.’ Ultimately, the solution will deliver a combination of proven components, cloud-based data storage and a set of API’s for connectivity.

BPT CEO Per Billington told Oil IT Journal, ‘As larger companies develop digital ecosystems, we see this as an opportunity for a practical solution that helps the digitalization processes and improves operational efficiency. Our solutions help with plant integrity and provide key operational limits to plant management and analytic systems. The plant data exposed through CFCP will be available to anybody with the right access privileges. We also see the CFCP data as a better source of information for artificial intelligence and machine learning systems than the raw plant data alone. We completed our proof of concept using internal funding. The next step is to launch a pilot with one or more operators on a live plant. We would welcome an initiative from any plant operator.

As we reported last year, Statoil is to leverage the OPC-UA communications standard in its ‘OneIMS’ initiative. OneIMS represents a unified way of accessing operational data from different assets from enterprise systems, with a standard protocol and stand data semantics. Prediktor’s Apis platform will act as the OneIMS OPC-UA gateway, with standards such as S95, Prodml and Witsml mapped to the OPC-UA semantic model. More from BPT.


Paradigm sold

Emerson’s $510 million deal extends software portfolio upstream with seismic processing. Skua geomodelling overlaps with RMS.

Emerson has ‘agreed to acquire’ Paradigm Geophysical for $510 million from its current investors, a VC group including Apax Partners and JMI Equity. The deal is said to represent ‘13x expected 2017 Ebitda.’ Apax/JMI acquired Paradigm from Fox Paine in 2012 for $1 billion cash. Emerson CEO David Farr said, ‘This significant technology investment meets a growing demand for an independent, global provider of E&P software solutions. When combined with Emerson’s Roxar portfolio, Paradigm expands the global upstream oil and gas capability of our Plantweb digital ecosystem.’

A three slide deck on the Emerson website shows how Paradigm’s software will extend Emerson’s portfolio upstream into seismic imaging. Coverage is now pretty complete with the possible exception of fluid flow modeling. But there is an interesting overlap in the static geomodelling space with both Paradigm’s Skua and Emerson’s RMS as competing flagships. According to the NDB SAB benchmark (page 7), RMS and Skua/Gocad are in the top three of their category. Emerson also reports that ‘the $3.6B E&P software market is down 10% since its 2015 peak.


The great misunderstanding

Editor Neil McNaughton argues that the popular narrative of data-driven disruption, leveraging internet technology is all wrong. The GAFAs success in not due to information technology but to a disruptive business model. Expecting IT to disrupt oil and gas is naive. As GE may be finding.

They used to be called GAFA, now I understand they are the Intel 7*. Whatever. These are the folks who are going to disrupt your business. Or so goes the narrative. It is a little hard to see how these virtual (as opposed to bricks and mortar) businesses will really disrupt oil and/or gas, although I suppose such a case could be made for disruption in trading. This is about as likely to happen as GAFA disrupting the banking business which is proving a lot harder than originally thought. But I digress. The ‘pure’ i.e. Silicon-Valley style disruption play is unlikely to work in a diversified, established and technologically challenging field like oil and gas. So the narrative has been spun around the impact of the information technology that is spinning out of the GAFAs and into business, a phenomenon known as ‘digital transformation.’

The narrative is wrong on two counts. First the oil and gas industry is an extremely computer-literate business with a large installed base, decades of R&D and a very recent bout of ‘disruption’ in the form of the ‘digital oilfield’ movement of the 2000s. Oil and gas is not your yellow/black taxi cab business waiting around to be disrupted by an Uber. The other reason the current narrative is wrong is that none of the GAFAs are really IT businesses. They are combinations of a great idea for a new business or the result of ruthless exploitation of somebody else’s great idea, followed by a first mover/monopolistic/winner-takes-all situation. Despite appearances, the compute/IT side is totally subordinate to the business. If you doubt this, think of how freely Google gives away its IT and contrast this with how jealously it guards its business data!

The disruption narrative, although false, has been successful in the IT/consulting community which has is always on the lookout for a new way of creating fear, uncertainty and doubt. The IT community has grabbed the wrong end of the stick in Google’s success (an awkward metaphor, sorry) and is rushing around trying to convince all and sundry (perhaps that should be oil and sundry?) that they need Hadoop, BigTable Spark and what have you, all the shiny new stuff that has ‘enabled’ the disruption elsewhere.

There is another weakness in this software free-for-all that derives originally from the GAFAs in-house requirements. It is worthwhile reflecting on exactly what these are. I’m not an expert on Google’s IT, but just, for a minute, imagine what the internet looks like from Google HQ. It is the mother of all data deluges! Bazillions of clickstreams coming in from internauts all over the world in real time. Somehow all this needs capturing (more on this in ‘the future of seismic data storage’ on page 8) and turning into a database of who clicked on what, where and when.

In one sense, this is mindbogglingly smart. In another it is stultifying dumb. For an oil, an immediate application of the Hadoop ecosystem is in extracting business-relevant information from computer log files, for instance in the cyber security field. This rather dumb application maps pretty well to the inverted clickstream model. 

Needless to say, those who are selling disruption have chosen to emphasize the mindboggling. Hence the talk of scalable systems, big data and artificial intelligence. And, more generally, on solving problems which you did not even realized you had! The Hadoop ecosystem (whose main job is to do the dumb stuff) has been extended with an Alphabet (sic) soup of acronyms that point us in the direction of the smart stuff. I, like everyone else, am much more interested in the smarts, even if I doubt that all of it is terribly relevant. Much of the big data/artificial intelligence tools are ‘highly scalable’ hammers looking for nails. I sometimes wonder if Google didn’t chuck all this stuff over the perimeter wall as a diversion. Rather like the British are said to have done in WWII by spreading the rumor that pilots were given carrots to help them see in the dark, when in fact, Radar has just been invented.

One company at the cutting edge of all this is GE whose departing CEO Jeff Immelt, was one of the first to put his money and mouth on big data and the Industrial Internet of Things. Five years ago, Immelt stated that ‘The industrial internet is revolutionizing our services. We will leverage our $150 billion backlog to grow this technology and our revenues 4-5% annually.’ A year later Jesse Demesa revealed GE had sunk $1 billion into the IIoT and that around 1,000 developers were beavering away on Predix, GE’s core industrial platform. Predix by the way is quite a remarkable infrastructure in that it shares much of the GAFA-derived open source software stack and also, GE really has been pretty ‘open’ itself in showing Predix’ innards to the world.

Five years is a long time at ‘internet speed’ and you might expect some signs of success. This does not appear to be materializing. The Financial Times of 21st October performs quite a hatchet job on Immelt and on GE. The FT’s Lex column has it that ‘Immelt’s abrupt departure suddenly makes sense [as the] scale of the debacle he created [..] became clear.’ Immelt’s replacement John Flannery spoke of the need to make ‘major changes with urgency.’ The FT editorial has it that GE’s bets on the IIoT ‘have yet to pay off.’ There is talk of sell-offs. One might think that Predix would be sell-off opportunity. But IMHO this is a case where the part would be worth a lot less than the whole. For GE and Google, the value is in the data not the software. That’s why Google gives its stuff away for free!

* Google, Amazon, Facebook, Apple. For the Intel 7 see below.

@neilmcn


Interview - Amor Bekrar, president IFS France

Is IFS really the ‘leading provider of EAM software to oil and gas?’ IFS claims its Apps suite is a leader in asset management. A niche activity that is proving rather resilient in the current downturn. Bekrar outlines recent advances in big data driven ‘operational intelligence’ and the promise of AI.

We were surprised that Arc Advisory found IFS to be a ‘leading provider of EAM software in oil and gas.’ Isn’t that SAP?

We do compete with SAP and also with Oracle and Microsoft. All three are multi-billion giants, but they are also generalists. With we are specialists in enterprise asset management with our IFS Apps suite. We have 3,300 employees and a turnover of $400 million. We have been classified by Arc for the last five years as the leader in the oil and gas asset management niche. Others like SAP are more focused on finance and operations, our focus is on maintaining assets in optimum operating conditions. Along with oil and gas we operate in other asset-intensive industries like defense and automobile.

But we cover SAP pretty extensively and report frequently on majors like Total and ExxonMobil which are big SAP shops.

Sure, but the majors have, to a large extent, outsourced much of the activity that we support. Our offshore operations and marine services solution targets the service providers that work for these major operators. We also support oil country manufacturers and service providers like TechnipFMC and Songa Offshore.

This activity must be suffering in the downturn…

This is a cyclical business that depends on the oil price. Opec has little respect for quotas and US shale has revolutionized the business. Our outlook is pretty pessimistic out to 2021-2022. Companies are preparing for the day when things pick up. This does give us a good target as clients’ focus shifts from investment to maintaining equipment in an operational state.

But presumably, operators are replacing kit less frequently.

Yes, there is not much new equipment being bought. Focus is more on maintenance and on extending asset life time. This is a real market opportunity today where our software fits in with the new Internet of Things paradigm. Each of Songa Offshore’s fleet of six rigs will have around 600 sensors that connect into our software via the IoT business connector.

So, you offer a big data framework like GE’s Predix?

Well, you have to be careful to distinguish between the promise of a technology and the hype! Our experience from outside of oil and gas has shown three key concepts, big data, IoT and what we call EOI, enterprise operational intelligence. The latter is not just marketing but reality. The IoT is producing more and more data and bigger and bigger databases. We know how to exploit this stuff. Prediction is a reality that helps prolong life of asset. Maybe not at 100% of the promise/hype, but real none the less.

We see big data as being on a spectrum from situational awareness (i.e. not so much smarts) across to artificial intelligence and machine learning. Where are you?

Clearly the IoT provides awareness. AI today is part real, part promise. For instance, we are working on sentiment analysis, piping unstructured text (and soon voice) operational records into an analyzer that identifies words and phrases used by operators describing equipment behavior. In oil and gas this could be deployed to avoid breakdowns and/or to bring tech support to into play in a timely fashion. But for sure, we are only just starting on this kind of work.


Statoil invests in AI/big data oil country boutiques

'Series A’ funding rounds initiated by AI ’startup’ Ambyint and seismic big data specialist Sharp Reflections catch VC arm’s eye.

Statoil’s Tech Invest (STI) venture capital arm has made two recent investments in big data companies. Statoil participated in an $11.5 million ‘Series A’ funding round* launched by ‘artificial intelligence startup**’ Ambyint, an artificial lift specialist with a 12-year history of providing control and monitoring solutions. The company combines physics-based methods with modern artificial intelligence and machine learning.

Ambyint claims the industry’s largest repository of labeled training data (nearly 100 million operating hours and around 50 million terabytes) from its high resolution adaptive controllers. Ambyint’s data lake includes over 10 years of operating data from 1000’s of horizontal and vertical wells, sampled every 5 milliseconds, including more than 33 million Dynocards.

The funds will be used to expand deployments of Ambyint’s oil and gas artificial lift optimization solutions in the US and Canada, and to work towards the vision of autonomous well operations. CEO Alex Robart said, ‘AI has shown promise in multiple industries, but many attempts will fail, not because of poor machine learning techniques, but due to the lack of high resolution, labeled training data. We have over a decade of high quality data that we use to train and improve our analytical capabilities.’

Moving upstream, STI has picked Fraunhofer spin-out Sharp Reflections (SR) for another (undisclosed) investment in a Series A financing round. Using pre-stack seismics is a tough ‘big data’ problem, a trilemma of data volumes (terabytes per day in acquisition), visualization and interaction.

STI MD Kristin Aamodt thinks SR is a potential market leader, ‘We have followed them for some time and have been impressed by their capabilities. Following the equity investment, we now look forward to working with the team to achieve their ambitious goals and to contribute to commercializing Pre-Stack Pro.’ The investment will go towards sales and marketing activities and accelerated development of the scalable cloud computing platform. More on SR in our next issue.

* Other round partners are Mercury, GE Ventures and Cottonwood.

** A little puffery here given that the company (as Pumpwell) has been in business since 2004, has been working on its ‘next generation’ platform since 2012 and relaunched as Ambyint in 2015.


More from data science @ Paris EAGE

Shell: GeoDNN-deep neural net-based seismic feature extraction, AutoSum, ML-based automated simulation post processing. Onera/Total: neural nets automate hyperspectral satellite image interpretation. Agile Data Decisions: Putting the structure back into CDA’s dataset.

Jan Limbeck (Shell) opined that conventional workflows suffer from poor scalability in the face of big data. Shell has applied machine learning to seismic interpretation in its in-house developed GeoDNN deep neural network. Conventional interpretation workflows are based on ‘siloed,’ semi-automated steps. Physics-based models may not be designed for the business at hand, the process takes too long, and uncertainties may go unrecognized. GeoDNN performs ML-derived feature extraction on raw seismic data, creating an approximate subsurface model early on. GeoDNN is not designed to replace the geoscientists (heaven forbid!) Other ML techniques help in well design, drilling and logging. GeoDNN’s geophysical feature detection is set to ‘greatly accelerate seismic processing’ but it is not (yet) perfect. GeoDNN was developed with help from MIT.

Shell is also using ML in reservoir engineering with a prototype tool for simulator post processing. Here the ‘AutoSum’ tool provides automatic summaries across large ensembles of reservoir models to help with understanding key sensitivities, ‘current tools are not adapted to this.’ AutoSum combines traditional physics-based models with analytics to identify the main contributions to geological uncertainty and minimize development strategy risk. If more data is required, the simulator can be run again to further investigate the uncertainty space. Machine learning has been applied to relate subsurface features with overall production strategy. Shell’s preferred ML tool is Python. Challenges remain. It is hard to persuade those used to physics-based models to switch. Data access and scale are issues as is the lack of ground truth (especially in seismics). But the approach has spin-off benefits. Data quality issues are found faster and the approach ‘frees up experts to focus on the important stuff.’ Finally, Limbeck warned of the crucial need to maintain underlying databases. [EarthDoc 89275]

Nicolas Audebert from the French R&D Onera establishment described the use of deep learning on hyperspectral data. The work was supported by Total as a part of its Naomi, ‘new advanced observation method integration*.’ Hyperspectral data comes from airborne and satellite-mounted sensors. The term refers to the wider than visual bandwidth, from infrared to UV, with a spatial resolution of around a meter. Imagery is used for geology and surface material identification. Convolutional neural nets have proved powerful for multimedia and RGB imagery. The neural net approach has been used since the 1970s on a pixel by pixel basis. Today, 2D/3D techniques use the full raw images, noise and all. Deep learning on these huge datasets has ‘blown everything else away.’ ADAM stochastic gradient descent, PyTorch and Nvidia Titan X GPUs also ran as did the Pavia Center dataset. Audebert is still on the lookout for more annotated data. ‘The potential of deep learning not yet fully realized.’ [EarthDoc 89272]

Henri Blondelle (Agile Data Decisions) outlined his work on the CDA unstructured data challenge. CDA provided a ‘fantastic dataset’ of logs and reports from decades of North Sea exploration. But much (80%) comes as scanned TIFF imagery or PDF documents. Blondelle wryly observed that the first task of a geoscientist is, ‘unstructure your data by making a print.’ The big question for the CDA challengers is how to put the structure back in. AgileDD’s IQC tool was used to classify documents, extracting metadata such as well status from reports, along with a measure of the probability of correctness. A range of tools contributed to the initiative, the Python ML library, Java, MySQL, JBoss Tools, Azure and a ‘hybrid cloud.’ CDA’s CS8 hardcopy catalogue formed the basis of a classification taxonomy, along with 2,000 annotated documents used as a training set. [EarthDoc 89273]

* Cheekily described in Le Figaro as a relaunch of the sniffer plane (avion renifleur).


More on Schlumberger’s Delfi

Google blogger reveals inside workings of Schlumberger’s flagship ‘cognitive’ data platform.

Before we gave Schlumberger last month’s lead we did ask politely for more information on the breakthrough cloud-based infrastructure. None was forthcoming. However, Schlumberger’s information-retentive guardians of the truth forgot to tell partner-in-crime Google who’s SVP Urs Hölzle has been blogging away regardless. Hölzle, reprising his address to the private, clients-only Schlumberger Forum in Paris last month, provided more on Delfi’s innards. So, according to Google, the Delfi E&P data lake is based on Google BigQuery (data warehouse), Cloud Spanner (RDBMS) and Cloud Datastore (NoSQL) ‘with more than 100 million data items, some 30TB of petrotechnical data.’ According to Hölzle, Schlumberger’s petrotechnical flagship, Petrel, and the Intersect simulator are running in the Google cloud, ‘integrated into Delfi.’ WesternGeco’s Omega geophysical data processing is ‘running at a scale not possible in traditional data center environments’ thanks to Google cloud-based Nvidia GPUs and ‘custom machine types’ giving a compute capacity of ‘over 35 petaflops* and 10PB of storage.’ Other novel tools include TensorFlow, open source artificial intelligence, used for log QC and interpretation and also for 3D seismic interpretation.

Hölzle reports that Schlumberger has deployed the Apigee API management platform (acquired by Google in 2016) to provide ‘openness and extensibility’ allowing clients and partners to add their intellectual property and workflows into Delfi. Read Hölzle’s blog here

Chatting with some Schlumberger people at the SEG it was unclear how much, if any, of this laundry list of Google’s software is fully deployed in Delfi.

* Although SLB’s Paal Kibsgaard reported that Schlumberger had a 27PF capacity way back in 2013.


Software, hardware short takes ...

Coreworx Express, GE blockchain-as-a-service, Naveego DQS, Tracts, EcoSys V8.0, Probe iQ, Spectraseis, Devon, ExproSoft WellMaster IMS, Frogtech Geoscience Terraflux, NSI Technologies StimPlan, Wireless Seismic RT3.

Coreworx has released a low cost, cloud-based ‘Express’ version of its eponymous change management for engineering solution. Coreworx CM Express lets project participants manage the change order process and includes templates ‘used by thousands of capital projects around the world.’ In a similar vein, Coreworx has rolled out Coreworx RFI for collaborative management of engineering project requests for information.

GE has announced blockchain-as-a-service, a new component of its Predix big data/IoT platform. The service combines peer-to-peer communication, cryptography and ‘game theory.’

Naveego has released a new version of DQS, its cloud-based data quality and master data management solution. New features include cross-system data comparison, a data health dashboard and ‘big data technology.’

Tracts automates oil and gas title reporting with ‘patent-pending’ technology. Tracts is backed by VC Houston Ventures.

The V 8.0 release of EcoSys sees the integration of the stand-alone Portfolios and Contracts solutions with the flagship EcoSys Projects.

Probe iQ is a multi-purpose well inspection tool that measures casing thickness and inner diameter and analyses casing material properties.

ESG SolutionsSpectraseis’ surface microseismic array provides direct location of proppant placement during fracking. The solution was featured in a joint presentation with Devon at the 2017 SEG.

ExproSoft has added reporting and trend analysis of preventative maintenance activity to its WellMaster IMS flagship.

Australian Frogtech Geoscience has released Terraflux, a tool for the evaluation of regional heat flow in petroleum system modelling. Terraflux leverages Frogtech’s Seebase studies of the Gulf of Mexico, South China Sea, the North Sea and elsewhere.

V8 of NSI TechnologiesStimPlan fracking software is a ‘complete rewrite’ that has increased stability and improved processing time. New features include stress ‘shadowing’ between fractures, batch processing for sensitivity studies and export of fracture networks to a reservoir simulator.

Wireless Seismic has rolled out RT3, an ultra-high channel count onshore seismic recording system. RT3 supports interactive management of a 250,000 plus channel recording spread in real time.


Regulatory round-up

Updated APEGA (Alberta) reserves reporting. Alberta Geological Survey models Peace River in Minecraft! US Groundwater Protection Council releases WellFinder. PPDM Regulatory meet update.

Apega*, the Association of Professional Engineers and Geoscientists of Alberta has issued a ‘major’ update to its Professional Practice Standard (PPS) for the evaluation of oil and gas resources and reserves for public disclosure. The new version aligns Apega’s work with recent changes to the Canadian Securities Administrators’ National Instrument 51-101 and the COGE handbook from the Calgary chapter of the Society of Petroleum Evaluation Engineers. The 13 page Apega PPS is full of entreaties to do right, not wrong and makes a valiant attempt to explain the intricate relationship between it and the multiple ‘related documents’ that govern reserves reporting.

The Alberta Geological Survey has developed a 3D geological model of the Peace River area. The model was built to assist with the geological and geochemical investigation of odors and emissions from heavy oil and bitumen production in the Peace River Oil Sands region in response to a formal proceeding by the Alberta Energy Regulator. While the model was created with 3D geomodelling software (Voxler), AGS has also released a ‘360 virtual reality tour’ of Peace River developed in Minecraft. Take the tour on Youtube or (if you have Minecraft) download the 20MB model.

The US Ground Water Protection Council has released RBDMS WellFinder, a free mobile application powered by data from state regulatory programs. WellFinder allows users to select oil and natural gas wells from an interactive map and display ‘valuable data and information.’ The app was originally developed in collaboration with the Oklahoma Corporation Commission. Since then regulatory agencies in nine states including Oklahoma, New York, Nebraska, Arkansas, Alabama, Mississippi, Colorado, Kentucky provide well data. Download WellFinder from the App Store and Google Play.

The PPDM Regulatory Data Standards (RDS) Committee met earlier this year. The Well Status and Classification work group has received ‘regulatory feed back’ and is to submit a revised set of values and recommendations for regulatory endorsement. PPDM is also working on well milestones and dates disambiguation and is to report real soon now. A roundtable discussion on ‘potential data problems for pipelines’ found that ‘more research into different jurisdiction’s legislation is required, particularly to understand whether these present common terms and definitions for pipelines and their components.’ More disambiguation is in the err.. pipeline? The RDS has participation from Canadian, US and Australian regulators.

* APEGA is not a regulator but the Canadian Securities Administrator’s Standards of Disclosure for Oil and Gas Activities refers to Apega Coge handbook in its authoritative National Instrument 51-101. All engineers and geoscientists practicing in Alberta are required to register with APEGA.


ECIM 2017, Haugesund

NPD - digital to counterbalance low oil price. IBM - tape still going strong. R2I’s PowerBI showcase. PetroChina - AI lightens dark big data. Teradata and the Teashops. Dell/EMC, ‘Buy not build is dead!’ NDB updates subsurface applications benchmark. Statoil’s data platform in the cloud. Teradata’s managed data lake and the future of subsurface data management. Halliburton on disrupting with open source big data analytics. EPIM on the vicissitudes of Norwegian production reporting.

In her keynote address, Maria Juul (Norwegian Petroleum Directorate) explained that the ‘digital transformation’ is now a central part of Norwegian national strategy. The transformation includes the internet of things, big data and technology, with the ‘theme of data’ running throughout the ‘data-driven’ transformation. The downturn has put the Norwegian continental shelf under pressure. But the digital transformation can counterbalance the low oil price through ‘IT, standards and automation.’ Machine learning and robotics will ‘transform the way we work.’ But ML is ‘also about people.’ Statistics Norway, the national agency has predicted that AI and robotics will see ‘one in three jobs disappear.’ For Juul, these technological advances bring opportunities for innovation and new jobs, so long as new skills and proficiencies are learned. Key among these are IT security, HSE and data quality. The future is ‘both complex and simplified.’ IT security is a priority in NPD’s 2016-2020 strategy plan.

Mark Lantz (IBM) entertained the audience with a super-geeky presentation of the future of tape. Worldwide data growth is running at 42% annually and storage is a race between disk and tape. Hitherto $/GB storage costs have been driven by disk technology but as the 2015 INSIC tape storage roadmap showed, disk drive scaling is slowing while tape is forecast to raise the storage bar at a steady 33%/year for another decade. 80% of data is inactive and can be stored on tape. Tape is energy efficient, secure with a long media life. A 2015 investigation by the Clipper Group of the total cost of ownership, over a nine-year period, of a petabyte archive growing at 55%/year found a 6.7x cost advantage of LTO over disk. Could it be that tape’s evolution will slow down too? Seemingly not. While disk is approaching the ‘superparamagnetic limit’ due to grain size, tape’s much lower areal density can continue to scale log-linearly for maybe 20 years. INSIC is forecasting 250TB cartridges by 2025. Lantz concluded that today, ‘tape is good for big data and its cost advantage over disk will continue to grow.’

Kine Johanne Aardal gave an assured presentation of her startup’s (Robotic Resource Insight R2I) work on creating business value through data integration, analytics and visualizations. The online demo shows R2I’s Microsoft PowerBI development accessing public NPD data via a compelling GUI.

Li Dawei (a.k.a David Lee) outlined PetroChina’s use of data mining algorithms in petroleum. PetroChina has 1.6 million employees and operates the largest oilfield in China with 100,000 wells, and 70 large IT systems. Moreover, PetroChina has 2 petabytes of data ‘whose deep value has not yet been realized.’ Enter data mining as a way of leveraging this huge dark dataset. In China, data mining PHds peaked in 2012. The subject is now considered ‘mature.’ All that is needed is to select the best algorithm between ‘classification, regression, clustering, estimation, prediction and association.’ Dawei has tested the approach on the C&C Reservoirs global oil and gas field database, looking for key factors that influence recovery. It turns out that ANN (artificial neural network) and SVC (support vector network) regression techniques are both ‘applicable’ with ANN best for recovery factor regression and SVC the tool of choice for classification.

Duncan Irving, now a ‘think big’ consultant chez Teradata agreed that data science has made it into the upstream. While data-driven decision making has a long history in business-at-large (Irving cited Joe Lyons’ deployment of the LEO computer at his eponymous teashops in 1951). Despite BP’s multi-million dollar investment in NASA AI spin-out Beyond Limits, in general, oil and gas has missed out on the big data movement because of culture, skillsets, platforms and, until recently, the lack of an economic driver. The culture issue is well illustrated in geophysics which turned its back on IT twenty years ago and built its own HPC infrastructure. This is now ‘quite impenetrable to generic IT.’ But this can be fixed through improved data management. Companies have a choice, either keep storing data as we done for years or ‘add a few smarts’ to data management and make it (and the data managers) more useful to the business. Enter business-focused data management and the chief data officer, an ‘emerging’ C-level role in oil and gas. There needs to be a shift from ‘custodianship’ (creating walls, avoiding change) to ‘steward’ (sharing, teaching). Stewards can mentor and support ‘citizen data management’ and ‘do better than storing everything in PowerPoint.’

David Holmes Dell/EMC also thinks that, access to data should not give a competitive advantage. In Norway, NPD has promoted open access to data for decades via Diskos. So how does IT confer a competitive edge? Perhaps it is the applications and how we use them that are transformational. But deploying applications is costly in terms of infrastructure and they are expensive to deliver and maintain. The cloud promises to change this but there is an assumption that apps will take care of persistence and availability. Unfortunately, today’s ‘legacy, pre-cloud’ data management apps (Schlumberger’s Petrel Studio, Landmark’s Open Works) just assume that such infrastructure is already there. Cloud-native means a different set of assumptions and a shift to a ‘software-defined’ environment.

Holmes advocates a ‘multi-cloud’ environment with cloud-native new apps and infrastructure-as-a-service for legacy. ‘Look at how much you spend just keeping the lights on, most IT budgets are spent on keeping things alive.’ Holmes advocates a cloud-first strategy, either on or off premises, or one of the private cloud providers, like Dell’s own VirtueStream. Some large ERP-type apps may require specialist skills to migrate, probably with an abstraction layer to separate apps from virtualized hardware. Today all E&P software vendors are racing to be cloud native. Holmes, citing Gartner, sees a return of in-house app development. ‘75% of business apps will be built not bought in 2020.’ The movement will also see the rise of open source software. It used to need a huge team to do a ‘hello world’ app. Today teams can put together their own apps and avoid vendor lock-in. While it may not be possible to write a conventional reservoir flow simulator in a couple of weeks, ‘machine learning will write you one!’ Will a ‘no physics’ methodology be OK for the SEC? Probably not? Will it be useful? Definitely. The tools of the trade, the Hadoop ecosystem, may be terrifying, but you can always use a packaged solution like the Dell-backed Cloud-Foundry, the fastest path to innovation and a virtuous circle of development. In the Q&A, Holmes admitted that the risk of competing cloud ‘platforms’ in the future was real.

New Digital Business’ Jonathan Jenkins provided a progress report on the Subsurface Applications Benchmark, a joint venture with Aupec. The SAB kicked off in 2011 and has studied the marketplace for six years, providing detailed trends for subsurface application and data tool usage. Schlumberger Petrel ‘absolutely dominates’ the subsurface, per company and per user. Since the downturn there has been growth in cheaper tools like Open dTect. Landmark’s DecisionSpace ‘is not dead’ and is starting to creep up the charts, ‘offering some competition for Petrel.’ For seismic interpretation the top three are Petrel, Kingdom and Geoprobe.

SeisWorks ‘is still there’ and Open dTect is ‘coming up strong.’ For mapping and visualization, ArcGIS and Petrosys are equal at the top spot. For static geomodelling its Petrel, RMS, Skua/Gocad with DecisionSpace making some headway. Reservoir Engineering is still dominated by Schlumberger either with Eclipse or increasingly, by Petrel RE. Landmark continues to dominate drilling. Despite the best efforts at portfolio rationalization, the number of tools in a workflow remains stable. In data management, Petrel Studio ‘does not appear to be getting huge traction,’ the Petrel reference project and OpenWorks ‘dominate’ and ‘spatial databases are no longer dominated by ESRI.’ BlueCube and Landmark Earth are now launching as cloud ready and EnergySys is ‘designed for the cloud’ and available in a ‘pay by barrel’ license. Open dTect with all bells and whistles is available for $200/day. Some majors still do in-house development. Both Total and Shell prefer to build in the face of expensive vendor tools. But their peers still buy software. Inter-tool compatibility is much better today than before.

Nina Reiersgård and Per Kåre Foss described how the new cloud-based Statoil data platform is being built. The drivers for the SDP were a) the complex and non-scalable legacy infrastructure, b) more real time data, c) the difficulty of finding stuff in the 30 petabyte archive and d) the advent of novel data sources like drones, the robot ‘snake’ and satellite imagery. 2015 saw the kick-off Statoil’s future IT project and the search for a new (additional) data platform. The idea was for connectivity to the legacy platform from a ‘one stop’ data platform in the cloud, supporting analytics and external collaboration. A bold, multi-cloud strategy means that Amazon is used for infrastructure and HPC, the data platform is on Microsoft Azure and some subsurface and future HPC on Google. The SDP spans data storage across multiple databases and API connections to SAP and other enterprise tools. Legacy systems without an API are replicated to the SDP. The cloud ingests both streaming and batch data from the IoT. Predictive maintenance is currently the main app, but Statoil is working on subsurface analytics. All data in the cloud is encrypted so it is ‘actually more secure than legacy.’ Loading processes are all automated and only seven people run the platform. The recently announced Statoil digital center of excellence was enabled by the SDP. The company is ‘recruiting heavily,’ both internally and externally.

Teradata’s Jane McConnell provided a peek into the future of subsurface data management in the ‘managed data lake.’ It is clear that a lot is happening in IT with analytics, unstructured data and hackathons. Data use is changing so data management needs to change too. A data lake groups original format data in a ‘collection of storage instances.’ These need modelling and management, ‘otherwise you will have a data swamp.’ Whereas in the old world data was loaded manually, in the new world of the data lake, data is picked up from a directory and automatically ingested into the lake through predefined pipelines. Enter Kylo* (a recent Teradata acquisition) as a data lake management platform. Kylo manages enterprise class data lakes in Hadoop and Spark. Teradata has developed a pipeline for LAS (log) data using an Apache NiFi template. Regarding seismic data McConnell observed, ‘standards are good but it would be better nice if they did not assume we were writing to 9 track tape!’ In a another nod to Norway’s regulators, McConnell also stated that, ‘the ability to access data should not confer a competitive advantage.’

Halliburton’s Ashwani Dev recalled the disruption of Amazon, Uber, AirBNB and others to ponder on the impact of all this on the upstream. Well completions ‘create 12 terabytes of data/day’ which is amenable to big data analytics. BDA is coming in right now in the form of open source tools like R and Python, but the industry is reluctant and slow to adopt. Dev observed that despite fifty years of stuck pipe research on OnePetro, ‘we are still nowhere!’ But if we manage to use all the data, leveraging the emerging technologies we can expect actionable insights. Dev’s open technology slide was very busy with maybe 50 or so ‘open source emerging technologies’ making up the big data ecosystem. One current use case is reservoir petrophysical property prediction, now ‘90% accurate.’ An advisory tool has been developed to rationalize unused functions in software by monitoring mouse clicks. ‘We (or maybe you) are paying for software that we don’t use.’ Dev wound up with a plug for Halliburton’s OpenEarth community backed by RedHat, Energistics, Total, CGG, Shell, Devon, Statoil and IHS which is set to drive the open source model à la Linux.

Magnus Svensson of Norway’s Epim joint industry body traced the tortuous history of Norway’s production reporting. Back in 2006, the first version of Epim’s daily production report was released. This has been through several evolutions, most recently the Epim reporting hub, a common data sharing platform that connects field production data to operations and the regulator. Reflecting on weaknesses in the current systems, after fourteen years of effort, Svensson wondered, ‘what went wrong?’ Some technology components (read the semantic web) did not to fulfill their early promise. Also production reporting is complex with different daily, monthly, partner, asset and yearly reports along with updates and reconciliations. Such issues are particularly relevant as we move into the new world of big data, digitalization and the data lake. Text input can be valuable but is hard to capture, especially when in ‘offshore language.’ PDF still rules, ‘this is a challenge, we cant get rid of it.’ Even a ‘trivial’ daily production report is not trivial at all!

Visit the ECIM home page here.

* Kylo looks like a GUI-driven reinvention of the Unix Shell!


Folks, facts, orgs ...

ENGlobal, Atlantic Gulf and Pacific, BCCK, Americas Natural Resources, Digital Route, Flowserve, H-D Advanced Manufacturing, KBR, NextDecade, NGP Energy Capital, National Oilwell Varco, Orbital, Seeq, S&P Global Platts, Swagelok, Victrex, Weatherford, Williams.

Shalon Simmons has joined ENGlobal as program manager cyber security/operational technology.

Abhilesh Gupta is the new global CFO and commercial head of Atlantic Gulf and Pacific. He hails from Asia Genco.

Kevin Blount has been named VP corporate development at BCCK Holding.

Scott Van Bergh is vice chairman of Americas Natural Resources Group. He was previously with Global Energy Investment Banking.

Digital Route has promoted Andreas Zartmann to CEO. He replaces Johan Bergh who steps down from the CEO role to lead DR’s sales organization.

David Wilson replaces retiree Tom Pajonas as president, Industrial Products Division at Flowserve. Lee Eckert has joined Flowserve as senior vice president and chief financial officer. He hails from CHC Group.

Michael Vincent has been promoted to President and CEO at H-D Advanced Manufacturing.

KBR has appointed Lieutenant General Wendy Masiello to its board.

NextDecade has named Matt Schatzman to the newly created position of President. He was previously with BG.

James Wallis has joined NGP Energy Capital as partner in Houston. He was previously with Lime Rock Partners.

Chevron retiree Melody Meyer has joined National Oilwell Varco’s board.

Openlink has promoted Rich Grossi to CEO. John O’Malley continues as executive chairman.

Jacob Tivey is now trace measurement specialist at Orbital. He recently graduated from the University of Birmingham.

Seeq has hired Todd Amy as sales executive in Houston, Cody Ray Hoeft as software developer and Mike Talmadge as analytics engineer in Houston.

Chris Midgley is to lead S&P Global Platts’ analytics content. He hails from Shell.

Bill Canady is now president and COO at Swagelok. He was previously with Hillenbrand.

Jakob Sigurdsson is to succeed retiree David Hummel as CEO at Victrex.

Weatherford has appointed Karl Blanchard as executive VP and COO. He hails from Seventy Seven Energy.

Williams has named John Chandler senior VP and CFO. He succeeds retiree Don Chappel.


Future of seismic data storage

Eurotech, ‘large companies have no long term storage strategy.’ Iron Mountain, ‘not all are ready to face storage challenges.’ Statoil as remastering poster child. IBM ‘cloudifies’ tape.

At a seminar on the future of seismic storage organized by the Norwegian Petroleum Directorate earlier this year in Stavanger, operators were warned that inaction is a high-risk option with regard to preserving legacy seismic data on tape. Egil Simones has worked with seismic related data since 1992 starting as tape monkey, then field engineer and latterly, as CEO of Eurotech Computer Services Norway, one of a few companies that still specialize in tape technology. Simones has experience with tape from many large companies and has ‘seen all the problems there are.’ Many large companies have no long term migration strategy. Only when somebody died does stuff get thrown away. Equipment and tapes are assumed to last perhaps thirty years. Information and systems are managed and planned for based on today’s viewpoint and values, not on what the future will bring. It now hard to access the equipment needed to read older tapes and it is not going to get any easier (or cheaper) in the future. Companies should have a data migration strategy. If this is done right, ‘you can actually get rid of the old tapes, not just add another set to manage!’

John Kjetil Pedersen (Iron Mountain) traced the history of media and technology for seismic and well data recording, from 7, 21 and 9 track tapes to modern 3592 media. There is now a general recognition that media and data formats change over time and that data deteriorates due to aging and storage conditions. The seismic industry is not alone here as film, broadcast and others face the same challenges. However, most data owners have no clear media strategy. Old media and formats are kept too long and meanwhile it is getting harder and more expensive to recover old data as hardware expires and operators retire. Unfortunately, ‘not everyone is willing to face these challenges properly.’

Judging by Sivert Kibsgaard’s presentation, Statoil should be a poster child for an orderly seismic remastering program. This has run in four phases from 2004 through to 2017 and has seen over half a million tapes upgraded to modern 3590/3592. Looking to the future, Kibsgaard weighed up the pros and cons of LTFS (see below). The latest media offers easy access to content. Even though it is tape, it is more like a USB stick or disk and there are no issues regarding block size and tape handling commands. On the other hand, its relationship with SEG-D is uncertain and its use may not be standard across vendors.

IBM’s Robert Haas described how IBM is ‘cloudifying’ tape storage with storage objects and by extending OpenStack Swift to high latency media (i.e. tape). IBM provides OpenLTFS as entry point to its Spectrum LTFS tape libraries. If there remains any doubt as to tape’s continuing importance in the modern IT/big data world, Haas pointed out that Google and others in the ‘Intel Super 7*’ offer various combinations of LTO and Jag tape drives in the cloud. Haas also provided a pointer to an old but interesting presentation by Google’s Raymond Blum on ‘How Google backs up the internet.’ Blum observed that a) backups are useless (in themselves), what is important is the restore and b) internet/Google scale mandates taking humans out of the loop and c) diversity is key, tape is great because it is not disk! More next month in our report from the SEG Technical Standards committee.

* The Intel Super 7 are the GAFA plus others which influence chip development.


Going, going... green

ProGHG reporting. CCS makes major strides. $36 million for US CCS. Quanta3 methane sensing.

Wood Group has launched ProGHG, a new application for onshore oil and gas producers to report greenhouse gas emissions under EPA subpart W (40 CFR 98) regulations. The solution targets gas producers with emissions greater than 25,000 metric tons of CO2 equivalent or operating more than 800 wells.

According to the Global CCS Institute, the world made ‘major strides’ in carbon capture and storage (CCS) in 2017. For fossil fuel producers, CCS is ‘the only climate mitigation technology that can rescue the trillions of dollars of fossil assets that may otherwise be stranded.’ Although more CCS facilities are operational, the current level of CCS deployment falls short of what is required from to meet the Paris ‘well below 2°C’ target. CCS uptake ‘must be accelerated.’ OTOH, since the early 1970s some 200 million tonnes of CO2 gas been securely injected into the sub-surface ‘putting paid to assertions that CCS is an experimental or untried technology.’

US secretary of energy Rick Perry recently announced $36 million for projects to ‘advance carbon capture technologies.’ Perry described carbon capture technologies as ‘one of the most effective ways we can continue to leverage the sustainability of our Nation’s fossil fuel resources while advancing environmental stewardship.’

Statoil, Vattenfall and Gasunie have signed an agreement to evaluate the possibilities of converting Vattenfall’s gas power plant Magnum in the Netherlands into a hydrogen-powered plant with a 4 Mtpa reduction of CO2 emissions. The plant is to extract hydrogen from natural gas and sequester the CO2 on the Norwegian continental shelf.

Shell has kicked-off a methane detection pilot at its Rocky Mountain House shale gas facility in Alberta. The test is part of the Methane Detectors Challenge (MDC), a partnership between the Environmental Defense Fund (EDF), oil and gas companies, US government agencies and technology developers to test methane detection technologies.

Technology provider for the Shell pilot is Quanta3 whose sensing system continuously monitors methane emissions, providing Shell with real time information on the integrity and performance of their sites. Shell is also involved in the Oil and Gas Climate Initiative (OGCI) that sets out to ‘understand the gaps’ in methane data and detection technology. The US EPA puts current methane emission estimates (leaks) at 1.1% of total gas production in the US.


Noggateway.org

US EIA website consolidates well data previously only available from commercial providers.

The US Energy Information Administration has consolidated well data from ten participating states* into the National oil and gas gateway, a collaborative initiative between the EIA, the Groundwater protection council, member states and the Department of energy’s office of oil and natural gas. NOGG participation is open to all oil and natural gas-producing states which update well-level data monthly.

Public users of the gateway can view, analyze and export well name, location, API number and operator. Also available is the current well type and status along with production, injection, disposal and completion data. The website reproduces hydraulic fracturing chemical disclosure reports from FracFocus.

Previously well-level data was only available on individual state websites from commercial databases. Only individual states may modify the data in the Gateway, and state agency websites should be considered the definitive source for all data in the Gateway.

* Currently Alabama, Arkansas, Colorado, Kentucky, Mississippi, Nebraska, New York, Oklahoma, Utah and West Virginia.


Paradigm/k

Cloud-based production management solution rolled-out at San Antonio SPE ATCE.

At the 2017 SPE ATCE in San Antonio, Paradigm announced ‘Paradigm K’ (for permeability*) a new cloud-based production management solution. K is said to provide ‘unprecedented speed’ and a holistic view of physical and virtual measurements, built on reservoir physics-based predictive analytics. The science underlying K was largely developed by ex-Schlumberger CTO Michael Thambynayagam. More on K next month in our interview with Paradigm’s Indy Chakrabarti.

Also, in a joint ATCE presentation with Dassault Systèmes, Paradigm described a novel way to optimized grids for accurate representations of geology in geo-mechanical simulations. The ‘3D hybrid grid’ allows for a combination of hexahedral and tetragonal cells that adapt to various complex geometries. The hybrid gridding is said to lead to more accurate fluid flow modeling around wells, faults and fractures.

* K is a universal symbol for permeability. But why K? A quick spin through the 600 plus pages of Henry Darcy’s original work, ‘Les fontaines de la Ville de Dijon’ leads us to believe that it might be  because, in French, K is phonetically closest to the ‘c’ of coefficient. Another theory is that K relates back to Darcy’s teacher Joseph Fourier’s use of K as a constant for heat flow, from the Greek καύσωνας. Other ideas to info@oilit.com please.


Sales, partnerships, deployments ...

OSIsoft, Rheidiant, Chevron Technology Ventures, Aker Solutions, Archeio, Clariant, Custom-Weather, GeoView Data Services, Aoheds, EDrilling, Hexagon, Plansea Solutions, ION Geophysical, Lloyd’s Register, MRC Global, Oniqua, SAP, Recon Technology, Seeq, Inductive Automation, Shell, Petrobras, TechnipFMC, Wavefront Technology, ENGlobal.

Shell has signed an enterprise agreement with OSIsoft to deploy PI System software across its global operations. The EA decouples the cost of a PI system from a company’s tag count and includes advanced analytics and digital services.

Rheidiant has joined Chevron Technology Ventures’ new Catalyst program. Catalyst is a network of start-ups with innovative products that may have positive impact on the oil and gas industry. Chevron is also to implement Rheidiant’s Smart Sign leak detection system.

Aker Solutions has secured a four-year, extensible to seven, framework agreement from Shell to provide brownfield modifications services and maintenance support for the Nyhamna and Draugen facilities in Norway.

Browning Oil Company is to use Archeio’s cloud-based well and land file management software to digitally manage and search a large volume of unstructured data related to its extensive lease holdings.

Scotia Gas Networks has chosen CA Technologies’ Privilege Access Management to secure its cloud infrastructure.

Clariant has announced a ‘low single-digit million’ dollar investment that strengthens its North American oil services. The investment upgrades and expands facilities in the Mid-Continent region and sees the construction of a new regional technical lab at the Permian Basin Midland facility.

CustomWeather is to provide GeoView Data Services with real-time marine weather, including wind speed and direction, wave height and direction, humidity and temperature. The weather data is utilized within GeoView’s custom software to help customers’ offshore operations.

Aoheds is now EDrilling’s regional well planning and well control systems integrator in China. EDrilling will provide WellPlanner alongside Aohed’s own service and support offerings.

Samsung Engineering has selected Hexagon SmartPlant Foundation to build an engineering data warehouse. The EDW will house critical engineering information on active projects across Central America, the Middle East and Asia.

Plansea Solutions and ION Geophysical Corporation have teamed to embed PlanSea’s logistics optimization algorithms into ION’s Marlin offshore operations optimization software. The ensemble provides a ‘comprehensive, real-time solution for marine logistics management that reduces costs and risks.’

Statoil has awarded Lloyd’s Register’s risk management consulting team a contract for a risk analysis of the riser platform modification project on the Johan Sverdrup field.

Schlumberger Technology Corporation’s completions unit has used Lloyd’s Register’s expertise to improve global quality management systems for Schlumberger’s oil and gas operation. Systems will be aligned with the 2016 American Petroleum Institute quality auditor certification program.

MRC Global will now be ‘primary global provider’ of valve and valve products and services to ExxonMobil. The agreement includes global projects and maintenance, repair and operations.

Oniqua is to integrate Oniqua IQ with SAP Business Suite on SAP HANA.

Recon Technology has won contracts to develop four internet of things oil and gas production projects for three facilities at PetroChina’s Changqing Oilfield.

Seeq has introduced an updated version of its connection module for Inductive Automation’s Ignition Scada system. Seeq has also announced a new reseller program for Inductive Automation systems integrators.

Shell and Petrobras are teaming on a ‘long-term mutual collaboration’ to develop pre-salt fields in Brazil.

TechnipFMC is to provide engineering, procurement, construction and installation for Hurricane Energy’s Lancaster early production system project. TechnipFMC will also install the subsea equipment, turret buoy and mooring system.

Wavefront Technology has announced that the Powerwave Neptune pulsating selective injection valve used in waterflooding has been deemed a commercial technology by Ecopetrol’s flow assurance group.

ENGlobal’s Government Services unit is henceforth to offer its ‘heritage’ government-only engineering, automation and cyber security services to the private sector. EGS’ services to the US military include design, installation, and maintenance of automated fueling systems, tank gauging, Scada development and integration, cyber security and other engineered solutions.


Standards stuff...

OGC Underground Infrastructure. EdgeX Barcelona. W3 Web of Things.

The Open Geospatial Consortium is seeking sponsors for its Underground infrastructure pilot project to implement and demonstrate underground infra-structure information sharing. The Pilot will implement sponsor requirements based on the findings of OGC’s Underground Infrastructure Concept Study. The study is now published as a free summary engineering report.

The EdgeX internet of things open ecosystem has announced ‘Barcelona,’ it’s first major code release and a component of its roadmap for ‘product-quality’ open source interoperable ‘commercial differentiation.’

Rather late in the day, the World Wide Web consortium has climbed onto the internet of things bandwagon with the launch of a Web of Things working group. WOT WG has released drafts of its proposed architecture, JSON-LD-based interaction models and a scripting API.


Cyber sec round-up

CERT/CEI threat models ‘too optimistic.’ OSIsoft on securing PI. LR and Petras. More cyber help from Honeywell, Leidos, Schneider, Claroty, Waterfall, Siemens. Deloitte on how not to do it.

Lots of recent activity sets out to secure industrial control systems (ICS) and the ‘internet of things’ (IoT). Why? Best read the 120 page CERT/CEI report on Coordinated Vulnerability Disclosure which states that ‘we have observed that overly optimistic threat models are de rigueur among IoT products. Many IoT products are developed with what can only be described as naïve threat models that drastically underestimate the hostility of the environments into which the product will be deployed.’ Ouch!

Recent Ponemon Institute research on the state of cybersecurity in the US oil and gas industry found that cybersecurity measures ‘are not keeping pace with the growth of digitalization in oil and gas operations.’ 61% reported that their organization’s industrial control systems protection and security is ‘not adequate.’

When you are through with CERT and supposing you have a PI System deployed then you will likely want to review a recent presentation from Harry Paul, Cyber Security Advisor at OSIsoft titled, ‘How secure are your PI Systems? A primer for PI System security baselining.’

Lloyds Register is also interesting itself in ‘safety and security’ in the IoT and, through its LR Foundation, is supporting Petras, a £10 million multi-industry consortium investigating ICS threats, block chain applications for resilience in the smart energy sector, using the IoT ‘to understand dynamic risks’ and mitigating botnet attacks.

Honeywell observes that ‘those little connectors can cause big cybersecurity trouble at plants’ and has introduced the Secure Media Exchange. Users check a USB thumb drive by plugging it into an SMX Intelligence Gateway to analyze and secure the entire drive or specific files. SMX also runs in the background on the network to control and log USB device connections. Elsewhere, Honeywell and the Singapore Economic Development Board have established a new industrial cyber security center of excellence for Asia Pacific in Singapore.

A six page white paper from Leidos proffers advice on ‘proactive detection of advanced persistent threats’ and introduces the Cyber Kill Chain. The CKC looks at cyber security from the adversarial standpoint and models the actions they take to achieve a breach. CKC analysis is represented as a threat campaign heat map, a high-level view of a potential hack. A corresponding mitigation scorecard helps an organization assess its internal security posture against specific threats.

Schneider Electric has teamed with ICS security boutique Claroty to address safety and cybersecurity challenges for the world’s industrial infrastructure. Claroty’s real-time OT/ICS network monitoring and detection solution are now available to users of Schneider’s EcoStruxure IoT-enabled, open and interoperable system architecture.

Atos has launched a ‘prescriptive’ security operations center (SOC) to leverage big data and analytics to ‘predict security threats before they occur.’ Detection and neutralization time is improved significantly compared to existing solutions. The SOC runs on the Atos data lake appliance and embeds McAfee’s Open data exchange layer (Open-DXL) and Threat defense life cycle technologies.

Waterfall Security Solutions has partnered with FireEye to integrate the FireEye cloud-based Threat analytics platform with industrial networks using Waterfall’s unidirectional CloudConnect. Customers can monitor and protect their ICS networks using FireEye’s cloud-based Helix service.

Siemens has teamed with PAS Global on a strategic ICS cybersecurity offering. The partnership promises deep analytics to identify and inventory proprietary assets and to detect and respond to attacks across the operating environment. The offering targets utilities and oil and gas, sectors that regularly confront ‘sophisticated, persistent and aggressive’ cyber threats to their operational environments.

A final salutary tale from the hapless Deloitte whose own system was breached recently. The Guardian reported that emails between Deloitte’s 244,000 staff stored in the Microsoft Azure cloud were compromised and client information obtained. In 2012, Deloitte was ranked the ‘best cybersecurity consultant in the world.’ Ouch again!


Teradata for refiner Andeavor

Excellent results from Teradata Analytics Platform. ConocoPhillips on SAP integration at Eagle Ford.

San Antonio-headquartered Andeavor (formerly Tesoro) is to report ‘excellent results’ from the deployment of the Teradata Analytic Platform at its Western Refining unit at the upcoming Teradata Partners Conference. Andeavor’s ten refineries have a combined capacity of 1.2 million barrels per day in 18 states. The Teradata system has accelerated information processing, saving ‘millions of dollars a year.’ Presenter David Brand said, ‘Our analytics platform gives us an integrated, holistic view of the business and allows users to see sales and customer activity across three systems as one in near-real time. This is exactly the kind of business outcome we had hoped that analytics technology would enable. We started with one project and we now have over 30 projects that have been integrated on the Teradata platform.’ Today, Teradata gives decision-makers critical business reports in near real time. Inventories can be tallied accurately and the company’s credit line adjusted appropriately. Now Andeavor is upgrading the system using sensor data for machine learning and predictive analytics.

In another presentation, Gisle Karlsen is to show how ConocoPhillips has integrated Teradata and SAP to provide a ‘deeper understanding of the production characteristics of their Eagle Ford operational assets.’ The system combines production and operational data in an analytics environment that connects engineering components with an integrated dataset.


Collaboration to ‘converge process and operational thinking'

Baker Hughes/GE, Yokogawa/KBC team on preferred partnership for the digital twin.

Baker Hughes, now a GE Company, has signed with KBC, a Yokogawa unit to ‘combine’ KBC’s Petro-SIM package with BHGE’s digital twin simulation software. The ‘preferred partnership’ leverages GE’s Predix industrial internet platform, extending KBC’s Petro-SIM process simulation modeler upstream, providing end-to-end optimization and connectivity across ‘assets, people and business processes.’

BHGE chief digital officer Matthias Heilmann said, ‘This partnership showcases our commitment to break down data silos and converge process and operational thinking. Customers can build a digital twin of a plant, refinery or rig, that incorporates end-to-end process and operational analytics and machine learning. Petro-SIM adds simulation technology to our portfolio and heralds a new era of operational improvement.’

Petro-SIM already provides a cloud/IIoT data-as-a-service solution which begs the question, which ‘cloud platform’ will predominate in the expanding digital twin space. Is Petro-SIM running on Cloud Foundry? We did ask, but no answer so far.


SHAARCs in the cloud

Seismic harmonic analysis and reflectivity toolset now pay-as-you-go service.

Pays International has ported Shaarc, its seismic harmonic analysis and reflectivity computation toolset to the cloud. The Pays GeoCloud brings ‘pay-as-you-go’ to geophysical processing, spanning 3D visualization, interpretation, sparse spike inversion, attribute classification ‘and more.’

Pays sees software as a service as the future of commercial geophysical software services. ‘Gone are the days of expensive workstations and geo-software licenses. GeoCloud needs nothing more than a basic computer and internet connection, yet this offers no compromise in performance.’

Pays offers potential clients a free, four week pilot study. Latterly this includes a ‘seismic health check’ prior to the fault analysis, inversion and attribute analysis workflow. The health check pinpoints commonly defects in seismic data such as residual multiples, non-flat gathers and poor AVO integrity.


Ivar Aasen poster child for Siemens Topsides 4.0

Engineering solution sees projects through design and build and handover of digital twin.

Siemens rolled out ‘Topsides 4.0’ at the recent SPE Offshore Europe event in Aberdeen, a ‘comprehensive digital lifecycle’ that is to ‘help offshore production transition to fully digital operations.’ Topsides 4.0 repurposes Siemens’ industrial plant digitalization smarts for the oil and gas industry. Judy Marks, CEO of the Siemens’ Dresser-Rand unit said, ‘Digitalization is not a passing trend, but rather a foundational value-add technology in oil and gas.’

Topsides 4.0 spans the conceptual and design phases of an offshore project, enabling digital project management and manufacturing, virtual testing and commissioning. On handover to the owner operator, Siemens delivers an ‘intelligent digital twin’ of the facility. The digital twin comprises a virtual replica of the plant, underpinned by secure communications and centered on the key modules of compression, power generation and automation.

Marks concluded, ‘To reap the benefits of digitalization, we must combine customers’ domain expertise with our knowledge of products, automation and data analytics. Getting companies to be comfortable sharing data in this secure but open ecosystem will be critical.’

Poster child for Topsides 4.0 is the digital performance analytics solution deployed on Aker BP’s Ivar Aasen project. This has cut the offshore headcount and optimized maintenance schedules. Following the Ivar Aasen success, Aker BP and Siemens have partnered to provide analytics across future Aker BP developments.


Prosume Energy Foundation

Blockchain-based market supports ‘prosumers’ in ‘decentralized, digitized, decarbonized ecosystem.'

Prosume, a Swiss Foundation, has announced the Prosume Energy Foundation (PEF) a blockchain based, peer-to-peer online marketplace for the exchange of energy assets. The ‘revolutionary’ PEF platform will allow communities to exchange electricity from both renewable and fossil sources. PEF connects independent power producers, consumers, utility companies and energy communities in a locally shared market where each peer is free to interact in a multi-tenant ecosystem. The decentralized, self-regulated monitoring system, guarantees an autonomous, independent and digital ‘smart place’ where users to exchange different energy sources, promoting and accelerating new energy community models.

Energy professional consumers, a.k.a. ‘prosumers,’ will become actors in a new ‘decentralized, digitized and decarbonized’ ecosystem. Seemingly, ‘starting a fossil or nuclear company today is not economically convenient anymore’ as years are required to amortize the high up-front investment.

The Prosume network and ‘energy layer’ will form a ‘blockchain-based Internet of Energy.’ Prosume R&D includes ‘innovative hardware and IoT-devices related to smart metering, smart grid, smart billing, energy routers and devices.’

The company is currently running a crowdfunding campaign and is negotiating eight different pilot projects in Europe. The twenty-strong PEF advisory board includes two representatives from Ernst & Young.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.