Speaking at the virtual 2020 Nvidia global technology conference, Xiaoqing Ge unveiled Baker Hughes’ ‘SafeSense’, a truck-mounted mobile ‘edge analytics’ platform. The idea behind SafeSense is to collect and analyze video-recorded data from a Google Street View car-inspired device mounted on some of Baker Hughes’ 4,800 North American service vehicles. These already travel to remote production locations on a regular basis, making a great platform for in-the-field intelligence gathering. Ge suggested some use cases, HSE, road condition monitoring and scouting but there are undoubtedly many others.
The SafeSense device comprises a roof mounted housing for cameras that provide a 360° view of surroundings which are captured in real time and processed with mobile ‘edge’ analytics. The process starts with assisted machine learning with a human-in-the-loop approach. This trains the system to recognize oil country facilities and activities such as oil trucks, flares, drilling rigs and tank farms. For many applications the system bests satellite monitoring offering better resolution and more frequent coverage.
Training is performed on a high-end Nvidia DGX system using the SSD (single shot multi-box detector), MobileNet and TensorFlow/TensorRT inference engine. The resulting models are transferred to an Nvidia edge GPU computer (a Jetson TX2). A video analytics pipeline builds on the Nvidia DeepStream API, a component of Nvidia’s Metropolis platform for ‘transforming pixels and sensor data to actionable insights’. Frames captured from DeepStream are stored in a NoSQL database for further analytics and visualization.
Work on Safe Sense started in 2018. Generation 3 rolled out in 2019 with 6 2K resolution CSI cameras in the unit analyzing 20GB/hour of imagery in real time. A five-truck pilot in the Permian Basin covered over 17,000 miles and gathered over 3TB of data.
Ge thinks that the SafeSense infrastructure and pipeline for deep learning has broader application to other ‘visual domain’ problems. The neural net backbone can easily be swapped for other deep learning tools and the results are exportable as JSON files. Baker Hughes envisages various business models for SafeSense, selling the data, software or derived intelligence to customers in the energy industry, investors or regulators.
Comment – Safe Sense brings the kind of AI used in remote (satellite) sensing down to earth. To get an idea of the power of the technology, read our report from the Ersi EU PUG elsewhere in this issue where we quote WoodMac’s Stephen Bull who observed (in the context of satellite imagery), ‘If you can see it you can train a model’. Makes you think…
Petrobras’ 2019 Annual Report provides an update on the Brazilian oil’s digital transformation. In September 2019, the company created a ‘Digital transformation and innovation executive office’ headed by Nicolás Simone. The new office is to develop ‘a more consistent and synergic [digital] journey’. The new unit operates five major initiatives, ‘Go Digital’ (the technology platform), ‘Be Digital’ (digital and agile innovation), ‘Lean Petro’ (optimization and automation), ‘Innovating and R&D’ and ‘Protect’ (information security).
Under the ‘Go Digital’ banner, Petrobras is offering integrated data platforms and new technologies such as artificial intelligence. The upstream unit’s high-performance computing capability has tripled from 3 to 9 petaflops (slotted to exceed 30 PF by year-end 2020). The adoption of cloud-based solutions is said to ‘enhance our make vs. buy strategy’. A ‘Future ERP’ program involves a corporate SAP S/4 HANA deployment set to foster ‘agility and analytics-based decisions’. Petrobras expects this to increase productivity through process redesign and facilitate activities such as mergers and acquisitions.
Petrobras has also launched an internal startups program whereby entrepreneurs present potentially strategic digital technologies to a panel of execs. One successful program ‘Trip Detector’ uses AI to mitigate process upsets and suggest actions that avoid automatic shutdown. A 70% success rate is reported. Petrobras likes its’ TLAs*, viz: robotic process automation (RPA), enterprise service management (ESM) and business process management (BPM).
All of Petrobras’ digital initiatives come together in the deployment of digital twins, digital representations of operating facilities such as a platform, an oil reservoir, a submarine system, a critical equipment or a refinery. The twins are said to have the potential to contribute to a reduction of operating costs and to increase efficiency and safety in operations.
See also our 2018 interview with Petrobras’ Nelson Silva and Augusto Borella who discussed the opportunities expected from ‘digital transformation across the value chain’ and on ‘targeting the [disruptive] digital moonshot.’
* Three letter acronyms!
It has been 15 years now since I paid tribute to Jerry Pournelle, now the late Jerry Pournelle, he died back in 2017. I was an avid reader of Jerry’s ‘Chaos Manor’ column in Byte Magazine which, like Pournelle is now deceased. My 2005 piece was a ramble around a dilletante build of a super (micro) computer, assembled from esoteric parts bought on Ebay. I never said what became of it. So, for those of you with long memories here is the end of that story.
When I finally assembled the dual Xeon monster with super-fast SCSI drives and found an OS that worked, I switched the thing on and immediately realized why servers live in server rooms. This thing made a noise like a helicopter. No way I was ever going to use it in the office. It was relegated to the basement where it still sits. Alongside, by the way, maybe a dozen other old computers that have for the most part been rendered obsolete by software upgrades.
My next couple of computer purchases, a Mac mini server and a black box Windows PC were made a couple of years after this experience in the late 2000s. The Mac mini reached its end of life (in software terms) a few years later and was mothballed – along with its super high-res display with its Mac-only connectivity. The black box, which I named, ‘BlackBox’ has been my workhorse ever since. Which meant that come January 14, 2020 and the end-of-life for Windows 7, I had to upgrade. To Windows 10.
I really didn’t want to add another piece of junk to my collection so, despite Microsoft’s warnings that I would probably need a new machine, I decided to stay with BlackBox, especially as I has just added a 1TB solid state drive (a great €100 investment). Windows 10 installation (from a bought USB stick) went fine. One curiosity though. At one stage the installation asked me ‘how I wanted my adverts served’. There was no option for ‘not at all’ which I would have been happy with. After some fiddling around I managed to keep Window 7 on a partition so that my ancient VB6 programs can run natively. I reboot to W7 without internet access so this ought to be safe, even if it is a bit of a pain.
During the process it crossed my mind that maybe this would be a good time to jump off the Windows bandwagon completely and downloaded Linux which is on yet another partition on BlackBox. I found that Linux installs are now as easy if not easier than Windows. Which made me think that there was more life in the old Mac mini. There was. I now have Ubuntu running on the Mac mini and the beautiful Apple 24” screen is back in action.
So why didn’t I make the switch? There are a few reasons. One because leaving things to the last minute does reduce one’s room for maneuver. But mainly, I didn’t make the switch because Microsoft has me by the short and curlies as they say in the UK. Windows 365, One Drive, Share Point are fantastic at assuring lock-in and I am still trying to find the way out.
I mentioned VB6 which may surprise you. In which context I have a few things to get off my chest. From the above you will have gathered that I don’t like to throw away old hardware. In fact, I am even more attached to old software. I figure that once you have solved a problem, written the code, then you should not have to write it again because the old development environment (hardware or software) is at its ‘end-of-life’. Unix shell scripts I wrote back in the 1980s run fine under today’s Linux. VB6 was officially obsoleted in 2008.
Which brings me to the Oil IT Journal website. This too uses technology that is not only old but the actually deprecated by the W3C. The deprecated technology deployed is HTML Frames and Framesets. I use these stubbornly because they involve a minimum amount of programming effort and they work (at least so far) in all browsers without fancy browser-specific coding.
The current Corona virus crisis has given me some extra time to spend on the website. I could have devoted this to a port to some fancy CMS or other. Instead I decided to double down on my Frames. On www.oilit.com you can now manipulate the frame boundaries to optimize your view of the main frame containing the Journal text, the navigation menu and the newly revamped Conferences page at the bottom of the screen. If you are new to the site, grab the frame boundaries, move them around and scroll through the text. You may notice how fast the site works (well I hope you do). This is because of another against-the-norm technology choice. No database. All the site is built from small files which the Apache server assembles in the twinkling of an eye. Oh, and another unconventional technology choice of which I am especially proud is … NO COOKIES! I think that web cookies represent one of the craziest facets of the internet. Facebook, Google, LinkedIn and just about every website you ever visit write stuff to your hard drive! Not Oil IT!
Speaking of the Conferences section this, and indeed our great relationship with the major conference organizers, is one of our main assets. We have put a lot of effort into this page, keeping it as up-to-date as we can, and tracking conferences as they move to online ‘virtual’ events or get re-scheduled. Let me know (info@oilit.com) how you find the tweaks to the site. Should I ditch the frames? I still have that project of a shift away from Windows and am looking at a more stable (and up-to-date) programming environment than VB6!
Quantitative Geosciences* (QG) is a 640 page textbook by Zee Ma, a scientific advisor on geosciences and mathematics with Schlumberger. QG sets out to present ‘quantitative methods and applications of integrated descriptive-quantitative geoscience analysis to reservoir characterization and modeling’. From the preface we learn that while geoscience data analysis has increased significantly in the last three decades, coverage in the literature has been ‘uneven’ and there are ‘significant gaps’ between descriptive and quantitative geosciences. QG ‘attempts to fill some of these gaps through a more systematic, integrative treatment of descriptive and quantitative geoscience analyses by unifying and extending them’. Moreover, although descriptive and quantitative methods may appear unconnected, they can be ‘annealed into a coherent, integrative approach’. Ma likes his hyperbole!
‘Earth science, like other scientific disciplines, is increasingly becoming quantitative because of the digital revolution’. Ma places QG in the context of current IT tropes such as the ‘4th industrial revolution’, ‘digitalization’ and ‘artificial intelligence’. This reviewer first encountered what was then termed ‘numerical taxonomy’ (of fossils) in 1968 so ‘quantitative’ in fact has a pretty long history in earth sciences. Ma frequently prefaces technology as having made significant progress ‘in the last few decades’. ‘Geological models, once drawn by hand, have evolved into digital reservoir models that can integrate various geoscience disciplines, [.. with ..] both descriptive and quantitative data’.
However, ‘the potential of big data and quantitative methods is not yet universally recognized in the geoscience community due, to a lack of familiarity’. Hence the aim of QG is to familiarize the reader with data analytical methods including probability, statistics, geostatistics, data science, and integrated geosciences. An ambitious goal indeed. Does he succeed?
First an admission. This book is far too big and wide-ranging for this reviewer to be able to read and digest in our allotted time frame, so this review dips in and out and makes frequent use of Acrobat’s word count function. But dipping in and out does give a feel for Ma’s thinking and writing. On the thinking side, Ma has opinions on just about everything and provides insights that go beyond the usual textbook. On the writing side, let’s say that conciseness is not his main concern. That’s OK though, Ma has a lot to talk about!
Ma does not pretend to compete with earlier books on ‘mathematical geosciences’, presenting instead ‘data analytics and descriptive quantitative integration’. He focuses on ‘multidisciplinary applications of geosciences to reservoir characterization and modeling’ with the aim of providing a ‘basic understanding of relevant theories and how to put them into practice’.
QG’s contents occupy some 15 pages. Part 1 (data analytics) covers data analysis, correlations, principle component analysis, regression, machine learning. Part 2 (reservoir characterization) includes geological heterogeneity, petrophysical analysis, lithological characterization, spatial analysis, seismics and geostatistics. Part 3 (reservoir modeling and uncertainty) 3D models, kriging, stochastic modeling, more geostatistics, porosity and permeability modeling, water saturation modeling and hydrocarbon volumetrics, upscaling and uncertainty.
QG is heavy on analytical methods and statistics. 30 pages are devoted to the subject of kriging. Another 30 to stochastic modeling of continuous variables. The presentation is betimes heavy on mathematics, elsewhere packed with interesting discussion on topics such as whether stochastic realizations are truly ‘equiprobable’ and the degree to which such models are ‘real’.
With the arrival of ‘big data’ and ‘analytics’ what is the role of all this statistical stuff? Ma argues as follows: ‘Before the arrival of big data, statistical methods used in science and engineering were dominantly model-based with an emphasis on estimation unbiasedness. Although many traditional statistical methods work well with small datasets and a proper experimental design, they are less effective in handling some of the problems that have arisen out of big data. Artificial intelligence has led the way to data mining for discovering patterns and regularities from big data and for making predictions for scientific and technical applications. Although the movement was initially led by computer scientists, statisticians and engineers are now all involved, thus strengthening the trend’.
A chapter on machine learning includes an enlightened discussion of AI and under and overfitting models. ML can produce extremely accurate models that include impossible conditions such and ‘the creation of unphysical or unreasonable values, such as negative porosity or permeability, porosity greater than 100%, etc.’ Avoiding such ‘overfitting’ is ‘one of the most challenging problems in using a machine learning algorithm’. Ma also draws attention to ‘one of the biggest problems in big data’, collinearity, a problem that also arises in classical multivariate linear regression. The effect of collinearity can be ‘dramatic and often difficult to interpret’. Especially as ‘the concept of big data promotes the use of many related variables, which makes collinearity even more severe’. Ma discusses means of mitigating collinearity issues and their effect on model fit.
Another ML ‘gotcha’ is the ‘no free-lunch principle’ whereby a good learning algorithm in some situations may not be so in others. ‘There is no such a thing as a universal machine learning algorithm that gives the best solution to every problem of applications.’ Ma concludes the ML chapter with a warning, ‘Machine learning methods can be very powerful, but they have many pitfalls waiting for the unwary … such as collinearities, inconsistencies and noise. Data integration and prediction using machine learning can be very useful when combined with subject-matter knowledge and other modeling techniques, such as geostatistical methods’.
More advice from the field comes in the contrast between people who are ‘more tuned to modeling procedures/workflows’ and those who are ‘focused on […] integrated analyses and inference’. Knowledge of software modeling tools along with some quantitative aptitude does not make you a modeler. ‘Using modeling tools without understanding the scientific problems and integrated analytics is like giving someone a hammer to interpret an outcrop. The hammer is only a tool in the process of understanding the rock’.
Addressing the ‘disruption’ of the AI/ML revolution, Ma opines that ‘a modern geoscientist should be catalyst, not a casualty, of digital geosciences. Knowledge of multidisciplinary geosciences is a start, analytics will lead to capability, and experience will foster proficiency’. Ma warms up in his critique of ‘questionable practices’ in big data, warning of the ‘excessive appreciation of model appearance’ and the use of ‘exotic methods’. Such approaches have been deprecated as ‘glory models’.
To return to our question, does QG help realize ‘the potential of big data and quantitative methods’. The answer is yes, and no. Ma is more reticent than the introduction would suggest as to the ‘progress’ made in ‘data science’. Those seeking cookie-cutter TensorFlow code will be disappointed. There is no mention of Python. The main thrust of Ma’s argument is that in-depth domain knowledge and first-class statistical know-how will trump the naïve ‘data-science’ approach. One can only agree although some of today’s Python script kiddies might recoil at such heresy.
* Quantitative Geosciences:
Data Analytics, Geostatistics, Reservoir Characterization and Modeling.
Springer ISBN 978-3-030-17859-8 ISBN 978-3-030-17860-4 (eBook).
A recent report from Oil & Gas UK puts the annual expenditure on decommissioning in the UKCS at £1.5 billion. To date some 9% of UK platforms have been decommissioned, with an estimated £15.2 billion more to be spent over the next decade. The global market is put at around £67 billion.
Ikon Science, developer of RokDoc, has produced a timely white paper titled, ‘Meeting the Decommissioning Challenge, addressing the cost and safety issues of abandoning wells which investigates the geotechnical challenges of well abandonment in the context of safety and the regulatory environment.
Various ‘smart’ engineering approaches have led to substantial cost reductions in decommissioning large offshore structures, but these only represent a fraction of the overall decommissioning costs. The largest capital expenditure in decommissioning (50% of cost) is well plugging and abandonment. New, efficient P&A techniques are the easiest way reduce costs, as long as a ‘zero risk’ of integrity failures can be achieved. One concern is the long-term sealing requirements of an abandoned well, risks which can be assessed by in-depth analysis of log data, pressure data and other data such as drilling reports.
Formation pressure is key to evaluating these risks and here, Ikon’ database of well data and RokDoc analyses is claimed to offer tailored workflows adapted to regional and local data access protocols and formats. Initial focus is the UKCS where Ikon has analyzed some 6,000 wells.
The UK Petroleum Act of 1998 is the regulatory foundation for decommissioning. However, ‘an overly strict following of the guidelines can lead to poor technical choices depending on local pore pressure distribution’. Ikon provides ‘independent reviews and due diligence on proposed plans, evaluating plug placement suitability and optimizing P&A design’.
A literature review in the Elsevier Journal of Petroleum Science and Engineering by Timur Bikmukhametov and Johannes Jäschke (NUST, Trondheim) compares first principles and machine learning based approaches to virtual flow metering (VFM). VFM can replace expensive hardware metering devices and reduce the need for well tests. The review compares traditional VFM methods and commercial software with emerging data-driven methods.
Data-driven approaches circumvent the simulation of complex systems or processes for which the exact solution can be difficult to find numerically. The review covers operational experience from trials of data-driven models. Much commercial software (OLGA, K-Spice/LedaFlow, FlowManager and others) fall in the ‘first principles’ camp.
A notable exception to this is FieldWare Production Universe (FW PU), a data-driven VFM software package used by Shell. FW PU is said to be robust and accurate for conventional and multizone wells and capable of tracking changes in production regimes. Baker Hughes’ NeuraFlow is also cited as being data-driven and working for extended periods without re-calibration. The authors discuss, with copious references, different approaches to solving the ‘highly non-linear problems’ involved in data-driven VFM, notably the computationally efficient ensemble Kalman filtering (EnKF).
Summing up, today’s commercial VFMs are deployed both as a standalone solution or as a back-up system for physical multiphase flow meters. However, in these systems, model and PVT data tuning and transient flow behavior are problematical. Data-driven methods ‘are becoming popular as more field data is available and as algorithms advance and compute power increases’. The authors consider that a promising research direction is the development of a hybrid VFM that combines both first principles and data-driven modeling. The study was carried out as a part of SUBPRO, a NNTU unit.
~
Another Journal of Petroleum Science and Engineering paper, ‘Leveraging digital rock physics workflows in unconventional petrophysics: A review of opportunities, challenges, and benchmarking’ by Ayaz Mehmani, Carlos Torres-Verdín (both UTx Austin) and Shaina Kelly (ConocoPhillips) reviews the state-of-the-art in numerical simulation and pore-network modeling. The authors observe that ‘despite advances in micro-computed tomography (µCT) and scanning electron microscopy (SEM) techniques, obtaining sufficient information to capture dual-scale porosity and surface textures remains ‘a formidable challenge’. The paper reviews the current status of digital rock physics (DRP) in tight and/or diagenetically-altered rocks.
The review concludes that performing DRP on a core scan is not enough, even for many conventional rocks. It is crucial to interface µCT scans with experimental data from core analyses or microfluidics. Most current studies of shale tend to focus on discrete pore-scale scenarios that are often not benchmarked with SCAL data. The paper offers a technical roadmap for the ‘robust application of unconventional DRP for the petrophysical and general subsurface community’.
~
An enthusiastic report from Norway’s Norce R&D body has it that novel approaches to understanding the reservoir have ‘triggered large volumes’ of oil resources. The Norce-developed methods use the ensemble Kalman filter (EnKF) to merge large amounts of diverse data on the reservoir.
Norce researcher Geir Evensen manages the RCN Petromaks-2 project DIGIRES where the EnKF technology is being developed for operational use in ‘digital decision support’. The project is a collaboration with seven oil companies. EnKF, originally used weather forecasting is now said to work well in combination with reservoir modeling and simulation. Several oil companies, including Equinor, are now using EnKF technology in their recommended workflow.
A Rystad Energy presentation at the 2020 Offshore Strategy Conference in Stavanger claimed that ‘reservoir characterization, including the use of EnKF technology, is estimated to account for over half of the technology area’s realized reserve increase of 540 million barrels of oil equivalents’. Make of that what you will! More from the Norce ‘Digifuture’ minisite.
~
A presentation at the 2019 Norwegian Conference on ICT by researchers at Sirius (an Oslo University unit) investigated the use of ‘rewriting logic’ to formalize the description of geological processes. The presentation on ‘Geological multi-scenario reasoning’ has it that the approach may replace today’s ‘ad hoc manual work practices’ for developing and communicating multiple geological hypotheses. The formal framework for geological reasoning is currently written in Prolog and leverages the Maude rewriting system. As an example of some geo-Prolog code consider this… ‘Fault has top-layer Layer if Fault is a fault, Layer is a geological unit, Fault goes through Layer, and there does not exist a layer that is on top of Layer which Fault goes through’. The researchers claim that applying geological rewrite rules onto ‘proto-scenarios’ can deliver multi-scenario ‘beyond human capacity’.
Amazon oil and gas blogger-in-chief Don Bulmer reports on ‘serverless*’ seismic data management as proposed in a White Paper from UK-based seismic boutique Osokey. Osokey services include SEG-Y and SEG-D data storage in the Amazon S3 for cloud. Data is then available to processes running as AWS Lambda instances. Other tools included in the offering include the DynamoDB metadata store and ‘Athena’ for search.
The whitepaper describes a cloud-based seismic data management
offering that enables a ‘lift and shift’ of SEG-Y or SEG-D
formatted data into Amazon Simple Storage Service (S3). An
‘event-driven’ architecture ingests seismic data and
generates a file inventory that can be searched using Amazon Athena.
Data is passed through AWS Lambda to extract header information which
is stored in Amazon DynamoDB. Trace-level indexes allow for data
viewing output to geoscience applications o, premise. Using multiple
AWS Lambda instances, an aggregate read performance of 42 GB/s was
achieved. Amazon S3 batch operations are said to be a cost-effective
way of bulk processing files and de-duplicating SEG-D files.More from
Bulmer’s blog.
* You may be wondering what ‘serverless’ actually means. In this context it appears to refer to Amazon AWS Lambda which Wikipedia tells us is a virtual machine that spins-up for the duration of a process, running Amazon’s own-brand Linux.
Another seismic blog, authored by Amazon’s Mehdi Far investigates the automated interpretation of 3D seismics with Amazon SageMaker. SageMaker enables data scientists to build, train and run machine learning models in the cloud. Far used the Apache MXNet deep learning library running within SageMaker to create a horizon picking algorithm. Far shows how to identify salt bodies on seismics using U-Net semantic segmentation trained on the TGS Kaggle public domain image dataset.
Comment: Amazon Athena, DynamoDB, Lambda, S3, SageMaker … anyone smell ‘lock-in?’ See this month’s editorial.
Bordeaux, France headquartered F2i Consulting has published a white paper titled, ‘Simplifying the deployment of Energistics data transfer standards’. F2i’s ‘FESAPI’ application programming interface is a high-level, open source software development tool kit for reading and manipulating Energistics’ XML data transfer formats and making them accessible from multiple programming languages. Initial focus was ResqML v2.0.1 and the key components of the Energistics common technical architecture, the packaging conventions and HDF5, the technology used to store and managing large, complex data arrays. FESAPI development is ongoing and will extend to all Energistics’ data formats, WitsML, ProdML, ResqML and ETP, the Energistics transfer protocol.
FESAPI is multi-platform (Windows, Linux and MacOS) and polyglot, with its native C++ classes available to .NET, JAVA, Python developers through SWIG wrappers. gSOAP classes allow for easy XML serialization and deserialization access to HDF5 libraries for importing and exporting numerical values into binary files. FESAPI is released under an Apache 2.0 license and located on GitHub.
Dynamic Graphics’ Carl Godkin gave a strong endorsement to the toolkit, ‘FESAPI has saved us an enormous amount of development and debugging effort, we rely on FESAPI for our CoViz 4D RESQML implementation because of its high quality and straightforward API’.
An Energistics consortium has been established to prioritize and fund future development. ExxonMobil, Total and Dynamic Graphics are on board. Others, operators, service companies and software developers are invited to join the open source project. Interested parties should contact Jana Schey at Energistics.
The 2019.2 release of Geoteric now functions in a distributed compute environment. Machine resource use has been optimized and support for 4k resolution screens added to the ‘cloud-friendly’ edition. More from Geoteric.
The new 2.5 release of INT’s IVAAP upstream data visualization platform adds new production dashboards and support for curve dictionaries and aliases. IVAAP displays raster logs in well and correlation views. Improved synchronization between widgets makes it easier to navigate complex dashboards.
Version 2020.02 of Geovariances’ Isatis.neo automates the identification of multiple reservoirs and spill-points and helps resolve mis-ties.
The 3.3 release of Geophysical Insights’ Paradise seismic interpretation workbench adds deep learning for fault detection and seismic facies classification. DL fault detection comes with pre-trained models that run without user-provided fault examples. GPU technology is said to ‘dramatically’ speed fault extraction. Also, in the new release is PSLi, an interactive tool for scripting in the Paradise scripting language, a procedural language for geospatial signal processing and neural network analysis.
Lynx iSeisview has a new ‘cube’ viewer for 3D seismic surveys. The interactive cube allows for zoom and rotation to view inline and crossline profiles from any angle. The viewer works in all modern browsers (Chrome, Firefox, Safari and Edge) on desktop and mobile devices.
Neuralog Pro has rolled-out, providing North American users with ‘budget-friendly’ access to, and interpretation of, well log data. Pro includes automated log digitizing, visualization, volumetrics, mapping and more.
‘Allez’, the latest edition of Peloton’s WellView provides a more intuitive access to reports, schematics and timelines in a single well dashboard tailored to screen size. Allez provides configurable workflows that guide users through import, data entry and QC with context-sensitive templates. The company has also announced Peloton GO, an iOS app for mobile access to maps, well and production KPIs and daily reports.
Petrosys PRO 2019.3 adds
connectors to Eliis Paleoscan and Rock Flow Dynamics’ tNavigator.
Pro also now displays OpenWorks contours and Petrel and DecisionSpace
fault sticks on the map. The new release adds support for
Energistics’ Rescue data format for wells, 3D grids and faults.
Petroweb has announced the Energy Data Curator, combining Petroweb’s team of data curators and partners with the latest in ETL technologies and machine learning. A standardized catalog provides access to third party data coverages worldwide. Currently, the Curator spans 146 countries, over 1.1 Million 2D/3D seismic lines and surveys and 7.0 million wells.
The 2019.12.1 release of ResInsight, the open source 3D visualization, curve plotting and post-processing tool for reservoir models and simulations, includes bug fixes and enhancements. ResInsight is developed by Ceetron Solutions in collaboration with Equinor. The tool is a part of the Open Porous Media Initiative and the software is hosted on GitHub.
Tecplot RS 2019 R1 includes a new well path capability to plot grid values along any well path. Tecplot RS can now read CMG SR3 data files.
Rock Flow Dynamics’ tNavigator 20.1 introduces a special mode for wells with a large number of perforations, speeding calculation of models with hydraulic fractures. Assisted history matching and uncertainty has been enhanced as has visualization of calculated parameters. Object-based simulation has been implemented in Geology Designer. In PVT Designer, Herning and Zipperer correlation is now supported for viscosity calculation.
Schlumberger announces that the Ocean Alpha/Petrel 2020 field introduction preview 5 release is now in ‘commercial ready state for plug-ins’. All Ocean library and third-party library are binary compatible with the commercial release. Plug-in developers can now compile and submit for Ocean Store acceptance at the released of Petrel 2020.1. A data model change necessitates a replacement API for 2021. The current API will be obsolete before 2020.1 is released.
Thermo Fisher Scientific has rolled out Open Inventor Toolkit 10.5 introducing a Java build for ‘headless’ servers. The use of the EGL library allows applications to run without starting an X Server. More in the release notes.
V 21.1 of Blue Marble Geographics’ Global Mapper introduces a new raster functionality and subscription based access to Blackbeard Tanuki, an online data service for pipeline, well, and lease information. The Lidar module has been updated with a new tool to automatically align point clouds with a best-fit 3D transformation. A point cloud change detection tool does just what it says.
Spectra Logic’s new StorCycle storage management software creates a ‘perpetual tier’ of storage either locally or in the cloud. StorCycle scans primary storage data for inactive files and migrates them to the perpetual tier which can be any combination of cloud, disk, network-attached storage or tape. A project archive feature addresses data preservation in seismic research, oil and gas studies and other fields.
Seeq Corp. has announced a beta release of Seeq Data Lab, a Python interface to its process data analytical toolset. A new R22 release adds support for OSIsoft PI security, new conditional filtering options and a connector for the NOAA Weather Service API.
A new release of FieldComm FDI, a tool for developing process automation applications, adds support for HTML5, JSON and enhances OPC UA connectivity. The software also provides access to FieldComm’s online FDI device package and device description repository of registered device files.
IBM’s new Maximo Asset Monitor embeds AI into its asset monitoring solution. IBM is also working with Novate Solutions on a new remote monitoring and support service for industrial manufacturers, providing data analysis by engineering professionals at Novate’s support operations center.
DNV GL has announced a new methodology and digital tool for corrosion under insulation monitoring, said to be a major safety threat and multi-billion-dollar cost to industry. CUI Manager works on live asset data and provides insights on CUI risks, potential mitigation measures and cost.
Weatherford has announced ‘Centro’ digital well delivery, a ‘holistic’ approach to the digital management of complex wellsite operations. Centro makes consolidated well data available in real time, enabling advanced domain viewing and live analytics.
A ‘total redesign’ of Wellsite Navigator adds new functionality including 5-day satellite imagery updates of 22 states and a modern user interface showing nearby wells, saltwater disposal facilities and rig locations. 5-day satellite imagery is said to best Google imagery ‘that is often years out of date’.
Halliburton’s SPIDRlive streaming unconventional well test data retriever captures high-resolution, high-frequency data at the wellhead without downhole equipment to provide real-time monitoring and optimization of fracture performance. Proprietary modeling software converts wellhead pressures to bottom hole pressures which are broadcast to multiple endpoints. One use case is real-time monitoring of fracture interactions in offset wells.
Getac’s new ruggedized UX10-Ex tablet is ATEX certified for use in explosive atmospheres. The tablet offers a range of Intel Core i5/i7 8th generation processers and a 10” high visibility touch screen.
OptaSense can now offer high backscatter optical fiber in its DAS interrogators for ‘unparalleled’ downhole measurement performance. A comparison of high-backscatter and single-mode fibers has shown clearer seismic signals, reduced background noise and improved signal fidelity.
Emerson’s new DataManager V8.2 helps refiners monitor corrosion of hydrofluoric acid alkylation units to prevent costly, unplanned shutdowns. DataManager is a component of the Rosemount Wireless Permasense corrosion and erosion monitoring system.
Emerson has also released ProcessViz, new data visualization software for flow data from its Micro Motion Coriolis flow meters.
ExxonMobil has launched Mobil Serv lubrication management, a digital maintenance management platform for fleet operators. Mobil Serv organizes, automates and streamlines maintenance-related activities, including predictive maintenance task management, lubrication planning, performance tracking, and safety and compliance reporting. The new platform is powered by Redlist’s ‘mobile-ready, cloud-based app’.
Sharc has released Periscope 7.0, a new version of the post processor and viewer for output from its Harpoon computational fluid dynamics flagship. Harpoon was recently used by Abbott Risk Consulting in a fire and explosion risk assessment of a new offshore platform design. More on the study from Sharc.
Open iT LicenseAnalyzer2020 adds some 50 new features for software licensing analytics. Open iT has also published a white paper, ‘common myths about migrating to the cloud’ with advice on how and when to migrate (or not!).
The Repsol-backed Barcelona Supercomputing Center (BSC) has announced ‘MEEP’ the MareNostrum Experimental Exascale Project a next-generation, ‘full stack’ open source software and hardware facility. BSC, in collaboration with ARM and others is to develop open-source RISC-V chips and software ecosystem. MEEP will leverage field-programmable gate arrays (FPGAs) to develop the chips. MEEP’s software development will translate into a proof-of-concept for industrial usage, enabling next-generation exploration of computer architecture as well as software development of existing and future HPC applications, including emerging artificial intelligence workloads. The EU-funded MEEP project MEEP will run for three years with a budget of €10.3 million.
Repsol’s involvement is couched in environmentally friendly terms, such as ‘sustainable energy’ and ‘following the company’s decarbonization policy’. Notwithstanding these noble goals, BCS a long history of seismic research, notably with the from the 2006 Kaleidoscope project. 2010 saw the creation of the Repsol-BSC Research Center. The collaboration also extends to the construction of a new facility in Barcelona which will house BSC’s MareNostrum 5 supercomputer which will run the MEEP codes. MareNostrum 5 is to start installation year end 2020 with a €217 million budget for completion by 2025. The initiative sets-out, inter-alia, to defend EU ‘technological sovereignty’. Last December, Repsol president Antonio Brufau received a ‘national’ public-private R&D award from the Catalan Government on behalf of the Repsol–BSC joint research center. More from BSC.
In her keynote address, Elena Farnè described Equinor’s sideways move into offshore wind. While current energy scenarios out to 2050 show that large oil and gas investments are needed, (‘reassuring for us in oil and gas’) the resulting CO2 emissions remain far from a 2° scenario. The floating offshore wind market has a potential of 12GW by 2030. Currently, the largest is Dudgeon at 402MW. The planned Dogger Bank installation will be the biggest in the world. The Hywind demonstrator between Snorre and Gulfaks produces 88MW of electrical power for offshore oilfields. In the Q&A Farnè was asked about competing demands for capital between oil and gas and wind. Equinor plans to devote 15-20% of CAPEX to renewables i.e. one Dogger Bank sized project per year.
Matt Ballard (Esri) continued with the going green theme, presenting new functionality in ArcGIS Pro such as pie charts of EU renewables, graphs of growth in solar and a ‘Living Atlas’ of windfarms in Germany. A geospatial workflow for windfarm location leveraged DTN data on windspeeds into another ArcGIS Living Atlas, incorporating distance from shore, shipping lanes, water depth and infrastructure. Other functions of note are a 3D visual impact assessment and ‘viewshed analysis’ i.e. what can be seen from where. The green functionality is delivered (broadly) from one of Esri’s Operations Dashboards and also from third-party Getech/Exprodat as ArcGIS for Renewables, built with the JavaScript API.
Dal Hunter (Esri) puts GIS at the heart of the digital transformation. Web GIS allows all data types to be consolidated in the geospatial cloud and system of record. Data can be fed to multiples stakeholders including ‘non-GIS people’. Esri has previously shied away from development, but the operations dashboards are changing this, making real time IoT data available for advanced analytics. Here, ‘there is a spatial aspect’, fulfilled by the Notebook Server a ‘platform for spatial data scientists’. ArcPy offers tools for geospatial AI and a big data toolkit along with integration with R. One use case is satellite image analysis to spot pipeline encroachment (see also the WoodMac presentation below).
Adam Pittman reprised an ExxonMobil presentation made at the 2019 Esri plenary on the use of ArcGIS Indoors (AGI), Esri’s venture into BIM (building information management). A CAD model was imported into AGI and connect with operational data from safety systems, security (d-cameras), badging systems and HVAC. The 15 million square foot Exxon campus in The Woodlands has 23 miles of ‘walkable space’ hence the need for a wayfinding app. The Apple Bluetooth beacon system and ITS Systems indoor positioning also ran.
Jeff Allen presented on new developments in pipeline GIS. Here, a collaboration between Autodesk and ESRI is working to bridge the pipeline design/build gap. Moreover, while different tools are used to model upstream, midstream and downstream activity, ‘customers want only one’. Autocad/Civi3D/Revit data can blend with ArcGIS pipeline, combining linear referenced data with a network into a ‘seamless’ geodatabase from ‘wellhead to meter’. A demo showed a trace for min/max AOP design across a gas compression/pig station. A container concept, ‘GIS in GIS’ drills down inside pump station to show finer detail in the BIM model.
Gregor Calderwood presented on Tullow Oil’s use of Survey 123 during a seismic survey in Côte d’Ivoire. Survey 123 forms (created using the Borealis Excel template) record stakeholder engagement meetings and feed into the operations dashboard for reporting. A ‘grievance workflow’ captures complaints which are upload to the Borealis stakeholder engagement database. The system provides land access parcel mapping for affected landowners. The Côte d’Ivoire legal compensation formulas are programmed into Survey 123 forms. Tullow is now waiting to use UAVs for land marcel mapping and will be evaluating Drone2Map.
An Esri presentation on Tracker for ArcGIS showed how tablets, phones and smart watches can now feed into the Operations Dashboard for tracking activity in the field. Use cases include checking which well pads are visited and right-of-way patrolling. A ‘Quick Capture’ app is presented as an improvement on Survey 123 (‘too fiddly’) for aerial or walking pipeline control. Data can be captured into an on-site Esri spatio-temporal big data store or (real soon now) to the Esri Cloud. Analytics in ArcGIS is a ‘hot topic’. ArcGIS online offers some 25 analytics capabilities and ArcGIS Pro comes with 200+ pre-configured analytics tools. Many data scientists don’t know ESRI but are trying to do geospatial stuff, unaware of the Esri big data toolkit. Esri has decoupled ArcGIS binaries so they can run in the big data cloud and the Python ecosystem.
Stephen Bull presented a trial of AI/ML at WoodMac (now a Verisk unit). The idea is to automate the identification of well pads from satellite imagery using ESRI and open source technology, with help from sister company Geomni. WoodMac used ESA Sentinel 2 imagery and manual labelling with training in TensorFlow and Jupyter Lab. A single shot detector was trained with Coco (common object in context). The resulting model was deployed in ArcGIS Pro. 10m ESA S2 imagery, updated every few days, proved good for change detection on Texas well pads. In conclusion, good tagging is key and in general, ‘if you can see it you can train a model’. See also the Esri Sentinel-2 image services and the Deep dive into ML on ArcGis on youtube.
In the exploration-oriented session Galvin Tan and Jack Luo (both Shell) presented on GIS-enabling exploration in a UK license round. In June 2019, the UK opened a ‘mature areas’ round with a 120-day application period. Shell assembled a team of G&Gs, data manager and GIS specialist and set up a project using Shell’s ‘worldwide standard structure’. The project included data mining of corporate internal and regulator (OGA) sources, global 3rd party data providers, Shell’s past project archive and hard copy reports/books. Interpretations spanned geological models in ArcGIS (some on mylar/paper) and Shell’s cultural/surface data and ‘a lot of Shell PowerPoints’ from the geologists covering dry hole analysis and polygons of common risk segments. All of which was combined using ‘classic’ spatial analysis to produce a heat map of sweet spots, overlain with the blocks on offer. Next add infrastructure, pipelines, seismic cover, and non-technical risk (can we shoot seismics?). After the round and the award, the project data is cleaned-up and published into to corporate data sets for use by a wider audience and archival. The process in managed under a project assurance check list to capture metadata and context. Components of the data repository include Shell’s corporate data model, the EP Catalog and ‘maps’.
John Monkman (ExxonMobil) investigated the accuracy of spatial data. ExxonMobil is confident in its own acquired survey data but ‘most wells are not XOM drilled and surveyed’. What is the confidence of their location? A comparison of XOM data vs an unnamed 3rd party data vendor showed 450 wells with between 500m and 20km discrepancy. The main factors affecting accuracy are 1) a known coordinate reference system (CRS), 2) the accuracy of transformation to WGS84 (see EPSG.org for more on transform accuracy), 3) coordinate numerical precision, 4) survey equipment (today’s DGNSS can give sub meter accuracy, 5) when drilled/onshore-offshore. XOM uses a quality flag based on all of the above and a ‘traffic light’ system to indicate potential issues. In the future XOM plans to identify well spots from remote sensing imagery, and to data mine end of well reports to grab CRS/coordinates. XOM is also interested in large scale geographic conversion initiatives in collaboration with government or partners. Out of scope of the current work are altitude (Z), updated well surveys, time dependent/dynamic datums.
Bryan Yates (Orbital Insight) presented some possible uses of geospatial analytics. Orbital Insight’s GO platform claims the largest collection of satellite data available from multiple partners. Orbital collates multi-sensor data from satellites, smart phones, AIS and more, to allow remote intelligence gathering across areas of interest. One example was tracking Innergex’ progress on solar farm construction in Texas by watching workers’ foot traffic and evolving land use. Elsewhere, plant expansion and refinery outages can be spotted from Planet imagery. Foot traffic plus vehicle counts may signal the restart of a refinery. A ‘tip and queue’ service means that if something interesting is spotted, a satellite can be cued to look at the parking lot. An activity indicator for the Permian basin has shown a real slowdown (this in fall 2019!) with hotel foot traffic down and traffic at the Halliburton Field Complex at Odessa ‘all down since July 2019’. Global oil storage can be tracked by looking for floating top tanks, and then moving a radar sat in, or checking the shadow. Orbital Insight monitors 500 million barrels of storage for clients to spot build-ups, tracks gas station traffic in the US to monitor demand and counts cars in emerging markets (Rio, Egypt) for pollution and/or macroeconomic health. Supply chain monitoring for sustainable sourcing also ran ‘don’t buy stuff from non-sustainable areas in Brazil’. Chevron is a strategic investor.
Anuar Ospanov (NCOC) and Sam Bishop (Total) presented on the OGP Geomatics Committee’s new Offshore infrastructure survey data model (OISDM). The OISDM targets operations and maintenance of pipelines and platforms and supports ‘as built’ and ‘operating life’ survey comparison. The standard can capture side-scan sonar, ROVs, hi-res video and still imagery. OISDM was originally created for Chevron. 12 operators are now on board plus 6 survey companies. OISDM is based on the Esri geodatabase format. Tools of the OISDM trade include a geodatabase template, GML encoding and Enterprise Architect. A Rev C release is scheduled for Q2 2020 for use in inspect-repair-maintain surveys.
Tanya Knowles (UK Oil & Gas Authority) traced the UK Government’s push towards open data and the evolution of the UKCS as a ‘data-driven basin’. Previously, data release went through the joint industry CDA (Common data access) body which was expensive for newcomers. Today, ‘anyone can log in’ and access 173TB of data, 265k well logs, 12k well bores from the open data portal. OGA has also developed licensing round-specific portals. More from the OGA’s ArcGIS portal.
Oliver Morris traced the recent evolution of Neftex’ (a Halliburton unit) mapping technology. Neftex Insights (a service that provides regional geological interpretations) was previously delivered as ArcMap .mxd files. Generating these was labor intensive and the company lost staff in the downturn. A partnership with Safe Software and Esri saw the creation of an ArcGIS interoperability extension, based on Safe’s FME Cloud. This now allows Neftex to aggregate geological data across palinspastic maps and other data. Morris observed en passant that the Esri shapefile is still going strong after 30 years! Neftex also leverages the open source PostGIS system. A SQL query on PostGIS outputs to FME for delivery as file geodatabase, Zmap, Petrel, SHP and most recently, 3D PDF for VR. FME ships with over 5k coordinate transformations. More from QuickPlates from Neftex and also from Neftex associate PalaeoGIS.
Keith Fraley (40GEO) plans to turn IoT APIs into location-based intelligence. 40Geo’s Raptor Geo-IoT is an off-the-shelf solution for aggregating geospatial IoT feeds streaming them into enterprise GIS systems. Data comes from (inter alia) AIS vessel tracking and ADSB for aircraft.
Geochemical/petroleum systems analysis boutique Applied Petroleum Technology has hired Carl Peter Berg as CFO. Berg hails from Agility Fuel Solutions.
Mona Buckley has joined ARMA International as CEO.
Aspen Technology is to appoint Adriana Karaboutis to its board. She is currently group CIO and digital officer at National Grid.
Bruce Napier has joined Atwell, LLC as director of oil and gas operations. He hails from Universal Pegasus International.
Kevin Blount, currently COO, is to be BCCK’s next CEO.
Bluefield Geoservices has appointed Bruce Pudney to MD of its Americas unit.
The University of Texas at Austin Center for Petroleum and Geosystems Engineering has appointed Hildebrand Department professor Matthew Balhoff to director. He succeeds Kishore Mohanty, the center’s previous director.
Current senior executive VP Hiroaki Sakashita is now president and CEO of ClassNK. Koichi Fujiwara has been appointed as chairman of the board.
Computer Modelling Group’s Anjani Kumar, VP Engineering Solutions, now oversees the development of Builder and Results, CMG’s data import, model build and visualization applications. Jason Close, is now VP, CoFlow commercialization. Long Nghiem is the new VP R&D and CTO. He hails from Tech Mahindra.
Denna Arias has been named VP corporate development at EnergyNet. Prior to joining the team in 2017, she served as VP business development for the Oil & Gas Asset Clearinghouse.
Kjetil Ebbesberg has been appointed to Group CFO at DNV GL.
Dresser Natural Gas Solutions has appointed Chris Watkins as Industrial Products Group district sales manager.
EFG, The EU federation of geologists has named Nikolaos Koukouzas as coordinator for the EFG panel of experts on carbon capture and storage.
Samantha McPheter has been named president and CEO at eLynx Technologies, succeeding company co-founder Steve Jackson. Ryan Richison has joined eLynx as CIO.
The UK Engineering and Physical Sciences Research Council (EPSRC) has appointed Mark Parsons as director of research computing. Parsons directs the Edinburgh Parallel Computing Centre.
Nick Bigney has joined Flotek as SVP, general counsel and corporate secretary. He hails from Oiltanking North America.
Terry Jbeili has joined geoLOGIC systems as president and COO. He was previously with COO at Premier Oilfield Group.
Golder has appointed Andreas Rothe as CFO, taking over from Lee Anne Lackey who is to retire.
Magne Reiersgard of PGS is the 2019 – 2021 chair of IAGC, the International Association of Geophysical Contractors.
Hiroshi Hagiwara is INPEX’ chief technical representative at Japan Oil Development Co.
Dick Hansen has joined Lineal Industries as VP business development.
Dan Kieny has joined Michael Baker International as CTO. Kieny hails from Black & Veatch and its Pivvot spin-off.
Trevor Stapleton has assumed the role of OGUK HSE director. Katy Heidenreich is Operations director. Matt Abraham is Supply Chain and Exports director and Mike Tholen is Sustainability director.
BGS’ chief digital officer Kate Royse has taken over Matt Harrison’s role as chair of the OneGeology Operational Group.
Dennis Stevens is the new OSDU Forum Director at The Open Group. Stevens worked previously on TOG’s Open Process Automation Forum and the FACE consortium.
Sandy Esslemont is now President and CEO at Parker Drilling. He hails from Abrado, a Houston-based company he helped found.
A reshuffle of the Petrolink board sees Nick Baker in the CEO role. Gary Hickin is COO and David Johnson is CTO.
PLH Group has appointed Peter Howe as pipeline segment EVP. He was previously with KBR.
Mike Hormell has joined Plutoshift as SVP business development and strategy. He hails from PA Consulting. Sunny Dronawat is SVP Technology.
Pipeline Research Council International (PRCI) has announced the promotions of Laurie Perry and Gary Choquette. Perry is now senior program manager. Choquette is executive director of research and IT.
John Clay Nunnally is now CEO at Quality Companies. Richard Tang is director of business development.
Yonggang Duan has been appointed to the Recon Technology board as an independent director. Duan is director of the Oil Well Completion Technology Center at the School of Petroleum Engineering, Southwest Petroleum University in Sichuan, China.
Former Red Hat chairman Arvind Krishna is now CEO of IBM. Red Hat president and CEO Jim Whitehurst also moves to IBM as president, also taking the chairman role at Red hat. Paul Cormier takes over both the president and CEO roles at Red Hat.
Errol Olivier has joined RigNet as SVP and COO.
The Houston Chronicle’s Fuel Fix newsletter reports that Jim Wright has unseated incumbent Texas Railroad Commissioner Ryan Sitton in a surprise Republican primary victory.
Yao Tian has joined Ryder Scott as petroleum engineer in the reservoir simulation group. Steven Beck has joined the newly formed facilities engineering group as a project engineer.
Peter Weckesser is to join Schneider Electric as chief digital officer. He hails from Airbus.
Ed VanWieren is transitioning to a product manager role at SeisWare. Former COO Murray Brack takes over as CEO.
Sercel-GRC has named Alejandro Villa as global customer support engineer.
Roland Busch has been named President and CEO of Siemens AG. Joe Kaeser is now chairman and Christian Bruch is designated CEO of Siemens Energy, a new unit to be spun-out in September 2020.
Stratagraph has appointed Wayne Allen Cook as director of business development.
Structural Integrity Associates has appointed Mark Marano as president and CEO.
Ben Scott has joined Surtek as director of subsurface engineering and development.
Technical Toolboxes has appointed Joseph Ladner as engineering performance advisor. He joins TT from Alliant Group.
Jim McGowin is now VP Americas with Tendeka. He was previously with Packers Plus Energy Services. Renato Barbedo has joined as business development manager for South America.
Dawn Summers is now COO of Wintershall DEA. She hails from Beach Energy.
CDA has achieved ISO 9001:2015 quality certification for the provision of data management services. The certification demonstrates that ‘data management services are well managed and well documented, that risks to delivery are understood and properly mitigated, and that [ …there is a … ] solid foundation for service delivery and service improvement’.
Bechtel, JGC Corporation, KBR, McDermott, Wood Group, Worley and Saipem, have signed up as partner contractors on the IOGP JIP33 engineering standards initiative.
Death
eLynx co-founder Steve Jackson has died, aged 74. Read his obituary here.
Advisory, tax and assurance firm Baker Tilly Virchow Krause is to acquire Oil and Gas Business Solutions, a Dallas-based firm specializing in ‘complex and unique’ bookkeeping and back office services in the oil and gas upstream sector.
DNV GL has launched a venture fund to address the energy transition and the ‘fourth industrial revolution’. DNV GL Ventures will build a portfolio of between 15-20 startups in the next four years, taking stakes of up to 20% in the companies. One early beneficiary of DNV GL largesse is VeChain, a ‘blockchain company’. DNV GL is inviting startups to make 90 second video pitches directly to the venture team.
Enverus has acquired RS Energy Group and is to combine 40 years of data and energy insights into a publicly available source of data science-backed research and market education for companies navigating the Covid-19 crisis and the crude price collapse.
IDEX Corp. has acquired Flow Management Devices, a privately held provider of custody transfer solutions for the oil and gas industry.
ION Geophysical has received notice from the New York Stock Exchange that its share price has fallen below the NYSE criteria for continued listing. ION is to submit a plan to the NYSE to demonstrate its ability to regain listing conformity within 18 months.
Koch Industries has completed its acquisition of Infor from Golden Gate Capital. Infor has been a key component of Koch’s technological transformation. Koch companies have made more than $26 billion in technology-related investments in the past six years. In a separate announcement, Infor announced the launch of a customer user community for its CloudSuite Enterprise Asset Management solution in the Middle East.
Aveva has acquired production accounting software from South Korean MESEnter. MES’ ErrorSolver flagship software is to be rebranded as Aveva Production Accounting.
NexTier Oilfield Solutions has sold its Well Support Services segment to Basic Energy Services for $93.7 million. The sale includes the company’s rig services, special services and fluids management businesses.
Petrosys has acquired Interica, developer of the PARS upstream archival solution. The deal provides Interica with access to a broader shared client base and pool of specialist expertise. Both Petrosys and Interica are now part of Vela, an operating group of Constellation Software Inc.
Quorum Software has acquired Denver-based EnergyIQ, a provider of well master data management software applications. The acquisition expands and strengthens Quorum’s oil and gas software and services portfolio with in-well lifecycle data management solutions.
Rockwell Automation is to acquire privately-held Kalypso, LP, a US-based software delivery and consulting firm specializing in the digital transformation of industrial companies.
Schlumberger is cancelling its listing on the London Stock Exchange, citing small trading volumes and ongoing regulatory compliance and administrative costs. The shares will continue to be listed on the New York Stock Exchange and on Paris Euronext.
Spectris has completed the sale of its interest in the EMS Brüel & Kjær joint venture to Envirosuite Ltd.
Following the recent acquisition by the Sword Group of DataCo, Sword Venture and Sword DataCo will be merging to form a single entity that will specialize in energy sector data services and solutions. The combined organization will be known as Sword Venture. Sword has also announced a new Sword Venture Digital Solutions Factory, an in-house R&D department and commercial analytics delivery team leveraging data science across the oil and gas industry.
Wolters Kluwer has completed the acquisition of CGE Risk Management Solutions, a provider of risk management software, notably the ‘industry-standard’ BowTieXP solution. CGE will be part of Wolters Kluwer’s environmental, health and safety and operational risk management software group. The company has also announced an EHS and Risk solution addressing the Covid-19 situation.
Yokogawa has acquired Danish startup Grazper Technologies, a specialist in AI for image analytics. Grazper runs its AI software on a field-programmable gate array. Yokogawa is to embed the technology into its Edge industrial LTE gateway, currently under development.
Mehdi Sadeghi and Peyruz Gasimov (AmeriCo Energy Resources) have analyzed historic data to maximize profitability at a mature Permian basin waterflood by developing a cost-effective strategies to optimize rod pump performance. Discovered in 1946, the Coleman Ranch property now has 48 producers and 20 active injectors. Rod pump failures are a major cost center and cause of production loss. A data pipeline from historical daily reports and emails was analyzed with Python code and regex text parsing and into a well operations activity matrix covering 15 years of data from 88 well and 500k data points. A reliability model was constructed from downhole failure data comprising pump time to failure, time to repair and other KPIs. An ‘average’ reliability model was developed using the Akaike information criterion is used to choose the best fit for reliability model. Wells were then ranked with a profitability index, combining failure history and productivity into a single metric. The study demonstrated the value of chemical treatment. ‘For every dollar spent on chemical treatment we save on average 6 dollars on workovers’. AmeriCo is now investigating the effect of weather, injection parameters and workover cost as a predictors of time to failure.
Ron Peterson extolled the merits of Unico’s suite of sucker rod pump controllers. Unico’s variable speed drive controllers use dynamometer data to adapt pump speed to well conditions in real time. The system avoids ‘over-pumping’ and minimizes energy consumption. An inclinometer or crank sensor compares Dynacard data with pre-calculated rod-string models in real-time, providing continuous inferred surface and downhole states and allowing accurate estimation of pump load and fill. The system is said to generate ‘a comprehensive well report for every pump stroke’. Interaction is via an HMI on a touchscreen in the field, laptop or smartphone, or from Unico’s GMC remote monitoring station. Unico’s system would appear to be a prime example of ‘edge computing’ although it was probably developed before the term appeared!
On the topic of what could be termed ground-truthing the analytics, Mick Harrison (Harrison Wright Oil Co.) argued in favor of ESP ‘DIFA’ (dismantle inspection failure analysis). DIFA is the subject of the API Recommended Practice 11S1*. Harrison believes that pump run-life is often not maximized because ESP failures are not properly identified. DIFA aims at determining the root cause of failure and involves physical inspection of the pump along with an analysis of operating conditions and externalities including amperage data, downhole gauge data, scada and field records. ESP run-life can be extended when more care and attention is given to failure mode and root cause analysis. ‘If you don’t have a DIFA standard then establish one, set a goal and establish a run-life improvement plan’.
More from the LBCG Artificial Lift Conference website.
* API RP for Electrical Submersible Pump Teardown Report 3rd Edition September 1997.
The US Chemical Safety Board has requested a $13 million budget for financial year 2021, plus a onetime request of $400,000 support of a new chemical incident reporting rule initiative. More on the request here. The CSB has also released an updated animation covering the 2005 explosion at the BP America Texas City refinery. The landmark investigation ‘revealed safety gaps at refineries across the country’. In a unanimous decision, a three-judge panel of the US Ninth Circuit Court of Appeals has ruled that Exxon Mobil must produce information to the CSB related to a tank filled with hydrofluoric acid at the site of a 2015 oil refinery explosion in Torrance, California. A PDF of the case is available at US Courts.
DNV GL has is working on Safety 4.0, a framework of work processes, methods, and tools to assure the safety of novel subsea technologies. Safety 4.0 includes integrated solutions, advanced sensor data and analytics and ‘new safety philosophies’. ‘Novel subsea technologies can be complicated because current standards may be hard to apply and there may be a bias against new concepts’. More from DNV GL.
A study of over 1,000 failure cases by DNV GL finds that tubes and piping are the most failure-prone components in oil and gas. Fatigue and corrosion are the most common failure types. 27% of failures occur in tubes and piping, and 20% in rotating machinery. Fatigue (30%) and corrosion (19%) make up nearly half of the primary failure types occurring in the cases DNV GL analyzed. Brittle fracture, overload and wear are also in the top five.
A new publication from the EU Agency for Safety and Health at Work investigates the impact of digitalization on occupational safety and health. The study explores the potential and challenges arising from digitalization and how it is shaping working lives and workers’ safety and health with the arrival of VR, automation and robotics and worker monitoring in the context of regulations.
The Gulf Research Program is awarding $7.25 million eight projects to address safety culture in the offshore oil and gas industry. Safer Offshore Energy Systems covers four projects focusing on different aspects of safety culture, including employee well-being and mindfulness, safety incident data sharing, risk scenario modeling, and the measurement of organizational safety culture.
Lloyds Register’s Aurora is applying data science to occupational health and safety information. Natural language processing (NLP) can transform textual information into normalized, structured data that can be interpreted and analyzed. Correlating OHS information with operational data, such as vehicle telemetry, working hours or HR demographics can reveal factors that influence performance. The LR Aurora SafetyScanner turns accident description data from multiple sources into ‘meaningful insights’.
Following a competitive pilot, National Oilwell Varco has awarded MiX Telematics a contract for the provision of its fleet safety solution across its light-duty vehicles in US and Canada. MiX technology is used to decrease risky driving behavior amongst employees.
UL (formerly Underwriters Laboratories) has opened a new hazardous locations customer service center in Houston. The new facility offers collaborative safety services to the upstream, midstream and downstream oil and gas industry. The center was established to mitigate ignition source fires with risk management. In the greater Houston area, a major chemical incident occurs once every six weeks. More from UL Houston.
The IOGP has opened a new portal for interactive data reporting and download. The portal exposes IOGP-curated data on safety, health, and environmental performance.
A study, Suicide rates by industry and occupation from the US Center for Disease Control and Prevention finds that in 2017, men working in mining, quarrying and oil and gas extraction had the highest rate of suicide, at 54.2 per 100,000. The average across all industries studied was 18.0, up from 13 in 2000. The CDC recommends that all can benefit from a comprehensive approach to suicide prevention as proposed in its publication Preventing Suicide: A technical package of policy, programs and practices.
The Netherlands-based USPI-NL standards body has kicked off a Facility Lifecycle 3D Model Standard (FL3DMS) project in a web conference attended by most major owner operators and EPCs, as well as several software vendors and institutes. FL3DMS will be a practical specification for use by owner operators in their contracts with EPCs. The specification should capture current best practices and be usable across construction and operations. FL3DMS will support the creation of a digital twin, allowing real time data to be ingested into the model by standardizing model structure and labelling. Initially the specification will leverage proprietary modelling tools, but may evolve into a neutral format over time.
FL3DMS has an ambitious scope, spanning the validation of construction planning and work package completeness, the use of laser scans and digital photogrammetry in model data acquisition, GIS, control systems, historians and ERP system. All connected via a common taxonomy. A subsequent phase envisages the integration of civil engineering building information model (BIM) data into the 3D model, extending scope with consistent cross-domain tagging.
FL3DMS will leverage the IOGP JIP36/CFIHOS specification for model reference data. The ISO 19650/PAS 1192 BIM standard is also cited as foundational to the new initiative in what appears to be a departure from the venerable ISO 15926 approach and (possibly) mirroring the subsuming of the US Fiatech by CII/BIM. USPI is seeking partners for the initiative and envisages a €7,000/per year fee for USPI members.
Chris Smith (Enverus) advocates the use of machine learning and natural language processing to categorize invoice line items. Meaningful savings opportunities can be found in a buyer’s smaller, less visible items, something that was previously ‘exceedingly difficult to do’. Using their own data, buyers can quickly discover their most cost-effective suppliers. One study identified a per-valve saving of $72, for an average annual volume of 20,000 valves.
Eduardo Nunez (Blue Wave) presented on local content development*. Local content providers are sometimes challenged by a lack of clarity in contracts and procurement standards. This makes local content obligations hard to fulfil. Blue Wave is an international corporation formed by ExxonMobil and BP veterans to help develop local suppliers and improve the conditions of local communities. Blue Wave helps local companies understand energy sector standards develops them into suppliers of major international or state-owned energy companies and their partners. Blue Wave also provides a collaboration framework for governments, IOCs and local suppliers along with a mechanism for technology transfer to local suppliers. Blue Wave recently launched in Mexico with support from the State of Tabasco. A kick-off workshop saw one IOC commit to sponsor 25 local suppliers for a year. International expansion is planned for 2021.
* local content development, providing local jobs or engaging local service providers and suppliers is often a pre-requisite for international companies to operate in developing nations.
Sirion Labs president Claude Marais categorized current procurement practices as ‘hunting mice while the elephants run wild!’ Traditional supplier management and account governance strategies take a one-size-fits-all approach for suppliers and customers. What’s needed is a tiered approach – at both the process and technology level. The executive/strategic tier sets the overall tone of the relationship. A management tier adds a performance check, dispute resolution and scorecard review scorecard. The operational tier adds detailed contract, performance and financial management. All of which is enabled by an integrated contract lifecycle management (CLM) platform that provides templates for contract clauses, authoring workflows and e-signature. The CLM also provides OCR/ICR and a document and metadata repository. ‘Today’s CLM technology is uniquely positioned to enable this integrated approach’.
Glenn Healy presented Appian’s ‘low code’ platform for modernizing enterprise and operational applications. Appian connects to data sources such as ERP, legacy systems, asset/inventory systems, IoT/scada, blockchain and enables these to be extended with cloud-based solutions from AWS/Google/Azure AI or Blue Prism’s robotic process automation. Appian promises an intuitive experience on mobile, and other endpoints with out-of-the-box AI and machine learning.
Gwen Mitchell, president of the Houston affiliate of the Institute for Supply Chain Management, introduced the 750 members-strong organization. ISM is a cross industry body that is a ‘pace setter’ for ISM learning, networking and opportunity. Mitchell sketched out the top supply chain technology trends as robotic process automation, ‘autonomous things’(!), the digital supply chain twin, immersive experience and, you got it, blockchain.
Jeff Houtz provided an overview of Fluor’s supply chain material planning and how this benefits Fluor’s construction projects. The supply chain discipline is responsible for ensuring the ‘right material, at the right place and at the right time’. Schedule and material data are used to plan ahead and meet construction’s work packaging needs. Enter the concept of advanced work packaging (AWP) defined by the CII as ‘the overall process flow of detailed work packages (construction, engineering, and installation), in a planned, executable framework for productive and progressive construction’. Prerequisites for the AWP include accurate material, work package and schedule data. A focus on ‘bulks’ (material) is key to AWP success as this is where there is the most risk. On surpluses and ‘bump factors’ Houtz opined that it is hard to influence bump and emergencies, but ‘we must do our part to assure that data is trustworthy as a loss of confidence in material data is a major cause of surplus’.
Chevron’s Raquel Clement, who is also on the board of the Oil and Gas Blockchain Consortium enumerated some oil country blockchain pilots. These included truck ticketing (Equinor), AFE balloting (ConocoPhillips), seismic entitlements (Repsol) and joint interest billing (ConocoPhillips). The truck ticketing pilot is said to have ‘transformed’ the salt water disposal payment cycle and Equinor aims to realize ‘over 25% operational cost savings’ from here on. Better still, ‘suppliers can finally expect to be paid on time!’. The system is ‘paving the way for automated vendor validation of any delivery of goods and services’.
Requis CEO Richard Martin wants to transform the supply chain into the ‘ultimate value network’. The Requis supply chain platform was inspired by consumer platforms and ‘built for the enterprise’. Requis covers asset procurement, management and disposal ‘with unprecedented visibility, simplicity and efficiency’. A survey* of 500 supply chain professionals found ‘disconnected, siloed, highly manual enterprise procurement processes’ resulting in ‘billions in surplus assets and dismal returns on asset sales’. The Solution? A shift from the ‘supply chain’ to a ‘value network’, that brings supply and demand together to ‘drive efficiency and market equilibrium on a local, regional and global basis’. The value network is driven by digitisation using Requis’ network of around a million ‘listed assets’ and some 300 enterprise clients including BP, Chevron, Ineos, Shell Siemens and Worley.
The 2020 Energy Conference Network Oil & Gas Supply Chain and Procurement Summit is scheduled for 2-3 December in Houston.
* Probably Deloitte’s 2019 Chief Procurement Officer survey.
Norway’s Force industry
body is inviting participation in the 2020 Force machine learning
competition. Interested parties are invited to predicting facies and
stratigraphy from well logs, and map faults on seismic. Force is to
make available some 150 wells, with logs tagged with stratigraphic and
lithological labels. Competitors will use ML to produce pseudo
lithologies for a further 20 more wells that will be held back.
Agile’s Matt Hall is to organize the ‘Kaggle (Google)
style’ competition and will help-out with ML tutorials. More from
Force.
Meanwhile Matt Hall (Agile) invites geos to try their hand at
his own geological coding challenge in a ‘Kata’. Working
from an online dataset of 20,000 lithology codes, coders are invited to
answer questions like ‘what is the total thickness
sandstone?’ More from Agile.
A team of Fugro employees won the geotechnical
machine-learning competition conducted as a prelude to the 2020
International Symposium on Frontiers in Offshore Geotechnics (ISFOG).
Competitors were supplied with a dataset of cone penetration test
results, hammer energy and pile dimensions and had to derive the most
optimum blow count vs. depth for an offshore jacket. A team comprising
data scientists, geotechnical consultants and pile installation
specialists worked on the project for four months! More on the
competition’s Kaggle home page.
AI analytics start-up Contilio won the Industrial Internet Consortium’s ‘Smart Construction Challenge’. The challenge involved construction applications integrating cloud, edge, fog and IoT technologies. Contilio used 3D computer vision and deep learning to provide intelligent insights from 3D data and photos captured at a construction site and processed in the cloud. Contilio won the €25,000 prize and the opportunity to deliver a live proof of concept at the TÜV SÜD International Business Park in Singapore. More from the IIC and the MachNation summary white paper.
A report from Trend Micro Research investigates the cyber
risks that face the oil and gas industry and its supply chain. Trend
Micro finds that geopolitics and espionage motivate attackers targeting
the oil and gas industry. While attacks are not always sophisticated,
they often target and impact production, causing real-world damage.
Trend Micro recommends deployment of a range of defensive strategies
including two factor authentication for changes to DNS settings, data
integrity checks, implementing DNSSEC, SSL certificate monitoring and
training. Read the complete report here.
The US National Academies Press recently published the proceedings of a workshop on the
implications of artificial intelligence for cybersecurity, a free,
99-page PDF download. Interest in artificial intelligence (AI) and
machine learning (ML) have boomed in recent years. At the same time,
the computing and communications technologies present serious security
concerns. The report provides a potted history of cyber security and
AI. Notable prior art includes the Lockheed Martin cyber kill chain, the DARPA High-assurance cyber military systems program and
others. While AI’s role in cyber defense has yet to be
established, its role in cyber attack is evidenced in
authentication-based attacks (spoofing voice-activated systems) and in
spear-phishing. DARPA’s Cyber Grand Challenge competition
investigated the potential for AI to ‘assist deeper’ into
the cyber kill chain. Tuft’s Kathleen Fisher observed that
‘AI has the potential to fuel a cyber arms race as cyber weapons
operate much faster than humans’. The workshop was held under the
auspices of the National academies of science Computer science and telecommunications board.
Cegal’s Henrik Skandsen blogged recently on the importance of asset and plant security. Previously operational technology (OT) and industrial control systems (ICS) were separate from an organization’s IT systems. The advent of Industry 4.0 and the Industrial Internet of Things (IIoT) are making for IT/OT convergence and new security threats. The SANS Institute* 2019 State of OT/ICS Cybersecurity Survey found that over 50% of respondents perceived OT/ICS cyber risk as either severe, critical or high. Skandsen recommends leveraging three cybersecurity standards, IEC 62443 (OT security), ISO 27000 (IT security) and the NIST Cyber Security Framework. Operators can either plough through this voluminous paperwork (ISO 27000 alone has 46 parts) or call on Cegal whose Connect@Plant security solution will do it for you.
* SysAdmin, Audit, Network, Security. See also the upcoming SANS
2020 Automation and integration survey panel
discussion.
The risks associated with API access to the cloud are highlighted in McAfee Labs’ 2020 Threats Predictions Report. 2020 will see APIs exposed as the weakest link leading to cloud-native threats, particularly as API security readiness lags behind other aspects of application security.
Wyoming-based oil and gas producer Ultra Petroleum has selected Datrium’s ‘DRaaS’ (disaster recovery as-a-service) to combat ransomware and recover from disasters with Datrium. DRaaS provides VMware workloads with a built-in backup and instant recovery service. Ultra Petroleum now uses the cloud for disaster recovery ‘at a fraction of the cost of a second data center’. More from Datrium datrium.com.
Mol Group has deployed MobileIron’s ‘zero trust’ mobile security platform to provide employees with secure access to remote resources. The unified endpoint management solution was deployed by MobileIron partner S&T Consulting.
Nexus Controls, a Baker Hughes business, is to add Tripwire’s industrial cybersecurity solution to its SecurityST cybersecurity offering for operational. SecurityST offers proactive protection and centralized reporting to manage cyber risk and comply with global security standards. Tripwire adds expanded threat monitoring and mitigation with passive data collection and advanced logging capabilities. Tripwire also helps detect configuration drift and maintains system integrity and compliance with industry standards such as IEC 62443 and NIST SP 800-82. More from Tripwire.
The EU Commission has issued a proposal for a European Cybersecurity Taxonomy, to align cybersecurity terminologies, definitions and domains into a coherent and comprehensive taxonomy to facilitate the categorization of EU cybersecurity competencies.
Ken Munro (Pen Test Partners), speaking at the 2019 OilComm conference in Houston, gave a keynote talk on ‘hacking a mobile drilling platform’. Most ships and MODUs are connected to the internet and are likely visible with a tool like Shodan, possibly exposing sensitive information like passwords. Vessels can be tracked in real time with live AIS data. Satcom systems and other onboard hardware including IoT devices represent a multiplicity of attack points unless password access is configured correctly. By default, this is unlikely. System upgrades may reset passwords to ‘admin’. Some hydraulics and industrial control systems have no security at all, a fact that does not seem to trouble the vendors unduly! Marine electronic chart displays have been spoofed to make vessels appear to be 1km wide ‘blocking’ shipping lanes. Autopilots have been hacked. Checkout Munro’s blog for advice on satcom systems security hardening and to get a security audit of your rig from the experts.
The US National Institute of Standards and Technology has also been investigating cyber risks across the supply chain. A draft publication, ‘Key Practices in Cyber Supply Chain Risk Management (Draft NISTIR 8276)’ proposes strategies to cybersecurity issues posed by systems built using components and services supplied by third-parties.
Yokogawa has obtained ISASecure security development lifecycle assurance (SDLA) certification from the ISA Security Compliance Institute. The certification, obtained through a third-party evaluation, assures that Yokogawa’s development processes meet the requirements for developing secure control system products. Yokogawa previously obtained ISASecure for its Centum VP integrated production control system and ProSafe-RS safety instrumented system. Certification was granted on the basis of an examination to verify compliance with the IEC 62443-4-1 standard and certain other requirements.
Aker BP and Cognite are to explore how robotics systems can be used to make offshore operations safer and more efficient. Cognite’s AI toolset will be tested using robots and drones on the Aker BP operated Skarv Norwegian Sea facility.
Baker Hughes’ first venture company, Avitas, a provider of inspection and monitoring solutions is using machine learning to automate manual processes in customers’ operations. Robotics and drones can automate well pad inspections reducing ‘windshield-time’ and enhancing safety. Avitas also contributes to Baker Hughes’ Lumen offering for emissions reduction.
Baker Hughes and C3.ai have launched BHC3 Production Optimization, an AI-based production data visualization, forecasting and optimization toolset, the latest addition to the BHC3 portfolio of AI applications.
Total has renewed its contract with CGG for the operations of its dedicated processing center in Pau, France. The new contract runs for five years.
Wintershall Dea is to deploy the Coupa BSM spend management solution across the merged company
Ecolog International and Smartline Technology are to partner on a pit-less/de-watering solution, a mobile, closed loop drilling mod management system.
eDrilling and SK Oilfield are teaming to promote
modern drilling technology in India. eDrilling has also reported on
successes with clients Gazpromneft and Sinopec.
Emerson has teamed with Quantum Reservoir Impact
(QRI) on AI-based analytics and decision-making tools for exploration
and production. QRI provides ‘augmented’ AI, machine
learning and advanced analytics for asset and reservoir management.
Equinor and Shell are to collaborate on digital solutions, exchanging expertise in data science, artificial intelligence and 3D printing. The deal entails ‘co-innovation’ across maintenance, production optimization and supply chain management. Equinor’s Torbjørn Folgerø cited OSDU, the Open Subsurface Data Universe as an example of the benefits to accrue across safety, reduced emissions and value-add from digital technologies.
Linde is to leverage Fieldbit’s augmented reality technology to enhance remote support and safety. Operators with mobile devices or smart glasses will be coached by remote experts on inspection and maintenance tasks.
geoLOGIC has announced ‘basinINTEL’, providing ‘pre-computed, data-driven, accurate, unbiased and repeatable well production forecasts for every well in western Canada’. basinINTEL combines first principles-based analysis of geoLOGIC’s production data with BetaZi’s ‘physio-statistical’ algorithm, trained on 4 million wells. More from geoLOGIC.
Pertamina has awarded Halliburton a contract for the deployment of Landmark’s DecisionSpace 365 petro-technical applications on the iEnergy cloud. The multi-year contract covers artificial intelligence, machine learning and data analytics in support of Pertamina’s digital transformation.
Hexagon and OSIsoft are collaborating on digital twins for industry with the HxGN SDx connector for OSIsoft. The HxGN SDx digital twin platform combines multiple data streams to improve asset performance and reduce costs at oil and gas operations and other process industry facilities. PI tags in SDx P&IDs can be clicked on to display trend data.
Tank terminal IT providers Implico and Brainum have announced Supply Chain United and QINO vNext. SCU is a cross-company, cloud/web services supply chain orchestration solution for tank storage, forecasting, dispatching and sales. SCU embeds QINO vNext, a new cloud-based terminal management system.
Kongsberg Digital is to deliver its K-Sim Safety firefighting simulator to the Norwegian Society for Sea Rescue (NSSR) for use at the Horten, Norway training center. K-Sim’s interactive 3D ‘WalkThrough’ software combines object and equipment models with immersive visuals, exposing trainees to ‘all conceivable scenarios and situations’.
Petrofac has selected Microsoft Azure’s Internet of Things for its digital Connected Construction platform. The solution was developed in conjunction with Accenture’s Industry X.0 initiative and is undergoing trials at a Petrofac EPC project in the Middle East.
CGG Jason has ported Jason RockMod to the Azure cloud with help from Microsoft’s AzureCAT unit. A master node of RockMod on a Windows virtual machine distributes multiple realizations across Linux-based HPC nodes. CGG has also been trialing InsightEarth on Azure.
Nextoil has partnered with Nutech energy alliance to
combine its reserves certification and intelligence with Nutech’s
reservoir characterization, leveraging the Next-ai analytic engine and
analog database. More from Nextoil.
Norwegian Omega provided document management, risk analysis and cost control through its PIMS flagship to the Maersk (now Total) Culzean North Sea HP/HT development.
Vermillion Energy is to deploy Omnira Software’s Mosaic reserves and economic forecasting tool for its corporate portfolio and asset management requirements. The choice was made after an ‘extensive evaluation process’. Omnira is a Constellation Software unit.
Petrofac has signed a partnership with MODS, a digitalization and dimensional control service company, to deploy its flagship product ‘Virtual Manager Enterprise’ in the North Sea. The VME hub facilitates collaboration between onshore and offshore personnel and provides real-time progress tracking.
Petrolink has joined the Open Subsurface Data Universe which it seeds as ‘a key step in the journey towards true collaboration across the industry’.
Petrosys has partnered with Rogii to develop its StarSteer well placement/geosteering platform and SOLO collaborative field development solution in the Pacific region.
PGS has entered into an agreement with Google Cloud as its ‘preferred cloud provider’. PGS plans to image seismic data in the cloud and launch a cloud-based multi-client sales platform. Longer term, PGS plans to leverage machine learning and artificial intelligence for subsurface data analytics.
Energistics has certified proNova’s WITSML Store v4.7 as compliant with the WITSML data exchange standard.
Dallas-based oil and gas private equity company PetroCap has chosen Quorum Software’s Upstream On Demand cloud-based solution to streamline financial and operational reporting across its projects.
Italgas is to deploy RealWear’s HMT-1Z1, an intrinsically safe smart helmet that runs OverIT’s field services software. The helmet and AR communications device is certified for operations in ATEX Zone 1 restricted zones.
Northern Offshore has awarded RigNet a contract for VSAT managed communications and other services for its fleet of rigs in the Middle East.
Robert Bosch GmbH, through its global center for artificial intelligence has joined Norway’s Sirius R&D establishment, a University of Oslo unit.
Metroval, working for EPC Modec, has specified Rotork Intelligent IQ3 multi-turn electric actuators and other kit for an FPSO that will operate on the Libra oil field in Brazil’s Santos Basin. Actuator data is analyzed with Rotork’s Insight 2 software.
SAP and Accenture have co-developed a cloud solution for upstream oil and gas clients. SAP S/4HANA Cloud for upstream oil and gas uses AI to increase visibility into operations and cash flow. The solution includes contributions from end users ConocoPhillips and Shell.
Schlumberger has announced the Egypt Upstream Gateway, a joint venture with the Egyptian Ministry of Petroleum for digitizing subsurface information and delivering a digital subsurface platform built atop GAIA, along with components of the Delfi E&P ‘cognitive’ suite.
Stress Engineering Services and Optimum Program have formed the RiGUARD joint venture to combines SES’ offshore drilling analytics expertise with Optimum’s data-driven drilling riser maintenance program. The solution targets riser condition-based maintenance and fatigue analysis.
VELO3D and Duncan Machine Products are now capable of 3D printing downhole tools for on and offshore oil and gas facilities. VELO3D’s Sapphire metal additive manufacturing printer handles complex geometries without part redesign.
Chevron has awarded Wood a multimillion-dollar engineering design project for its Anchor deep water development in the Gulf of Mexico. Anchor is a ‘wet tree’ development using semi-submersible floating production unit, and is said to be the ‘first deep water high-pressure development to achieve a final investment decision’.
The first Deep-Time Digital Earth (DDE) standards task group face-to-face meeting was held in Beijing earlier this year. DDE is to harmonize global deep-time digital earth data and share global geoscience knowledge. Tim Duffy from the OneGeology organization is advising on standards-based informatics and interoperable standards in geoscience.
The Open Group has just released O-PAS Version 2.0, Part 1 – a technical architecture overview, the first of 13 planned subsections of the ExxonMobil-backed process control standard. The document describes OCF, the O-PAS connectivity framework, a royalty-free specification for secure and interoperable communications, to be released in the upcoming Part 4. The OCF is conceptually a single network but will be implemented as separate, segmented networks to assure security and quality of service. Version 2.0 defines a ‘system of control systems and applications created and maintained by external management services’. A future version of the standard is to address platform-independent applications and the physical hardware layer. ExxonMobil has contracted with Yokogawa to run a new OPA Test Bed Collaboration Center in The Woodlands, Texas.
The Industrial Internet Consortium and the 5G Alliance for connected industries and automation (5G-ACIA) are to cooperate on interoperability, prevent fragmentation and maximize synergies of 5G for the Industrial Internet. The collaboration will address wireless connectivity and digital transformation across industries. More from the release.
ISO has announced an update of the international standard for descriptive metadata. ISO 15836-2, also known as the Dublin Core metadata element set Part 2 adds 40 new metadata properties to improve the precision and expressiveness of Dublin Core descriptions. ISO 15836-2 was developed by the ISO/TC 46 technical committee.
The Open Geospatial Consortium seeks public comment on the OGC API - Features - Part 2: Coordinate Reference Systems candidate standard. OGC API - Features provides allows for the creation, modification and query of geographic features on the web. The new standard is consistent with the OGC/W3C spatial data on the web best practices.
OGC and Natural Resources Canada are requesting input to the proposed revamp of Canada’s spatial data infrastructure (SDI). OGC is seeking suggestions for open APIs, machine learning, data lakes, the cloud and … blockchains!
The OMG’s AI Platform Task Force held its first meeting in September 2019 chaired by Claude Baudoin (cébé IT). The task force has set up a Wiki https://www.omgwiki.org/ai/doku.php open to OMG members and non-members. The aim is for the OMG to be ‘the place to come for AI technology standardization’.
The OPC Foundation has been ‘aggressively’ working on its OPC UA certification program, helping vendors verify and certify applications. OPC expects the certification tool to be of use in the ExxonMobil/The Open Group’s O-PAS standard (above) of which OPC UA is a ‘key part’. OPC has it that O-PAS ‘will require 100% certification of all compliant products’. The Compliance Test Tool is available for both windows and Linux and includes PLCopen test scripts.
The OPC UA Safety working group has published Release 1.00 of its specification for interoperable communications functional safety, OPC UA Safety. OPC UA Safety comes under the auspices of the Field Level Communication (FLC) initiative.
The EU Petroleum Survey Group (EPSG), the geodetics arm of
IOGP is in the process of upgrading the EPSG
online registry and support web site
to align with the revised ISO 19111:2019 data model. A beta EPSG repository for evaluation and testing is
available and for API access. The
new platform is expected to become the master EPSG registry mid-year
2020. More from the IOGP blog.
PIDX International has revised its PriceSheet XSD schema. Originally published in 2014, the 2020 edition is available royalty-free. The new schema allows for specific part information parameters to be exchanged between trading partners, ‘improving the efficiency and data accuracy of exchanging price books between suppliers and operators’ according to Margret Saniel of OFS Portal and chair of the price sheet project.
Weatherford’s Adrian Vuyk teamed with Jordan Reynolds (Kalypso) to present the results of a trial of AI/ML in optimizing milling through casing operations by determining the best settings for faster and safer drilling in changing formation conditions. ML was applied to data from past successful and failed casing exit jobs to identify the key factors involved. Open source tools and Python exploratory data analysis across some 18 measured parameters was run in a GPU machine. Particle swarm optimization was found to out-perform gradient-based algorithms that tend to get stuck in local minima. An ‘Edge’ intelligent device talks to Weatherford’s AccuView system onshore. The results are ‘preliminary’ but AI is believed to have enormous potential.
Debbie Rothe showed how Dow Chemical has deployed enterprise
manufacturing intelligence (EMI) dashboards across its
environmental assets. The dashboards show an aerial view of plant with
pop-ups of asset data, highlighting environmental issues. Following
successes at local plants, Dow has now developed a global roll-up
dashboard for its worldwide plants. Management of change was key to the
program and was enabled by engagement with the Prosci community and the Prosci Change Triangle, a
methodology for driving through change in the face of different levels
of opposition.
Robb Bunge (Noble Energy) and Rebecca Thomas (i2k Connect)
began with a timeline for analytics stretching from linear regression
in the 1700s to today’s (or tomorrow’s?) OSDU. Today,
analytics can unveil nonlinear, multi-dimensional relationships between
different data types. Noble leverages I2K Connect’s AI platform
for automated data QC and loading of log curves and other sources,
along with integrated document search from the SPE’s OnePetro and
other repositories. Use cases include pore pressure/geomechanical
studies for drilling safety and logistics efficiency.
Steve Bitar provided an update on ExxonMobil’s ambitious Open process automation (OPA) initiative. This is currently undergoing a two-year (2020-2021) test of OPA standards and components in collaboration with Yokogawa and other partners (notably Aramco and ConocoPhillips). Field trials will come in 2021-2022. Exxon has trialed a prototype at its catalyst testing R&D facility, connect various data sources through a real time bus (Matrikon/OPC-UA) to applications running in a Dell/EMC 'advanced computing platform’. The trial has leveraged the IEC 61499 functional block, a software object used to build applications. Each FB is independent and encapsulates data, variables and programs. FBs can be combined into application. Several vendors involved in the trial (Siemens, Yokogawa, Schneider and Rockwell) were invited to build FBs independently. The resulting test showed that it was possible to assemble a ‘cohesive’ application from four different developer’s function blocks. A written interface description is sufficient to ensure correct use. Intellectual property can be protected via pre-compiled target libraries. Developers can begin to write interoperable function blocks today for licensing on future OPA compliant system. Bitar concluded that ‘the OPA interface will do for FBs what Foundation Fieldbus did for devices!’
Barry Kelley described Koch Industries’ work on leak detection and repair (LDAR). Today this is manual and repetitive requiring complex, physically demanding and potentially hazardous work in industrial settings. Finding quality talent capable of accurately collecting and analyzing the high volumes of data needed for regulatory reporting is a challenge. Koch’s vision is to automate LDAR process using modern sensing technologies and analytics to reduce emissions, improve reporting and optimize operations. An example of the new approach is an automated leak detection sensor that uses Molex Sensorcon wireless emissions detectors and mSyte analytics platform. When the monitor detects a leak, an alert is sent to designated managers who dispatch an LDAR technician to pinpoint the leak using portable instruments such as a VOC sniffer. The system was calibrated with controlled release testing at the EPA’s Research Triangle Park facility in North Carolina. Longer term tests at the Flint Hills (a Koch unit) Sour Lake Olefins facility preceded a full-scale demonstration in process units at the Corpus Christi refinery. Koch is now augmenting the system with multi-level sensors across the plant. Kelley concluded that Koch and the EPA have successfully developed and tested a first-of-its-kind leak detection sensor network that will lead to cleaner air for all. Koch is now seeking regulatory approval for the new approach.
Fakhri Landolsi sees Equinor’s future as ‘robotized and automated’. But today, ‘where do you put your AI/ML investment?’ Beware of the ‘AI return on investment fallacy’. AI models do not deliver value in a ‘linear case by case fashion’ as ‘the time and cost of building a model exceeds most returns from deploying them’. Operators need to build ‘nonlinearities’ in the way ML models are delivered across large, complex organizations. Success will (eventually) come from ‘ubiquitous AI/ML driving machines everywhere’. Success means scaling fast, which can be an organizational challenge and by ‘creating non-linearity’ by solving classes of problems that can be deployed multiple times at very low cost. Enter the Equinor AI/ML ‘chassis’, an AI/ML framework upon which business teams can build and deploy models. ML experts need to automate their work, turning their AI/ML ‘cottage industries’ into factories. Communication is key!
Next year’s IQPC Houston Intelligent
Automation in Oil & Gas is scheduled for 22-23 February 2021.
Speaking at the 2019 Complex Systems Design & Management conference, Thierry Forsans presented Total’s Digiref project, an initiative to harmonize and simplify Total’s engineering procurement documentation by ‘digitalization’. The project, which kicked-off in 2019, involved around 530 documents and 104,000 individual requirements from 17 technical disciplines.
Digiref involves the migration of specifications and requirements documentation from paper and PDF documents into a cloud-based database using an ‘off the shelf’ software package. The new system aims to simplify authoring and use across Total E&P projects and operations and, in the future, contractors and vendors. Documents and datasheets have been analyzed down to the level of individual specifications and rules to distinguish between technical, quality, contractual, and design requirements.
Digiref specifies areas where international standards and Total’s own company-specific requirements apply. ‘Criticality ratings’ and their justification are said to ‘ease the derogation process’. Digiref is set to ‘increase the visibility of technical requirements for all stakeholders and lead to a better understanding of Total’s engineering specifications’ and will ‘reduce project costs by facilitating the work of engineering contractors and equipment vendors’.
It is important to remember that the IEA’s report on the oil and gas industry in energy transitions* was written before both the coronavirus pandemic and the subsequent oil price implosion. This review likewise is written without considering the impact of these game-changing events. It is also important to underline the IEA’s premise, as it says on the can, that the oil and gas industry will be impacted by the energy transition. Or, as is stated in the introduction, ‘Rising concentrations of GHGs in the atmosphere, changing energy dynamics, and growing social and environmental pressures represent huge challenges for the oil and gas industry. The twin threats are a loss of financial profitability and a loss of social acceptability. There are already signs of both, whether in financial markets or in the reflexive antipathy towards fossil fuels that is increasingly visible in the public debate, at least in parts of Europe and North America. Either of these threats would be sufficient to fundamentally change the relationship of oil and gas companies with the societies in which they operate. Together, they require a rethink of the way that the industry conducts its business. Climate change is not a problem that can be solved in the existing oil and gas paradigm’.
To get an idea of the magnitude of the problem for oil and gas in the context of the above, page 97 shows that ‘stranded reserves’, i.e. oil and gas that has already been found, amount to around three times the allowable carbon budget for a 2° warming. On page 100, we read that ‘stranded capital’, i.e. monies that have already been invested in condemned oil and gas projects amounts to some $250 billion.
Oil IT Journal’s position on the greening of the industry is that carbon capture and storage (CCS) is the only real ‘solution’ for oil and gas in a low to zero carbon world. Oils investing in wind or PV is all well and good, but these are orthogonal to the fate of oil and gas. As long as there is demand for oil, any diversion of an oil company’s efforts towards green technologies will be compensated by others stepping into the breach. Another lever that oils can activate for a greener industry is reducing or eliminating the ‘15% of global energy related GHG that come from methane leaks to the atmosphere’.
But to get back to CCS, what does the IEA have to say here? ‘The oil and gas industry will be critical for some key capital-intensive clean energy technologies [.. including ..] the development of carbon capture storage and utilization (CCUS**)’. ‘Scaling up these technologies and bringing down their costs will rely on large-scale engineering and project management capabilities, qualities that are a good match to those of large oil and gas companies’. ‘For CCUS, three-quarters of the CO2 captured today in large-scale facilities is from oil and gas operations, and the industry accounts for more than one-third of overall spending on CCUS projects’. So far so good, but then ‘If the industry can partner with governments and other stakeholders to create viable business models for large-scale investment, this could provide a major boost to deployment’. Now what exactly does that mean? Apart from a few well-endowed companies seeking to burnish their green credentials, the only way that large scale ‘investment’ will be made in something that brings in no revenue is government and ‘other stakeholders’ forcing oils to do CCS.
The IEA reports that ‘financial, social and political pressures on the industry are rising’. Capital markets are affected as climate-related shareholder resolutions and investor collaborations, such as the Climate Action 100+, ‘increasingly seek to facilitate engagement on sustainability issues’. The IEA also sees banks and other financial institutions reducing their exposure to oil and gas as they have already done for coal. Other pressures come in the form of ‘sustainable finance’ as per the Michael Bloomberg-backed Task Force on climate-related financial disclosures. Nimby-style opposition to infrastructure projects and a push to keep fossil fuels in the ground have led to lengthy permitting procedures and litigation leading to project delays and cost overruns. Some projects have been indefinitely postponed or canceled and fracking is either banned or impossible in much of Europe, in New York, California and Quebec and in some states of Australia.
The IEA sounds wobbly when trying to explain the role of oils in the energy transition. ‘The transformation of the energy sector can happen without the oil and gas industry, but it would be more difficult and more expensive’. ‘Oil and gas companies need to clarify the implications of energy transitions for their operations and business models, and to explain the contributions that they can make to accelerate the pace of change’. ‘Climate impacts will become more visible and severe over the coming years, increasing the pressure on all elements of society to find solutions. These solutions cannot be found within today’s oil and gas paradigm’.
The IEA report has it that while worldwide wind and solar investment in the period from 2015-18 was just short of a trillion dollars, investment in CCUS was under $5 billion of which only $2 billion came from study group of oil and gas companies and NOCs. Unfortunately, investment in CCUS (the only technology that can mitigate oil’s carbon footprint) has declined enormously from 400 mm$ in 2015 to around $150mm. Interestingly though, corporate venture capital spend on CCUS has increased significantly, from zero in 2015 to (still rather measly) $40 million.
The IEA does offer some comfort to those working in oil and gas. While oil demand is forecast to fall by around 2.5% per year out into the 2030s, this rapid drop is well short of the decline in production that would occur if all capital investment were to cease immediately. This would lead to a loss of over 8% of supply each year. Thus ‘continued investment in existing oil fields, as well as some new ones, remains a necessary part of the energy transition’.
In conclusion, the world is not on track to deliver the emissions reductions required for the SDS. Achieving such will require well-designed policies from governments (including carbon pricing) to promote research, development and large-scale deployment of the relevant technologies and infrastructure.
The IEA’s flagship in this context is Denmark’s DONG (Dansk Olie og Naturgas) which sold its oil and gas business to INEOS and is now ‘a leading light’ in particularly offshore wind. However, as the IEA admits, ‘the oil and gas assets continue to produce under different ownership, a point sometimes overlooked by the divestment movement’ while DONG, now Ørsted ‘has achieved impressive reductions in its overall emissions intensity’.
Apart from the irony of such ‘production under new ownership,’ the IEA reports on the dilemma for today’s fuel companies that are looking at becoming energy companies’ that stems from the fact that oil and gas has been and remains a successful business that has rewarded shareholders with robust dividends. Any transition to ‘energy’ could risk, at least in the near term, these returns. While low-carbon energy businesses can be profitable, the returns for these segments have generally been lower than for hydrocarbons. Oil companies are also at risk as they move into areas where they may currently have much less of a comparative advantage. ‘Companies that are embracing the transition from fuel to energy are attempting to straddle divergent possible outcomes and risks’. IEA forecasts however suggest that ‘however fast energy demand grows in the future, electricity demand grows more quickly’. The composition of this electricity will shift towards lower-carbon sources presenting ‘real market opportunities for related businesses seeking to grow or to offset shrinking markets elsewhere’.
* International Energy Agency The oil and gas industry in energy transitions.
** While CCS, carbon capture and sequestration is fairly clear-cut. The addition of ‘U’ for use opens the debate to some more contentious issues. Using CO2 in oil enhanced recovery (one of the main ‘U’s today) begs the question of the carbon produced from the extra production. Other uses are even more fantastical. Turning CO2 into a syn-fuel is thermodynamically expensive and releases more CO2 when it is burned. Using CO2 in greenhouses is a neat idea but we understand that most greenhouse CO2 is vented before it is absorbed.